Wiki

Clone wiki

srcsim / qual_task1

Overview

SRC qualification 1 is an image processing task. The qualification loads with R5 standing in front of a console. Located on the console are a number of LEDs and screens, all of which are black.

The task is started when an empty message is sent to the /srcsim/qual1/start topic. At this point the large screen in the center of the console will turn white. This transition signals that the task is starting.

One at a time, LEDs will turn on then off. An LED will stay on between 5 and 20 seconds. The color of the LED will be either red, green, or blue. During this time, you must report the color and center location of the LED relative to R5's head frame in meters. Color values are in the range of (0,1). For example, red is r=1, g=0, b=0. Score is based on the accuracy of the reported locations and color.

Important Note

Raw image data is inverted, due to the multisense head being mounted up-side-down. Make sure to adjust the image data correctly in order to arrive at physically accurate LED positions.

4007635085-srcsim_qual1_console_yellow.jpg

The end of the task is marked by the center screen transitioning from white to black. At this point, no more LEDs will turn on, and it is safe to quit Gazebo and submit your results.

Quickstart

  1. Run the qualification task

    source /opt/nasa/indigo/setup.bash

    roslaunch srcsim qual1.launch extra_gazebo_args:="-r" init:="true"

  2. Test out signaling the start of the task

    rostopic pub /srcsim/qual1/start std_msgs/Empty

  3. Write a program to solve the task

  4. Rerun the qualification task and run your solution.

  5. Submit the two log files described in the Upload your log files section below.

2D Image Processing

Refer to the API for a topic that publishes camera data. Subscribe to this topic by registering a callback. The callback will receive camera image data when the data is available. Refer to the ROS Publisher Subscriber Tutorial for more details on nodes, publishers, and subscribers.

Make use of any image processing library, such as OpenCV, to determine if an LED is on and where in the image the LED is located.

Important Note

Raw image data is inverted, due to the multisense head being mounted up-side-down. Make sure to adjust the image data correctly in order to arrive at physically accurate LED positions.

Depth data

R5 comes equipped with a stereo camera and a spinning hokuyo laser. Make use of these sensors to determine where an LED is located.

Reporting LED color and location

Send the location, in meters relative to the head frame of R3, and color, using a range of (0,1) for each RGB component, of each LED to the /srcsim/qual1/light topic. This topic expects a srcsim/Console message. Your answer will appear in a log file that you must submit in order to complete the qualification tasks. This log file is in your home directory with the name src_qual1_<timestamp>.log, where <timestamp> is an ISO string of the time that simulation started.

Important Note

Raw image data is inverted, due to the multisense head being mounted up-side-down. Make sure to adjust the image data correctly in order to arrive at physically accurate LED positions.

Upload your log files

A submission for qual task 1 requires two files. The first is a text file that contains information about the LEDs and your reported LED locations. The second file contains simulation state information.

1. LED answer log file

A new file in your home directory appears with the name src_qual1_<timestamp>.log after each time qual task 1 is run. Make sure you submit the correct src_qual1_<timestamp>.log file along with your simulation state log file.

2. Simulation state log file

You need to add the parameter extra_gazebo_args:="-r" to roslaunch to enable Gazebo logging. When you're ready to start your SRC task with logging enable, type:

roslaunch srcsim qual1.launch extra_gazebo_args:="-r" init:="true"

A log file is written at ~/.gazebo/log/<timestamp>/gzserver/state.log . You can verify that your log file has been properly generated by loading it into Gazebo:

gazebo -p ~/.gazebo/log/<timestamp>/gzserver/state.log

It's highly recommended to playback your log files before submission. You can do it as follows (the lights won't flash while playing back the log):

roscore & rosrun gazebo_ros gazebo -p ~/.gazebo/log/<timestamp>/gzserver/state.log 

The size of a log file can be really big depending on the complexity of the world. For submission, we'll reduce the size of the log file by sampling at lower rate and filtering some of the information. Run the following command inside the folder where your log file was created.

gz log -e -f state.log --filter *.pose/*.pose -z 60 > qual_1.log

And then, compress your file:

gzip -c qual_1.log > qual_1.log.gz

Always save the original log and submit the filtered log (qual_1.log.gz).

Check your answers

A script to test answers is being made available to the teams. We encourage teams to check their answers before submission to make sure that your log file is correct and your answers looks reasonable.

Note: The scoring files are only meant to aid in debugging. They do not reflect the real score.

Create a directory for storing the scoring scripts:

mkdir ~/srcsim_score && cd ~/srcsim_score

Download the following ruby scripts:

wget https://bitbucket.org/osrf/srcsim/raw/score/scoring/common.rb
wget https://bitbucket.org/osrf/srcsim/raw/score/scoring/scoring_q1.rb

Install the following dependencies:

sudo apt-get install ruby-nokogiri

Make the scoring script executable:

chmod +x scoring_q1.rb

Check your score, giving to the script the previously generated files (note that you should use qual_1.log, not state.log):

./scoring_q1.rb <path_to>/src_qual1_<timestamp>.log <path_to>/qual_1.log

The script will compare each of your answers to the ground truth, for example:

Answer 1: Color:    answer                         [0.0000 1.0000 0.0000]
                    ground truth                   [1.0000 0.0000 0.0000]
                    euclidean error                [1.414214]
          Position: answer                         [2.5900 -0.4200 -0.6000]
                    ground truth (neck)            [2.6013 -0.4091 -0.6288]
                    ground truth (head)            [2.4890 0.4091 0.3825]
                    ground truth (head flipped)    [2.4890 -0.4091 -0.3825]
                    euclidean error (neck)         [0.032794]
                    euclidean error (head)         [1.289608]
                    euclidean error (head flipped) [0.240024]

The total for the whole task is shown in the end, for example:

Duration: 27.569000000
Total color euclidean error:                   7.171068
Total position euclidean error (neck):         15.028268
Total position euclidean error (head):         24.338433
Total position euclidean error (head flipped): 16.885048

Ground truth

The script compares your answer to a few different frames:

  1. The "neck" frame corresponds to the upperNeckPitchLink frame, which has its z axis up when the robot is standing.
  2. The "head" frame corresponds to the head frame, which has its z axis down when the robot is standing.
  3. The "head flipped" frame is the same as the head frame, but rolled 180 degrees so the image is upright.

headframetopicheadann.png

headframetopicflippedann.png

  • Note that the head frame should not be visible in your simulation, it was added here for demonstration purposes

Updated