Clone wiki

subt / api

Overview

The control of a SubT team encompasses the access to each robot's sensors and actuators, the ability to navigate each robot to the right location, and optionally, the communication among team members to distribute the tasks. Additionally, the location of multiple artifacts need to be reported. SubT provides a combination of ROS and C++ interfaces to manage these tasks. This section explains these interfaces grouped by type.

Navigation

All robots implement velocity controllers and accept geometry_msgs/Twist messages. With this interface, you should be able to command linear and angular velocities to each robot. Robots accept velocity commands on topic "/<robot_name>/cmd_vel" .

Topic Description
/x1/cmd_vel (X1) Target velocity
/x2/cmd_vel (X2) Target velocity
/x3/cmd_vel (X3) Target velocity
/x4/cmd_vel (X4) Target velocity

Lights

Each robot is equipped with an array of headlights and a few programmable colored LEDs. Each individual headlight can be turned on and off via a ROS service. Each LED is designed to blink in a certain pattern (via SDF) and can be enable or disable via a ROS service.

Service Description
/X1/center_left_headlight/enable (X1) Enable/disable headlight
/X1/center_right_headlight/enable (X1) Enable/disable headlight
/X1/left_headlight/enable (X1) Enable/disable headlight
/X1/left_lateral_front_led/enable (X1) Enable/disable LED
/X1/left_lateral_rear_led/enable (X1) Enable/disable LED
/X1/rear_center_led/enable (X1) Enable/disable LED
/X1/rear_left_led/enable (X1) Enable/disable LED
/X1/rear_right_led/enable (X1) Enable/disable LED
/X1/right_headlight/enable (X1) Enable/disable headlight
/X1/right_lateral_front_led/enable (X1) Enable/disable LED
/X1/right_lateral_rear_led/enable (X1) Enable/disable LED
/X2/center_left_headlight/enable (X2) Enable/disable headlight
/X2/center_right_headlight/enable (X2) Enable/disable headlight
/X2/left_headlight/enable (X2) Enable/disable headlight
/X2/left_lateral_led/enable (X2) Enable/disable LED
/X2/rear_center_led/enable (X2) Enable/disable LED
/X2/right_headlight/enable (X2) Enable/disable headlight
/X2/right_lateral_led/enable (X2) Enable/disable LED
/X3/center_led/enable (X2) Enable/disable LED
/X3/downward_flashlight/enable (X3) Enable/disable headlight
/X3/left_flashlight/enable (X3) Enable/disable headlight
/X3/left_led/enable (X3) Enable/disable LED
/X3/right_flashlight/enable (X3) Enable/disable headlight
/X3/right_led/enable (X3) Enable/disable LED
/X4/center_led/enable (X4) Enable/disable LED
/X4/downward_flashlight/enable (X4) Enable/disable headlight
/X4/left_flashlight/enable (X4) Enable/disable headlight
/X4/left_led/enable (X4) Enable/disable LED
/X4/right_flashlight/enable (X4) Enable/disable headlight
/X4/right_led/enable (X4) Enable/disable LED

Sensors

Each robot can be configured with four different sensor suites. Below, you can find a table summarizing all sensor options for each robot.

Robot Configuration Payload #1 Payload #2 Payload #3
X1 #1 5m 270-deg planar Lidar 320x240 2D camera IMU
X1 #2 30m 270-deg planar Lidar 1280x960 2D camera IMU
X1 #3 100m 360-deg +/-15 deg vertical Lidar 320x240 2D camera IMU
X1 #4 300m 360-deg +/-15 deg vertical Lidar 1280x960 2D camera IMU
X2 #1 5m 270-deg planar Lidar 320x240 2D camera IMU
X2 #2 5m 270-deg planar Lidar 1280x960 2D camera IMU
X2 #3 30m 270-deg planar Lidar 320x240 2D camera IMU
X2 #4 30m 270-deg planar Lidar 1280x960 2D camera IMU
X3 #1 --- 320x240 2D camera IMU
X3 #2 --- 1280x960 2D camera IMU
X3 #3 --- 320x240 RGBD camera, 5m range IMU
X3 #4 --- 640x480 RGBD camera, 5m range IMU
X4 #1 --- 1280x960 2D camera IMU
X4 #2 --- 640x480 RGBD camera, 5m range IMU
X4 #3 5m 270-deg planar Lidar 320x240 2D camera IMU
X4 #4 30m 270-deg planar Lidar 1280x960 2D camera IMU

The selection of each sensor suite is done via environment variables. The following environment variables are available:

Robot Environment variable Sensor configuration
X1 X1_SENSOR_CONFIG_1 #1
X1 X1_SENSOR_CONFIG_2 #2
X1 X1_SENSOR_CONFIG_3 #3
X1 X1_SENSOR_CONFIG_4 #4
X2 X2_SENSOR_CONFIG_1 #1
X2 X2_SENSOR_CONFIG_2 #2
X2 X2_SENSOR_CONFIG_3 #3
X2 X2_SENSOR_CONFIG_4 #4
X3 X3_SENSOR_CONFIG_1 #1
X3 X3_SENSOR_CONFIG_2 #2
X3 X3_SENSOR_CONFIG_3 #3
X3 X3_SENSOR_CONFIG_4 #4
X4 X4_SENSOR_CONFIG_1 #1
X4 X4_SENSOR_CONFIG_2 #2
X4 X4_SENSOR_CONFIG_3 #3
X4 X4_SENSOR_CONFIG_4 #4

Next is an example where we're launching our team of robots where X1, X2, X3 and X4 are configured to use the sensor configuration #1, #1, #3 and $4 respectively.

X1_SENSOR_CONFIG_1=1 X2_SENSOR_CONFIG_1=1 X3_SENSOR_CONFIG_3=1 X4_SENSOR_CONFIG_4=1 roslaunch subt_example team.launch

Each sensor publishes data over a ROS topic. Here's a summary of the topics used by each sensor type.

Sensor type ROS topic Message type
Planar Lidar /<ROBOT_NAME>/front_scan sensor_msgs/LaserScan
3D Lidar /<ROBOT_NAME>/points sensor_msgs/PointCloud2
2D camera /<ROBOT_NAME>/front/image_raw sensor_msgs/Image
RGBD camera /<ROBOT_NAME>/rgbd_camera/depth/points/ sensor_msgs/PointCloud2
IMU /<ROBOT_NAME>/imu/data sensor_msgs/Imu
Magnetometer /<ROBOT_NAME>/magnetic_field sensor_msgs/MagneticField
Pressure sensor /<ROBOT_NAME>/air_pressure sensor_msgs/FluidPressure

Communications

A robot has the ability to send data to one or multiple robots of the team. Next are the main communication features available:

  • Packets are sent over a single hop, similar to UDP. No MANET algorithm or other packet relay systems are provided. Teams can implement their own relay systems if they like.
  • Each robot can access a list of its current neighbors. This information is maintained and offered by the communication client library.
  • If two robots are neighbors, then one can try send a packet to the other.
  • There is a maximum range beyond which two robots are not neighbors (and thus cannot communicate directly).
  • When a robot tries to send a packet to a neighbor, that packet is dropped with some probability.
  • Robots periodically enter brief periods (seconds) of complete communication outage, unable to communicate with other robots. Outages occur at random to individual robots.
  • There is a maximum size (e.g. 1500 octets) per message.
  • There is a maximum data rate allowed among robots communicating over the same network segment (e.g. 54000 kbps).

These features have been implemented in a C++ library (libCommsClient.so). This library exposes the CommsClient class that lets you exercise the communication within the team. The CommsClient class contains the following relevant functions:

  • CommsClient(localAddress): This is the class constructor and accepts a local address. An address is a string representing a unique identifier.

    Important: In SubT, the address should match the name of the robot in Gazebo.

  • Bind(callback, address, port): This method binds an address and port to a virtual socket, and sends incoming messages to the specified callback.

  • SendTo(data, address, port): This method allows an agent to send data to other individual robot (unicast), all the robots (broadcast), or a group of robots (multicast). The addresses for sending broadcast and multicast messages are "kBroadcast" and "kMulticast" respectively.

  • SendToBaseStation(artifact): This method tries to report the location of an artifact. The artifact parameter is a Google Protobuf message that you should populate before calling this method. Here's the message description. Next you can find an example:

// We only need the position.
ignition::msgs::Pose pose;
pose.mutable_position()->set_x(1);
pose.mutable_position()->set_y(2);
pose.mutable_position()->set_z(3);

// Fill the type and pose.
subt::msgs::Artifact artifact;
artifact.set_type(static_cast<uint32_t>(subt::ArtifactType::TYPE_BACKPACK));
artifact.mutable_pose()->CopyFrom(pose);

// Report it.
client.SendToBaseStation(artifact);

Important: Calling SendToBaseStation() does not guarantee that the base station receives your message. When you are calling SendToBaseStation(), you are sending a regular message to the base station subject to the same rules that any other message sent with SendTo().

  • Host(): This method returns the address of the robot.

  • Neighbors(): This method returns the addresses of other robots that are inside the communication range of this robot.

Simple mode

In order to speed up development or for debugging purposes, it's possible to configure the communications in "simple mode". In this mode, all messages arrive to the destinations. You can enable/disable the simple mode with the /subt/comms/simple_mode ROS parameter. E.g. using CLI:

rosparam set /subt/comms/simple_mode true

At any point, the communications can be restored in normal mode. E.g. using CLI:

rosparam set /subt/comms/simple_mode false

Artifacts

One of the main tasks to complete during the competition is to identify and report the pose of multiple artifacts. Teams should use the SendToBaseStation() function from the CommsClient class.

Important: The scoring plugin will find the closest artifact near the reported pose that matches the requested type (e.g.: if you're reporting a backpack at pose [10, 20, 0, 0, 0, 0], the scoring plugin will compare your reported pose with the closest backpack). The score is based on the accuracy of the report and its timing. Teams should not try to report the same artifact multiple times. By doing that, you will negatively impact your score, as your reported pose will be incorrectly associated with another artifact.

Score

The current score is periodically published under the /subt/score topic. You can check your task score at any time with:

tostopic echo /subt/score
Topic Description
/subt/score Current score

Updated