Add ability to set IMU initial Orientation on boot

Issue #1959 new
John Hsu
created an issue

Current implementation ImuSensor::SetReferencePose() is inadequate for typical simulation use.

Users desire the ability to specify what ImuSensor::Orientation() returns after the sensor is created in the world.

The reason being some of the existing IMU sensors in the market returns it's own frame orientation in the NED frame upon boot up, and the ability to do this is in simulation is critical. With existing API, the user would have to first rotate IMU to align with NED frame before calling ImuSensor::SetReferencePose.

Proposed solution here is to add

      /// \brief Sets the current reported pose of the IMU
      /// Replaces SetReferencePose.
      /// \param _orientation current IMU reported orientation
      public: void SetReferenceOrientation(
        const ignition::math::Quaterniond &_orientation =

Comments (10)

  1. DouglasS

    This should definitely be improved. The way IMUs are simulated in Gazebo right now does not match the way most IMUs work in the real world w.r.t. the frame in which they report their rates. While some IMU packages or robots themselves may do some fancy filtering and cleaning up of stuff before sending out data this is not the IMU itself that does this but the robot's/kit's API. For example, Atlas has the ability to report an estimated pose for the pelvis that is gleaned from IMU data but this is not the IMU itself, just something nice provided by the Atlas API.

  2. DouglasS

    To clarify on my previous point.

    On a real IMU, most of them are set up in such a way that they use a combination of the Earth's gravity and if a magnetometer is available the Earth's magnetic field to orient their local reference frame (usually an NED style frame) in some nominal "world" coordinates.

    Based on that, you would typically see this sort of hypothetical scenario: If you create a sim world w/ 2 of the same robot, one facing true north, and one facing true south, the robot facing true south should have a yaw/heading/whatever you want to call it that's 180º separated from the robot facing true north, and if it spins around to face the exact same direction then it should be reporting the same yaw/orientation/heading/whatever, barring difference like simulated noise or whatever. In the current implementation of IMUs in Gazebo, this is not possible without doing some work in code or a plugin or something. So it doesn't reflect accurately how most IMUs actually work.

  3. John Hsu reporter

    stumped by the use case where the IMU uses accelerometer reading to align +/-z with gravity, but leaves heading undetermined. Does the heading align with IMU body X-Y frame? Is there a real world example device that we can model after?


  4. Log in to comment