HeadOrientationPacketMessage documentation

Issue #53 new
Maurice Fallon created an issue

Could you provide a little documentation on this message?

The msg documentations says: " This message gives the desired head orientation of the robot in world coordinates. "

This is not what's going on. From testing through the ROS API and Valkyrie it would seem: - the Euler yaw component of the quaternion seems to be mapped to NeckYaw - the Euler pitch component of the quaternion seems to be mapped to LowerNeckPitch - It doesn't seem to support pitching up (i.e. it doesnt use UpperNeckPitch)

Also how long does this command have effect e.g. is it supposed to track to this orientation indefinitely or just to the end of trajectory_time?

Comments (2)

  1. btshrewsbury NA

    It looks like the orientation is expressed in chest frame now. I will add a ticket to update the message, and check over the others. We had originally specified a Look At Target option that was in world. It should only track during the trajectory time.

    Our QP solver seem's to dislike the lower and upper neck pitch combination. We are limiting the head orientation manager to use the lower neck pitch and the neck yaw, which should allow you to pitch up and down, but with limited RoM. We will add a ticket to convert the three neck joints to use pd controllers and skip the qp solver which should fix the sim. The gains may be low, which would cause the upsway to lag. The neck actuators are position controlled on the real robot so this is a sim only problem. ValkyrieWalkingControllerParameters:

    @Override
       public String[] getDefaultHeadOrientationControlJointNames()
       {
          if (controlHeadAndHandsWithSliders())
          {
             // For sliders, return none of them, to make sure that the QP whole body controller doesn't control the neck.
             return new String[]{};
          }
    
          else if (target == DRCRobotModel.RobotTarget.REAL_ROBOT)
          {
             // On the real robot, return all 3 so it knows to use them all. The real robot will use position control.
             return new String[] {jointMap.getNeckJointName(NeckJointName.UPPER_NECK_PITCH), jointMap.getNeckJointName(NeckJointName.LOWER_NECK_PITCH), jointMap.getNeckJointName(NeckJointName.NECK_YAW)};
          }
    
          // For sims using the QP and whole body controller, only allow one neck joint for now since the QP and inverse dynamics
          // don't do well with redundant joints yet. We'll have to fix that later some how.
          else return new String[] {jointMap.getNeckJointName(NeckJointName.LOWER_NECK_PITCH), jointMap.getNeckJointName(NeckJointName.NECK_YAW)};
       }
    
  2. Maurice Fallon reporter

    Perfect, that's a great explanation. For now, we don't really care about tracking, looking up or using UpperNeckPitch. What exists currently is ok for us for now. Eventually we will implement Look At Target capability ourselves of SCS.

  3. Log in to comment