Capability to manually control joints similar to BDI "User" mode (esp. legs)

Issue #19 resolved
Stefan Kohlbrecher created an issue

While it is desirable leverage the locomotion and WBC capabilities of the IHMC controller as much as possible, there are some specific scenarios (such as getting out of a car or getting up) for which the capability to manually control joints would be desirable. It seems this can already be achieved by clever use of the messages for arms and torso, but as far as I can see there is no existing capability for specifying leg motion on joint level.

I'm not sure how easy it would be to integrate this with the QP solver based control approach, so throwing it out here for discussion.

Comments (6)

  1. Stefan Kohlbrecher reporter

    @nmertins @dljsjr I saw you added a message type that allows commanding joint positions in https://bitbucket.org/ihmcrobotics/ihmc_ros/commits/4d4ca6673fc23f771f568f2461fa5ce981eedc5f. This looks very useful. The minor quibble I have is the fact that only a trajectory time can be provided, so I think it will be somewhat hard to generate reproducible/fluid complex motions. The controller probably takes (message_receive time + trajectory_time) as a target time and it is hard from outside (e.g. the "ROS world") to know when the target has been reached (or what the computed target time by the controller really is). This makes it harder to execute reproducible trajectories than it could be.

    There are two improvements I can think of:

    • A trajectory following interface such as for the arms would be very useful (if that code can be generalized somewhat easily)

    • Providing a header with a timestamp in the message allows a deterministic computation of the target_time (header.stamp + trajectory_time) that could be relied upon

  2. DouglasS

    @Stefan_Kohlbrecher Thanks for the feedback! The current position control and trajectory messages are stubs, as there are still some stability issues with the controller consuming these messages on the real robot that are preventing us from releasing this feature. So there's time for us to review this and figure out how easy it will be to incorporate it.

  3. Dave Kotfis

    We really would like to see this as well. Just to be clear, do you plan to provide this interface for the upper body even when HighLevelStatePacketMessage is set to WALKING? The current messages give me the impression that JointAnglesPacketMessage was intended for "Driving" mode that does not use the QP Solver based whole body control.

  4. Georg Wiedebach

    The JointAnglesPacketMessage is intended only for the use with the position control HighLevelState.

    If the WALKING state is active you are able to send arm joint trajectories using the dedicated arm_joint_trajectory topic. These joint angle trajectories will not be modified by the QP solver: they are forwarded directly to the robot (the solver only knows about them). So the arms will do what they are told during manipulation tasks if you use that message. There is no "joint-level" control for pelvis and chest during walking since they are essential for balancing.

  5. Log in to comment