HTTPS SSH

README

Getting Started

Check out the body-tracking-outline.pptx for an overview of the project.

The notebook neural_net_pipeline in the folder Notebooks contains our first attempt at predicting avatar positions from the positions of the headset and controllers, using a multi-layer perceptron regressor.

To run the notebook, you will need:

  • python 2.7
  • jupyter
  • numpy
  • scikit.learn
  • seaborn
  • matplotlib
  • pyglet

All of these are best installed using conda.

The Data

The data (in the folder aptly named data) consists of CSV files corresponding to a continuous time in which someone was in VR being tracked. Data collection ended either because the person got bored or because a controller turned off. A controller not being tracked is represented with the value NULL.

In each CSV file, we have the following fields. All spatial coordinates are in nm:

  • DateTime - Precision up to seconds.
  • MainPlayerID - IP address, can be ignored.
  • LControllerX
  • LControllerY
  • LControllerZ
  • RControllerX
  • RControllerY
  • RControllerZ
  • HeadsetX
  • HeadsetY
  • HeadsetZ
  • LControllerQuaternionX
  • LControllerQuaternionY
  • LControllerQuaternionZ
  • LControllerQuaternionW
  • RControllerQuaternionX
  • RControllerQuaternionY
  • RControllerQuaternionZ
  • RControllerQuaternionW
  • HeadsetQuaternionX
  • HeadsetQuaternionY
  • HeadsetQuaternionZ
  • HeadsetQuaternionW
  • LeftElbowX
  • RightElbowX
  • LeftElbowY
  • RightElbowY
  • LeftElbowZ
  • RightElbowZ
  • FrontX
  • BackX
  • FrontY
  • BackY
  • FrontZ
  • BackZ
  • LeftKneeX
  • RightKneeX
  • LeftKneeY
  • RightKneeY
  • LeftKneeZ
  • RightKneeZ
  • Scale - The scale of the box during recording. This is the inverse of the scale of the person. (E.g. if box was scaled by 0.5, then the persons avatar was 2x as big in Simbox space).

We want to predict left elbow, right elbow, front, back, left knee and right knee.

The raw data is contained in steamVRLogs, and was processed by the script processSteamVRLogs.py. The processing mapped the IP addresses and left/right labels recorded in json files to the body positions.