Skip to content

Learn latent primitives of human movement.

License

Unknown, Unknown licenses found

Licenses found

Unknown
LICENSE
Unknown
license.pdf
Notifications You must be signed in to change notification settings

rasdani/mp-transformer

Repository files navigation

Movement Primitive - Transformer

Code style: black Tests

This transformer architecture maps a sequence of pose joint angles, i.e. frames of movement, to a (smaller) sequence of latent primitives. These act as low-dimensional building blocks of movement and are encoded as distributions in latent space. It then samples from these like a VAE and feeds them to a sequence decoder. After applying gaussian masks, the subsequences are averaged to form the final reconstructed movement sequence. Adapted from Marsot et. al. (2022) [code].

The goal is to explore a new model for Movement Primitives trained on data of a simulated toy limb.

Potential use cases are modeling Movement Primitives in our VR lab or in the wild.

Setup

Install with mamba env create --file environment.yml (tested with mamba and mambaforge on Ubuntu 22.04., conda should work aswell albeit slowly.) Make sure ffmpeg is installed on your system for rendering videos.

Activate the environment with mamba activate mp-transformer and setup with pip install -e ..

Run

Activate the environment and run training without logging to Weights & Biases with python mp_transformer/train.py --no-log.

Demo

Example videos and Jupyter notebooks can be found in demo/.

About

Learn latent primitives of human movement.

Topics

Resources

License

Unknown, Unknown licenses found

Licenses found

Unknown
LICENSE
Unknown
license.pdf

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published