Skip to content

Make the bot be able to move #10

@guillefix

Description

@guillefix

Reimplement a movement generation neural net that can run in real-time (hopefully) for the bot to be able to use non-verbal communication as well as navigate the world.

Making physical movement more natural, and more intelligent, is one the long term goals of MetaGen, for which we are gathering the "ImageNet of Human Behaviour", but we can begin implementing models that allow for limited functionality like:

  • simple speech-driven non-verbal communication (like done for StyleGestures)
  • simple trajectory-driven full-body movement (like done for MoGlow, or AIST++ paper)

The NN models that I'm considering are

  • MoGlow (can run real time)
  • AIST++ cross-modal transformer (I'm not sure yet if it can run in real time, but may produce better motion)

Metadata

Metadata

Assignees

No one assigned

    Labels

    AIArtificial IntelligenceLong term

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions