Use Kimodo in Blender.

NVIDIA's Kimodo model, packaged as a production-ready Blender workflow. Describe an action, generate motion, and bake editable keyframes onto your armature. Hosted or self-hosted by Animatica.

Get the Blender add-onView the source
91k+ framesgenerated in early access
Three ways to drive Kimodo

Keys, paths, prompts.

Mix and match every input. Block keyframes, sketch a path, describe the action — Kimodo respects each constraint and fills in the rest.

// keyframe constraints
Hosted or local

Start fast, or stay local.

One Blender add-on, two ways to run Kimodo. Use Animatica's hosted service when you want motion quickly, or run the Apache-2.0 backend yourself when scenes need to stay on your network.

Free to start
Hosted Kimodo

All the features. None of the setup.

Free
50 generations
$20/ mo
Unlimited generations during early access

Animatica runs the GPU and the full feature stack — retargeting to your rig, text-to-pose, foot-skate cleanup. Sign up, install the add-on, generate.

  • Retarget generated motion to humanoid Blender rigs
  • Use text-to-pose, foot-skate cleanup, and constraints
  • Skip local GPU and driver setup
Start free
Apache-2.0
Self-hosted

Your GPU. Your scenes.

Free
Apache-2.0 · your hardware, your network

The Apache-2.0 backbone runs Kimodo on your hardware over the open MMCP protocol. Best when scenes need to stay local — or when you want to fork the source and ship something custom.

  • Source on GitHub — inspect, fork, adapt
  • No telemetry, no cloud round-trip
  • Same MMCP protocol — switch to hosted any time
View source
Your characters, your timeline

Use Kimodo right inside
Blender.

Install the add-on, choose hosted or local, and turn Kimodo output into editable Blender keyframes.

Get the Blender add-onRun it yourself