AI Performance: An ActOne Experiment
In our latest experiment with AI tools, we explored RunwayML's newly released ActOne, pushing the boundaries of character animation through performance capture. The process began with Claude generating a surreal "day in the life" script, which I then performed on camera.
To create my digital double, we turned to MidJourney and Ideogram, generating character variations based on my appearance. The real magic happened when we fed these images into ActOne, using my recorded performance to drive the AI-generated character's movements and expressions.
What's fascinating about ActOne is how it bridges the gap between human performance and AI-generated imagery. The tool interpreted my movements and translated them into the digital character with surprising fidelity, though not without its share of uncanny moments.
The entire process - from script generation to final edit in Premiere Pro - took just one evening, highlighting the rapidly accelerating pace of AI-driven content creation. While the results weren't perfect, they offer a glimpse into a future where the line between human performance and digital animation continues to blur.
This experiment is part of our ongoing exploration of AI's creative potential and how it could possibly be used.