Chroma 0334
Thereโs something thrilling about pushing boundaries, especially when those boundaries involve reimagining moving images through the lens of an iconic painting.
This week, Iโve been experimenting with style transfer models in Nuke. My canvas? A plate of a galloping horse. My muse? Van Goghโs Starry Night.
The idea was simple: could machine learning achieve a temporally stable style transfer on a moving image?
The result? ๐คทโโ๏ธ
Raw style transfer models are far from feature-quality. Theyโre like rough sketches, bursting with potential but demanding significant refinement to meet professional standards.
One of the biggest hurdles was maintaining temporal stability: ensuring the effect flows seamlessly across frames without jittering or breaking. Current models, as they stand, struggle with this. Without segmentation or clean texture generation, the illusion falters, and the magic breaks.
One idea driving this experiment is using animation for flashbacks in storytelling. Instead of traditional live action, a painterly style like this could create a dreamlike effect that signals to the audience: this is a memory, a moment removed from time.
Making this production-ready is the real challenge. Models need to be trained on motion data, not just still images, to achieve fluidity. And as always, fundamentals like segmentation, texture generation, and reprojection remain indispensable.
This attempt is raw, experimental, and imperfect but itโs a step I wanted to take. With time, effort, and a willingness to iterate, machine learning could open doors to creative possibilities weโve only started to imagine.
P.S. If youโre not experimenting, are you even creating? ๐