Using machine-learning to generate high frame rate (slow-mo) ActionVFX assets

Hello everyone!

I’m trying to be more open about my experimental processes for some of the side-projects I spin up as I go along. I’m constantly evaluating new technologies and the implications they can have on the products we create at ActionVFX. So this post is me trying to let people in a little earlier on in the exploratory process! :slight_smile:

In this test, I’m utilizing an early-stage machine-learning model to generate a high framerate video clip based on our existing assets.

(Watch in max resolution)

It analyzes the frames that are contained inside of the clips provided and generates completely new frames inside of an exported video clip. To explain how it operates in its most basic form, it analyzes Frame A and Frame C and solves to generate Frame B. Except that example barely scratches the surface for how it actually works. :laughing:

You’ll notice how when slowed to 10% speed, the un-interpolated clip begins dropping frames, while the interpolated clip with the higher framerate has a much smoother motion. Even when the playback is dropped to 10%.

One implication of this type of technology could be that we could retroactively process the elements inside of our existing effects library to offer an experimental “High Framerate” download option, so if someone needed a clip at, say, 500FPS, they could simply download it.

It would take quite a bit more investigative work before this would be feasible, but would this even be something you’d be interested in us doing? Because, well…


yes yes yes moses GIF by Nonnahs Marketing


That’s awesome Luke! That is some really powerful AI that you are working with!

1 Like

Oh yes! I heard about that for animation, but not for real video!

Oh, and can I try it for myself?

It’s a bit of a convoluted process to achieve even these experimental results right now. There’s still quite a lot I don’t understand just yet. But if something becomes available that makes this process more user-friendly I’ll definitely let you know! :slight_smile:

1 Like

So is it that it is publicly available, but I wouldn’t understand? If so, I wanna try it anyways! :hugs:

The documentation on it isn’t as detailed as I’d like, but here’s a link to the code (GitHub - hzwer/arXiv2020-RIFE: RIFE: Real-Time Intermediate Flow Estimation for Video Frame Interpolation) where you can find more info! :slight_smile:

I’m hoping this type of model will eventually be integrated into so things can be made much more accessible for artists.

Cool! I’ll have a look at it now!

1 Like

Oh wait… My computer is weak!

Kristen Wiig No GIF by The Lonely Island

1 Like

I know… I don’t even have a graphics card! Instead, I have Intel hd graphics 4000. It’s from 2013.
It’s even a mini itx I think.

A few years old, but cool approach.