Super Slo-Mo Video:

Open Source Application Affiliated

with


This open-source application was created in affiliation with video software company, Telestream, interpolating standard frame rate footage to slow motion video.

The interpolation process can be divided into a two-step process. The first step utilizes NVIDIA's Optical Flow SDK to create optical flow vectors for each frame (an optical flow vector referring to a vector field between two frames describing how to move the pixels in the first frame to create the second frame). The utilization of NVIDIA hardware works to speed up the optical flow generation, which is much slower in similar applications which take an optical flow generation approach.

The second step involves a Convolutional Neural Network. While optical flow generation alone can generate new frames, parts of the image will have noise, especially if an object is coming in or out of the frame or moving quickly across the frame. The original frames and interpolated frames can be passed into the deep learning model to clean up that noise.

The final product is an end-to-end application which takes any video input and quickly returns a slowed version of the file.

Previous
Previous

Undergraduate Research: Multimodal Machine Learning

Next
Next

Ecosystems Data Generation