This is the source code for the video interpolation application Dain-App, developed on top of the source code of DAIN Dain GIT Project
- Introduction
- Citation
- Requirements and Dependencies
- Installation
- Running application with interface
- Running application with command line
- Slow-motion Generation
- Training New Models
- Google Colab Demo
Dain-App comes with a user interface and a command line script to help new users to start using it with little change to the code. You can also get the Windows binary from the build section.
You can see a few results from those Youtube videos. Turning animations to 60FPS. Turning Sprite Art to 60FPS. Turning Stop Motion to 60FPS. Turning ANIME P1 to 60FPS. Turning ANIME P2 to 60FPS.
If you find the code and datasets useful in your research, please cite:
@article{Dain-App,
title={Dain-App: Application for Video Interpolations},
author={Gabriel Poetsch},
year={2020}
}
@inproceedings{DAIN,
author = {Bao, Wenbo and Lai, Wei-Sheng and Ma, Chao and Zhang, Xiaoyun and Gao, Zhiyong and Yang, Ming-Hsuan},
title = {Depth-Aware Video Frame Interpolation},
booktitle = {IEEE Conference on Computer Vision and Pattern Recognition},
year = {2019}
}
@article{MEMC-Net,
title={MEMC-Net: Motion Estimation and Motion Compensation Driven Neural Network for Video Interpolation and Enhancement},
author={Bao, Wenbo and Lai, Wei-Sheng, and Zhang, Xiaoyun and Gao, Zhiyong and Yang, Ming-Hsuan},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
doi={10.1109/TPAMI.2019.2941941},
year={2018}
}
- numba=0.51.2
- numpy=1.19.2
- opencv-python=4.4.0.46
- pillow=8.0.1
- pyqt5=5.15.1
- python=3.8.5
- scikit-learn=0.23.2
- scipy=1.5.4
- torch=1.7.0+cu110
- torchvision=0.8.1+cu110
- tqdm=4.51.0
- ffmpeg
Remember you need to build the .cuda scripts before the app can work.
python my_design.py
You can see all commands for CLI using this code:
python my_design.py -cli -h
A example of a working code:
python my_design.py -cli --input "gif/example.gif" -o "example_folder/" -on "interpolated.gif" -m "model_weights/best.pth" -fh 3 --interpolations 2 --depth_awarenes 0 --loop 0 -p 0 --alpha 0 --check_scene_change 10 --png_compress 0 --crf 1 --pixel_upscale_downscale_before 1 --pixel_downscale_upscale_after 1 --pixel_upscale_after 1 --mute_ffmpeg 1 --split_size_x -1 --split_size_y -1 --split_pad 150 --half 0 --step_extract 1 --step_interpolate 1 --batch_size 1 --use_benchmark 0 --force_flow 1 --smooth_flow 0 --downscale -1 --fast_mode 0
Currently Dain-App training code is broken, to train new models, use the DAIN github and import the models to Dain-App
See MIT License