Giter VIP home page Giter VIP logo

liteflownet's Introduction

LiteFlowNet

This repository is the release of LiteFlowNet for our paper LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation in CVPR18 (Spotlight). The up-to-date version of the paper is available on arXiv.

For more details about LiteFlowNet, please visit my project page.

It comes as a fork of modified caffe master branches from DispFlowNet and FlowNet2 with our new layers, scripts, and trained models.

License and Citation 

All code and other materials (including but not limited to the paper, figures, and tables) are provided for research purposes only and without any warranty. Any commercial use requires our consent. When using any parts of the code package or the paper (LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation) in your work, please cite the following paper:

@InProceedings{hui18liteflownet,    
 author = {Tak-Wai Hui and Xiaoou Tang and Chen Change Loy},    
 title = {LiteFlowNet: A Lightweight Convolutional Neural Network for Optical Flow Estimation},    
 booktitle  = {Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},    
 year = {2018},    
 url = {http://mmlab.ie.cuhk.edu.hk/projects/LiteFlowNet/} 
}

Prerequisites

Installation was tested under Ubuntu 14.04.5 and 16.04.2 with CUDA 8.0 and cuDNN 5.1.

Edit Makefile.config (and Makefile) if necessary in order to fit your machine's settings.

For opencv 3+, you may need to change opencv2/gpu/gpu.hpp to opencv2/cudaarithm.hpp in /LiteFlowNet/src/caffe/layersresample_layer.cu.

If your machine installed a newer version of cuDNN, you do not need to downgrade it. You can do the following trick:

  1. Download cudnn-8.0-linux-x64-v5.1.tgz and untar it to a temp folder, say cuda-8-cudnn-5.1

  2. Rename cudnn.h to cudnn-5.1.h in the folder /cuda-8-cudnn-5.1/include

  3. $ sudo cp cuda-8-cudnn-5.1/include/cudnn-5.1.h /usr/local/cuda/include/	
    $ sudo cp cuda-8-cudnn-5.1/lib64/lib* /usr/local/cuda/lib64/
  4. Replace #include <cudnn.h> to #include <cudnn-5.1.h> in LiteFlowNet/include/caffe/util/cudnn.hpp.

Compiling

$ cd LiteFlowNet
$ make -j 8 tools pycaffe

Datasets

  1. FlyingChairs dataset (31GB) and train-validation split.
  2. RGB image pairs (clean pass) (37GB) and flow fields (311GB) for Things3D dataset.
  3. Sintel dataset (clean + final passes) (5.3GB).
  4. KITTI12 dataset (2GB) and KITTI15 dataset (2GB) (Simple registration is required).
FlyingChairs FlyingThings3D Sintel KITTI
Crop size 448 x 320 768 x 384 768 x 384 896 x 320
Batch size 8 4 4 4

Training

(prototxt files will be available soon)

  1. Prepare the training set. In LiteFlowNet/data/make-lmdbs-train.sh, change YOUR_TRAINING_SET and YOUR_TESTING_SET to your favourite dataset.
$ cd LiteFlowNet/data
$ ./make-lmdbs-train.sh
  1. Copy files from LiteFlowNet/models/training_template to a new model folder (e.g. NEW). Edit all the files and make sure all settings are correct.
$ mkdir LiteFlowNet/models/NEW
$ cd LiteFlowNet/models/NEW
$ cp ../training_template/solver.prototxt.template solver.prototxt	
$ cp ../training_template/train.prototxt.template train.prototxt
$ cp ../training_template/train.py.template train.py
  1. Create a soft link in your new model folder
$ ln -s ../../build/tools bin
  1. Run the training script
$ ./train.py -gpu 0 2>&1 | tee ./log.txt

Trained models

The trained models (liteflownet, liteflownet-ft-sintel, liteflownet-ft-kitti) are available in the folder LiteFlowNet/models/trained. Untar the files to the same folder before you use it.

Testing

(prototxt files will be available soon)

  1. Open the testing folder
$ cd LiteFlowNet/models/testing
  1. Create a soft link in the folder /testing
$ ln -s ../../build/tools bin
  1. Replace MODE in ./test_MODE.py to batch if all the images has the same resolution (e.g. Sintel dataset), otherwise replace it to iter (e.g. KITTI dataset).

  2. Replace MODEL in line 10 (cnn_model = 'MODEL') of test_MODE.py to one of the trained models (e.g. liteflownet-ft-sintel).

  3. Run the testing script. Flow fields (MODEL-0000000.flo, MODEL-0000001.flo, ... etc) are stored in the folder /testing/results having the same order as the image pair sequence.

$ test_MODE.py img1_pathList.txt img2_pathList.txt results

Evaluation

  1. End-point error (EPE) per image can be calculated using the provided script LiteFlowNet/models/testing/util/endPointErr.m

  2. Average end-point error (AEE) is simply computed by taking the average of all EPE.

liteflownet's People

Contributors

twhui avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.