Giter VIP home page Giter VIP logo

back2future.pytorch's People

Contributors

anuragranj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

back2future.pytorch's Issues

Non-shared weights for network branches?

Hi,

thanks for sharing the code.
I have a generic question w.r.t the model's code.
I see that you have used three (non-shared weights) branches for 3 images in the lines:
https://github.com/anuragranj/back2future.pytorch/blob/master/back2future.py#L147-L166

In your experience, do you suggest having non-shared weights for image encoding?
I have noticed that PWCNet has shared weights for these branches.

I understand you did not train with pytorch. So, I am not sure if you have shared weights when you trained with Lua torch.

questions regarding multiplicative constant for flow & correlation max_displacement in multi-levels

I am trying to understand the code and the design decisions made while designing this model.
In this regard, I have two doubts as below:

  1. The correlation max_displacement is kept constant at 4 for all the levels of feature maps.
    self.corr = Correlation(pad_size=4, kernel_size=1, max_displacement=4, stride1=1, stride2=1, corr_multiply=1)

In your opinion, do you think we have to have different max_displacement for different levels of the network?

  1. I would like to get an intuition regarding the multiplication factors used for flow:
    0.625 for level-6 flow
    feat5b_warped = self.warp(feat5b, 0.625*flow6_fwd_up)
    feat5c_warped = self.warp(feat5c, -0.625*flow6_bwd_up)

1.25 for level-5 flow

feat4b_warped = self.warp(feat4b, 1.25*flow5_fwd_up)
feat4c_warped = self.warp(feat4c, -1.25*flow5_bwd_up)

2.5 for level-4 flow

feat3b_warped = self.warp(feat3b, 2.5*flow4_fwd_up)
feat3c_warped = self.warp(feat3c, -2.5*flow4_bwd_up)

5 for level-3 flow

feat2b_warped = self.warp(feat2b, 5.0*flow3_fwd_up)
feat2c_warped = self.warp(feat2c, -5.0*flow3_bwd_up)

Could you please let me know why do we need these multiplicative factors?
I am sorry if this is a basic question to be asked. As the network is learnable, isn't it possible that the network learns this multiplicative factor automatically as well?

Thanks in advance!

How can I fix 'back2future_kitti.pth.tar' error?

After making the necessary institutions, I ran demo.py. The error I encountered is this> FileNotFoundError: [Errno 2] No such file or directory: 'pretrained / back2future_kitti.pth.tar'. When I look at it with normal access it gives an error > an error occurred while loading the archive. Is the error caused by the file's corruption?

Secondly, the output I got when I ran the line you gave as test > python3 test_back2future.py --pretrained-flow path / to / pretrained / model --kitti-dir path / to / kitti / 2015 / root
Traceback (most recent call last):
   File "test_back2future.py", line 8, in
     from path import Path
ImportError: cannot import name 'Path'

What solution can I apply for this?
Thank you from now. @anuragranj @JJanai

Question on the argument for backword warping

Hi,

In your Pytorch implementation, I don't understand why you use the forward upsampled flow for backward warping. should it be the backward upsampled flow? (warping from the past feature (feat5c) into the current frame should use the estimated flow ( past->curr) )

for example in your code of "back2future.py"

        feat5b_warped = self.warp(feat5b, 0.625*flow6_fwd_up)
        feat5c_warped = self.warp(feat5c, -0.625*flow6_fwd_up) # Q: flow6_fwd_up-> flow6_bwd_up?

It would be great if you answer my question.

Thank you,
Suhong

Some problems about the prediction of occlusion

Thanks for your great job!

I try to binary the occlusion by using nn.sigmoid and np.round, but it doesn't seem to be the right way.

I want to know what is the right way to visualize occlusion?
Or can you send to me the files which occlusion visualization on KITTI, Sintel and Middlebury if you have them?
I shall be very grateful.:blush:

Confusing of the occlusion map

Thanks for your sharing~!

After read your paper, I noticed that, the occlusion estimation of your work is mapped to (0,1), and the occlusion estimation is used as the weight of photometric loss.
While in your paper, the occlusion estimation in Fig.3 is something like a binary map, and in the last of second paragraph of section 4.2, you said 'This clearly shows the benefit of ignoring misleading information in accordance with the occlusion estimates'. It is a little confusing to me. Is the occlusion estimation transformed into binary map? Or you just transform occlusion maps for visualizing?

Any reply will be appreciated~
Thanks again.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.