Giter VIP home page Giter VIP logo

Comments (3)

f90 avatar f90 commented on August 15, 2024 1

So am I correct that the forward pass performs

I am only a bit confused of this "outer-ReLU" part since it essentially means you do ReLU twice directly in succession (once at the end of Sequential, and then after you add to the input). And so I was wondering if that makes anything worse to apply a non-linearity almost "twice". But good to know that all the variations basically work?

The reason I am asking in detail is because I was looking for simple residual variants that work well and do not need batchnorm in these kinds of networks. And it seems weird to me to always add something in each layer that is positive (since it was ReLU-d) - do activations not get too large with many layers? Same with always applying relu to the shortcut connection, I thought the point here is to leave the shortcut connection alone so information always gets propagated forward.

So only having Conv at the end and adding that to the input (more like Resnet), and then just outputting that seemed more intuitive to me.

from tcn.

f90 avatar f90 commented on August 15, 2024

Also how does it make sense to apply ReLU after convolution, add the output to the identity connection, and then perform ReLU again?
Are you following any kind of standard Resnet architecture?

from tcn.

jerrybai1995 avatar jerrybai1995 commented on August 15, 2024

Hi! Regarding your questions:

  1. It is not a mistake. The ReLU in the block corresponds to the ReLU in nn.Sequential (i.e., https://github.com/locuslab/TCN/blob/master/TCN/tcn.py#L31). There is an outer-ReLU, too.

  2. You can also certainly remove the self.relu2 (in practice I don't find much difference in performance). Then it would be more like a "standard ResNet". But in a high level, the residual blocks are just there to make sure the gradient can flow properly even a very deep network, so I don't think using a ReLU after the identity connection would be a big problem.

Let me know if this helps!

from tcn.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.