Comments (3)
So am I correct that the forward pass performs
- The nn.Sequential stuff - so Conv-Relu-Dropout, Conv-Relu-Dropout
- And then adds the output of the above to the input
- And then finally another ReLU (as seen here https://github.com/locuslab/TCN/blob/master/TCN/tcn.py#L45)
I am only a bit confused of this "outer-ReLU" part since it essentially means you do ReLU twice directly in succession (once at the end of Sequential, and then after you add to the input). And so I was wondering if that makes anything worse to apply a non-linearity almost "twice". But good to know that all the variations basically work?
The reason I am asking in detail is because I was looking for simple residual variants that work well and do not need batchnorm in these kinds of networks. And it seems weird to me to always add something in each layer that is positive (since it was ReLU-d) - do activations not get too large with many layers? Same with always applying relu to the shortcut connection, I thought the point here is to leave the shortcut connection alone so information always gets propagated forward.
So only having Conv at the end and adding that to the input (more like Resnet), and then just outputting that seemed more intuitive to me.
from tcn.
Also how does it make sense to apply ReLU after convolution, add the output to the identity connection, and then perform ReLU again?
Are you following any kind of standard Resnet architecture?
from tcn.
Hi! Regarding your questions:
-
It is not a mistake. The ReLU in the block corresponds to the ReLU in
nn.Sequential
(i.e., https://github.com/locuslab/TCN/blob/master/TCN/tcn.py#L31). There is an outer-ReLU, too. -
You can also certainly remove the self.relu2 (in practice I don't find much difference in performance). Then it would be more like a "standard ResNet". But in a high level, the residual blocks are just there to make sure the gradient can flow properly even a very deep network, so I don't think using a ReLU after the identity connection would be a big problem.
Let me know if this helps!
from tcn.
Related Issues (20)
- 函数调用问题
- LSTM and RNN used and issues of compatibility HOT 1
- issue about Input of TCN HOT 1
- ModuleNotFoundError: No module named 'tcn' HOT 2
- Clarification on figure 3(a) HOT 4
- Training on variable-length sequences HOT 1
- copy memory questions
- why raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
- seq2seq
- How should I choose correct layers number?
- How to save model?
- Code Question about: input the final conv-layer output to the linear layer
- What is the accuracy supposed to be for the MNIST problem?
- Is TCN suitable for spatio-temporal data? HOT 7
- why?
- Correlate .mat files with songs in Nottingham dataset
- Zero padding - possibly incorrect behavior? HOT 1
- DDP training with TCN Model
- do you have code examples for multivariate time series
- loss=nan
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tcn.