Comments (10)
I have wrote a function for this
def get_receptive(kernel_size, levels, dilation_exponential_base):
return sum([dilation_exponential_base**(l-1)*(kernel_size-1) for l in range(levels, 0, -1)]) + 1
from tcn.
You're correct that there are 2 convs per residual block. But I believe they have the same dimension so they don't increase the receptive area. Also you wouldn't double the value, the receptive area doesn't increase multiplicatively per layer, it's additive - i.e. each layer increases the receptive area by the length of 1 conv filter less one step.
Hi, david. Thanks for your reply. It's right that the receptive area increases additively per layer. But I think that double layers with the same dimension do increase the receptive area. In the trivial case, suppose that there are 2 same conv1d layers with kernal size=3 and dilation=1, so the receptive area of the first conv layer is 3 (i.e. 1+1*(3-1), the middle "1" denotes dilation), and the second is 5 (i.e. 1+1*(3-1)+1*(3-1)=1+2 * 1*(3-1)). By the way, you missed 1 in the equation.
from tcn.
The receptive area grows roughly exponentially if you used the default setting, because 2^i - 1 = 2^0 + 2^1 + ... + 2^{i-1}. You can use that as an estimate.
from tcn.
The receptive area grows roughly exponentially if you used the default setting, because 2^i - 1 = 2^0 + 2^1 + ... + 2^{i-1}. You can use that as an estimate.
Thank you, get it :)
from tcn.
Hi! I was inspired by your wonderful work to use TCN for my project.
I wrote a short Matlab script supposed to calculate the effective receptive field.
Element j
in the output vector RF
shows the receptive field for a network with j hidden layers.
Could please evaluate this code below so other people could hopefully use it?
@jerrybai1995
k = 6; %Kernel size
n = 7; %num of hidden layers
d = 2; %dilation factor
num_layers = 1:n+1; % hidden+input
dilation = d.^(num_layers - 1); % dilation at each hidden layer
RF = zeros(1,length(num_layers));
RF(1) = k; % first RF is kernel size
for layer = 2:length(dilation) % repeat for num of hidden layers - 1, beginning at second hidden layer
RF(layer) = RF(layer - 1) + (k - 1)*dilation(layer);
end
from tcn.
That looks right to me, I believe the equation is as follows
k = kernel size
n = hidden layers
d = dilation factor
from tcn.
@david-waterworth, I notice that there are 2 consecutive dilated convolution layers in each residual block, so the real receptive field =2 * the value calculated by your equation, am I right?
from tcn.
I believe not. If you look at the tcn.py code, it shows that the number of residual blocks is as the length of hidden_layers
from tcn.
You're correct that there are 2 convs per residual block. But I believe they have the same dimension so they don't increase the receptive area. Also you wouldn't double the value, the receptive area doesn't increase multiplicatively per layer, it's additive - i.e. each layer increases the receptive area by the length of 1 conv filter less one step.
from tcn.
Yes you're correct. I'm going to have to draw it up on the whiteboard again :)
from tcn.
Related Issues (20)
- 函数调用问题
- LSTM and RNN used and issues of compatibility HOT 1
- issue about Input of TCN HOT 1
- ModuleNotFoundError: No module named 'tcn' HOT 2
- Clarification on figure 3(a) HOT 4
- Training on variable-length sequences HOT 1
- copy memory questions
- why raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
- seq2seq
- How should I choose correct layers number?
- How to save model?
- Code Question about: input the final conv-layer output to the linear layer
- What is the accuracy supposed to be for the MNIST problem?
- Is TCN suitable for spatio-temporal data? HOT 7
- why?
- Correlate .mat files with songs in Nottingham dataset
- Zero padding - possibly incorrect behavior? HOT 1
- DDP training with TCN Model
- do you have code examples for multivariate time series
- loss=nan
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tcn.