Giter VIP home page Giter VIP logo

Comments (7)

philipperemy avatar philipperemy commented on May 24, 2024

@yanghui-wng here is a detailed version of the weights contained in the TCN model:

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
matching_conv1D (Conv1D)     multiple                  800
_________________________________________________________________
Act_Res_Block (Activation)   multiple                  0
_________________________________________________________________
conv1D_0 (Conv1D)            multiple                  2200
_________________________________________________________________
Act_Conv1D_0 (Activation)    multiple                  0
_________________________________________________________________
SDropout_0 (SpatialDropout1D multiple                  0
_________________________________________________________________
conv1D_1 (Conv1D)            multiple                  30100
_________________________________________________________________
Act_Conv1D_1 (Activation)    multiple                  0
_________________________________________________________________
SDropout_1 (SpatialDropout1D multiple                  0
_________________________________________________________________
Act_Conv_Blocks (Activation) multiple                  0
_________________________________________________________________
matching_identity (Lambda)   (None, 1, 100)            0
_________________________________________________________________
Act_Res_Block (Activation)   multiple                  0
_________________________________________________________________
conv1D_0 (Conv1D)            multiple                  30100
_________________________________________________________________
Act_Conv1D_0 (Activation)    multiple                  0
_________________________________________________________________
SDropout_0 (SpatialDropout1D multiple                  0
_________________________________________________________________
conv1D_1 (Conv1D)            multiple                  30100
_________________________________________________________________
Act_Conv1D_1 (Activation)    multiple                  0
_________________________________________________________________
SDropout_1 (SpatialDropout1D multiple                  0
_________________________________________________________________
Act_Conv_Blocks (Activation) multiple                  0
_________________________________________________________________
matching_identity (Lambda)   (None, 1, 100)            0
_________________________________________________________________
Act_Res_Block (Activation)   multiple                  0
_________________________________________________________________
conv1D_0 (Conv1D)            multiple                  30100
_________________________________________________________________
Act_Conv1D_0 (Activation)    multiple                  0
_________________________________________________________________
SDropout_0 (SpatialDropout1D multiple                  0
_________________________________________________________________
conv1D_1 (Conv1D)            multiple                  30100
_________________________________________________________________
Act_Conv1D_1 (Activation)    multiple                  0
_________________________________________________________________
SDropout_1 (SpatialDropout1D multiple                  0
_________________________________________________________________
Act_Conv_Blocks (Activation) multiple                  0
_________________________________________________________________
Slice_Output (Lambda)        multiple                  0
_________________________________________________________________
dense (Dense)                (None, 64)                6464
_________________________________________________________________
leaky_re_lu (LeakyReLU)      (None, 64)                0
_________________________________________________________________
dense_1 (Dense)              (None, 32)                2080
_________________________________________________________________
leaky_re_lu_1 (LeakyReLU)    (None, 32)                0
_________________________________________________________________
dense_2 (Dense)              (None, 16)                528
_________________________________________________________________
leaky_re_lu_2 (LeakyReLU)    (None, 16)                0
_________________________________________________________________
dense_3 (Dense)              (None, 1)                 17
_________________________________________________________________
leaky_re_lu_3 (LeakyReLU)    (None, 1)                 0
=================================================================
Total params: 162,589
Trainable params: 162,589
Non-trainable params: 0
_________________________________________________________________

from keras-tcn.

philipperemy avatar philipperemy commented on May 24, 2024

And here are the TCN blocks (the breakdown by block):

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
residual_block_0 (ResidualBl multiple                  33100
_________________________________________________________________
residual_block_1 (ResidualBl multiple                  60200
_________________________________________________________________
residual_block_2 (ResidualBl multiple                  60200
_________________________________________________________________

from keras-tcn.

philipperemy avatar philipperemy commented on May 24, 2024

Here is the graph of your model. I used tensorboard for this. You can generate it yourself and explore each node of your model.

train (2)

from keras-tcn.

philipperemy avatar philipperemy commented on May 24, 2024

To reproduce it you can run this script:

import numpy as np
from tensorflow.keras import Input
from tensorflow.keras import Sequential
from tensorflow.keras.callbacks import TensorBoard
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import LeakyReLU

from tcn import TCN

input_dim = 7
timesteps = 1

print('Loading data...')
x_train = np.zeros(shape=(100, timesteps, input_dim))
y_train = np.zeros(shape=(100, 1))

batch_size = None
model = Sequential()
input_layer = Input(batch_shape=(batch_size, timesteps, input_dim))
model.add(input_layer)
model.add(TCN(nb_filters=100,
              # Integer. The number of filters to use in the convolutional layers. Would be similar to units for LSTM. Can be a list.
              kernel_size=3,  # Integer. The size of the kernel to use in each convolutional layer.
              nb_stacks=1,  # The number of stacks of residual blocks to use.
              dilations=(1, 2, 4),  # List/Tuple. A dilation list. Example is: [1, 2, 4, 8, 16, 32, 64].
              padding='causal',
              use_skip_connections=False,
              dropout_rate=0.1,
              return_sequences=False,
              activation='relu',
              kernel_initializer='he_normal',
              use_batch_norm=False,
              use_layer_norm=False,
              ))
model.add(Dense(64))
model.add(LeakyReLU(alpha=0.3))
model.add(Dense(32))
model.add(LeakyReLU(alpha=0.3))
model.add(Dense(16))
model.add(LeakyReLU(alpha=0.3))
model.add(Dense(1))
model.add(LeakyReLU(alpha=0.3))
model.compile(loss='mse', optimizer='adam')

# tensorboard --logdir logs_tcn
# Browse to http://localhost:6006/#graphs&run=train.
# and double click on TCN to expand the inner layers.
# It takes time to write the graph to tensorboard. Wait until the first epoch is completed.
tensorboard = TensorBoard(
    log_dir='logs_tcn',
    histogram_freq=1,
    write_images=True
)

print('Train...')
model.fit(
    x_train, y_train,
    batch_size=batch_size,
    callbacks=[tensorboard],
    epochs=10
)

Run it and a folder called logs_tcn should be generated. In the same directory run:

tensorboard --logdir logs_tcn

And go to http://localhost:6006/.

Select GRAPH and you will see it:

image

from keras-tcn.

philipperemy avatar philipperemy commented on May 24, 2024

I guess with all those tools, you should be able to have an answer.

from keras-tcn.

yanghui-wng avatar yanghui-wng commented on May 24, 2024

Thank you for your help! Through your explanation, I have known how to calculate the parameters of TCN.

from keras-tcn.

philipperemy avatar philipperemy commented on May 24, 2024

Good to hear!

from keras-tcn.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.