Comments (3)
It's almost definitely networkx
. I recreated some of them using this code but I can't make claims for it's generality or accuracy since I just ripped it out of an old notebook and pasted it here. Hopefully it helps you out!
import networkx as nx
import numpy as np
from matplotlib import pyplot as plt
block_factor = 1 # assuming only one convolution in a block for simplicity
dilation_factor = 4
num_blocks = 3
num_stacks = 1
kernel_size = 4
sum_dilations = sum(dilation_factor**i for i in range(num_blocks))
receptive_field = 1 + block_factor * (kernel_size - 1) * num_stacks * sum_dilations
fill_inactive = False
if fill_inactive:
graph_width = 3 * receptive_field
else:
graph_width = receptive_field
# intialize the graphs with white nodes
active_graph = nx.create_empty_copy(nx.grid_2d_graph(num_blocks + 1, graph_width))
inactive_graph = nx.create_empty_copy(active_graph)
nx.set_node_attributes(active_graph, values="white", name="color")
nx.set_node_attributes(inactive_graph, values="white", name="color")
# color input nodes
input_nodes = [node for node in active_graph.nodes if node[0] == 0]
nx.set_node_attributes(active_graph, {node: "gold" for node in input_nodes}, name="color")
# color output nodes
output_nodes = [node for node in active_graph.nodes if node[0] == num_blocks]
nx.set_node_attributes(active_graph, {node: "red" for node in output_nodes}, name="color")
# highlight active nodes
for block_num in range(num_blocks, 0, -1):
dilation = dilation_factor ** block_num
next_dilation = dilation_factor ** (block_num - 1)
for i in range(graph_width - 1, (next_dilation * (kernel_size - 1)) - 1, -dilation):
nx.set_node_attributes(active_graph, {(block_num, i): {"color": "deepskyblue"}})
for j in range(kernel_size):
active_graph.add_edge((block_num - 1, i - (j * next_dilation)), (block_num, i), style="solid")
# fill in inactive nodes
if fill_inactive:
for block_num in range(num_blocks):
dilation = dilation_factor ** block_num
for i in range(graph_width):
inactive_graph.add_edge((block_num, i), (block_num + 1, i), style="dashed")
if i > dilation:
inactive_graph.add_edge((block_num, i-dilation), (block_num + 1, i), style="dashed")
plt.figure(figsize=(receptive_field, 0.3 * receptive_field))
pos = {(x,y):(y,x) for x,y in active_graph.nodes()}
nx.draw(
active_graph,
pos=pos,
arrows=True,
arrowstyle="-|>",
arrowsize=10,
width=1.5,
style="solid",
node_color=nx.get_node_attributes(active_graph, "color").values(),
edgecolors="black",
)
nx.draw_networkx_edges(
inactive_graph,
pos=pos,
arrows=True,
arrowstyle="-|>",
arrowsize=8,
width=0.8,
style="dashed",
)
if fill_inactive:
plt.xlim(left=receptive_field - 0.5, right=2 * receptive_field - 0.5)
plt.show()
from keras-tcn.
Indeed, my fault for lack of aknwoledgement! Thanks @krzim
from keras-tcn.
Big +1 to @krzim for answering on that one!
from keras-tcn.
Related Issues (20)
- setup.py requirements on mac os HOT 7
- How do I use a masking layer for TCN? I want to mask certain time steps which are missing. HOT 4
- ValueError: Unknown layer: TCN when trying to load saved model HOT 3
- Dilations and nb_stack relationship HOT 4
- The parameter of the TCN HOT 7
- Masking time steps in order to use TCN for variable length sequences HOT 2
- data shape of tcn layer HOT 1
- help one regression per sequence HOT 1
- Ensuring unique weights when a model uses multiple TCNs HOT 2
- How can i use TCN to build seq2seq model? HOT 1
- keras-tcn for R HOT 1
- Visualization of internal structure of TCN block HOT 1
- Low accuracy in keras-tuner based TCN model for audio classification HOT 6
- Saving a loaded model gets warning and then fails to open HOT 1
- exchange_rate.txt HOT 5
- Question: Skip connections HOT 2
- Why is the data form (batch_size, timesteps, input_dim) instead less accurate than (batch_size, input_dim, timesteps)?
- Keras 3 support HOT 4
- Confusion about regression or classification head
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from keras-tcn.