Comments (6)
If I make a matrix with upper right parts zero and multiply element-wise with attention matrix. For example
mask = tf.matrix_band_part(tf.ones_like(q.shape[1], k.shape[1]), -1, 0)
...
attn = Multiply()([attn, mask])
Would it take equivalent effects?
from attention-is-all-you-need-keras.
Hi, thank you for your response. But I still want to make sure that I understand correctly. Let me put the attention block below:
attn = Lambda(lambda x:K.batch_dot(x[0],x[1],axes=[2,2])/self.temper)([q, k])
if mask is not None:
mmask = Lambda(lambda x:(-1e+10)*(1-x))(mask)
attn = Add()([attn, mmask])
attn = Activation('softmax')(attn)
attn = self.dropout(attn)
output = Lambda(lambda x:K.batch_dot(x[0], x[1]))([attn, v])
Is the Activation
layer assure that the mask is "column-based"? It applies softmax on the last dimension, which is the column of attention matrix?
from attention-is-all-you-need-keras.
Sorry, my previous answer is wrong and I have found the right answers.
The experiment shows that using 1-mask+eye. The training accu & dev accu quickly go to near 100% but the model cannot process any user inputs. This situation means the model is using the future information.
The problem is: the axis 1 is not the column because there is a "Batch" axis.
>>> K.eval(GetSubMask(q)) # mask = K.cumsum(tf.eye(len_s, batch_shape=bs), 1)
array([[[1., 0., 0., 0., 0., 0., 0.],
[1., 1., 0., 0., 0., 0., 0.],
[1., 1., 1., 0., 0., 0., 0.],
[1., 1., 1., 1., 0., 0., 0.],
[1., 1., 1., 1., 1., 0., 0.],
[1., 1., 1., 1., 1., 1., 0.],
[1., 1., 1., 1., 1., 1., 1.]]], dtype=float32)
>>> np.cumsum(np.eye(5), 1) # Your question
array([[1., 1., 1., 1., 1.],
[0., 1., 1., 1., 1.],
[0., 0., 1., 1., 1.],
[0., 0., 0., 1., 1.],
[0., 0., 0., 0., 1.]])
>>> np.cumsum(np.eye(5), 0) # If no "Batch" axis, the cum axis is 0
array([[1., 0., 0., 0., 0.],
[1., 1., 0., 0., 0.],
[1., 1., 1., 0., 0.],
[1., 1., 1., 1., 0.],
[1., 1., 1., 1., 1.]])
from attention-is-all-you-need-keras.
We surely need a lower left triangular mask, as our expectation.
from attention-is-all-you-need-keras.
Thanks, that is clear.
from attention-is-all-you-need-keras.
from attention-is-all-you-need-keras.
Related Issues (20)
- Skip-connection in Transformer HOT 1
- after run your demo i get a error result like this.why?
- Transformer encoder layer instead of Bidirectional LSTM HOT 1
- K.mean() in computing loss doesn't make any sense.
- Why wasn't K and V weren't passed from the top encoder to bottom decoder model?
- Using the transformer instead of a simple LSTM layer HOT 2
- dimension in GetSubMask
- the test demo
- ScaledDotProductAttention
- seq2seq confused with shape HOT 1
- the mask of attention
- why get same output with different input? HOT 4
- Using the approach for video encoding.
- Difference between decode_sequence_fast and decode_sequence_readout?
- Time series forecasting?
- layer norm end of the encoder?
- startup error HOT 8
- reshape may not match
- after embedding layer HOT 1
- Licence
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from attention-is-all-you-need-keras.