Giter VIP home page Giter VIP logo

Comments (10)

philipperemy avatar philipperemy commented on July 24, 2024

Try to downgrade your tensorflow version.

from keras-attention.

SaharaAli16 avatar SaharaAli16 commented on July 24, 2024

I changed the way I was defining the model, without downgrading Tensorflow and it started working again. New model definition:

timestep = timesteps
features = 11

model_input = Input(shape=(timestep,features))
x = LSTM(64, return_sequences=True)(model_input)
x = Dropout(0.2)(x)
x = LSTM(32, return_sequences=True)(x)
x = LSTM(16, return_sequences=True)(x)
x = Attention(32)(x)
x = Dense(32)(x)
x = Dense(16)(x)
x = Dense(1)(x)
model = Model(model_input, x)
print(model.summary())

from keras-attention.

philipperemy avatar philipperemy commented on July 24, 2024

Great!

from keras-attention.

SaharaAli16 avatar SaharaAli16 commented on July 24, 2024

Quick follow-up question: Can you tell how to downgrade tensorflow to 2.3? Current version in Colab is 2.5 and I am having the reported issue again, even with the new model definition.
I know %tensorflow_version 2.x cannot downgrade TF to 2.3

from keras-attention.

philipperemy avatar philipperemy commented on July 24, 2024

I think this should work:

!pip install tensorflow==2.3

Like that
image

from keras-attention.

SaharaAli16 avatar SaharaAli16 commented on July 24, 2024

Alright, so that worked. Next up, I cannot use multiple Attention layers in one ensembled model. So, I have model1 that has an attention layer and I have model2 that has another attention layer. But when I concatenate these two models, I get this error:
ValueError: The name "last_hidden_state" is used 2 times in the model. All layer names should be unique.
I believe this is because the attention layer itself has multiple inner/nested layers and models cannot have layers with same name. I tried renaming the attention layer but since it is just a wrapper, that renaming didn't help and the error persists.
Any workaround for this?

from keras-attention.

philipperemy avatar philipperemy commented on July 24, 2024

@SaharaAli16 yes have to remove the names inside the layers: https://github.com/philipperemy/keras-attention-mechanism/blob/0f8b440e8e74fb25309b2d391f7280bf4f13129a/attention/attention.py#L24. Otherwise Keras will complain that they already exist if you instantiate a second Attention class.

from keras-attention.

shlomi-schwartz avatar shlomi-schwartz commented on July 24, 2024

The suggested setup:

timestep = timesteps
features = 11

model_input = Input(shape=(timestep,features))
x = LSTM(64, return_sequences=True)(model_input)
x = Dropout(0.2)(x)
x = LSTM(32, return_sequences=True)(x)
x = LSTM(16, return_sequences=True)(x)
x = Attention(32)(x)
x = Dense(32)(x)
x = Dense(16)(x)
x = Dense(1)(x)
model = Model(model_input, x)
print(model.summary())

No longer works for TF 2.7.0

Error :

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-6-88e2c30c5093> in <module>()
      7 x = LSTM(32, return_sequences=True)(x)
      8 x = LSTM(16, return_sequences=True)(x)
----> 9 x = Attention(32)(x)
     10 x = Dense(32)(x)
     11 x = Dense(16)(x)

1 frames
/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py in __init__(self, trainable, name, dtype, dynamic, **kwargs)
    339              trainable.dtype is tf.bool)):
    340       raise TypeError(
--> 341           'Expected `trainable` argument to be a boolean, '
    342           f'but got: {trainable}')
    343     self._trainable = trainable

TypeError: Expected `trainable` argument to be a boolean, but got: 32

from keras-attention.

SaharaAli16 avatar SaharaAli16 commented on July 24, 2024

I would suggest copying the source code and compile it in your code. That should work.

from keras-attention.

philipperemy avatar philipperemy commented on July 24, 2024

Yes so this issue was fixed in the latest release (4.1) of the attention mechanism.

pip install attention --upgrade

Will solve it.

from keras-attention.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.