Attention类实现感觉有问题,本人使用TextBiRNN模型来跑,出现以下错误:
Traceback (most recent call last):
File "train.py", line 86, in <module>
last_activation='softmax').get_model()
File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 282, in get_model
x_word = Attention(self.maxlen_word)(x_word)
File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py", line 75, in symbolic_fn_wrapper
return func(*args, **kwargs)
File "/media/gaoya/disk/Applications/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py", line 463, in __call__
self.build(unpack_singleton(input_shapes))
File "/media/gaoya/disk/Applications/keras/短文本分类/model.py", line 231, in build
constraint=self.b_constraint)
TypeError: add_weight() got multiple values for argument 'name'
请问怎么解决呢