Giter VIP home page Giter VIP logo

seq2seq's Introduction

seq2seq

keras example of seq2seq, auto title

https://kexue.fm/archives/5861

交流

QQ交流群:67729435,微信群请加机器人微信号spaces_ac_cn

seq2seq's People

Contributors

bojone avatar zh794390558 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

seq2seq's Issues

运行错误

File "seq2seq_train.py", line 370, in
callbacks=[evaluator])
File "/data/zhu/.conda/envs/py27/lib/python2.7/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/data/zhu/.conda/envs/py27/lib/python2.7/site-packages/keras/engine/training.py", line 1418, in fit_generator
initial_epoch=initial_epoch)
File "/data/zhu/.conda/envs/py27/lib/python2.7/site-packages/keras/engine/training_generator.py", line 181, in fit_generator
generator_output = next(output_generator)
File "/data/zhu/.conda/envs/py27/lib/python2.7/site-packages/keras/utils/data_utils.py", line 709, in get
six.reraise(*sys.exc_info())
File "/data/zhu/.conda/envs/py27/lib/python2.7/site-packages/keras/utils/data_utils.py", line 685, in get
inputs = self.queue.get(block=True).get()
File "/data/zhu/.conda/envs/py27/lib/python2.7/multiprocessing/pool.py", line 567, in get
raise self._value
TypeError: 'encoding' is an invalid keyword argument for this function

苏神,能看一下错误嘛(python2.7 keras 2.2.4 tensorflow1.8)

大数据的处理问题

您好,新手问题。在keras中有可以分批提取训练的方法么?(保证内存不溢出)比如100万条数据,则么处理在keras上会更好?

ValueError: Dimension must be 5 but is 4 for '{{node attention_11/transpose_4}} = Transpose[T=DT_FLOAT, Tperm=DT_INT32](attention_11/truediv, attention_11/transpose_4/perm)' with input shapes: [?,8,?,8,?], [4].

tf=2.6.0
keras=2.6.0

执行xy = Attention(8, 16)([y, x, x, x_mask])这一步时报错,
进入函数发现a = K.permute_dimensions(a, (0, 3, 2, 1))这里报错了

ValueError: in user code:

<ipython-input-106-d9f1e622f23a>:50 call  *
    a = K.permute_dimensions(a, (0, 3, 2, 1))
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/util/dispatch.py:206 wrapper  **
    return target(*args, **kwargs)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/keras/backend.py:3133 permute_dimensions
    return tf.compat.v1.transpose(x, perm=pattern)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/util/dispatch.py:206 wrapper
    return target(*args, **kwargs)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py:2309 transpose
    return transpose_fn(a, perm, name=name)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py:11659 transpose
    "Transpose", x=x, perm=perm, name=name)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py:750 _apply_op_helper
    attrs=attr_protos, op_def=op_def)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/framework/ops.py:3569 _create_op_internal
    op_def=op_def)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/framework/ops.py:2042 __init__
    control_input_ops, op_def)
/root/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/framework/ops.py:1883 _create_c_op
    raise ValueError(str(e))

ValueError: Dimension must be 5 but is 4 for '{{node attention_11/transpose_4}} = Transpose[T=DT_FLOAT, Tperm=DT_INT32](attention_11/truediv, attention_11/transpose_4/perm)' with input shapes: [?,8,?,8,?], [4].

对比文章和代码后的两个疑问

1,mv这个变量对应的特征好像是每个decoder节点的公共特征,在文章里面好像没有提及这个特征
2,基于词频的先验概率求和好像不为1,后面词频的先验概率和模型的概率在平均时也没归一化处理

以上是两个疑问 希望能得到解答 感谢🙏

执行model.add_loss(cross_entropy)报错You must feed a value for placeholder tensor

搭建seq2seq模型

x_in = Input(shape=(None,), dtype='int32', name="seq2seq_in_layer_1")
y_in = Input(shape=(None,), dtype='int32', name="seq2seq_in_layer_2")
x, y = x_in, y_in

...

model = Model([x_in, y_in], xy)

model.summary()

model.add_loss(cross_entropy)
model.compile(Adam(1e-3))

报错


InvalidArgumentError Traceback (most recent call last)
in
51 model = Model([x_in, y_in], xy)
52 # model.summary()
---> 53 model.add_loss(cross_entropy)
54 model.compile(Adam(1e-3))
55

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_v1.py in add_loss(self, losses, inputs)
1052 for symbolic_loss in symbolic_losses:
1053 if getattr(self, '_is_graph_network', False):
-> 1054 self._graph_network_add_loss(symbolic_loss)
1055 else:
1056 # Possible a loss was added in a Layer's build.

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/functional.py in _graph_network_add_loss(self, symbolic_loss)
841
842 def _graph_network_add_loss(self, symbolic_loss):
--> 843 new_nodes, new_layers = _map_subgraph_network(self.inputs, [symbolic_loss])
844 # Losses must be keyed on inputs no matter what in order to be supported in
845 # DistributionStrategy.

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/functional.py in _map_subgraph_network(inputs, outputs)
1071 """
1072 if not tf.compat.v1.executing_eagerly_outside_functions():
-> 1073 base_layer_utils.create_keras_history(outputs)
1074 # Keep only nodes and layers in the topology between inputs and outputs.
1075 _, nodes_by_depth, layers, _ = _map_graph_network(inputs, outputs)

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in create_keras_history(tensors)
173 the raw Tensorflow operations.
174 """
--> 175 _, created_layers = _create_keras_history_helper(tensors, set(), [])
176 return created_layers
177

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(
--> 254 layer_inputs, processed_ops, created_layers)
255 name = op.name
256 node_def = op.node_def.SerializeToString()

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/engine/base_layer_utils.py in _create_keras_history_helper(tensors, processed_ops, created_layers)
249 else:
250 with tf.init_scope():
--> 251 constants[i] = backend.function([], op_input)([])
252 layer_inputs = unnest_if_single_tensor(layer_inputs)
253 processed_ops, created_layers = _create_keras_history_helper(

~/anaconda3/envs/py364/lib/python3.6/site-packages/keras/backend.py in call(self, inputs)
4030
4031 fetched = self._callable_fn(*array_vals,
-> 4032 run_metadata=self.run_metadata)
4033 self._call_fetch_callbacks(fetched[-len(self._fetches):])
4034 output_structure = tf.nest.pack_sequence_as(

~/anaconda3/envs/py364/lib/python3.6/site-packages/tensorflow/python/client/session.py in call(self, *args, **kwargs)
1478 ret = tf_session.TF_SessionRunCallable(self._session._session,
1479 self._handle, args,
-> 1480 run_metadata_ptr)
1481 if run_metadata:
1482 proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

InvalidArgumentError: You must feed a value for placeholder tensor 'seq2seq_in_layer_2_9' with dtype int32 and shape [?,?]
[[{{node seq2seq_in_layer_2_9}}]]

数据问题

你好,我想问一下,是seq2seq.py这一个脚本搞定了全部吗?测试脚本要自己写是吗,还没有读代码。

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.