luka-group / lattice Goto Github PK
View Code? Open in Web Editor NEW[NAACL 2022] Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning.
Home Page: https://arxiv.org/abs/2205.03972
License: Apache License 2.0
[NAACL 2022] Robust (Controlled) Table-to-Text Generation with Structure-Aware Equivariance Learning.
Home Page: https://arxiv.org/abs/2205.03972
License: Apache License 2.0
hi,i know in your code,it will choose the best checkpoint for evaluation in Totto. And because you don't open source the HiTab code, i reproduce the HiTab github's code, but in HiTab github, its defalut setting is choosing the last checkpoint for evaluation, and i wanna know the way you do in HiTab dataset, thanks ~
Hi!
First of all, thank you for the great work!
I have a quick question about type_edges that you added in structural attention.
Is type_edges
fixed as ((1, 1), (2, 2), (1, 2), (2, 1), (3, 1), (3, 2), (1, 3), (2, 3))
?
If it is, then why?
If it is not, then where can I find type_edge values of each input?
Thank you!
Hope you have a great one!
Best,
Paul
Dear author, I have only seen the codes related to Totto, but where are the codes related to HiTab? Can you provide the codes like Totto, or do I need to replace the pre-trained model structure in the HiTab github myself. Looking forward to your reply, thank you.
Traceback (most recent call last):
File "train.py", line 569, in
main()
File "train.py", line 503, in main
train_result = trainer.train(resume_from_checkpoint=checkpoint)
File "/home/subtlbot/.local/lib/python3.8/site-packages/transformers/trainer.py", line 1316, in train
tr_loss_step = self.training_step(model, inputs)
File "/home/subtlbot/.local/lib/python3.8/site-packages/transformers/trainer.py", line 1849, in training_step
loss = self.compute_loss(model, inputs)
File "/home/subtlbot/.local/lib/python3.8/site-packages/transformers/trainer.py", line 1881, in compute_loss
outputs = model(**inputs)
File "/home/subtlbot/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/home/subtlbot/ToTTo/Lattice/model/modeling_t5.py", line 1557, in forward
encoder_outputs = self.encoder(
File "/home/subtlbot/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/home/subtlbot/ToTTo/Lattice/model/modeling_t5.py", line 919, in forward
inputs_embeds = self.embed_tokens(input_ids)
File "/home/subtlbot/.local/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/home/subtlbot/.local/lib/python3.8/site-packages/torch/nn/modules/sparse.py", line 158, in forward
return F.embedding(
File "/home/subtlbot/.local/lib/python3.8/site-packages/torch/nn/functional.py", line 2199, in embedding
return torch.embedding(weight, input, padding_idx, scale_grad_by_freq, sparse)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument index in method wrapper__index_select)
Hi, I tried to install this in a conda environment and when running with the current requirements.txt file generates an error when importing transformers (due to a mismatch of installed packages. I resolved it by removing the versions from all transformers-related packages and installing currently last versions:
transformers==4.25.1
datasets==2.7.1
sentencepiece==0.1.97
I suggest you to update the requirements file so it can be installed and run properly
Cheers,
David
Excuse me, how does the evaluation index blue in your article become the full version of bluet to upgrade to the evaluation index value in your article?
Hi,
First of all, thank you for the great work!
I'm a newbie in transformers and looking into some tasks like you guys did and reached your paper and code.
I have a quick question about custom seq2seqtrainer (trainer_seq2seq.py
) that you created in model
directory.
This might be a stupid question, but...
Is the prediction_step
in the trainer_seq2seq.py
the function for training with putting type_ids, row_ids, and col_ids into the model?
I mean, I followed your code line by line, and now I'm trying to put an extra param like type_ids
as yours in my T5ForConditionalGeneration by using normal transformers.Trainer
, but gives me an error like below.
ValueError: Unable to create tensor, you should probably activate truncation and/or padding with 'padding=True' 'truncation=True' to have batched tensors with the same length. Perhaps your features (
type_ids
in this case) have excessive nesting (inputs type
list
where type
int
is expected).
It seems that this error occurs using the general Trainer library, but I want to confirm just in case.
Thanks
Hope you have a happy new year
Paul
hi, the input format of my code in Totto is the same as that in your paper . I also shuffle the row and the column of table,but the baseline(T5-base)result only drops 0.2 -1.0 in bleu and parent, even if I exchange the order of metadata and table. In your parper, the T5-base drops 4.5 bleu. I don't know what's wrong with it. thanks.
here is one example:
before shuffle:
'page_title> List of Governors of South Carolina /page_title> section_title> Governors under the Constitution of 1868 /section_title> table> cell> 76 h> # /h> h> 75 /h> h> 74 /h> /cell> cell> Daniel Henry Chamberlain h> 76 /h> h> Governor /h> /cell> cell> December 1, 1874 h> 76 /h> h> Took Office /h> /cell> /table>'
after shuffle:
'section_title> Governors under the Constitution of 1868 /section_title> table> cell> Daniel Henry Chamberlain h> Governor /h> h> 76 /h> /cell> cell> December 1, 1874 h> Took Office /h> h> 76 /h> /cell> cell> 76 h> # /h> h> 74 /h> h> 75 /h> /cell> /table> page_title> List of Governors of South Carolina /page_title>'
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.