fork-of-bilstm-attention-kfold-clr-extra-features(with myself finetune).ipynb
- Private Score:
0.69825
- Public Score:
0.69058
- I trust my local f1-score rather than lb score, and the following tips only improve my local scores.
- Use the mean average over three embeddings:
glove
, wiki-news
, paragram
- Use
pos_weight=0.78
in the BCELoss function to deal with the imbalance of sincere and insincere question numbers
- Make the learning rate of the RNN layer equal to
0.25
times the fully connected layer
- Use all
training
sets and test
sets to fit tokenizer
- Use the model checkpoint of the highest val-score to make prediction
fork-from-bilstm-attention-kfold-0115-81a8d9 (the public kernel score 0.700).ipynb
- Private Score:
0.70235
- Public Score:
0.69729
- I didn't change anything for the original 0.700 public kernel.