Giter VIP home page Giter VIP logo

bert_classification's Introduction

Introduction

This repository is an adaptation of the bert repository.

The purpose of this repository is to visualize BERT's self-attention weights after it has been fine-tuned on the IMDb dateset. However, it can be extended to any text classification dataset by creating an appropriate DataProcessor class. See run_classifier.py for details.

Usage

  1. Create a tsv file for each of the IMDB training and test set.

    Refer to the imdb_data repo for instructions.

  2. Fine-tune BERT on the IMBDb training set.

    Refer to the official BERT repo for fine-tuning instructions.

    Alternatively, you can skip this step by downloading the fine-tuned model from here.

    The pre-trained model (BERT base uncased) used to perform the fine-tuning can also be downloaded from here.

  3. Visualize BERT's weights.

    Refer to the BERT_viz_attention_imdb notebook for more details.

How it works

The forward pass has been modified to return a list of {layer_i: layer_i_attention_weights} dictionaries. The shape of layer_i_attention_weights is (batch_size, num_multihead_attn, max_seq_length, max_seq_length).

You can specify a function to process the above list by passing it as a parameter into the load_bert_model function in the explain.model module. The function's output is avaialble as part of the result of the Estimator's predict call under the key named 'attention'.

Currently, only two attention processor functions have been defined, namely average_last_layer_by_head and average_first_layer_by_head. See explain.attention for implementation details.

Model Performance Metrics

The fine-tuned model achieved an accuracy of 0.9407 on the test set.

The fine-tuning process was done with the following hyperparameters:

  • maximum sequence length: 512
  • training batch size: 8
  • learning rate: 3e-5
  • number of epochs: 3

bert_classification's People

Contributors

jacobdevlin-google avatar mshseek avatar hsm207 avatar cbockman avatar abhishekraok avatar bogdandidenko avatar 0xflotus avatar aijunbai avatar ammarasmro avatar craigcitro avatar georgefeng avatar rodgzilla avatar eric-haibin-lin avatar qwfy avatar jasonjpu avatar pengli09 avatar stefan-it avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.