Giter VIP home page Giter VIP logo

knowledge-guided-network-pruning's Introduction

Knowledge-guided Network Pruning

python torch torcheeg mne scipy pandas

This is the implementation of our paper: Wenjie Rao, Sheng-hua Zhong "Knowledge-guided Network Pruning for EEG-based Emotion Recognition", accepted as a regular paper on IEEE International Conference on Bioinformatics and Biomedicine (BIBM)

With the development of deep learning in EEG-related tasks, the complexity of learning models has gradually increased. These complex models often result in long inference times, high energy consumption, and an increased risk of overfitting. Therefore, model compression has become an important consideration. Although some EEG models have used lightweight techniques, such as separable convolution, no existing work has directly attempted to compress EEG models to reduce their complexity. In this paper, we integrate neuroscience knowledge into EEG model pruning recovery, and innovatively propose two loss functions in the learning process, the knowledge-guided region-wise loss that enforces the classification evidence consistent with the importance of the prefrontal lobe, and the knowledge-guided sample-wise loss that constrains the learning process by distinguishing the importance of different samples.

Flow Chart

Given an EEG sample, we first train and prune a convolutional neural network. We then use guided Grad-CAM to extract classification evidence and constrain the model optimization by introducing knowledge-guided region-wise loss and sample-wise loss on data-driven evidence for better performance recovery.

Accuracy Comparison

Results of using the combination of two proposed loss functions to recover two EEG models on DEAP. Bold font indicates higher accuracy between "Baseline" and "Proposed".

Conclusion

In this paper, we investigate network pruning methods for EEG models used in emotion recognition tasks. We propose two innovative constraints in the learning process that incorporate interpretability into the recovery of pruning models. The first constraint is the knowledge-guided region-wise loss, which enforces classification evidence consistent with the importance of the prefrontal lobe. The second constraint is the knowledge-guided sample-wise loss, which distinguishes the importance of different samples to constrain the learning process. Our results show that the proposed method significantly enhances the recovery accuracy of the pruned model while reducing the required computational resources.

knowledge-guided-network-pruning's People

Contributors

jackyrwj avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.