Giter VIP home page Giter VIP logo

wdkhuans / awesome-human-activity-recognition Goto Github PK

View Code? Open in Web Editor NEW

This project forked from haorand/awesome-human-activity-recognition

0.0 0.0 0.0 233 KB

An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.

License: MIT License

awesome-human-activity-recognition's Introduction

Awesome Human Activity Recognition (mainly related/interacted to SENSOR data)

Awesome MIT License PRs Welcome

An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.

Acknowledgment

Many thanks to the useful publications and repos: Jingdong Wang, Awesome-Deep-Vision, Awesome-Deep-Learning-Papers, Awesome-Self-Supervised-Learning, Awesome-Semi-Supervised-Learning and Awesome-Crowd-Counting.

Contributing

We Need You!

Please feel free to contribute this list.

Conferences and Journals

  • IJCAI, ACM MultiMedia, AAAI, KDD, ICDM, TKDE, TIP, TNNLS, TPAMI, TMM, Pattern Recognition, AI, Nature Communication, ICPR, Sensors, Ubicomp(IMWUT Journal)

Datasets

Tools

Potential Research Direction

  • Large-Scale/Diverse Dataset Research
  • Multi-Modality: sensor-vision, sensor-skeleton, sensor-3DPose, Sensor-Motion
  • window selection
  • Generative Model: e.g., cross modality data generation, IMU2Skeleton
  • Handling the NULL-Class problem
  • Open-World, Real-World: complex/non-repetitive activities
  • Advanced model
  • Data-cental: active learning, unsupervised learning, semi-supervised learning, self-supervised learning
  • Actiion Segmentation
  • Are the existing settings/models reliable?
  • Graph Representation
  • Motion-Capture, Kinetic
  • Privacy related
  • Interpretability
  • Data Imbalance
  • Domain Adaptation
  • Fine-Grained
  • Multi-Label
  • Federated Learning
  • Ensemble
  • Knowledge Integragation/distillation

Papers

Surveys & Overview

  • A Survey on Deep Learning for Human Activity Recognition (ACM Computing Surveys (CSUR)) [paper]

  • Applying Machine Learning for Sensor Data Analysis in Interactive Systems: Common Pitfalls of Pragmatic Use and Ways to Avoid Them (ACM Computing Surveys (CSUR)) [paper]

  • [DL4SAR] Deep Learning for Sensor-based Activity Recognition: A Survey (Pattern Recognition Letters) [paper][code]

  • Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities (ACM Computing Surveys (CSUR)) [paper]

  • Human Action Recognition from Various Data Modalities: A Review (IEEE TPAMI 2022) [paper] top AI Journal

2022

  • Human Action Recognition from Various Data Modalities: A Review (IEEE TPAMI 2022 (top AI Journal))
  • Non-Bayesian Out-of-Distribution Detection Applied to CNN Architectures for Human Activity Recognition
  • Resource-Efficient Continual Learning for Sensor-Based Human Activity Recognition
  • Beyond the Gates of Euclidean Space: Temporal-Discrimination-Fusions and Attention-based Graph Neural Network for Human Activity Recognition
  • LiteHAR: LIGHTWEIGHT HUMAN ACTIVITY RECOGNITION FROM WIFI SIGNALS WITH RANDOM CONVOLUTION KERNELS
  • A Review on Topological Data Analysis in Human Activity Recognition
  • Deep CNN-LSTM with Self-Attention Model for Human Activity Recognition using Wearable Sensor
  • Zero-Shot Learning for IMU-Based Activity Recognition Using Video Embeddings
  • Deep Transfer Learning with Graph Neural Network for Sensor-Based Human Activity Recognition
  • Meta-learning meets the Internet of Things: Graph prototypical models for sensor-based human activity recognition
  • Federated Multi-Task Learning
  • Unsupervised Human Activity Recognition Using the Clustering Approach: A Review
  • Hierarchical Self Attention Based Autoencoder for Open-Set Human Activity Recognition
  • Assessing the State of Self-Supervised Human Activity Recognition using Wearables
  • ROBUST AND EFFICIENT UNCERTAINTY AWARE BIOSIGNAL CLASSIFICATION VIA EARLY EXIT ENSEMBLES
  • Machine learning detects altered spatial navigation features in outdoor behaviour of Alzheimer’s disease patients
  • Evaluating Contrastive Learning on Wearable Timeseries for Downstream Clinical Outcomes
  • Segmentation-free Heart Pathology Detection Using Deep Learning
  • Anticipatory Detection of Compulsive Body-focused Repetitive Behaviors with Wearables
  • Assessing the State of Self-Supervised Human Activity Recognition using Wearables
  • Method and system for automatic extraction of virtual on-body inertial measurement units
  • Enhancing the Security & Privacy of Wearable Brain-Computer Interfaces
  • What Makes Good Contrastive Learning on Small-Scale Wearable-based Tasks?
  • Detecting Smartwatch-Based Behavior Change in Response to a Multi-Domain Brain Health Intervention
  • ColloSSL: Collaborative Self-Supervised Learning for Human Activity Recognition
  • Multi-scale Deep Feature Learning for Human Activity Recognition Using Wearable Sensors
  • Improving Wearable-Based Activity Recognition Using Image Representations
  • Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges
  • A recurrent neural network architecture to model physical activity energy expenditure in older people
  • Application of artificial intelligence in wearable devices: Opportunities and challenges
  • A Close Look into Human Activity Recognition Models using Deep Learning
  • YONO: Modeling Multiple Heterogeneous Neural Networks on Microcontrollers
  • CogAx: Early Assessment of Cognitive and Functional Impairment from Accelerometry
  • Deep Temporal Conv-LSTM for Activity Recognition
  • Human Activity Recognition from Wearable Sensor Data Using Self-Attention
  • Combined deep centralized coordinate learning and hybrid loss for human activity recognition
  • Real-time human activity recognition using conditionally parametrized convolutions on mobile and wearable devices
  • Proposing a Fuzzy Soft‐max‐based classifier in a hybrid deep learning architecture for human activity recognition
  • HAR-GCNN: Deep Graph CNNs for Human Activity Recognition From Highly Unlabeled Mobile Sensor Data
  • Sensor-based human activity recognition using fuzzified deep CNN architecture with λmax method
  • WearRF-CLA: Continuous Location Authentication with Wrist Wearables and UHF RFID
  • Non-Bayesian Out-of-Distribution Detection Applied to CNN Architectures for Human Activity Recognition
  • Improving the Performance of Open-Set Classification in Human Activity Recognition by Applying a Residual Neural Network Architecture
  • A Review on Topological Data Analysis in Human Activity Recognition
  • UBIWEAR: An End-To-End Framework for Intelligent Physical Activity Prediction With Machine and Deep Learning
  • High-Precision and Personalized Wearable Sensing Systems for Healthcare Applications
  • ColloSSL: Collaborative Self-Supervised Learning for Human Activity Recognition
  • DANA: Dimension-Adaptive Neural Architecture
  • DeXAR: Deep Explainable Sensor-Based Activity Recognition in Smart-Home Environments
  • Latent Independent Excitation for Generalizable Sensor-based Cross-Person Activity Recognition
  • The Severity Prediction of The Binary And Multi-Class Cardiovascular Disease -- A Machine Learning-Based Fusion Approach
  • An Unsupervised User Adaptation Model for Multiple Wearable Sensors Based Human Activity Recognition
  • Machine Learning on Clinical Time Series: Classification and Representation Learning
  • Learning Disentangled Behaviour Patterns for Wearable-based Human Activity Recognition
  • What Makes Good Contrastive Learning on Small-scale Wearable-based Tasks?
  • Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data,
  • IMU2Doppler: Cross-Modal Domain Adaptation for Doppler-based Activity Recognition Using IMU Data
  • A CNN-based Human Activity Recognition System Combining a Laser Feedback Interferometry Eye Movement Sensor and an IMU for Context-aware Smart Glasses
  • Winect: 3D Human Pose Tracking for Free-form Activity Using Commodity WiFi
  • Zero-Shot Learning for IMU-Based Activity Recognition Using Video Embeddings
  • KATN: Key Activity Detection via Inexact Supervised Learning
  • Fusing Visual and Inertial Sensors with Semantics for 3D Human Pose Estimation
  • Multi-gat: A graphical attention-based hierarchical multimodal representation learning approach for human activity recognition
  • Semantics-aware adaptive knowledge distillation for sensor-to-vision action recognition
  • Human action recognition from various data modalities: A review
  • Eldersim: A synthetic data generation platform for human action recognition in eldercare applications
  • Home action genome: Cooperative compositional action understanding
  • Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition
  • Sensor-Augmented Egocentric-Video Captioning with Dynamic Modal Attention
  • Disentanglement Approach for Video Action Recognition
  • Fusion-GCN: Multimodal Action Recognition using Graph Convolutional Networks
  • Meta-learning meets the Internet of Things: Graph prototypical models for sensor-based human activity recognition
  • Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art

2021

  • Approaching the Real-World: Supporting Activity Recognition Training with Virtual IMU Data [paper]

  • Can You See It? Good, So We Can Sense It! [paper]

  • An Ensemble of ConvTransformer Networks for the Sussex-Huawei Locomotion-Transportation (SHL) Recognition Challenge [paper]

  • Fast Deep Neural Architecture Search for Wearable Activity Recognition by Early Prediction of Converged Performance [paper]

  • Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs [paper]

  • On the Role of Context Length for Feature Extraction and Sequence Modeling in Human Activity Recognition [paper]

  • ObscureNet: Learning Attribute-invariant Latent Representation for Anonymizing Sensor Data [paper]

  • SenseCollect: We Need Efficient Ways to Collect On-body Sensor-based Human Activity Data! [paper]

  • Self-supervised Learning for Reading Activity Classification [paper]

  • Approaching the Real-World: Supporting Activity Recognition Training with Virtual IMU Data [paper]

  • Reducing Muscle Activity when Playing Tremolo by Using Electrical Muscle Stimulation to Learn Efficient Motor Skills [paper]

  • Pushing the Limits of Long Range Wireless Sensing with LoRa [paper]

  • CardiacWave: A mmWave-based Scheme of Non-Contact and High-Definition Heart Activity Computing [paper]

  • Multimodal Federated Learning [paper]

  • A Deep Learning-Based Framework for Human Activity Recognition in Smart Homes [paper]

  • Interactive Hybrid Intelligence Systems for Human-Ai/Robot Collaboration: Improving the Practices of Physical Stroke Rehabilitation [paper]

  • Continual Activity Recognition with Generative Adversarial Networks [paper]

  • A multibranch CNN-BiLSTM model for human activity recognition using wearable sensor data [paper]

  • Unsupervised User Adaptation Model for Multiple Wearable Sensors Based Human Activity Recognition [paper]

  • ClusterFL: A Similarity-Aware Federated Learning System for Human Activity Recognition (MobiSys) [paper]

  • Improving Deep Learning for HAR with shallow LSTMs (ISWC/ubicomp) [paper]

  • Contrastive Predictive Coding for Human Activity Recognition (IMWUT/ubicomp) [paper]

  • Leveraging Activity Recognition to Enable Protective Behavior Detection in Continuous Data (IMWUT/ubicomp) [paper]

  • Watching Your Phone's Back: Gesture Recognition by Sensing Acoustical Structure-borne Propagation (IMWUT/ubicomp) [paper]

  • ApneaDetector: Detecting Sleep Apnea with Smartwatches (IMWUT/ubicomp) [paper]

  • NeckFace: Continuously Tracking Full Facial Expressions on Neck-mounted Wearables (IMWUT/ubicomp) [paper]

  • We Hear Your PACE: Passive Acoustic Localization of Multiple Walking Persons (IMWUT/ubicomp) [paper]

  • mTeeth: Identifying Brushing Teeth Surfaces Using Wrist-Worn Inertial Sensors (IMWUT/ubicomp) [paper]

  • Acoustic-based Upper Facial Action Recognition for Smart Eyewear (IMWUT/ubicomp) [paper]

  • Two-Stream Convolution Augmented Transformer for Human Activity Recognition (AAAI2021) [paper]

  • Unsupervised Human Activity Representation Learning with Multi-task Deep Clustering (IMWUT/ubicomp) [paper]

  • Attend and Discriminate: Beyond the State-of-the-Art for Human Activity Recognition Using Wearable Sensors (IMWUT/ubicomp) [paper]

  • SelfHAR: Improving Human Activity Recognition through Self-training with Unlabeled Data (IMWUT/ubicomp) [paper]

  • Latent Independent Excitation for Generalizable Sensor-based Cross-Person Activity Recognition (AAAI 2021) [paper]

  • Weakly-Supervised Sensor-based Activity Segmentation and Recognition via Learning from Distributions (Artificial Intelligence (AIJ)) [paper]

2020

  • GIobalFusion: A Global Attentional Deep Learning Framework for Multisensor Information Fusion (IMWUT/ubicomp) [paper]

  • METIER: A Deep Multi-Task Learning Based Activity and User Recognition Model Using Wearable Sensors (IMWUT/ubicomp) [paper]

  • Instance-Wise Dynamic Sensor Selection for Human Activity Recognition (AAAI 2020) [paper]

  • Cross-Dataset Activity Recognition via Adaptive Spatial-Temporal Transfer Learning (IMWUT/ubicomp) [paper]

  • MARS: Mixed Virtual and Real Wearable Sensors for Human Activity Recognition with Multi-Domain Deep Learning Model [arXiv]

  • Towards Deep Clustering of Human Activities from Wearables (ISWC/ubicomp) [paper]

  • [UDA4HAR] A Systematic Study of Unsupervised Domain Adaptation for Robust Human-Activity Recognition (IMWUT/ubicomp) [paper]

  • Adversarial Multi-view Networks for Activity Recognition (IMWUT/ubicomp) [paper]

  • Weakly Supervised Multi-Task Representation Learning for Human Activity Analysis Using Wearables (IMWUT/ubicomp) [paper]

  • [IMUTube] IMUTube: Automatic Extraction of Virtual on-body Accelerometry from Video for Human Activity Recognition (IMWUT/ubicomp) [paper]

  • Robust Unsupervised Factory Activity Recognition with Body-worn Accelerometer Using Temporal Structure of Multiple Sensor Data Motifs (IMWUT/ubicomp) [paper]

  • Masked reconstruction based self-supervision for human activity recognition (ISWC/ubicomp) [paper]

  • Digging deeper: towards a better understanding of transfer learning for human activity recognition with Body-worn Accelerometer Using Temporal Structure of Multiple Sensor Data Motifs (ISWC/ubicomp) [paper]

  • IndRNN based long-term temporal recognition in the spatial and frequency domain (ISWC/ubicomp) [paper]

  • Tackling the SHL challenge 2020 with person-specific classifiers and semi-supervised learning (ISWC/ubicomp) [paper]

  • DenseNetX and GRU for the sussex-huawei locomotion-transportation recognition challenge (ISWC/ubicomp) [paper]

2019

  • A Novel Distribution-Embedded Neural Network for Sensor-Based Activity Recognition (IJCAI) [paper][code]

  • Learning Bodily and Temporal Attention in Protective Movement Behavior Detection

  • [AttnSense] AttnSense: Multi-level Attention Mechanism For Multimodal Human Activity Recognition (IJCAI) [paper]

  • Multi-agent Attentional Activity Recognition (IJCAI) [paper][code]

  • Distribution-based Semi-Supervised Learning for Activity Recognition (AAAI) [paper][code]

  • On the Role of Features in Human Activity Recognition (ISWC/ubicomp) [paper]

  • Handling Annotation Uncertainty in Human Activity Recognition (ISWC/ubicomp) [paper]

  • Leveraging Active Learning and Conditional Mutual Information to Minimize Data Annotation in Human Activity Recognition (IMWUT/ubicomp) [paper]

  • [Vision2Sensor] Vision2Sensor: Knowledge Transfer Across Sensing Modalities for Human Activity Recognition (IMWUT/ubicomp) [paper]

  • How Does a Nation Walk? Interpreting Large-Scale Step Count Activity with Weekly Streak Patterns (IMWUT/ubicomp) [paper]

2018

  • Understanding and Improving Recurrent Networks for Human Activity Recognition by Continuous Attention (ISWC/ubicomp) [paper]

  • On specialized window lengths and detector based human activity recognition (ISWC/ubicomp) [paper]

  • Adding structural characteristics to distribution-based accelerometer representations for activity recognition using wearables (ISWC/ubicomp) [paper]

  • On Attention Models for Human Activity Recognition (ISWC/ubicomp) [paper]

  • [AROMA] AROMA: A Deep Multi-Task Learning Based Simple and Complex Human Activity Recognition Method Using Wearable Sensors (IMWUT/ubicomp) [paper]

2017

  • [EnsemblesLSTM] Ensembles of Deep LSTM Learners for Activity Recognition using Wearables (IMWUT/ubicomp) [paper] [Tensorflow]

  • Deep Learning for Sensor-based Activity Recognition: A Survey (Pattern Recognition Letters) [paper]

  • Activity Recognition for Quality Assessment of Batting Shots in Cricket using a Hierarchical Representation (IMWUT/ubicomp) [paper]

  • Label Propagation: An Unsupervised Similarity Based Method for Integrating New Sensors in Activity Recognition Systems (IMWUT/ubicomp) [paper]

  • CNN-based sensor fusion techniques for multimodal human activity recognition (ISWC/ubicomp) [paper]

2016

  • Learning from less for better: semi-supervised activity recognition via shared structure discovery (ubicomp) [paper]

  • Wearable sensor based multimodal human activity recognition exploiting the diversity of classifier ensemble (ubicomp) [paper]

2015

  • Beyond activity recognition: skill assessment from accelerometer data (ubicomp) [paper]

  • I did not smoke 100 cigarettes today!: avoiding false positives in real-world activity recognition (ubicomp) [paper]

  • Let's (not) stick together: pairwise similarity biases cross-validation in activity recognition (ubicomp) [paper]

  • Improved activity recognition by using enriched acceleration data (ubicomp) [paper]

  • A field study comparing approaches to collecting annotated activity data in real-world settings (ubicomp) [paper]

  • Personalization revisited: a reflective approach helps people better personalize health services and motivates them to increase physical activity (ubicomp) [paper]

2014

  • MONITORING HOUSEHOLD ACTIVITIES AND USER LOCATION WITH A CHEAP, UNOBTRUSIVE THERMAL SENSOR ARRAY (ubicomp) [paper]

  • Connecting personal-scale sensing and networked community behavior to infer human activities (ubicomp) [paper]

  • Using electrodermal activity to recognize ease of engagement in children during social interactions (ubicomp) [paper]

2013

  • Fine-Grained Sharing of Sensed Physical Activity: A Value Sensitive Approach (ubicomp) [paper]

  • Towards zero-shot learning for human activity recognition using semantic attribute sequence model (ubicomp) [paper]

  • Personalized mobile physical activity recognition (ubicomp) [paper]

  • A Hybrid Unsupervised/Supervised Model for Group Activity Recognition (ubicomp) [paper]

  • Confidence-based Multiclass AdaBoost for Physical Activity Monitoring (ubicomp) [paper]

  • An exploration with online complex activity recognition using cellphone accelerometer (ubicomp) [paper]

  • [UniPad] UniPad: Orchestrating collaborative activities through shared tablets and an integrated wall display (ubicomp) [paper]

  • Human Activity Recognition Using Heterogeneous Sensors (ubicomp) [paper]

  • A probabilistic ontological framework for the recognition of multilevel human activities (ubicomp) [paper]

  • Ubiquitous support for midwives to leverage daily activities (ubicomp) [paper]

  • Combining Embedded Accelerometers with Computer Vision for Recognizing Food Preparation Activities (ubicomp) [paper]

2012

  • A Spark Of Activity: Exploring Information Art As Visualization For Physical Activity (ubicomp) [paper]

  • [BodyScope] BodyScope: A Wearable Acoustic Sensor for Activity Recognition (ubicomp) [paper]

  • An Integrated Framework for Human Activity Classification (ubicomp) [paper]

2011

  • The Place for Ubiquitous Computing in Schools: Lessons Learned from a School-Based Intervention for Youth Physical Activity (ubicomp) [paper]

  • [CSN] Enabling Large-scale Human Activity Inference on Smartphones using Community Similarity Networks (ubicomp) [paper]

2010

  • Using Wearable Activity Type Detection to Improve Physical Activity Energy Expenditure Estimation (ubicomp) [paper]

awesome-human-activity-recognition's People

Contributors

haorand avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.