Giter VIP home page Giter VIP logo

3d-gesture-a2j-transformer's Introduction

A2J-Transformer

Introduction

This is the official implementation for the paper, "A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image", CVPR 2023.

Paper link here: A2J-Transformer: Anchor-to-Joint Transformer Network for 3D Interacting Hand Pose Estimation from a Single RGB Image

About our code

Installation and Setup

Requirements

  • Our code is tested under Ubuntu 20.04 environment with NVIDIA 2080Ti GPU and NVIDIA 3090 GPU, both Pytorch1.7 and Pytorch1.11 work.

  • Python>=3.7

    We recommend you to use Anaconda to create a conda environment:

    conda create --name a2j_trans python=3.7

    Then, activate the environment:

    conda activate a2j_trans
  • PyTorch>=1.7.1, torchvision>=0.8.2 (following instructions here)

    We recommend you to use the following pytorch and torchvision:

    conda install pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=11.0 -c pytorch
  • Other requirements

    conda install tqdm numpy matplotlib scipy
    pip install opencv-python pycocotools

Compiling CUDA operators(Following Deformable-DETR)

cd ./dab_deformable_detr/ops
sh make.sh

Usage

Dataset preparation

  • Please download InterHand 2.6M Dataset and organize them as following:

    your_dataset_path/
    └── Interhand2.6M_5fps/
        ├── annotations/
        └── images/
    

Testing on InterHand 2.6M Dataset

  • Please download our pre-trained model and organize the code as following:

    a2j-transformer/
    ├── dab_deformable_detr/
    ├── nets/
    ├── utils/
    ├── ...py
    ├── datalist/
    |   └── ...pkl
    └── output/
        └── model_dump/
            └── snapshot.pth.tar
    

    The datalist folder and the pkl files denotes the dataset-list generated during running the code. You can choose to download them here, and manually put them under the datalist folder.

  • In config.py, set interhand_anno_dir, interhand_images_path to the dataset abs directory.

  • In config.py, set cur_dir to the a2j-transformer code directory.

  • Run the following script:

    python test.py --gpu <your_gpu_ids>

    You can also choose to change the gpu_ids in test.py.

3d-gesture-a2j-transformer's People

Contributors

changlongjianggit avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.