Giter VIP home page Giter VIP logo

cpf's People

Contributors

kailinli avatar kelvin34501 avatar lixiny avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

cpf's Issues

How to use CPF on both hands?

Thanks a lot for your great work!
I have a question:
Since you only use the MANO_RIGHT.pkl, it seems that CPF currently can only construct right hand model, right?
What is needed to be modified to use CPF on both hands?
Thanks!

what's the meaning of "adapt"?

I notice that there are hand_pose_axisang_adapt_np and hand_pose_axisang_np in your code. Could you please explain what's the difference between them?

AttributeError: 'ParsedRequirement' object has no attribute 'req'

Could you tell me which version of Anaconda to use please?
I am getting the below error:

neptune:/s/red/a/nobackup/vision/anju/CPF$ conda env create -f environment.yaml
Collecting package metadata (repodata.json): done
Solving environment: done

==> WARNING: A newer version of conda exists. <==
current version: 4.9.2
latest version: 4.10.1

Please update conda by running

$ conda update -n base -c defaults conda

Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Installing pip dependencies: | Ran pip subprocess with arguments:
['/s/chopin/a/grad/anju/.conda/envs/cpf/bin/python', '-m', 'pip', 'install', '-U', '-r', '/s/red/a/nobackup/vision/anju/CPF/condaenv.agtpjn0v.requirements.txt']
Pip subprocess output:
Collecting git+https://github.com/utiasSTARS/liegroups.git (from -r /s/red/a/nobackup/vision/anju/CPF/condaenv.agtpjn0v.requirements.txt (line 1))
Cloning https://github.com/utiasSTARS/liegroups.git to /tmp/pip-req-build-ey_prxpa
Obtaining file:///s/red/a/nobackup/vision/anju/CPF/manopth (from -r /s/red/a/nobackup/vision/anju/CPF/condaenv.agtpjn0v.requirements.txt (line 12))
Obtaining file:///s/red/a/nobackup/vision/anju/CPF (from -r /s/red/a/nobackup/vision/anju/CPF/condaenv.agtpjn0v.requirements.txt (line 13))
Collecting trimesh==3.8.10
Using cached trimesh-3.8.10-py3-none-any.whl (625 kB)
Collecting open3d==0.10.0.0
Using cached open3d-0.10.0.0-cp38-cp38-manylinux1_x86_64.whl (4.7 MB)
Collecting pyrender==0.1.43
Using cached pyrender-0.1.43-py3-none-any.whl (1.2 MB)
Collecting scikit-learn==0.23.2
Using cached scikit_learn-0.23.2-cp38-cp38-manylinux1_x86_64.whl (6.8 MB)
Collecting chumpy==0.69
Using cached chumpy-0.69.tar.gz (50 kB)

Pip subprocess error:
Running command git clone -q https://github.com/utiasSTARS/liegroups.git /tmp/pip-req-build-ey_prxpa
ERROR: Command errored out with exit status 1:
command: /s/chopin/a/grad/anju/.conda/envs/cpf/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-hnf78qhk/chumpy/setup.py'"'"'; file='"'"'/tmp/pip-install-hnf78qhk/chumpy/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-k7bp5gq7
cwd: /tmp/pip-install-hnf78qhk/chumpy/
Complete output (7 lines):
Traceback (most recent call last):
File "", line 1, in
File "/tmp/pip-install-hnf78qhk/chumpy/setup.py", line 15, in
install_requires = [str(ir.req) for ir in install_reqs]
File "/tmp/pip-install-hnf78qhk/chumpy/setup.py", line 15, in
install_requires = [str(ir.req) for ir in install_reqs]
AttributeError: 'ParsedRequirement' object has no attribute 'req'
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

failed

CondaEnvException: Pip failed

Some questions about PiQR code

In the contacthead.py, the three decoders have different input dimension.
self.vertex_contact_decoder = PointNetDecodeModule(self._concat_feat_dim, 1)
self.contact_region_decoder = PointNetDecodeModule(self._concat_feat_dim + 1, self.n_region)
self.anchor_elasti_decoder = PointNetDecodeModule(self._concat_feat_dim + 17, self.n_anchor)

I am wondering if this part is used to predict selected anchor points within each subregion.

The classification of subregions is obtained by contact_region_decoder and then the anchor points are predicted by anchor_elasti_decoder, is it right ?

I am a little bit confused about it, because according to the paper, Anchor Elasticity (AE) represents the elasticities of the attractive springs. But in the code, the output of anchor_elasti_decoder has no relation to the elasticity parameter, I'm wondering if there's some part I've missed.

Sorry for any trouble caused and thanks for your help!

RuntimeError: indices should be either on cpu or on the same device as the indexed tensor (cpu)

while running python3 scripts/run_demo.py --gpu 0 --init_ckpt CPF_checkpoints/picr/fhb/checkpoint_200.pth.tar --honet_mano_fhb_hand, the visualization window flashes and an error occurs as below.

==============================  Options  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
anchor_root                     :  assets/anchor
batch_size                      :  4
center_idx                      :  9
data_root                       :  data
exp_keyword                     :  None
gpu                             :  0
hand_closed_path                :  assets/closed_hand/hand_mesh_close.obj
hg_blocks                       :  1
hg_classes                      :  64
hg_stacks                       :  2
honet_mano_fhb_hand             :  True
honet_mano_lambda_pose_reg      :  5e-06
honet_mano_lambda_recov_joints3d  :  0.5
honet_mano_lambda_recov_verts3d  :  0
honet_mano_lambda_shape         :  5e-07
honet_obj_lambda_recov_verts2d  :  0.0
honet_obj_lambda_recov_verts3d  :  0.5
honet_obj_trans_factor          :  100
honet_resnet_version            :  18
init_ckpt                       :  CPF_checkpoints/picr/fhb/checkpoint_200.pth.tar
lambda_contact_loss             :  10.0
lambda_repulsion_loss           :  1.6
mano_root                       :  assets/mano
manual_seed                     :  0
obj_scale_factor                :  0.0001
palm_path                       :  assets/hand_palm_full.txt
repulsion_query                 :  0.02
repulsion_threshold             :  0.05
vertex_contact_thresh           :  0.7
workers                         :  16
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<  Options  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
Saving experiment logs, models, and training curves and images to checkpoints/picr_geo_example/example_example/2023_04_17_16
Got 10 samples for data_split example
==============================  example_dataset_queries  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
+------------------------------------+---------------------------------------+---------------------------------------+
|        BaseQueries.CAM_INTR        |         BaseQueries.HAND_FACES        |     BaseQueries.HAND_POSE_WRT_CAM     |
|     BaseQueries.HAND_VERTS_2D      |       BaseQueries.HAND_VERTS_3D       |           BaseQueries.IMAGE           |
|       BaseQueries.IMAGE_PATH       |         BaseQueries.JOINTS_2D         |         BaseQueries.JOINTS_3D         |
|       BaseQueries.JOINT_VIS        |       BaseQueries.OBJ_CAN_SCALE       |       BaseQueries.OBJ_CAN_TRANS       |
|     BaseQueries.OBJ_CAN_VERTS      |         BaseQueries.OBJ_FACES         |         BaseQueries.OBJ_TRANSF        |
|      BaseQueries.OBJ_VERTS_2D      |        BaseQueries.OBJ_VERTS_3D       |         BaseQueries.OBJ_VIS_2D        |
|          BaseQueries.SIDE          |     MetaQueries.SAMPLE_IDENTIFIER     |        TransQueries.AFFINETRANS       |
|       TransQueries.CAM_INTR        |         TransQueries.CENTER_3D        |       TransQueries.HAND_VERTS_2D      |
|     TransQueries.HAND_VERTS_3D     |           TransQueries.IMAGE          |         TransQueries.JOINTS_2D        |
|       TransQueries.JOINTS_3D       |        TransQueries.OBJ_TRANSF        |       TransQueries.OBJ_VERTS_2D       |
|     TransQueries.OBJ_VERTS_3D      |          TransQueries.ROTMAT          |                   -                   |
+------------------------------------+---------------------------------------+---------------------------------------+
<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<  example_dataset_queries  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
/home/kb/anaconda3/envs/cpf/lib/python3.8/site-packages/torch/utils/data/dataloader.py:561: UserWarning: This DataLoader will create 16 worker processes in total. Our suggested max number of worker in current system is 12, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
  warnings.warn(_create_warning_msg(
Loading resnet18 pretrained models !
=> loading checkpoint 'CPF_checkpoints/picr/fhb/checkpoint_200.pth.tar'
=> loaded checkpoint 'CPF_checkpoints/picr/fhb/checkpoint_200.pth.tar' (epoch 200)
Model total size == 93.72725296020508 MB
  |  HONet total size == 47.723751068115234 MB
  |  BaseNet total size == 25.7626953125 MB
  \  ContactHead total size == 20.240806579589844 MB
    |  EncodeModule total size == 4.4404296875 MB
    |  DecodeModule_VertexContact total size == 5.257816314697266 MB
    |  DecodeModule_ContactRegion total size == 5.266666412353516 MB
    |  DecodeModule_AnchorElasti total size == 5.2758941650390625 MB
Example Epoch 0
100%|██████████████████████████████████████████████████████████████████████████████████████| 10/10 [00:01<00:00,  6.48it/s]

PICR DONE!
Got 10 samples for data_split example
Traceback (most recent call last):
  File "scripts/run_demo.py", line 715, in <module>
    main(args)
  File "scripts/run_demo.py", line 617, in main
    geo_stage(intermediate, args)
  File "scripts/run_demo.py", line 599, in geo_stage
    print_msg = run_sample(
  File "scripts/run_demo.py", line 461, in run_sample
    hoptim.set_opt_val(**opt_val_kwargs)
  File "/home/kb/Projects/CPF/hocontact/postprocess/geo_optimizer.py", line 141, in set_opt_val
    self.const_val["indexed_vertex_id"] = vertex_id[anchor_padding_mask == 1]  # TENSOR[NVALID, ]
RuntimeError: indices should be either on cpu or on the same device as the indexed tensor (cpu)

Expected code date ?

Hi !

I just read through your paper, congratulation on the great work !
I love the fact that you provide an anatomically-constrained MANO, and the per-object-vertex hand part affinity.

I look forward to the code realease :)

Do you have a planned date in mind ?

All the best,

Yana

Augmented Data Available?

Since I want to reproduce the entire training process,I'm wondering if the augmented data of Ho3d-v1 can be released.

Training code

Thank you very much for your excellent work.
Could you please provide the training code and if the three parts of MIHO models are trained separately?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.