Giter VIP home page Giter VIP logo

kinect2_tracker's Introduction

kinect2_tracker

A working ROS wrapper for the KinectOne (v2) using libfreenect2

install

  • install libfreenect2
    • Make sure to install all the optional stuff, including OpenCL and OpenNI2
    • When you build the library, do not follow the instructions there, instead run
    mkdir build && cd build
    cmake .. -DCMAKE_INSTALL_PREFIX=/usr/
    make
    sudo make install
  • Download NiTE2 and put it in ~/package_ws/NiTE-Linux-x64-2.2/
    • Or you can put it in some other random places, but you need to modify CMakeList.txt and setup_nite.bash
  • source setup_nite.bash

To run the program the launch file needs to be used

Run

roslaunch kinect2_tracker tracker.launch

API

Published

  • /people_skeleton : kinect2_tracker::user_IDs, id array of the tracked people
  • /people_points: kinect2_tracker::user_points, center of mass for each person
  • /people_points_viz: visualization_msgs::Marker, people points to show in rviz
  • tf transforms for the human skeletons
  • Kinect RGB, depth and infrad images

Params

  • tf_prefix: The prefirx when publishing tf
  • relative_frame: The base frame of the Kinect observations

kinect2_tracker's People

Contributors

andrewww2 avatar jltrovil avatar mcgi5sr2 avatar yodahuang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

kinect2_tracker's Issues

Can't open the device issue

Hi,

I could compile the code normally using catkin_make...
However, when I try to "roslaunch" the project I get a "[FATAL] [1467020828.259201654]: Can't Open Device"... despite de fact that the kinect2 device is connected normally to the computer as well as running normally using libfreenect2. Do you have any insight of how to solve this problem?

Thanks.

Kinect extrinsic calibration

Hi! Thanks for the great tool, I was able to use the package and track the user successfully. However, I have problems in calibrating the Kinect for my environment.

I used to do my extrinsic calibration by detecting a marker with a camera of the robot and the Kinect, through the identification of the transformation between the two point of views.
I could calibrate my kinect2 with the driver available in; https://github.com/code-iai/iai_kinect2.
Such a transformation works for point cloud but does not for the skeleton.

Do you know which is the reference frame of the camera?
Alternatively, do you know how can I see the point cloud used by the tracker (in order to identify its viewpoint) and if/how can I add a cloud transformation?

Running skeleton tracking while viewing point clouds

Hi!

Firstly, thanks a lot for your code. It seems to be working when I configured the OpenNI + NiTE + libfreenect setup. I was able to visualize the tf frames as visualization markers as Rviz.

I've been trying to run the skeleton tracker while also viewing the pointcloud libfreenect provides - but to no avail. It seems I cannot run kinect2_tracker.launch while also running another driver for the kinect (for visualizing the pointclouds). DO you have any idea of how I can access RGBD data from your package, or how to make the freenect driver also publish pointclouds simultaneously?

THanks!

OpenNI2 initialization failed

Hi,
I just tried your package and setup those things you mentioned in README.md. Once I roslaunch it, the error shows "OpenNI initialize error". Are there anything I need to add into the ~/include dir?
BTW, You list two to-do things in README.md. How should I exactly do? Sorry I am just a freshman in Kincet2.

Many thanks!!

Not able to catkin_make on kinect2_tracker cloned repository

First of all thanks to you for providing this ROS wrapper to the community. I was following the installation steps and came across few issues and solved them.

  1. Couldn't clone- git clone http://amrcgithub/mep12sr/kinect2_tracker so I have cloned - git clone https://github.com/mcgi5sr2/kinect2_tracker into my catkin workspace.
  2. While doing catkin_make, it throws an error saying fatal error: NiTE.h: No such file or directory

    include "NiTE.h" then i have copied all the header files from NiTE's include dir to catkin_ws/src/kinect2_tracker/include dir. An easy fix is to edit cmakeLists.txt in line 80 and 81 by providing the path to NiTE-Linux-x64-2.2 dir.

  3. After doing above mentioned changes and using catkin_make, it again throws some errors. I have found out it is because of this piece of code in file kinect2_tracker.hpp from line 218-233:
    `//Publish the calibration tf_frame as the cross product of the shoulder vectors
    -// This function publishes the calibration_space opposite the shoulders of the user
  • void publishCalibrationOriginTF(nite::SkeletonJoint skelTorso, nite::SkeletonJoint skelRshoulder, nite::SkeletonJoint skelLshoulder, int uid)
  • {
  • if (skelTorso.getPositionConfidence() > 0.0)
  • {
  •  tf::Transform calibrationOriginTransform;
    

- tf::Transform torsoTransform;

  •        tf::Vector3 torsoVec3 = tf::Vector3(skelTorso.getPosition().x / 1000.0, skelTorso.getPosition().y / 1000.0, skelTorso.getPosition().z / 1000.0);
    
  •        torsoTransform.setOrigin(torsoVec3);
    

- torsoTransform.setRotation(tf::Quaternion(0,0,0,1));

  •        tf::Vector3 RshoulderVec3 = tf::Vector3(skelRshoulder.getPosition().x / 1000.0, skelRshoulder.getPosition().y / 1000.0, skelRshoulder.getPosition().z / 1000.0);                 //create a vector for the right shoulder
    

- RshoulderVec3 = (RshoulderVec3 - torsoVec3); //vector is the difference of the two

  •        tf::Vector3 LshoulderVec3 = tf::Vector3(skelLshoulder.getPosition().x / 1000.0, skelLshoulder.getPosition().y / 1000.0, skelLshoulder.getPosition().z / 1000.0);                 //create a vector for the left shoulder`
    

Fix: remove all these - (hyphen marks) from in front of these lines of code and perform catkin make.

Can't launch kinect2_tracker

Hi,

I want to use kinect2_tracker for skeleton tracking with kinect v2.
I have installed libfreenect2 and iai_kinect2 - and kinect works fine.

I followed the Readme for setting kinect2_tracker but when I launch the tracker I get the error:
[tracker.launch] is neither a launch file in package [kinect2_tracker] nor is [kinect2_tracker] a launch file name.

I see there is a problem - I'm not able to catkin_make. There are a lot of errors:
Error catkin_make kinect2_tracker.pdf

Did I forget something?
Any help?

Couldn't create user tracker

when I roslaunch kinect2_tracker tracker.launch,there's an error:

Could not find data file ./NiTE2/s.dat
current working directory = /home/huangdan/.ros
[FATAL] [1555235316.166777268]: Couldn't create user tracker
[kinect2_tracker_node2-2] process has finished cleanly
log file: /home/huangdan/.ros/log/77e934e0-5e9a-11e9-951d-acd1b884bee7/kinect2_tracker_node2-2*.log

but NiTE-x64-2.2 ./UserViewer is OK
how can I solve the problem

Orientations are not updating

I'm not sure if it's just not implemented or a bug - only the positions and relative positions of the joints are updated and published to tf. Orientations of the joints stay same for the entire runtime of the node.

I noticed from other issues that the project has been discontinued, but it would be nice to hear from you if you had any experience in getting the orientations of the joints (in my understanding this should essentially correspond to the angles between the joints) from the NiTE and what problems did you tackle, so that I could eventually fork this project and develop the missing functionality.

Is this possible tracking in Simulator?

Hi, Is this possible tracking in Simulator?
I'm using vrep Simulator because I don't have actual device.
The Kinect camera is in vrep scene.
Is it possible human mannequin's action tracking in simulator?
If does, How do i modify the code?

kinect2 nite2 confused tracking people

nite2 use openni2 start depth frame,the depth stream mode is 640480,but the kinect2 output 512424,this make kinect2 sometimes tracked error data,and the data is Inaccurate.

Cannot access the published data...!!

Everything worked fine for me, but I am not able to access the data from the published topics? In the /tf topic I am continuosly getting the same values even if I move the kinect around.
And I am also not able to view the image of the kinect in rviz.

Can anyone please help me?
Thanks in advance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.