orbbec / ros_astra_camera Goto Github PK
View Code? Open in Web Editor NEWROS wrapper for Astra camera
License: Apache License 2.0
ROS wrapper for Astra camera
License: Apache License 2.0
Kind of like the Kobuki ftdi udev rules installer: https://github.com/yujinrobot/kobuki_core/blob/kinetic/kobuki_ftdi/scripts/create_udev_rules
There's a script righ tnow, but it does not get installed into the debian packages and is not ros package path aware.
The mini s' color modes are different from those listed in the Astra.cfg. It's highest resolution is
Astra.cfg however only offers these values:
output_mode_enum = gen.enum([ gen.const( "SXGA_30Hz", int_t, 1, "1280x1024@30Hz"),
gen.const( "SXGA_15Hz", int_t, 2, "1280x1024@15Hz"),
gen.const( "XGA_30Hz", int_t, 3, "1280x720@30Hz"),
gen.const( "XGA_15Hz", int_t, 4, "1280x720@15Hz"),
So that the higher resolution color images cannot be used without modifying the values in astra_driver.cpp (after line 900)
Does the driver support exposing an RGB image at 1280*960 @ 10FPS to ROS?
According to the Orbbec Astra Specs (https://orbbec3d.com/product-astra/), this should be possible...
There are only some licenses in the individual files, but no general License file.
I tried to test a new embedded S camera (label AQAAC8, 30009) with this driver on Melodic/18.04.02 If I connect the camera to a USB3 port on my computer, I get this dmesg:
usb 8-1: new SuperSpeed USB device number 4 using xhci_hcd
[113807.293831] usb 8-1: New USB device found, idVendor=05e3, idProduct=0620
[113807.293832] usb 8-1: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[113807.293833] usb 8-1: Product: USB3.1 Hub
[113807.293834] usb 8-1: Manufacturer: GenesysLogic
[113807.294522] hub 8-1:1.0: USB hub found
[113807.294837] hub 8-1:1.0: 2 ports detected
[113807.368571] usb 7-1.1: new high-speed USB device number 29 using xhci_hcd
[113807.471177] usb 7-1.1: New USB device found, idVendor=05e3, idProduct=0610
[113807.471179] usb 7-1.1: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[113807.471180] usb 7-1.1: Product: USB2.1 Hub
[113807.471180] usb 7-1.1: Manufacturer: GenesysLogic
[113807.471868] hub 7-1.1:1.0: USB hub found
[113807.472128] hub 7-1.1:1.0: 2 ports detected
[113807.581012] usb 8-1.2: new SuperSpeed USB device number 5 using xhci_hcd
[113807.601488] usb 8-1.2: New USB device found, idVendor=2bc5, idProduct=060b
[113807.601490] usb 8-1.2: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[113807.601491] usb 8-1.2: Product: ORBBEC Depth Sensor
[113807.601491] usb 8-1.2: Manufacturer: Orbbec(R)
[113807.784578] usb 7-1.1.1: new high-speed USB device number 30 using xhci_hcd
[113807.928807] usb 7-1.1.1: New USB device found, idVendor=2bc5, idProduct=050b
[113807.928809] usb 7-1.1.1: New USB device strings: Mfr=2, Product=1, SerialNumber=3
[113807.928819] usb 7-1.1.1: Product: USB 2.0 Camera
[113807.928820] usb 7-1.1.1: Manufacturer: Sonix Technology Co., Ltd.
[113807.928820] usb 7-1.1.1: SerialNumber: SN0001
[113807.930632] uvcvideo: Found UVC 1.00 device USB 2.0 Camera (2bc5:050b)
[113807.945564] uvcvideo 7-1.1.1:1.0: Entity type for entity Extension 4 was not initialized!
[113807.945567] uvcvideo 7-1.1.1:1.0: Entity type for entity Extension 3 was not initialized!
[113807.945568] uvcvideo 7-1.1.1:1.0: Entity type for entity Processing 2 was not initialized!
[113807.945570] uvcvideo 7-1.1.1:1.0: Entity type for entity Camera 1 was not initialized!
[113807.945750] input: USB 2.0 Camera: USB Camera as /devices/pci0000:00/0000:00:1c.3/0000:04:00.0/usb7/7-1/7-1.1/7-1.1.1/7-1.1.1:1.0/input/input25
The devices are created according to the udev-rules:
foo@barhostname:/dev$ ls
astrauvc deeyea (...)
I can open the rgb-camera as a webcam.
astra_list_devices however does not find the device.
What should I try next?
Hi,
Have just purchased the Orbec Astra Mini and is testing it out using ros. The default depth_mode is 30hz (VGA). I would like to know in which part of the program should I change to reduce it to 25hz?
Cheers
According to the datasheet, the Astra can produce 1280x960 color images, however there is no option in the ROS driver dynamic config for this resolution?
I tried this package with the Orbbec Astra Mini S, everything works so far. The only thing is, that the depth cloud i receive from the camera is about 5 cm offset in X direction.
I noticed, that the frames published don't match with the cameras sensor positions.
Any suggestions?
The CMakelists wants to install libuvc_camera_nodelet.xml which I can't find in the repository. The install command was added in efd14c7.
install(FILES libuvc_camera_nodelet.xml
DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}
)
Is ROS Lunar supported? Can you publish a release for lunar if it is?
Hello All,
I was wondering if I could use the Astra pro at a rate of 60Hz. Currently rgb and depth both are running at 30Hz. I'm currently using 640x480 size frames. But I don't mind using the 320x240 sized frames in order to get 60Hz as is mentioned in Astra.cfg.
output_mode_enum = gen.enum([ gen.const( "SXGA_30Hz", int_t, 1, "1280x1024@30Hz"),
gen.const( "SXGA_15Hz", int_t, 2, "1280x1024@15Hz"),
gen.const( "XGA_30Hz", int_t, 3, "1280x720@30Hz"),
gen.const( "XGA_15Hz", int_t, 4, "1280x720@15Hz"),
gen.const( "VGA_30Hz", int_t, 5, "640x480@30Hz"),
gen.const( "VGA_25Hz", int_t, 6, "640x480@25Hz"),
gen.const( "QVGA_25Hz", int_t, 7, "320x240@25Hz"),
gen.const( "QVGA_30Hz", int_t, 8, "320x240@30Hz"),
gen.const( "QVGA_60Hz", int_t, 9, "320x240@60Hz"),
gen.const( "QQVGA_25Hz", int_t, 10, "160x120@25Hz"),
gen.const( "QQVGA_30Hz", int_t, 11, "160x120@30Hz"),
gen.const( "QQVGA_60Hz", int_t, 12, "160x120@60Hz")], "output mode")
I belive that I need change some configs like
gen.add("ir_mode", int_t, 0, "Video mode for IR camera", 5, 1, 12, 9, edit_method = output_mode_enum)
gen.add("color_mode", int_t, 0, "Video mode for color camera", 5, 1, 12, 9, edit_method = output_mode_enum)
gen.add("depth_mode", int_t, 0, "Video mode for depth camera", 5, 1, 12, 9, edit_method = output_mode_enum)
and If I could change so that I can get a 60Hz rate for IR, RGB and Depth output.
So, my concern is that, is it possible to do like this and what else I need to achive my goal of getting 60Hz rate for all outputs?
What are the things I need to change ?
I'm using Ubuntu 14.04 and ROS Indigo.
Thanks,
Prasanna
Hi all,
I have checked driver in ROS Indigo repositories (sudo apt-get install ros-indigo-astra-launch) and also compiling from source.
The driver works, but there are topics that appear doing "rostopic list" that do not publish any data. In my case, these are:
/camera/depth_registered/points
/camera/depth_registered/sw_registered/*
/camera/rgb/*
This means there is not color images, neither RGB-D pointclouds.
Best,
Alberto
We are using Mini S cameras (with IR-Flood). So far, we couldn't find the command to toggle the projector and IR-Flood (is this even possible without the Extended API?)
Without this feature, the IR-Flood is quite useless.
Any way to throttle depth frame rate?
Is there any possibility to stream both color and ir image simultaneously?
It is quite annoying that the package needs an internet connection at every build to download OpenNI2.
I am using Astra cameras on a mobile robot that is not always connected to internet so I cannot build my whole Ros workspace.
I want to subscribe two topics at a time using ApproximateTimeSynchronizer for Astra Pro camera.
This is what I'm doing to reach my goal.
int main(int argc, char** argv)
{
ros::init(argc, argv, "hit_detection");
ros::NodeHandle nh;
// topic subscription
message_filters::Subscriber<Image> RGB_sub(nh, "/camera/rgb/image", 1);
message_filters::Subscriber<Image> DEPTH_sub(nh, "/camera/depth/image", 1);
// synchronization policy
typedef sync_policies::ApproximateTime<Image,Image> MySyncPolicy;
// ApproximateTime takes a queue size as its constructor argument, hence MySyncPolicy(10)
Synchronizer<MySyncPolicy> sync(MySyncPolicy(1), RGB_sub, DEPTH_sub);
sync.registerCallback(boost::bind(&callback, _1, _2));
ros::spin();
return 0;
}
and then in callback function I'm doing this...
void callback(const ImageConstPtr& image_rgb, const ImageConstPtr& image_depth)
{
// do something here.
}
when I subscribe one topic only it works but when I'm trying to subscribe using timeSynchronizer it fails. It's not going to the callback. I mean the subscription process is not working well somewhere.
It would be a great help if anyone can solve this issue.
Thanks,
Prasanna
There is a reconfigurable option in the ROS driver to toggle "auto_exposure". However, toggling this option does not change anything for the Astra Mini S (the RGB image's exposure still changes automatically with changes to light conditions.) How can I actually disable the auto exposure on the Mini S?
Hi,
I'm trying to test the driver, it looks like the launch files/package are missing. Maybe those were not pushed to the git repo?
Thanks,
Guru
The files astra.launch and astrapro.launch from your README manual in point 3 are missing in the repository.
After calibrating the rgb and ir camera and estimating their relative pose, I wonder how to integrate this information into the camera driver. I found no configuration file for this data and also no TF-listener that would read this information from /tf (or rather /tf_static).
How can I pass this information to the node? Do I have to integrate it into the projection matrix?
I'm getting "fatal error: libudev.h: No such file or directory #include <libudev.h>" when compiling the package. This is not probably an issue but I'm guessing that I'm missing a dependency that should be installed.
I would be grateful if you tell me what are the dependencies that should be installed to get the package work properly.
Thanks in advance.
Followed the instruction in the README. Compiled the package successfully:
...
[100%] Built target astra_camera_node
[100%] Built target astra_camera_nodelet
However there are no launch files generated. And there is no astra_launch package on astra_camera.
How do you launch the camera with this package?
When using Astra mini, I can see rgb images out but no depth. I can only see depth for the instance after Starting color stream. It accompanies these warnings:
[ WARN] [1496352532.290814313]: Messages of type 2 arrived out of order (will print only once)
Where messages of type 0-2 have been observed. Depth information is publishing but it is not meaningful or viewable in RVIZ or image_view except for those few frames when toggling color.
Hello,
We are using Orbbec Astra with our TurtleBots. They are shipped from ClearPath. We get RGB image using rosrun libuvc_camera camera_node
but it seems to me that the RGB images are not quite good. You can see the RGB image below. Does anybody know how we can fix this?
Note: We have 4 turtlebots and 4 cameras and we get the same results for all of them.
Setting the rgb_preferred parameter in the Astra.cfg file to 1 results in an image with the correct size, but the image scrolls. I think this has something to do with the maximum height resolution of the camera being 960. but the image is pretty much unusable. I (understand)[https://github.com//issues/32] that the intended behaviour would be that a fixed part of the image would have invalid data.
Further more I did some experimenting with the different colour modes and I cannot say I understand which modes are compatible. Is there some documentation on that?
At the moment there is only one IR camera stream available in the driver but Stereo S has two IR cameras, is it possible to enable that as well or will it be available in the future?
When I launch the astrapro.launch file and do a 'rostopic list', it lists all of the depth, ir, and rgb topics. The depth and ir topics are being published to, and I can view the image feed. However, the rgb topics are all empty. This is the terminal I launch the launch file in:
roslaunch astrapro.launch
... logging to /home/e4e/.ros/log/ccfe7e6a-cdb6-11e8-af2b-9cb6d0dcfaeb/roslaunch-e4e-XPS-15-9560-24406.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.
started roslaunch server http://e4e-XPS-15-9560:42871/
PARAMETERS
NODES
/camera/
camera_nodelet_manager (nodelet/nodelet)
depth_metric (nodelet/nodelet)
depth_metric_rect (nodelet/nodelet)
depth_points (nodelet/nodelet)
depth_rectify_depth (nodelet/nodelet)
depth_registered_sw_metric_rect (nodelet/nodelet)
driver (nodelet/nodelet)
points_xyzrgb_sw_registered (nodelet/nodelet)
register_depth_rgb (nodelet/nodelet)
rgb_debayer (nodelet/nodelet)
rgb_rectify_color (nodelet/nodelet)
rgb_rectify_mono (nodelet/nodelet)
/
camera_base_link (tf/static_transform_publisher)
camera_base_link1 (tf/static_transform_publisher)
camera_base_link2 (tf/static_transform_publisher)
camera_base_link3 (tf/static_transform_publisher)
auto-starting new master
process[master]: started with pid [24416]
ROS_MASTER_URI=http://localhost:11311
setting /run_id to ccfe7e6a-cdb6-11e8-af2b-9cb6d0dcfaeb
process[rosout-1]: started with pid [24429]
started core service [/rosout]
process[camera/camera_nodelet_manager-2]: started with pid [24441]
process[camera/driver-3]: started with pid [24447]
process[camera/rgb_debayer-4]: started with pid [24448]
process[camera/rgb_rectify_mono-5]: started with pid [24456]
process[camera/rgb_rectify_color-6]: started with pid [24469]
process[camera/depth_rectify_depth-7]: started with pid [24475]
process[camera/depth_metric_rect-8]: started with pid [24487]
process[camera/depth_metric-9]: started with pid [24497]
[ INFO] [1539304564.836762739]: Initializing nodelet with 4 worker threads.
process[camera/depth_points-10]: started with pid [24511]
process[camera/register_depth_rgb-11]: started with pid [24528]
process[camera/points_xyzrgb_sw_registered-12]: started with pid [24553]
process[camera/depth_registered_sw_metric_rect-13]: started with pid [24564]
process[camera_base_link-14]: started with pid [24581]
process[camera_base_link1-15]: started with pid [24590]
process[camera_base_link2-16]: started with pid [24602]
process[camera_base_link3-17]: started with pid [24612]
[ INFO] [1539304564.936147051]: Device "2bc5/0403@1/8" found.
^C[camera_base_link3-17] killing on exit
[camera_base_link2-16] killing on exit
[camera_base_link1-15] killing on exit
[camera_base_link-14] killing on exit
[camera/depth_registered_sw_metric_rect-13] killing on exit
[camera/points_xyzrgb_sw_registered-12] killing on exit
[camera/register_depth_rgb-11] killing on exit
[camera/depth_points-10] killing on exit
[camera/depth_metric-9] killing on exit
[camera/depth_metric_rect-8] killing on exit
[camera/depth_rectify_depth-7] killing on exit
[camera/rgb_rectify_color-6] killing on exit
[camera/rgb_rectify_mono-5] killing on exit
[camera/rgb_debayer-4] killing on exit
[camera/driver-3] killing on exit
[camera/camera_nodelet_manager-2] killing on exit
[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done
Do you know what may be causing this?
I used the orbbec camera to run the the new version rtabmap 0.17.6 error! but the older version rtabmap was ok.. is the camera driver not fit the rtabmap???
[ INFO] [1551257273.507102516]: rtabmap 0.17.6 started...
[ INFO] [1551257273.518923499]: Starting color stream.
[ INFO] [1551257274.112980126]: rtabmap (1): Rate=1.00s, Limit=0.700s, RTAB-Map=0.1196s, Maps update=0.0293s pub=0.0040s (local map=1, WM=1)
[ INFO] [1551257274.229862717]: Using plugin "static_layer"
[ INFO] [1551257274.261771882]: Requesting the map...
[ INFO] [1551257274.481063797]: Resizing costmap to 421 X 421 at 0.050000 m/pix
[ INFO] [1551257274.580789996]: Received a 421 X 421 map at 0.050000 m/pix
[ INFO] [1551257274.606411614]: Using plugin "obstacle_layer"
[ INFO] [1551257274.620584008]: Subscribed to Topics: scan
[ INFO] [1551257274.754579212]: Using plugin "inflation_layer"
[ INFO] [1551257275.014146790]: Loading from pre-hydro parameter style
[ INFO] [1551257275.057684199]: Using plugin "obstacle_layer"
[ INFO] [1551257275.066937460]: Subscribed to Topics: scan
[FATAL] (2019-02-27 16:47:55.101) Link.cpp:82::setInfMatrix() Condition (uIsFinite(infMatrix.at(1,1)) && infMatrix.at(1,1)>0) not met! [Linear information should not be null! Value=inf (set to 1 if unknown).]
terminate called after throwing an instance of 'UException'
what(): [FATAL] (2019-02-27 16:47:55.101) Link.cpp:82::setInfMatrix() Condition (uIsFinite(infMatrix.at(1,1)) && infMatrix.at(1,1)>0) not met! [Linear information should not be null! Value=inf (set to 1 if unknown).]
[ INFO] [1551257275.183047576]: Using plugin "inflation_layer"
it has not been released since 0.1.5, from fork at tfoote/ros_astra_camera
When using the driver there is a huge noise in the point cloud at edges of objects.
A video of it can be found here:
https://www.youtube.com/watch?v=p2L59TLhsg4
I guess some interpolation calculation is going wrong.
This was not the case with an older driver I got from the Orbbec website (the link is down now).
I get the following error messages when I try to compile ros_astra_camera :
[ 40%] Performing configure step for 'astra_openni2' no need to configure [ 43%] Performing build step for 'astra_openni2' make[3]: warning: jobserver unavailable: using -j1. Add '+' to parent make rule. ThirdParty/PSCommon/BuildSystem/CommonDefs.mak:40: HOST_PLATFORM is x64 ../../BuildSystem/CommonDefs.mak:40: HOST_PLATFORM is x64 ../../ThirdParty/PSCommon/BuildSystem/CommonDefs.mak:40: HOST_PLATFORM is x64 ../../ThirdParty/PSCommon/BuildSystem/CommonDefs.mak:40: HOST_PLATFORM is x64 ../../../ThirdParty/PSCommon/BuildSystem/CommonDefs.mak:40: HOST_PLATFORM is x64 Formats/XnFormatsMirror.cpp: In function ‘XnStatus XnMirrorOneBytePixels(XnUChar*, XnUInt32, XnUInt32)’: Formats/XnFormatsMirror.cpp:46:11: error: array subscript is below array bounds [-Werror=array-bounds] XnUInt8* pDestEnd = &pLineBuffer[0] - 1; ^~~~~~~~ Formats/XnFormatsMirror.cpp: In function ‘XnStatus XnMirrorTwoBytePixels(XnUChar*, XnUInt32, XnUInt32)’: Formats/XnFormatsMirror.cpp:79:12: error: array subscript is below array bounds [-Werror=array-bounds] XnUInt16* pDestEnd = &pLineBuffer[0] - 1; ^~~~~~~~ Formats/XnFormatsMirror.cpp: In function ‘XnStatus XnMirrorThreeBytePixels(XnUChar*, XnUInt32, XnUInt32)’: Formats/XnFormatsMirror.cpp:115:11: error: array subscript is below array bounds [-Werror=array-bounds] XnUInt8* pDestEnd = &pLineBuffer[0] - 1; ^~~~~~~~ cc1plus: all warnings being treated as errors ../../../ThirdParty/PSCommon/BuildSystem/CommonCppMakefile:141: recipe for target '../../../Bin/Intermediate/x64-Release/liborbbec.so/XnFormatsMirror.o' failed make[4]: *** [../../../Bin/Intermediate/x64-Release/liborbbec.so/XnFormatsMirror.o] Error 1 Makefile:131: recipe for target 'Source/Drivers/orbbec' failed make[3]: *** [Source/Drivers/orbbec] Error 2 ros_astra_camera/CMakeFiles/astra_openni2.dir/build.make:110: recipe for target 'ros_astra_camera/astra_openni2/src/astra_openni2-stamp/astra_openni2-build' failed make[2]: *** [ros_astra_camera/astra_openni2/src/astra_openni2-stamp/astra_openni2-build] Error 2 CMakeFiles/Makefile2:1018: recipe for target 'ros_astra_camera/CMakeFiles/astra_openni2.dir/all' failed make[1]: *** [ros_astra_camera/CMakeFiles/astra_openni2.dir/all] Error 2 Makefile:140: recipe for target 'all' failed make: *** [all] Error 2 Invoking "make -j4 -l4" failed
I have ros_astra_cam and usb_cam with the launch file from the other astra git in my catkin_astra/src directory.
It's the same I have with my other Ubuntu system except that the other one is a Mint 18.3 with ROS Kinetic and now its Mint 19 with ROS Melodic
I have observed that when running the IR camera in VGA (IR mode 5), the resolution is 640x480. The default camera intrinsics at this resolution seem reasonable to me (image center is at the center of the image, and the focal length calculation in astra_device looks correct)
When I switch to XVGA (IR mode 1), I observe that the resolution scales (factor of 2), plus there is an additional section of pixels (64 pixels high) at the bottom of the IR image containing extra information (i.e., these pixels are also being captured by the CMOS sensor, but not included in the VGA image). This suggests that the image is being trimmed at VGA resolution. Do the default camera intrinsics account for this?
If they do account for this, then the default intrinsics for XVGA must be wrong, since the sensor and field of view do not change, yet the height of the image changes asymmetrically! This results in the focal length at XVGA not being twice the focal length at VGA (as it should, provided that the default intrinsics are correct at VGA resolution)
Clearly, this is a bug with the default intrinsics calculations (either for VGA, XVGA, or both)
Could some clarity be provided concerning the camera sensor? Specifically, I need to know the following:
Is the IR stream reporting the vertical field of view of the CMOS sensor, or is it reporting a modified vertical field of view that accounts for the trimmed VGA image?
Is the VGA image trimmed symmetrically? (i.e., XVGA image indicates that there is at least 32 pixels of information not included in the bottom of the VGA image. Is there also a 32-pixel section removed from the top of the VGA image?) The image center being reported in both VGA and XVGA modes indicates this to be the case, but I wanted to verify.
Once I know this information, I can issue a pull request for astra_device to correctly report the resolution at VGA/XVGA
Hello!
roslaunch astra_launch astra.launch
gives me the following error :
[ERROR] [1528466499.738426497]: Failed to load nodelet [/camera/driver] of type [astra_camera/AstraDriverNodelet] even after refreshing the cache: Could not find library corresponding to plugin astra_camera/AstraDriverNodelet. Make sure the plugin description XML file has the correct name of the library and that the library actually exists.
Can you help ? I have no idea what to do.
Thank you!
Hi,
Does anyone know how to get the color and depth image published on a single topic so that the images line up in time? As far as i can tell the color and depth time stamps do not match up. I have turned on all the depth_registration flags. And get this when running astra launch:
PARAMETERS
the topic /camera/depth_registered/points has 8 channels for pixel where the first 3 are x,y,z coordinates but the remaining 5 values are always 0 or NaN.
Has anyone managed to get get corresponding frames?
I'm sorry this might be a very novice questions for most people, but how do I set up those folders to compile? The Readme doesn't specify anything, if I just open the terminal and type catkin_make --pkg astra_camera
it obviously won't work.
Do I just place eveything inside ~/catkin_ws/src/ros_astra_camera and compile using the catkin_make --pkg astra_camera
(or would I put it inside ~/catkin_ws/src/astra_camera? it seems the name of the project is ros_astra_camera but the first command to compile the package is astra_camera). Do I also need to install any drivers? How?
Thank you
I'm not sure what steps I need to launch the package. The instructions say issue command roslaunch astra_launch astra.launch but I don't see a launch file included with this package. Am I missing something?
When I did "catkin_make --pkg astra_camera", there was a error that "/usr/bin/ld: cannot find -lOpenNI2Orbbec". Could you please let me know how to fix it?
Thank you very much!
Hello everyone,
When I roslaunch astra_launch astra.launch on TX2, the publishing frequency of /camera/depth/points is lower than 1Hz, while on PC is 29Hz. I have try Nofilter OpenNI version as well as the other.
Do you have any ideas about it?
Thanks a lot!
Is it possible to adjust the frame rate through the parameters?
Hello,
I have been experiencing a strange issue on my pointcloud, which has a ray (sometimes multiple rays) on the edge of the scene. After checking the depth image from the Astra camera, I found incorrect bright pixels, indicating the gradually increasing depth values, on the edge of the depth image.
Does anyone has a clue where it could go wrong?
[ WARN] [1464806618.218487179]: TF2 exception:
Lookup would require extrapolation into the future. Requested time 1464806618.315158598 but the latest data is at time 1464806618.239545601, when looking up transform from frame [camera_depth_optical_frame] to frame [camera_rgb_optical_frame]
This is a static transform so should be easy to work around.
The data gets through occasionally.
I successfully finished the first two steps. When I did "roslaunch astra_launch astra.launch", there was a warning in the terminal:
Warning: USB events thread - failed to set priority. This might cause loss of data....
As a result, in Rviz, I cannot receive the data from the following topics:
/camera/rgb/camera/depth_registered/sw_registered/image_rect
/camera/rgb/camera/depth_registered/sw_registered/image_rect_raw
/camera/rgb/camera/rgb/image_raw
/camera/rgb/camera/rgb/image_rect_color
To create a simple GetSerialRequest-Client, you have to install the full package (including Openni-dependency). It would be better to have a (rather small) msgs-package that only contains the service definition. (Similar to https://github.com/ensenso/ros_driver/tree/master/ensenso_camera_msgs)
There's some documentation here: https://3dclub.orbbec3d.com/t/rgb-d-sync-project/265 of how to update the firmware.
A simple script which can be provided by the package and can flash the firmware onto the devices intellegently.
It should know compatibility and refuse to flash incompatible firmware. And it would be great if it could query the firmware version and know what the versions available are and how to fetch them.
How can I compile the package and use it in the ros2.0?
@tfoote Receiving the following error.
aswin@aswin-laptop:~$ cd camera/
aswin@aswin-laptop:~/camera$ catkin_make
Base path: /home/aswin/camera
Source space: /home/aswin/camera/src
Build space: /home/aswin/camera/build
Devel space: /home/aswin/camera/devel
Install space: /home/aswin/camera/install
####
#### Running command: "make cmake_check_build_system" in "/home/aswin/camera/build"
####
####
#### Running command: "make -j8 -l8" in "/home/aswin/camera/build"
####
[ 3%] [ 7%] Built target astra_camera_gencfg
[ 11%] Built target astra_usb_reset
Performing update step for 'astra_openni2'
[ 11%] Built target _astra_camera_generate_messages_check_deps_GetSerial
[ 22%] [ 22%] [ 25%] Built target astra_camera_generate_messages_lisp
Built target astra_camera_generate_messages_py
Built target astra_camera_generate_messages_cpp
[ 25%] Built target astra_camera_generate_messages
Already on 'orbbec_ros'
Your branch is up-to-date with 'origin/orbbec_ros'.
[ 29%] Performing configure step for 'astra_openni2'
no need to configure
[ 33%] Performing build step for 'astra_openni2'
make[3]: warning: jobserver unavailable: using -j1. Add `+' to parent make rule.
[ 37%] Performing install step for 'astra_openni2'
[ 40%] Completed 'astra_openni2'
[ 51%] Built target astra_openni2
Linking CXX shared library /home/aswin/camera/devel/lib/libastra_wrapper.so
/usr/bin/ld: cannot find -lOpenNI2Orbbec
collect2: error: ld returned 1 exit status
make[2]: *** [/home/aswin/camera/devel/lib/libastra_wrapper.so] Error 1
make[1]: *** [ros_astra_camera/CMakeFiles/astra_wrapper.dir/all] Error 2
make: *** [all] Error 2
Invoking "make -j8 -l8" failed
Hi,
I tried to test the Orbbec Astra Camera on a machine running ROS Kinetic / Ubuntu 16.04
Building failed because astra_camera/AstraConfig.h could not be found (referred to in astra_camera/astra_driver.h).
Packages have been installed using Roboware Studio Designer.
On another machine running ROS Indigo, AstraConfig.h is present and working.
Is it possible to use the Astra camera on a machine running ROS Kinetic?
This is blocking the catkin_make command any time when no Internet connection is available. To avoid this problem I deleted the lines in CMakeLists.txt enabling the fetch and the download of the openni2 dependency from https://github.com/orbbec/OpenNI2.git . So it won't ask for it anymore once the driver is already installed and compiled for the first time.
A test would be added instead in order to check whether openni2 is installed or not , if it is installed no need to install it again or even to look for it , if it's not then it should be installed .
Also it would be more convenient if the installation is possible offline by simply copying the package in the work_space, so the openni2 dependency should be added somehow to the hole package.
It would be great if you fix this!
Hello,
I have an Orbbec Persee and I want to obtain a RGB point cloud of different objects.
My operating system is Ubuntu 16.04 LTS and I have ROS Kinetic installed.
I installed this driver and the camera publishes both depth, IR and RGB topics fine (I can see them all in rviz).
However, I cannot get the rgb point cloud!
Could you please suggest me a solution for my problem?
Thank you.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.