Giter VIP home page Giter VIP logo

patchwork's Introduction

Patchwork

Official page of "Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor", which is accepted by RA-L with IROS'21 option

IMPORTANT: (Aug. 18th, 2024) I employ TBB, so its FPS is increased from 50 Hz to 100 Hz! If you want to use the paper version of Patchwork for SOTA comparison purpose, Please use this ground seg. benchmark code.


KITTI 00

Rough Terrain

Patchwork Concept of our method (CZM & GLE)

It's an overall updated version of R-GPF of ERASOR [Code] [Paper].


Characteristics

  • Single hpp file (include/patchwork/patchwork.hpp)

  • Robust ground consistency

As shown in the demo videos, our method shows the most promising robust performance compared with other state-of-the-art methods, especially, our method focuses on the little perturbation of precision/recall as shown in this figure.

Please kindly note that the concept of traversable area and ground is quite different! Please refer to our paper.


Contents

  1. Test Env.
  2. Requirements
  3. How to Run Patchwork
  4. Citation

Test Env.

The code is tested successfully at

  • Linux 18.04 LTS
  • ROS Melodic

Requirements

ROS Setting

    1. Install ROS on a machine.
    1. Thereafter, jsk-visualization is required to visualize Ground Likelihood Estimation status.
(if you use ubuntu 20.04)
sudo apt-get install ros-noetic-jsk-recognition
sudo apt-get install ros-noetic-jsk-common-msgs
sudo apt-get install ros-noetic-jsk-rviz-plugins
(if you use ubuntu 18.04)
sudo apt-get install ros-melodic-jsk-recognition
sudo apt-get install ros-melodic-jsk-common-msgs
sudo apt-get install ros-melodic-jsk-rviz-plugins
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/src
git clone https://github.com/LimHyungTae/patchwork.git
cd .. && catkin build patchwork 

How to Run Patchwork

We provide four examples:

  • How to run Patchwork in SemanticKITTI dataset

    • Offline KITTI dataset
    • Online (ROS Callback) KITTI dataset
  • How to run Patchwork in your own dataset

    • Offline by loading pcd files
    • Online (ROS Callback) using your ROS bag file

Offline KITTI dataset

  1. Download SemanticKITTI Odometry dataset (We also need labels since we also open the evaluation code! :)

  2. Set the data_path in launch/offline_kitti.launch for your machine.

The data_path consists of velodyne folder and labels folder as follows:

data_path (e.g. 00, 01, ..., or 10)
_____velodyne
     |___000000.bin
     |___000001.bin
     |___000002.bin
     |...
_____labels
     |___000000.label
     |___000001.label
     |___000002.label
     |...
_____...
   
  1. Run launch file
roslaunch patchwork offline_kitti.launch

You can directly feel the speed of Patchwork! 😉

Online (ROS Callback) KITTI dataset

We also provide rosbag example. If you run our patchwork via rosbag, please refer to this example.

  1. After building this package, run the roslaunch as follows:
roslaunch patchwork run_patchwork.launch is_kitti:=true

Then you can see the below message:

  1. Set the data_path in launch/kitti_publisher.launch for your machine, which is same with the aforementioned parameter in "Offline KITTI dataset" part.

  2. Then, run ros player (please refer to nodes/ros_kitti_publisher.cpp) by following command at another terminal window:

roslaunch patchwork kitti_publisher.launch

Own dataset using pcd files

Please refer to /nodes/offilne_own_data.cpp.

(Note that in your own data format, there may not exist ground truth labels!)

Be sure to set right params. Otherwise, your results may be wrong as follows:

W/ wrong params After setting right params

For better understanding of the parameters of Patchwork, please read our wiki, 4. IMPORTANT: Setting Parameters of Patchwork in Your Own Env..

Offline (Using *.pcd or *.bin file)

  1. Utilize /nodes/offilne_own_data.cpp

  2. Please check the output by following command and corresponding files:

  3. Set appropriate absolute file directory, i.e. file_dir, in offline_ouster128.launch

roslaunch patchwork offline_ouster128.launch

Online (via your ROS bag file)

It is easy by re-using run_patchwork.launch.

  1. Remap the topic of subscriber, i.g. modify remap line as follows:
<remap from="/patchwork/cloud" to="$YOUR_LIDAR_TOPIC_NAME$"/>

Note that the type subscribed data is sensor_msgs::PointCloud2.

  1. Next, launch the roslaunch file as follows:
roslaunch patchwork run_patchwork.launch is_kitti:=false

Note that is_kitti=false is important! Because it decides which rviz is opened. The rviz shows only estimated ground and non-ground because your own dataset may have no point-wise labels.

  1. Then play your bag file!
rosbag play $YOUR_BAG_FILE_NAME$.bag

Citation

If you use our code or method in your work, please consider citing the following:

@article{lim2021patchwork,
title={Patchwork: Concentric Zone-based Region-wise Ground Segmentation with Ground Likelihood Estimation Using a 3D LiDAR Sensor},
author={Lim, Hyungtae and Minho, Oh and Myung, Hyun},
journal={IEEE Robotics and Automation Letters},
year={2021}
}

Description

All explanations of parameters and other experimental results will be uploaded in wiki

Contact

If you have any questions, please let me know:

  • Hyungtae Lim (shapelim at kaist dot ac dot kr)

Updates

NEWS (22.12.24)

  • Merry christmas eve XD! include/label_generator is added to make the .label file, following the SemanticKITTI format.
  • The .label files can be directly used in 3DUIS benchmark
  • Thank Lucas Nunes and Xieyuanli Chen for providing code snippets to save a .label file.

NEWS (22.07.25)

  • Pybinding + more advanced version is now available on Patchwork++ as a preprocessing step for deep learning users (i.e., python users can also use our robust ground segmentation)!

NEWS (22.07.13)

  • For increasing convenience of use, the examples and codes are extensively revised by reflecting issue #12.

NEWS (22.05.22)

  • The meaning of elevation_thresholds is changed to increase the usability. The meaning is explained in wiki.
  • A novel height estimator, called All-Terrain Automatic heighT estimator (ATAT) is added within the patchwork code, which auto-calibrates the sensor height using the ground points in the vicinity of the vehicle/mobile robot.
    • Please refer to the function consensus_set_based_height_estimation().

NEWS (21.12.27)

  • pub_for_legoloam node for the pointcloud in kitti bagfile is added.

    • ground_estimate.msg is added
  • Bug in xy2theta function is fixed.

  • How to run

roslaunch patchwork pub_for_legoloam.launch
rosbag play {YOUR_FILE_PATH}/KITTI_BAG/kitti_sequence_00.bag --clock /kitti/velo/pointcloud:=/velodyne_points
  • This README about this LiDAR odometry is still incomplete. It will be updated soon!

patchwork's People

Contributors

frankhoeller avatar limhyungtae avatar seodu avatar urlkaist avatar yucedagonurcan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

patchwork's Issues

Difference between estimated nonground points and origin kitti point cloud

I use offline_kitti.launch to generate classified point cloud, however, when I put the estimated point cloud and corresponding kitti odometry velodyne file in Cloudcompare, I found that there is a offset between them as following pic.
2022-10-03 15-31-48 的屏幕截图

I only changed the pcd_savepath and save_flag in source code. Did I do something wrong? I tried patchwork++ and got same results.

Why is elevation_threshold different for each zone?

Hi,

I read through your paper explaining the patchwork algorithm, but need a little more clarification on why different elevation thresholds are needed for each zone.

If I am understanding the parameter correctly, the elevation thresholds represent the adaptive midpoint function κ(r). Would I be correct to say that this parameter is simply the maximum height that the ground plane can be? In this case, why would the maximum ground height change depending on the distance of each zone?

Thanks!

It seems that the element of normal_ along z shoud be positive

Thanks for your excellent work, I used your work on a depth camera's cloud. I found sometimes the plane's norm vector may point below ground, this will cause wrong ground estimation. I add this line "normal_(2) = fabs(normal_(2));" Then everythong is OK. I study the function "extract_piecewiseground" and think the plane's norm vector should point above ground. I have also used this algorithm on LiDAR but didn't meet this problem.

Screenshot from 2023-07-27 14-39-51
Screenshot from 2023-07-27 14-40-47

How to save the map created

Is there any way to save it as a map while it is going through all the pcd files ? Right now it just shows the current file as it iterates through them all. I would like to save it as a map.

prabally code error

It seems that line 186 "concentric_idx_tmp++; " should be put behind line 187.

Under And over segmentation Ouster 128

Hello LIm,
I was wondering if you could help me with the issue I'm having with my data. It's over and under segmenting. I tried adjusting the parameters but no luck far. Thanks!

params for 128-beam LiDAR

When I test with 128-beam data, the ground will be under-segmented,how to modify those parameters?
thanks much!

How to show pointcloud frame by frame?

Hi, it's a wonderful work! I'm a new user of ROS and I wonder how can I show the result of patchwork frame by frame, since your work show the result in a video-like style.

Explanation needed of consensus_set_based_height_estimation() function

Hello @LimHyungTae,
Thank you for making this great work open-source. I have been going through the patchwork.hpp file and am having trouble understanding the consensus_set_based_height_estimation function. Could you explain what the function is doing when you pass the values, ranges, and weights? What is the physical significance of the linearities and planarities vector and their relation to the ranges, and weights?

Screenshot from 2022-10-11 15-33-46

Screenshot from 2022-10-11 15-34-41

0th node come

"When I play the bag file, the output keeps showing '0th node come, Operating patchwork...
INITIALIZATION COMPLETE
0th node come
Operating patchwork...
[ATAT] The sensor height is auto-calibrated via the ground points in the vicinity of the vehicle
[ATAT] Elevation of the ground w.r.t. the origin is -1.76602 m
0th node come
Operating patchwork...
0th node come
Operating patchwork...
0th node come
Operating patchwork...

Is there a mistake in patchwork.hpp????

Line 439
if (ground_z_elevation > elevation_thr_[ring_idx + 2 * k]) {

is index 'ring_idx + 2k' wrong ?
the size of elevation_thr_ is 4. So, ' ring_idx + 2
k' may be large than 4.
Can you explain it?

patchwork.hpp:373

include/patchwork/patchwork.hpp:373: double PatchWork<PointT>::consensus_set_based_height_estimation(const RowVectorXd&, const RowVectorXd&, const RowVectorXd&) [with PointT = pcl::PointXYZ; Eigen::RowVectorXd = Eigen::Matrix<double, 1, -1>]: Assertion !only_one_element' failed.`

Hi, when I tried to run my own data, some troubles occurred in this line.
I find some comments behind this line, i.e. "TODO: admit a trivial solution".
Is there any bugs when you debuged it?

The different between code and paper

Thank you for your excellent work. In your paper, If the probability of Ground Likelihood Estimation is larger than 0.5, than ˆGn belongs to the actual ground. However, I noticed you don't do this step, Besides, the calculation of equation (10)and (11)didn't happen. I want to know whether it's just some kind of simplification.

adaption for VLP16

Hello, thanks for your great work. Is there some examples or advices in parameters setting for the VLP16 data?

md5sum mismatch : Connection drop

I am trying to run rosbag_kitti.launch on a rosbag which contains sensor::PointCloud2 message. Using Ubuntu 20.04, ROS Noetic.

On running rosbag, I am getting --

[ERROR] [1645128512.247642039]: Client [/ros_kitti_bhooshan_Legion] wants topic /os_cloud_node/points to have 
datatype/md5sum [patchwork/node/8ffdb3dcfd475161209f2ce2c04a5bcc], but our version has 
[sensor_msgs/PointCloud2/1158d486dd51d683ce2f1be655c3c181]. Dropping connection.

According to internet there is a mismatch in what the subscriber is asking for and what my rosbag is publishing. The same rosbag works well on pub_for_legoloam.launch. I compared the two files and found out the difference being line 150 in rosbag_kitti.cpp :

    ros::Subscriber NodeSubscriber = nh.subscribe<patchwork::node>("/node", 5000, callbackNode);

and line 90 on pub_for_legloam.cpp :

    ros::Subscriber NodeSubscriber = nh.subscribe<sensor_msgs::PointCloud2>("/node", 5000, callbackNode);

That's what I think on a primary check. Can you help? You can try running any rosbag with PointCloud2 msgs and try running it with rosbag_kitti.launch. Could it be due to ROS Noetic??

how to make it online processing for outputting Non-ground points

Hi~
I want to use it on my SLAM, but after I change the topic in run_patchwork.launch, it warns

[ INFO] [1683712130.658571863]: Inititalizing PatchWork...
Global thr. is not in use
[ INFO] [1683712130.661795518]: Sensor Height: 1.723000
[ INFO] [1683712130.661803499]: Num of Iteration: 3
[ INFO] [1683712130.661807762]: Num of LPR: 20
[ INFO] [1683712130.661811605]: Num of min. points: 10
[ INFO] [1683712130.661816381]: Seeds Threshold: 0.500000
[ INFO] [1683712130.661820503]: Distance Threshold: 0.125000
[ INFO] [1683712130.661824912]: Max. range:: 80.000000
[ INFO] [1683712130.661830104]: Min. range:: 2.700000
[ INFO] [1683712130.661834419]: Num. rings: 16
[ INFO] [1683712130.661838921]: Num. sectors: 54
[ INFO] [1683712130.661842516]: adaptive_seed_selection_margin: -1.100000
[ INFO] [1683712130.662416103]: Uprightness threshold: 0.707000
[ INFO] [1683712130.662421158]: Elevation thresholds: 0.523000 0.746000 0.879000 1.125000
[ INFO] [1683712130.662425354]: Flatness thresholds: 0.000500 0.000725 0.001000 0.001000
[ INFO] [1683712130.662428703]: Num. zones: 4
INITIALIZATION COMPLETE
80859th node come
Failed to find match for field 'label'.
Failed to find match for field 'id'.
Operating patchwork...
[ros_kitti_lin_15646_1444545934531213044-2] process has died [pid 15663, exit code -11, cmd /home/oem/Projects/water_surface_detect_ws/devel/lib/patchwork/ros_kitti /patchwork/cloud:=/lslidar_point_cloud __name:=ros_kitti_lin_15646_1444545934531213044 __log:=/home/oem/.ros/log/2eda3d38-ef16-11ed-bcab-3108b3958f44/ros_kitti_lin_15646_1444545934531213044-2.log].
log file: /home/oem/.ros/log/2eda3d38-ef16-11ed-bcab-3108b3958f44/ros_kitti_lin_15646_1444545934531213044-2*.log

how could I just using your work to filter my point cloud?

Most Sincere Respect for You~~~

Magic numbers in code

Hi @LimHyungTae,
Thanks for releasing the great work! I found that there are magic numbers in the code that can be effected the final extraction result as follow.

  1. Could you please explain linearity?
    if (ground_z_vec > uprightness_thr_ && linearity < 0.9) {
  2. Should I use a value larger than the sensor's mounting height?
    if (laserCloudIn.points[i].z < -sensor_height_ - 2.0) {
  3. What is the meaning of the size of the point cloud?
    cloud.reserve(1000);

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.