Giter VIP home page Giter VIP logo

gh-icp's Introduction

GH-ICP:Iterative Closest Point algorithm with global optimal matching and hybrid metric

GH-ICP is a robust coarse-to-fine pairwise point cloud registration method.

Two key innovative points over ICP are:

  1. Global optimal matching (Using Bipartite Graph and KM algorithm)

  2. Hybrid metrics (Using Euclidean distance and feature distance at the same time)

The earlier conference version of GH-ICP is called Iterative Global Similarity Point (IGSP).

To highlight two key innovative points of the algorithm, we renamed IGSP as GH-ICP.

Demo

ETH TLS Dataset

alt text

WHU TLS Dataset

alt text

More

alt text

Principle

alt text

If you find our work useful in your research, please consider citing:

    @INPROCEEDINGS{yue2018igsp,    
      author={Yue, Pan and Bisheng, Yang and Fuxun, Liang and Zhen, Dong},
      booktitle = {2018 International Conference on 3D Vision (3DV)},
      title={Iterative Global Similarity Points: A robust coarse-to-fine integration solution for pairwise 3D point cloud registration},
      year={2018}
    }

original version available on Windows.

Compiled with Visual Studio 12 2013 Win64 Release / Debug Passed, see former release

now available on Linux (passed on ubuntu 16.04)

How to use

  1. Install dependent 3rd libraries

PCL(>=1.7), LibLas(Optional for las data IO)

  1. Compile
mkdir build
cd build
cmake ..
make 
  1. Run
cd ..
# configure the script/run.sh file for editting the data path and key parameters
sh script/run.sh
  1. Parameter configuration
#./script/run.sh

#parameters setting example for large scale (100m+) TLS data
using_feature=B;              # Feature selection [ B: BSC, F: FPFH, R: RoPS, N: register without feature ]
corres_estimation_method=K;   # Correspondence estimation by [ K: Bipartite graph min weight match using KM, N: Nearest Neighbor, R: Reciprocal NN ]

downsample_resolution=0.1;    # Raw data downsampling voxel size, just keep one point in the voxel  
neighborhood_radius=0.5;      # Curvature estimation / feature encoding radius
curvature_non_max_radius=1.5; # Keypoint extraction based on curvature: non max suppression radius 
weight_adjustment_ratio=1.1;  # Weight would be adjusted if the IoU between expected value and calculated value is beyond this value
weight_adjustment_step=0.1;   # Weight adjustment for one iteration
registration_dof=6;           # Degree of freedom of the transformation [ 4: TLS with leveling, 6: arbitary ]
appro_overlap_ratio=0.6;      # Estimated approximate overlapping ratio of two point cloud 

launch_realtime_viewer=1;     # Launch the realtime registration viewer during registration or not (1: Launch, 0: Not launch)

  1. Data preparation

You can test on the online available point cloud data and registration dataset such as WHU TLS Registration Dataset, ETH PRS TLS Registration Dataset, ETH ASL Robotics Registration Dataset, 3D Match, Robotic 3D Scan Repository, etc.

You may apply the format transform tool to get the data ready for registration.

You can also use your own data and edit the data path in the shell file. Four formats of point cloud data are available (*.pcd, *.las, *.ply, *.txt) for IO.

#./script/run.sh

#data path
target_point_cloud_path=...
source_point_cloud_path=...
output_point_cloud_path=...

  1. Analysis

Some other well-known automatic registration algorithms are also provided in this repo. and you may apply them as reference.

Other Reference

If you find the Binary Shape Context (BSC) feature used in this repo. useful in your research, please consider citing:

@article{dong2017novel,
  title={A novel binary shape context for 3D local surface description},
  author={Dong, Zhen and Yang, Bisheng and Liu, Yuan and Liang, Fuxun and Li, Bijun and Zang, Yufu},
  journal={ISPRS Journal of Photogrammetry and Remote Sensing},
  volume={130},
  pages={431--452},
  year={2017},
  publisher={Elsevier}
}

gh-icp's People

Contributors

yuepanedward avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gh-icp's Issues

缺少文件

当我采用vs2015编译时,发现缺少好多类的定义文件,比如CloudBlock,HDmap_data_import函数中的Transaction、IMU_data等,这些缺少的定义均在dataio中

fatal error: liblas/capi/las_version.h: No such file or directory

i try compile the code with liblas.
i install liblas using sudo apt install liblas-dev command,
then
make build; cd build ;cmake ..
without any error

PCL [OK]
-- Reading /usr/share/cmake/libLAS-1.8.0/liblas-config.cmake
-- libLAS configuration, version
LibLAS is found:
SRC_LIST is /home/maxsense/3rdparty/GH-ICP/src/stereo_binary_feature.cpp/home/xxx/3rdparty/GH-ICP/src/ghicp_reg.cpp/home/xxx/3rdparty/GH-ICP/src/common_reg.cpp/home/xxx/3rdparty/GH-ICP/src/km.cpp
-- Configuring done
-- Generating done
-- Build files have been written to: /home/xxx/3rdparty/GH-ICP/build

make
/usr/include/liblas/version.hpp:45:37: fatal error: liblas/capi/las_version.h: No such file or directory

Can I run the GHICP.exe directly? Or how should I do to use your code?

Hello @YuePanEdward , I am learning how to registrate and stich point clouds together.
So can I run the GHICP.exe directly? Or how should I do to use your code?

And, when I run the GHICP.exe and appear an error:
image
Should I install VTK to solve this problem?

I really need your help. I want to get a complete head module with stitching the multiple point clouds together!
Thanks a lot!!

Compilation error

Scanning dependencies of target ghicp
[ 16%] Building CXX object CMakeFiles/ghicp.dir/test/ghicp_main.cpp.o
[ 33%] Building CXX object CMakeFiles/ghicp.dir/src/common_reg.cpp.o
In file included from /mnt/Shared/Demania/repos/GH-ICP-master/src/common_reg.cpp:8:0:
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:92: error: ‘Vertices’ is not a member of ‘pcl’
     computeApproximateNormals(const pcl::PointCloud<PointT>& cloud, const std::vector<pcl::Vertices>& polygons, pcl::PointCloud<PointNT>& normals)
                                                                                            ^~~~~~~~
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:92: error: ‘Vertices’ is not a member of ‘pcl’
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:100: error: template argument 1 is invalid
     computeApproximateNormals(const pcl::PointCloud<PointT>& cloud, const std::vector<pcl::Vertices>& polygons, pcl::PointCloud<PointNT>& normals)
                                                                                                    ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:100: error: template argument 2 is invalid
In file included from /mnt/Shared/Demania/repos/GH-ICP-master/src/common_reg.cpp:8:0:
/usr/include/pcl-1.8/pcl/features/from_meshes.h: In function ‘void pcl::features::computeApproximateNormals(const pcl::PointCloud<PointT>&, const int&, pcl::PointCloud<PointNT>&)’:
/usr/include/pcl-1.8/pcl/features/from_meshes.h:20:51: error: request for member ‘size’ in ‘polygons’, which is of non-class type ‘const int’
       int nr_polygons = static_cast<int>(polygons.size());
                                                   ^~~~
/usr/include/pcl-1.8/pcl/features/from_meshes.h:35:54: error: invalid types ‘const int[int]’ for array subscript
         const int nr_points_polygon = (int)polygons[i].vertices.size();
                                                      ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:39:58: error: invalid types ‘const int[int]’ for array subscript
         Eigen::Vector3f vec_a_b = cloud.points[polygons[i].vertices[0]].getVector3fMap() - cloud.points[polygons[i].vertices[1]].getVector3fMap();
                                                          ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:39:115: error: invalid types ‘const int[int]’ for array subscript
         Eigen::Vector3f vec_a_b = cloud.points[polygons[i].vertices[0]].getVector3fMap() - cloud.points[polygons[i].vertices[1]].getVector3fMap();
                                                                                                                   ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:40:58: error: invalid types ‘const int[int]’ for array subscript
         Eigen::Vector3f vec_a_c = cloud.points[polygons[i].vertices[0]].getVector3fMap() - cloud.points[polygons[i].vertices[2]].getVector3fMap();
                                                          ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:40:115: error: invalid types ‘const int[int]’ for array subscript
         Eigen::Vector3f vec_a_c = cloud.points[polygons[i].vertices[0]].getVector3fMap() - cloud.points[polygons[i].vertices[2]].getVector3fMap();
                                                                                                                   ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:42:64: error: invalid types ‘const int[int]’ for array subscript
         pcl::flipNormalTowardsViewpoint(cloud.points[polygons[i].vertices[0]], 0.0f, 0.0f, 0.0f, normal(0), normal(1), normal(2));
                                                                ^
/usr/include/pcl-1.8/pcl/features/from_meshes.h:46:36: error: invalid types ‘const int[int]’ for array subscript
           normals.points[polygons[i].vertices[j]].getNormalVector3fMap() += normal;
                                    ^
CMakeFiles/ghicp.dir/build.make:95: recipe for target 'CMakeFiles/ghicp.dir/src/common_reg.cpp.o' failed
make[2]: *** [CMakeFiles/ghicp.dir/src/common_reg.cpp.o] Error 1
CMakeFiles/Makefile2:95: recipe for target 'CMakeFiles/ghicp.dir/all' failed
make[1]: *** [CMakeFiles/ghicp.dir/all] Error 2
Makefile:103: recipe for target 'all' failed
make: *** [all] Error 2

compile error

Hi, I tried to compile the code in Ubuntu1804, and has the following error:

/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:92: error: ‘Vertices’ is not a member of ‘pcl’
ormals(const pcl::PointCloud& cloud, const std::vectorpcl::Vertices& polygons, pcl::PointCloud& normals)
^~~~~~~~
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:92: error: ‘Vertices’ is not a member of ‘pcl’
/usr/include/pcl-1.8/pcl/features/from_meshes.h:17:100: error: template argument 1 is invalid
onst pcl::PointCloud& cloud, const std::vectorpcl::Vertices& polygons, pcl::PointCloud& normals)

Can you help me fix this? Thanks.

Compilation

Hi,

I'm trying to compile your project because I'm interesting in testing it.
However, there is a multitude of error about templated class that are not define as template or include ...
Is is possible for you to push an up to date source code?

Thank you a lot for your help

Compilation error

While compiling i am getting the error
script/run.sh: line 21: ./bin/ghicp: No such file or directory .

My configuration is done and build is done.

I have used
mkdir build
cd build
cmake ..

cd ..

and sh script/run.sh

If i use make command after cmake .. then i am getting make: *** No targets specified and no makefile found.

I have removed libLas and using it without Liblas.

PCL [OK]
SRC_LIST is E:/Thesis/Coding/Surface_matching_c++/src/common_reg.cppE:/Thesis/Coding/Surface_matching_c++/src/ghicp_reg.cppE:/Thesis/Coding/Surface_matching_c++/src/km.cppE:/Thesis/Coding/Surface_matching_c++/src/stereo_binary_feature.cpp
-- Configuring done
-- Generating done
-- Build files have been written to: E:/Thesis/Coding/Surface_matching_c++/build


lenovo@DESKTOP-OQJFVM8 MINGW64 /e/Thesis/Coding/Surface_matching_c++
$ sh script/run.sh
script/run.sh: line 21: ./bin/ghicp: No such file or directory

result is not good

i try use it.my config is:
using_feature=B; # Feature selection [ B: BSC, F: FPFH, R: RoPS, N: register without feature ]
corres_estimation_method=N; # Correspondence estimation by [ K: Bipartite graph min weight match using KM, N: Nearest Neighbor, R: Reciprocal NN ]

downsample_resolution=0.01; # Raw data downsampling voxel size, just keep one point in the voxel
neighborhood_radius=0.3; # Curvature estimation / feature encoding radius
curvature_non_max_radius=1.0; # Keypoint extraction based on curvature: non max suppression radius
weight_adjustment_ratio=1.1; # Weight would be adjusted if the IoU between expected value and calculated value is beyond this value
weight_adjustment_step=0.1; # Weight adjustment for one iteration
registration_dof=6; # Degree of freedom of the transformation [ 4: TLS with leveling, 6: arbitary ]
appro_overlap_ratio=0.5; # Estimated approximate overlapping ratio of two point cloud

launch_realtime_viewer=1; # Launch the realtime registration viewer during registration or not (1: Launch, 0: Not launch)

the result:
2020-05-12 13-54-31屏幕截图
image

Errors

Linux版出现很多模板类错误,这项目是不是不完整的代码

wrong result

hi,it is a nice work!
i run this example use this script:

#data path
target_point_cloud_path=/home/arch/s1.ply;
source_point_cloud_path=/home/arch/s2.ply;
output_point_cloud_path=/home/arch/reg_s2.pcd;

#parameters
using_feature=B; # Feature selection [ B: BSC, F: FPFH, R: RoPS, N: register without feature ]
corres_estimation_method=N; # Correspondence estimation by [ K: Bipartite graph min weight match using KM, N: Nearest Neighbor, R: Reciprocal NN ]

downsample_resolution=0.1; # Raw data downsampling voxel size, just keep one point in the voxel
neighborhood_radius=0.5; # Curvature estimation / feature encoding radius
curvature_non_max_radius=1.0; # Keypoint extraction based on curvature: non max suppression radius
weight_adjustment_ratio=1.1; # Weight would be adjusted if the IoU between expected value and calculated value is beyond this value
weight_adjustment_step=0.1; # Weight adjustment for one iteration
registration_dof=6; # Degree of freedom of the transformation [ 4: TLS with leveling, 6: arbitary ]
appro_overlap_ratio=0.5; # Estimated approximate overlapping ratio of two point cloud

launch_realtime_viewer=1; # Launch the realtime registration viewer during registration or not (1: Launch, 0: Not launch)

#run
./bin/ghicp ${target_point_cloud_path} ${source_point_cloud_path} ${output_point_cloud_path}
${using_feature} ${corres_estimation_method}
${downsample_resolution} ${neighborhood_radius} ${curvature_non_max_radius}
${weight_adjustment_ratio} ${weight_adjustment_step} ${registration_dof} ${appro_overlap_ratio} ${launch_realtime_viewer}

**then i get wrong reslut,
Final transformation matrix:
-0.0911522 -0.957878 0.272338 -2.22012
-0.995763 0.091092 -0.0128938 1.41576
-0.0124568 -0.272358 -0.962119 1.1656
0 0 0 1
A pcd file has been exported
Click X(close) to continue...

groundtruth is:
0.9317582977 0.3630790644 0.0002599753 -10.5909217645
-0.3630791504 0.9317581820 0.0004699465 -9.2871497734
-0.0000716063 -0.0005322681 0.9999998558 0.8096312993
0.0000000000 0.0000000000 0.0000000000 1.0000000000
can i how to do next?
thank you**

ASL Data set

I want to use this method in ASL Data Set(Apartment,Staris,ETH).But I can't get a satisfactory result.Whether someone have also tried this Data.Can u tell me what the Parameters of this method should be set!Thanks.

PCL and Visual Studio version

Well done job!
I'm trying to test your code on my platform.
Which PCL version and Visual Studio did you use?

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.