Giter VIP home page Giter VIP logo

treeseg's People

Contributors

apburt avatar jgrn307 avatar tpet93 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

treeseg's Issues

PCL incompatibility

Hi,

First, of all, I'd like to thank you for developing and sharing your work! I'm very much looking forward to trying treeseg out! Unfortunately, I'm having some issues getting started. I followed the instructions on the main page, but there appears to be an incompatibility with the latest PCL library. When I run cmake, I get this error:

CMake Error at CMakeLists.txt:4 (find_package):
Could not find a configuration file for package "PCL" that is compatible
with requested version "1.9".

The following configuration files were considered but not accepted:

/usr/lib/x86_64-linux-gnu/cmake/pcl/PCLConfig.cmake, version: 1.8.1

Any ideas?

Thank you very much!

Using the code in Python env

Hi, i'm a Python beginner programmer.

I wonder if it is possible to import this script in python environment, where I feel slightly more comfortable.
Thanks

error: 'split' is not a member of 'boost'

Hi Guys, so I'm trying to build the project, and I'm getting an error with the boost library.
Seems like my version of the boost library is wrong.
Where can I check the correct version, please ?

`roberto@roberto-VirtualBox:~/treeseg/build$ make
Consolidate compiler generated dependencies of target treeseg
[ 3%] Building CXX object CMakeFiles/treeseg.dir/src/treeseg.cpp.o
/home/roberto/treeseg/src/treeseg.cpp: In function ‘std::vector<std::__cxx11::basic_string > getFileID(std::string)’:
/home/roberto/treeseg/src/treeseg.cpp:54:16: error: ‘split’ is not a member of ‘boost’
54 | boost::split(tmp1,filename,boost::is_any_of("/"));
| ^~~~~
/home/roberto/treeseg/src/treeseg.cpp:55:16: error: ‘split’ is not a member of ‘boost’
55 | boost::split(tmp2,tmp1[tmp1.size()-1],boost::is_any_of("."));
| ^~~~~
/home/roberto/treeseg/src/treeseg.cpp: In function ‘void readTiles(const std::vector<std::__cxx11::basic_string >&, pcl::PointCloud::Ptr&)’:
/home/roberto/treeseg/src/treeseg.cpp:83:24: error: ‘split’ is not a member of ‘boost’
83 | boost::split(tmp1,filename,boost::is_any_of("/"));
| ^~~~~
/home/roberto/treeseg/src/treeseg.cpp:84:24: error: ‘split’ is not a member of ‘boost’
84 | boost::split(tmp2,tmp1[tmp1.size()-1],boost::is_any_of("."));
| ^~~~~
/home/roberto/treeseg/src/treeseg.cpp: In function ‘int getTilesStartIdx(const std::vector<std::__cxx11::basic_string >&)’:
/home/roberto/treeseg/src/treeseg.cpp:104:24: error: ‘split’ is not a member of ‘boost’
104 | boost::split(tmp1,filename,boost::is_any_of("/"));
| ^~~~~
/home/roberto/treeseg/src/treeseg.cpp:105:24: error: ‘split’ is not a member of ‘boost’
105 | boost::split(tmp2,tmp1[tmp1.size()-1],boost::is_any_of("."));
| ^~~~~
make[2]: *** [CMakeFiles/treeseg.dir/build.make:76: CMakeFiles/treeseg.dir/src/treeseg.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:111: CMakeFiles/treeseg.dir/all] Error 2
make: *** [Makefile:91: all] Error 2

`

segmentstem "Correcting stem" errors

We found during our workflow an error in segmentstem that CAN occur during the "Correcting stem:" phase. The error that occurs is:
...
----------: cluster_20.pcd
RANSAC cylinder fit: 0.2
Segmenting extended cylinder: cylinder_20.pcd
Segmenting ground returns: cylinder_noground_20.pcd
Euclidean clustering: 0.184679, cylinder_noground_clusters_20.pcd
Region-based segmentation: cylinder_noground_clusters_regions_20.pcd
Correcting stem: stem_20.pcd
----------: cluster_21.pcd
RANSAC cylinder fit: 0.2
Segmenting extended cylinder: cylinder_21.pcd
Segmenting ground returns: cylinder_noground_21.pcd
Euclidean clustering: 0.0493233, cylinder_noground_clusters_21.pcd
Region-based segmentation: cylinder_noground_clusters_regions_21.pcd
Correcting stem: terminate called after throwing an instance of 'pcl::IOException'
what(): : [pcl::PCDWriter::writeBinary] Input point cloud has no data!
Aborted (core dumped)

Note cluster_20 completed fine but 21 failed.

I have included the files to replicate this error as follows:
https://nevada.box.com/s/kfjqnbkzjyox0eygtstamuf4o5a50tfb

To repeat the workflow to get to this stage (note the pcd is already downsampled):
getdemslice 2 3 6 P002_E1_clip_downsample.pcd
findstems 15 0.2 2 P002_E1_clip_coords.dat P002.slice.downsample.pcd
segmentstem 12.5 P002_E1_clip_downsample.pcd cluster_*.pcd

I've played around with other slice/findstem parameters and this error frequently pops up, with different clusters.

How long time do you guys nomally run with your own data?

I'm running it with the command "./findstems 16 0.2 2 cloud.pcd cloud.pcd"
size of cloud.pcd is 369.1 kB, about 10000 cloud point.

the process is as followed:

Reading slice: complete
Cluster extraction: cloud.intermediate.slice.clusters.pcd | 1
Region-based segmentation: cloud.intermediate.slice.cluster.regions.pcd | 5
RANSAC cylinder fits:

and it keeps running for about 5 hours and is still running now.
am I doing something wrong? or it's slow because I'm running it using CPU r75800X.
I'll be really grateful if someone could answer me.

Jayla

writer.write error and fix -- use ASCII output instead of binary

We found one of the issue we were having was with the writer.write statement. It looks like the binary format invokes mmap which can be problematic on different systems. By modifying the final parameter to be "false" (ASCII) instead of "true" (binary) throughout the code, we were able to get some of the steps working properly, e.g.:

writer.write(ss.str(),*cloud,false);

This issue was reported elsewhere on the PCL sites.

Get "Segmentation violation (generated `core')" with comand getdtmslice

Hello:
When a try to get a DTM o a slices, in the step (), whit may own sample files, like this:

:~/treeseg_tutorial/data$ getdtmslice 2 2.5 3 6 L3.tile.downsample.*.pcd > L3.dtm.dat

I have de next message:

Violación de segmento (`core' generado)

Another question:

With your data, what is the file: NOU11.dtm.dat?

Regards

ulimaps

can't download RiVLIB

hi ,
I couldn't download RiVLIB as I don't have an serial number of RIEGL product when register account on the Members Area of the RIEGL website, could anyone please help me with downloading RiVLiB? Thanks a lot!

Output file naming...

Minor issue, but could you tweak the output files so they append to the end of the name minus the .pcd? Right now, given a file:
P002_E1_clip_downsample.pcd
getdemslice produces:
P002.slice.downsample.pcd

Swapped if statement in correctStem()

implemented in pull request #18
In treeseg.cpp at line 685:

The if statement is written in a way that it will never use the zstop variable.
This leads to stems being cut off too high.

if (cov > stepcovmax || radchange < radchangemin)
{
	zstop = z - zstep * 1.5;
	broken = true;
	std::cout << " Broke: " << zstop << std::endl;

	break;
}
if (broken == false)
	spatial1DFilter(stem, "z", min[2], zstop, corrected);
else
	spatial1DFilter(stem, "z", min[2], max[2] - zstep, corrected);

The false should be changed to true.


if (cov > stepcovmax || radchange < radchangemin)
{
	zstop = z - zstep * 1.5;
	broken = true;
	std::cout << " Broke: " << zstop << std::endl;

	break;
}
if (broken == true)
	spatial1DFilter(stem, "z", min[2], zstop, corrected);
else
	spatial1DFilter(stem, "z", min[2], max[2] - zstep, corrected);

Regards
Tony

Differentiating between trees and other objects

Hi,

This if more of a question rather than an issue but we are exploring the use of this repository for one of our projects. One of our requirements is that we need to only consider trees in a LiDAR point cloud and eliminate all other objects.

I have followed the tutorials to extract individual trees from my own data. However, it seems that most of other objects (cars, small buildings) seem to be treated the same as trees. I understand that the point of the repo is to segment individual trees in forest but I was wondering if there is a way to use your code to differentiate between trees and other objects. Thanks

May I know if the algorithm can be runned in ROS Melodic on Ubuntu 18.04 and here are some problem I have met.

Hello, dear author!
I am a graduate student and I would like to learn and apply your algorithm on my tree diameter at breast height estimation project. But nowadays, I always encountered some installation problems, which caused me unable to compile your source code. For example:
'split' is not a member of 'boost'
I would like to know the version of the dependencies you are using, such as the Boost library. And is it ok that the version of some of my feature packages is higher than the version of your project dependencies?

Looking forward to your reply.
Much thanks and god bless us.

unkowned number

getcrownvolume.cpp -> function maxheight and maxcrown
i can not understanding.
float maxheight(float dbh)
{
//m -> 41.22 * dbh ^ 0.3406
//ci_u -> 42.30 * dbh ^ 0.3697
float height = 42.30 * pow(dbh,0.3697) + 5;
return height;
}

float maxcrown(float dbh)
{
//m -> 29.40 * dbh ^ 0.6524
//ci_u -> 30.36 * dbh ^ 0.6931
float extent = 30.36 * pow(dbh,0.6931) + 5;
return extent;
}

stem limitation - resolved (no limitation)

I get 25 stems with "findstems" but only 10 stems with "segmentstem". Why is there limitation to only 10 stems ("the inliers of the 10 largest stems output from findstems")?
EDIT: There is no that kind of limitation. The example command has '?'-windcard and selects only clusters 0-9. Cluster files names are sorted by descending diameter.

rxp2pcd not found

I can go through the tutorial until this:
"rxp2pcd can then be called as:"

running this command:
"rxp2pcd ../data/ NOU11.coords.dat 25 15 NOU11"

I get this:
"rxp2pcd: command not found"

Should rxp2pcd be in the build folder after install? (It's not there)

Creation of coords file

I am using the tutorial workflow, with my own file, and have one issue.

What is the required format of the plot coords file? ("....and their centre resides inside or on the bounding box specified by the plot boundaries"), so I may create one from my own plot.

Can you provide an example.pcd?

Because I failed to download riegl, I directly used my own pcd data, and the commands I tired were

../downsample 0.04 part_small.pcd
../getdemslice 2 3 0 3 part_small.tile.downsample.part_small.pcd
../findstems 15 0.2 2 nouragesH20_coords.dat part_small.slice.pcd

where part_small.pcd was my data, and in nouragesH20_coords.dat I set

-100 100 -100 100

to allow all points in the region.

But no cylinder was found, so I wonder if there is something wrong with my usage somewhere, or maybe it is because the stem is too short in my data and ransac cannot raise enough points belonging to the columns.

So please can you give an example of pcd and the command, and let me make a try/ learn how to use?

PCD format -- possible solution to our woes...

Ok, at long last we think we figured out what was happening to our analysis. The TLS points were in CA Teale Albers, which are VERY large numbers (projected, meter-based system) which we think were leading to numerical overflow errors.

We were able to properly create a working PCD file using pdal translate via the singularity container you can grab at:
singularity pull shub://gearslaboratory/gears-singularity:gears-pdal

This container is converter from the docker file: msmitherdc/pdal:1.7 (which you could also use). Note PDAL is a giant pain to compile from scratch.

singularity exec ~/gearslaboratory-gears-singularity-master-gears-pdal.simg pdal translate
--writers.pcd.xyz=true
P002_E1_clip_fixed.las
P002_E1_clip_fixed.pcd
-v 4

There's a couple of keys here that I think allowed this to work. First, the .xyz=true is neccessary to avoid RGB errors (also, treeseg doesn't use RGB).

Second, an important default for this conversion (https://pdal.io/stages/writers.pcd.html) is the subtract_minimum=true, which calculates the point cloud min x,y,z and simply subtracts that from every point. Our plots are only 30m, so the true range is pretty small, but the actual numbers were gigantic.

This seems to work great. As a bonus, you can skip "downsample" and use this command to both convert and voxelize:

singularity exec ~/gearslaboratory-gears-singularity-master-gears-pdal.simg pdal translate
--writers.pcd.xyz=true
-i P002_E1_clip_fixed.las
-o P002_E1_clip_fixed_voxel04.pcd
voxelgrid
--filters.voxelgrid.leaf_x="0.04"
--filters.voxelgrid.leaf_y="0.04"
--filters.voxelgrid.leaf_z="0.04"

This is identical to downsample's voxel algorithm (both use PCL).

So, in the short term I think we have it working via the PDAL translate. It might be worth some tweaks down the line to your rxp2pcd and txt2pcd for these weird projections tho!

Enhancement: Faster buildTree()

In the current workflow the buildTree command seems to be one of the most time consuming steps.

This is mostly due to the large amount of point cloud operations while comparing each set of outer clusters to every other unused cluster.
This can result in re-doing the same calculations several thousand times.

I have re-ordered the steps inside buildTree to optimize the speed at which it runs.

My testing showed a reduction in processing time (for the buildTree function) from 430 seconds to 28 seconds for one particular volume and 13558s to 1135s on another.

The main changes include:

  • Transforming each cluster and calculating its parameters (length,centroid .. ect) once in a single loop instead of in the iterative double-for loop.
  • Checking whether the clusterlength is less than outerlength before checking min distance between clouds (the most intensive step).
  • Checking the bounding boxes are closer than the min distance, before checking min distance between clouds (the most intensive step).

New buildTree function:

void buildTree_n(std::vector<pcl::PointCloud<PointTreeseg>::Ptr> &clusters, pcl::PointCloud<PointTreeseg>::Ptr &tree)
{
	pcl::PointCloud<PointTreeseg>::Ptr tmpcloud(new pcl::PointCloud<PointTreeseg>);
	for (int a = 0; a < clusters.size(); a++)
		*tmpcloud += *clusters[a];
	std::vector<std::vector<float>> nndata = dNNz(tmpcloud, 50, 2); //careful here //list of nn distances with the coresponding z heights at step size of 2
	std::cout << "Done DNNz" << std::endl;

	std::vector<float> clusterlengths;			   // all clusterlengths
	std::vector<Eigen::Vector4f> clustervectors;   // all clustervectors
	std::vector<Eigen::Vector4f> clustercentroids; // all cluster centroids

	std::vector<Eigen::Vector4f> clustermins; // all cluster centroids
	std::vector<Eigen::Vector4f> clustermaxs; // all cluster centroids

	std::vector<int> clusteridxs; // indexs of unused clusters

	for (int i = 0; i < clusters.size(); i++) // calculate all cluster variables first rather than in iterative steps below
	{
		Eigen::Vector4f clustercentroid;
		Eigen::Matrix3f clustercovariancematrix;
		Eigen::Matrix3f clustereigenvectors;
		Eigen::Vector3f clustereigenvalues;
		pcl::PointCloud<PointTreeseg>::Ptr clustertransformed(new pcl::PointCloud<PointTreeseg>);
		Eigen::Vector4f clustermin, clustermax;
		float clusterlength;
		computePCA(clusters[i], clustercentroid, clustercovariancematrix, clustereigenvectors, clustereigenvalues);
		Eigen::Vector3f clusterpoint(clustercentroid[0], clustercentroid[1], clustercentroid[2]);
		Eigen::Vector3f clusterdirection(clustereigenvectors(0, 2), clustereigenvectors(1, 2), clustereigenvectors(2, 2));
		Eigen::Affine3f clustertransform;
		Eigen::Vector3f clusterworld(0, clusterdirection[2], -clusterdirection[1]);
		clusterdirection.normalize();
		pcl::getTransformationFromTwoUnitVectorsAndOrigin(clusterworld, clusterdirection, clusterpoint, clustertransform);
		pcl::transformPointCloud(*clusters[i], *clustertransformed, clustertransform);
		pcl::getMinMax3D(*clustertransformed, clustermin, clustermax);
		clusterlength = clustermax[2] - clustermin[2];

		pcl::getMinMax3D(*clusters[i], clustermin, clustermax); //get min max of untransformed

		Eigen::Vector4f clustervector(clustereigenvectors(0, 2), clustereigenvectors(1, 2), clustereigenvectors(2, 2), 0);

		clusterlengths.push_back(clusterlength);
		clustervectors.push_back(clustervector);
		clustercentroids.push_back(clustercentroid);
		clustermins.push_back(clustermin);
		clustermaxs.push_back(clustermax);

		clusteridxs.push_back(i);
	}

	std::cout << "Done transforming clusters" << std::endl;

	int idx = findPrincipalCloudIdx(clusters);
	std::vector<pcl::PointCloud<PointTreeseg>::Ptr> treeclusters;
	std::vector<int> outeridxs;// indexes of clusters in outer edge of tree

	treeclusters.push_back(clusters[idx]);
	outeridxs.push_back(idx);
	clusteridxs.erase(clusteridxs.begin() + idx); // remove primary cluster

	int count = 0;
	bool donesomething = true;
	while (donesomething == true)
	{
		std::vector<int> tmpidxs;

		for (int i = 0; i < outeridxs.size(); i++)
		{
			std::vector<int> member;
			for (int j = 0; j < clusteridxs.size(); j++)
			{
				if (clusterlengths[clusteridxs[j]] < clusterlengths[outeridxs[i]]) // compare cluster length along first PCA
				{
					float mind = interpolatedNNZ((clustercentroids[clusteridxs[j]][2] + clustercentroids[outeridxs[i]][2]) / 2, nndata, true);

					// perform an intersecting3dbbox check with an extra buffer distance of mind
					if (clustermins[outeridxs[i]][0] <= (clustermaxs[clusteridxs[j]][0] + mind) && (clustermaxs[outeridxs[i]][0] + mind) >= clustermins[clusteridxs[j]][0] &&
						clustermins[outeridxs[i]][1] <= (clustermaxs[clusteridxs[j]][1] + mind) && (clustermaxs[outeridxs[i]][1] + mind) >= clustermins[clusteridxs[j]][1] &&
						clustermins[outeridxs[i]][2] <= (clustermaxs[clusteridxs[j]][2] + mind) && (clustermaxs[outeridxs[i]][2] + mind) >= clustermins[clusteridxs[j]][2]) 
					{
						// if bounding boxes are closer than mind then do min distance check
						float d;
						if (clusters[outeridxs[i]]->points.size() >= clusters[clusteridxs[j]]->points.size())
							d = minDistBetweenClouds(clusters[outeridxs[i]], clusters[clusteridxs[j]]); // this takes the most time so we want to minimize this check by doing it last
						else
							d = minDistBetweenClouds(clusters[clusteridxs[j]], clusters[outeridxs[i]]);
						if (d <= mind)
						{
							member.push_back(j); // save j for adding
						}
					}
				}
			}

			std::sort(member.begin(), member.end(), std::greater<int>());
			for (int k = 0; k < member.size(); k++)
			{
				tmpidxs.push_back(clusteridxs[member[k]]);			// add clusters idxs refernce to by j
				clusteridxs.erase(clusteridxs.begin() + member[k]); // remove clusters reference to by j form the valid idxs
			}
		}
		if (tmpidxs.size() != 0)
		{
			outeridxs.clear();
			for (int m = 0; m < tmpidxs.size(); m++)
			{
				treeclusters.push_back(clusters[tmpidxs[m]]); // add to tree
				outeridxs.push_back(tmpidxs[m]);			  // get next set of outers
			}
		}
		else
			donesomething = false;
		count++;
		std::cout << "." << std::flush;
	}

	for (int n = 0; n < treeclusters.size(); n++)
		*tree += *treeclusters[n];
}

Error occured when running "make" command

Hi,

I followed the readme instruction works fine until I run with command "make", an error occured:

[ 21%] Linking CXX executable sepwoodleaf
/usr/bin/ld: warning: libhdf5.so.101, needed by /usr/lib/x86_64-linux-gnu/libarmadillo.so, not found (try using -rpath or -rpath-link)
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dread' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Eset_auto2'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Fcreate' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_UCHAR_g'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Tinsert' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Tequal'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5check_version' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dget_space'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Tclose' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Fis_hdf5'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dopen2' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dget_type'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5open' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_FLOAT_g'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Gopen2' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Ovisit'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_USHORT_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Eget_auto2'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_ULONG_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Screate_simple'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Sget_simple_extent_ndims' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_SCHAR_g'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_UINT_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Tcreate'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_LONG_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dwrite'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Fclose' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Gclose'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Lexists' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Gcreate2'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_INT_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Tcopy'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_ULLONG_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Sclose'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_LLONG_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_SHORT_g'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5T_NATIVE_DOUBLE_g' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dclose'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Ldelete' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Sget_simple_extent_dims'
/usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Dcreate2' /usr/lib/x86_64-linux-gnu/libarmadillo.so: undefined reference to H5Fopen'
collect2: error: ld returned 1 exit status
CMakeFiles/sepwoodleaf.dir/build.make:197: recipe for target 'sepwoodleaf' failed
make[2]: *** [sepwoodleaf] Error 1
CMakeFiles/Makefile2:102: recipe for target 'CMakeFiles/sepwoodleaf.dir/all' failed
make[1]: *** [CMakeFiles/sepwoodleaf.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2

I googled this error and get the answer that I should install hdf5 package, but it doesn't work, what else could the problem be? Thanks a lot.

findstems RANSAC cylinder fits error

When running findstems after successfully running getdtmslice with a custom point cloud data set, I got this error:

RANSAC cylinder fits: libc++abi.dylib: terminating with uncaught exception of type pcl::IOException: : [pcl::PCDWriter::writeBinary] Input point cloud has no data!

Before crashing, findstems generate the <filename>.intermediate.slice.clusters.pcd and <filename>.intermediate.slice.clusters.regions.pcd files, and they look fine, it does find the stems but does not generate the corresponding <filename>..cluster.*.pcd files needed for the next step.

Process stopped issue

Process stopped issue while loading xyz data in xyz2pcd
can I process directly xyz data ?

Need for help with starting analysis

Hello,

I installed treeseg on Ubuntu and trying to use for plot based TLS point cloud. I have single return LAS files one hectare plot with 5 cm point spacing. It was classified into ground and non-ground points and georeferenced. I converted it into PCD file with CloudCompare to be able to use in treeseg. I need help with starting the segmentation, please. Volumne of my data is 324 mb.
findstems 15 0.2 2 /Documents/9a-4.pcd Returns error: Reading slice: terminate called after throwing and instance of std::logic_error Aborted (core dumped)

segmentcrown [14-16] Documents/19a-4.pcd Reading volume cloud: Segmentation fault (core dumped)
I clipped small part of point cloud which is only 31 mb.
Reading volume cloud: [pcl::KdTreeFLANN::read] Could not find file ... complete Eucledian clustering: 0, [pcl::KdTreeFLANN;;setInputCloud] Cannot create a KDTree with an empty input cloud! terminate called after throwing an instance of pcl::IOexception what(): : [pcl::PCDWriter::writeBinary] Input point cloud has no data

Could someone please help with starting?

Thanks so much in advance.

GetDemSlice error: segmentation fault

Hi,

I'm trying to follow the example on the ReadMe with my single TLS data, but I ran into an issue with the getdemslice function. I've managed to downsample the input cloud data with the downsample function, but when I try to apply the getdemslice function to the output of the downsample function, I get the following error: Segmentation fault.

Also, when I try to run the getdemslice function on the original point cloud (in .pcd), I get the error "[pcl::PCDWriter::writeBinary] Input point cloud has no data!" even though the file is valid (I'm able to visualize it in CloudCompare and R).

I know it's a bit vague, but have you ran into this sort of issue before? I thought that it might be an incompatibility with the PCL library, so I downgraded to libpcl 1.9 but it didn't solve the issue. I also tried modifying the treeseg_pointtype.h file, but it didn't work.

Any ideas?
Thank you very much!

Trouble installing.

Hello!

Everything builds correctly (seemingly), but I just get "command not found" when trying the different commandos.
Where can I find the relevant arguments for each commando? Because I can run the executable and it works, but I don't know the arguments for stem segmentation (documentation not available), so I'm stuck on this part.

Thank you.

Small tweak to get working on Ubuntu

I was able to get this working on Ubuntu, but there is an issue noted at:
PointCloudLibrary/pcl#1594

I was able to modify your instructions for RHEL to get treeseg working as follows:

apt-get install -y git cmake
apt-get install -y libpcl-dev
apt-get install -y libproj-dev # Note that proj4 appears to be a requirement of treeseg
ln -s /usr/lib/x86_64-linux-gnu/libvtkCommonCore-6.2.so /usr/lib/libvtkproj4.so # this is the tweak
git clone https://github.com/apburt/treeseg.git;
cd treeseg;
mkdir build;
cd build;
cmake ../src;
make;

In a future release, could you consider searching for libvtkproj4.so as listed above? Otherwise the symlink trick will work fine. Cheers!

how to test my data

I have a PCD format file and want to test it,But the program named getdemslice run failed.I want to know what the four parameters means? How to test my data?

Rxp2pcd generating corrupt PCDs

Running rxp2pcd on Red Hat Linux I was gettting corrupt Pcd errors.
[pcl::PCDReader::read] Corrupted PCD file. The file is smaller than expected!
I believe I found the cause of the error in the line 82:

int tile_pointcount[tile_count];

Does not always initialize the point count for each tile to zero,
and may overflow with many points per tile.

This causes a mismatch between the reported point-cloud size in the header and the size of the binary data.

It should be replaced with:

unsigned long int tile_pointcount[tile_count] = { 0 };

rxp2pcd not building on Ubuntu

I'm using the latest rivlib (rivlib-2_5_4-x86_64-linux-gcc55) and have dropped it into the right place (I think) but I'm getting an error building rxp2pcd on Ubuntu. Any ideas?

Here's the install + errors:

Singularity gears-singularity.tls.img:> cd ~
Singularity gears-singularity.tls.img:
> git clone https://github.com/apburt/treeseg.git;
Cloning into 'treeseg'...
remote: Counting objects: 51, done.
remote: Total 51 (delta 0), reused 0 (delta 0), pack-reused 51
Unpacking objects: 100% (51/51), done.
Checking connectivity... done.

Singularity gears-singularity.tls.img:>
Singularity gears-singularity.tls.img:
> # Copy riegl files in:
Singularity gears-singularity.tls.img:> cp -r /rivlib-2_5_4-x86_64-linux-gcc55/include/riegl /treeseg/include/riegl
Singularity gears-singularity.tls.img:
> cp -r /rivlib-2_5_4-x86_64-linux-gcc55/lib/ /treeseg/lib
Singularity gears-singularity.tls.img:
> # Now build treeseg:
Singularity gears-singularity.tls.img:
> cd ~
Singularity gears-singularity.tls.img:
> cd treeseg;
Singularity gears-singularity.tls.img:
/treeseg> mkdir build;
Singularity gears-singularity.tls.img:/treeseg> cd build;
Singularity gears-singularity.tls.img:
/treeseg/build> cmake ../src;
-- The C compiler identification is GNU 5.3.1
-- The CXX compiler identification is GNU 5.3.1
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Checking for module 'eigen3'
-- Found eigen3, version 3.2.92
-- Found eigen: /usr/include/eigen3
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE
-- Boost version: 1.58.0
-- Found the following Boost libraries:
-- system
-- filesystem
-- thread
-- date_time
-- iostreams
-- serialization
-- chrono
-- atomic
-- regex
-- Checking for module 'flann'
-- Found flann, version 1.8.4
-- Found Flann: /usr/lib/x86_64-linux-gnu/libflann_cpp_s.a
-- Checking for module 'libopenni'
-- Found libopenni, version 1.5.4.0
-- Found openni: /usr/lib/libOpenNI.so
-- Checking for module 'libopenni2'
-- Found libopenni2, version 2.2.0.3
-- Found OpenNI2: /usr/lib/libOpenNI2.so
** WARNING ** io features related to pcap will be disabled
** WARNING ** io features related to png will be disabled
-- The imported target "vtkRenderingPythonTkWidgets" references the file
"/usr/lib/x86_64-linux-gnu/libvtkRenderingPythonTkWidgets.so"
but this file does not exist. Possible reasons include:

  • The file was deleted, renamed, or moved to another location.
  • An install or uninstall procedure did not complete successfully.
  • The installation package was faulty and contained
    "/usr/lib/cmake/vtk-6.2/VTKTargets.cmake"
    but not all the files it references.

-- Found libusb-1.0: /usr/include
-- Found qhull: /usr/lib/x86_64-linux-gnu/libqhull.so
-- looking for PCL_COMMON
-- Found PCL_COMMON: /usr/lib/x86_64-linux-gnu/libpcl_common.so
-- looking for PCL_KDTREE
-- Found PCL_KDTREE: /usr/lib/x86_64-linux-gnu/libpcl_kdtree.so
-- looking for PCL_OCTREE
-- Found PCL_OCTREE: /usr/lib/x86_64-linux-gnu/libpcl_octree.so
-- looking for PCL_SEARCH
-- Found PCL_SEARCH: /usr/lib/x86_64-linux-gnu/libpcl_search.so
-- looking for PCL_IO
-- Found PCL_IO: /usr/lib/x86_64-linux-gnu/libpcl_io.so
-- looking for PCL_SAMPLE_CONSENSUS
-- Found PCL_SAMPLE_CONSENSUS: /usr/lib/x86_64-linux-gnu/libpcl_sample_consensus.so
-- looking for PCL_FILTERS
-- Found PCL_FILTERS: /usr/lib/x86_64-linux-gnu/libpcl_filters.so
-- looking for PCL_GEOMETRY
-- Found PCL_GEOMETRY: /usr/include/pcl-1.7
-- looking for PCL_FEATURES
-- Found PCL_FEATURES: /usr/lib/x86_64-linux-gnu/libpcl_features.so
-- looking for PCL_SEGMENTATION
-- Found PCL_SEGMENTATION: /usr/lib/x86_64-linux-gnu/libpcl_segmentation.so
-- looking for PCL_SURFACE
-- Found PCL_SURFACE: /usr/lib/x86_64-linux-gnu/libpcl_surface.so
-- looking for PCL_REGISTRATION
-- Found PCL_REGISTRATION: /usr/lib/x86_64-linux-gnu/libpcl_registration.so
-- looking for PCL_RECOGNITION
-- Found PCL_RECOGNITION: /usr/lib/x86_64-linux-gnu/libpcl_recognition.so
-- looking for PCL_KEYPOINTS
-- Found PCL_KEYPOINTS: /usr/lib/x86_64-linux-gnu/libpcl_keypoints.so
-- looking for PCL_VISUALIZATION
-- Found PCL_VISUALIZATION: /usr/lib/x86_64-linux-gnu/libpcl_visualization.so
-- looking for PCL_PEOPLE
-- Found PCL_PEOPLE: /usr/lib/x86_64-linux-gnu/libpcl_people.so
-- looking for PCL_OUTOFCORE
-- Found PCL_OUTOFCORE: /usr/lib/x86_64-linux-gnu/libpcl_outofcore.so
-- looking for PCL_TRACKING
-- Found PCL_TRACKING: /usr/lib/x86_64-linux-gnu/libpcl_tracking.so
-- looking for PCL_APPS
-- Could NOT find PCL_APPS (missing: PCL_APPS_LIBRARY)
-- looking for PCL_MODELER
-- Found PCL_MODELER: /usr/include/pcl-1.7
-- looking for PCL_IN_HAND_SCANNER
-- Found PCL_IN_HAND_SCANNER: /usr/include/pcl-1.7
-- looking for PCL_POINT_CLOUD_EDITOR
-- Found PCL_POINT_CLOUD_EDITOR: /usr/include/pcl-1.7
-- Found PCL: /usr/lib/x86_64-linux-gnu/libboost_system.so;/usr/lib/x86_64-linux-gnu/libboost_filesystem.so;/usr/lib/x86_64-linux-gnu/libboost_thread.so;/usr/lib/x86_64-linux-gnu/libboost_date_time.so;/usr/lib/x86_64-linux-gnu/libboost_iostreams.so;/usr/lib/x86_64-linux-gnu/libboost_serialization.so;/usr/lib/x86_64-linux-gnu/libboost_chrono.so;/usr/lib/x86_64-linux-gnu/libboost_atomic.so;/usr/lib/x86_64-linux-gnu/libboost_regex.so;/usr/lib/x86_64-linux-gnu/libpthread.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_common.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_common.so;optimized;/usr/lib/x86_64-linux-gnu/libflann_cpp_s.a;debug;/usr/lib/x86_64-linux-gnu/libflann_cpp_s.a;optimized;/usr/lib/x86_64-linux-gnu/libpcl_kdtree.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_kdtree.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_octree.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_octree.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_search.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_search.so;/usr/lib/libOpenNI.so;/usr/lib/libOpenNI2.so;vtkImagingStencil;vtkCommonComputationalGeometry;vtkCommonDataModel;vtkCommonMath;vtkCommonCore;vtksys;vtkCommonMisc;vtkCommonSystem;vtkCommonTransforms;vtkImagingCore;vtkCommonExecutionModel;vtkFiltersAMR;vtkFiltersGeneral;vtkFiltersCore;vtkParallelCore;vtkIOLegacy;vtkIOCore;/usr/lib/x86_64-linux-gnu/libz.so;vtkInteractionWidgets;vtkFiltersHybrid;vtkImagingSources;vtkRenderingCore;vtkCommonColor;vtkFiltersExtraction;vtkFiltersStatistics;vtkImagingFourier;vtkalglib;vtkFiltersGeometry;vtkFiltersSources;vtkFiltersModeling;vtkImagingGeneral;vtkImagingHybrid;vtkIOImage;vtkDICOMParser;vtkmetaio;/usr/lib/x86_64-linux-gnu/libjpeg.so;/usr/lib/x86_64-linux-gnu/libpng.so;/usr/lib/x86_64-linux-gnu/libtiff.so;vtkInteractionStyle;vtkRenderingAnnotation;vtkImagingColor;vtkRenderingFreeType;/usr/lib/x86_64-linux-gnu/libfreetype.so;vtkftgl;vtkRenderingVolume;vtkIOParallelNetCDF;vtkParallelMPI;/usr/lib/x86_64-linux-gnu/libnetcdf_c++.so;/usr/lib/x86_64-linux-gnu/libnetcdf.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so;/usr/lib/x86_64-linux-gnu/libsz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5_hl.so;vtkRenderingOpenGL;vtkIOLSDyna;vtkIOXML;vtkIOGeometry;/usr/lib/x86_64-linux-gnu/libjsoncpp.so;vtkIOXMLParser;/usr/lib/x86_64-linux-gnu/libexpat.so;vtkLocalExample;vtkInfovisCore;vtkGeovisCore;vtkInfovisLayout;vtkViewsCore;vtkproj4;/usr/lib/x86_64-linux-gnu/libpython2.7.so;vtkTestingGenericBridge;/usr/lib/libgl2ps.so;verdict;vtkIOMovie;/usr/lib/x86_64-linux-gnu/libtheoraenc.so;/usr/lib/x86_64-linux-gnu/libtheoradec.so;/usr/lib/x86_64-linux-gnu/libogg.so;vtkFiltersImaging;vtkIOMINC;vtkRenderingLOD;vtkViewsQt;vtkGUISupportQt;vtkViewsInfovis;vtkChartsCore;vtkRenderingContext2D;vtkRenderingLabel;vtkRenderingImage;vtkFiltersFlowPaths;vtkxdmf2;/usr/lib/x86_64-linux-gnu/libxml2.so;vtkFiltersReebGraph;vtkViewsContext2D;vtkIOXdmf2;vtkIOAMR;vtkRenderingContextOpenGL;vtkImagingStatistics;vtkIOParallel;vtkFiltersParallel;vtkIONetCDF;vtkexoIIc;vtkGUISupportQtOpenGL;vtkIOParallelLSDyna;vtkFiltersParallelGeometry;vtkGUISupportQtWebkit;vtkIOPLY;vtkWrappingTools;vtkFiltersHyperTree;vtkRenderingVolumeOpenGL;vtkIOExodus;vtkIOPostgreSQL;vtkIOSQL;sqlite3;vtkWrappingJava;vtkFiltersParallelFlowPaths;vtkFiltersParallelStatistics;vtkFiltersProgrammable;vtkFiltersParallelImaging;vtkRenderingParallelLIC;vtkRenderingLIC;vtkInteractionImage;vtkFiltersPython;vtkWrappingPythonCore;vtkIOParallelExodus;vtkFiltersGeneric;vtkIOVideo;vtkRenderingQt;vtkFiltersTexture;vtkIOInfovis;vtkGUISupportQtSQL;vtkRenderingFreeTypeOpenGL;vtkInfovisBoostGraphAlgorithms;vtkRenderingGL2PS;vtkIOGeoJSON;vtkFiltersVerdict;vtkViewsGeovis;vtkIOImport;vtkTestingIOSQL;vtkPythonInterpreter;vtkIOODBC;vtkIOEnSight;vtkIOMySQL;vtkRenderingMatplotlib;vtkDomainsChemistry;vtkIOExport;vtkFiltersParallelMPI;vtkIOParallelXML;vtkTestingRendering;vtkIOMPIParallel;vtkParallelMPI4Py;vtkFiltersSMP;vtkFiltersSelection;vtkIOVPIC;VPIC;vtkImagingMath;vtkImagingMorphological;vtkRenderingParallel;vtkRenderingFreeTypeFontConfig;vtkIOFFMPEG;vtkIOMPIImage;vtkIOGDAL;optimized;/usr/lib/x86_64-linux-gnu/libpcl_io.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_io.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_sample_consensus.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_sample_consensus.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_filters.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_filters.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_features.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_features.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_segmentation.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_segmentation.so;optimized;/usr/lib/x86_64-linux-gnu/libqhull.so;debug;/usr/lib/x86_64-linux-gnu/libqhull.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_surface.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_surface.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_registration.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_registration.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_recognition.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_recognition.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_keypoints.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_keypoints.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_visualization.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_visualization.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_people.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_people.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_outofcore.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_outofcore.so;optimized;/usr/lib/x86_64-linux-gnu/libpcl_tracking.so;debug;/usr/lib/x86_64-linux-gnu/libpcl_tracking.so;/usr/lib/x86_64-linux-gnu/libboost_system.so;/usr/lib/x86_64-linux-gnu/libboost_filesystem.so;/usr/lib/x86_64-linux-gnu/libboost_thread.so;/usr/lib/x86_64-linux-gnu/libboost_date_time.so;/usr/lib/x86_64-linux-gnu/libboost_iostreams.so;/usr/lib/x86_64-linux-gnu/libboost_serialization.so;/usr/lib/x86_64-linux-gnu/libboost_chrono.so;/usr/lib/x86_64-linux-gnu/libboost_atomic.so;/usr/lib/x86_64-linux-gnu/libboost_regex.so;/usr/lib/x86_64-linux-gnu/libpthread.so;optimized;/usr/lib/x86_64-linux-gnu/libqhull.so;debug;/usr/lib/x86_64-linux-gnu/libqhull.so;/usr/lib/libOpenNI.so;/usr/lib/libOpenNI2.so;optimized;/usr/lib/x86_64-linux-gnu/libflann_cpp_s.a;debug;/usr/lib/x86_64-linux-gnu/libflann_cpp_s.a;vtkImagingStencil;vtkCommonComputationalGeometry;vtkCommonDataModel;vtkCommonMath;vtkCommonCore;vtksys;vtkCommonMisc;vtkCommonSystem;vtkCommonTransforms;vtkImagingCore;vtkCommonExecutionModel;vtkFiltersAMR;vtkFiltersGeneral;vtkFiltersCore;vtkParallelCore;vtkIOLegacy;vtkIOCore;/usr/lib/x86_64-linux-gnu/libz.so;vtkInteractionWidgets;vtkFiltersHybrid;vtkImagingSources;vtkRenderingCore;vtkCommonColor;vtkFiltersExtraction;vtkFiltersStatistics;vtkImagingFourier;vtkalglib;vtkFiltersGeometry;vtkFiltersSources;vtkFiltersModeling;vtkImagingGeneral;vtkImagingHybrid;vtkIOImage;vtkDICOMParser;vtkmetaio;/usr/lib/x86_64-linux-gnu/libjpeg.so;/usr/lib/x86_64-linux-gnu/libpng.so;/usr/lib/x86_64-linux-gnu/libtiff.so;vtkInteractionStyle;vtkRenderingAnnotation;vtkImagingColor;vtkRenderingFreeType;/usr/lib/x86_64-linux-gnu/libfreetype.so;vtkftgl;vtkRenderingVolume;vtkIOParallelNetCDF;vtkParallelMPI;/usr/lib/x86_64-linux-gnu/libnetcdf_c++.so;/usr/lib/x86_64-linux-gnu/libnetcdf.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libsz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/hdf5/serial/lib/libhdf5_hl.so;vtkRenderingOpenGL;vtkIOLSDyna;vtkIOXML;vtkIOGeometry;/usr/lib/x86_64-linux-gnu/libjsoncpp.so;vtkIOXMLParser;/usr/lib/x86_64-linux-gnu/libexpat.so;vtkLocalExample;vtkInfovisCore;vtkGeovisCore;vtkInfovisLayout;vtkViewsCore;vtkproj4;/usr/lib/x86_64-linux-gnu/libpython2.7.so;vtkTestingGenericBridge;/usr/lib/libgl2ps.so;verdict;vtkIOMovie;/usr/lib/x86_64-linux-gnu/libtheoraenc.so;/usr/lib/x86_64-linux-gnu/libtheoradec.so;/usr/lib/x86_64-linux-gnu/libogg.so;vtkFiltersImaging;vtkIOMINC;vtkRenderingLOD;vtkViewsQt;vtkGUISupportQt;vtkViewsInfovis;vtkChartsCore;vtkRenderingContext2D;vtkRenderingLabel;vtkRenderingImage;vtkFiltersFlowPaths;vtkxdmf2;/usr/lib/x86_64-linux-gnu/libxml2.so;vtkFiltersReebGraph;vtkViewsContext2D;vtkIOXdmf2;vtkIOAMR;vtkRenderingContextOpenGL;vtkImagingStatistics;vtkIOParallel;vtkFiltersParallel;vtkIONetCDF;vtkexoIIc;vtkGUISupportQtOpenGL;vtkIOParallelLSDyna;vtkFiltersParallelGeometry;vtkGUISupportQtWebkit;vtkIOPLY;vtkWrappingTools;vtkFiltersHyperTree;vtkRenderingVolumeOpenGL;vtkIOExodus;vtkIOPostgreSQL;vtkIOSQL;sqlite3;vtkWrappingJava;vtkFiltersParallelFlowPaths;vtkFiltersParallelStatistics;vtkFiltersProgrammable;vtkFiltersParallelImaging;vtkRenderingParallelLIC;vtkRenderingLIC;vtkInteractionImage;vtkFiltersPython;vtkWrappingPythonCore;vtkIOParallelExodus;vtkFiltersGeneric;vtkIOVideo;vtkRenderingQt;vtkFiltersTexture;vtkIOInfovis;vtkGUISupportQtSQL;vtkRenderingFreeTypeOpenGL;vtkInfovisBoostGraphAlgorithms;vtkRenderingGL2PS;vtkIOGeoJSON;vtkFiltersVerdict;vtkViewsGeovis;vtkIOImport;vtkTestingIOSQL;vtkPythonInterpreter;vtkIOODBC;vtkIOEnSight;vtkIOMySQL;vtkRenderingMatplotlib;vtkDomainsChemistry;vtkIOExport;vtkFiltersParallelMPI;vtkIOParallelXML;vtkTestingRendering;vtkIOMPIParallel;vtkParallelMPI4Py;vtkFiltersSMP;vtkFiltersSelection;vtkIOVPIC;VPIC;vtkImagingMath;vtkImagingMorphological;vtkRenderingParallel;vtkRenderingFreeTypeFontConfig;vtkIOFFMPEG;vtkIOMPIImage;vtkIOGDAL (Required is at least version "1.7")
CMake Warning (dev) at CMakeLists.txt:9 (link_directories):
This command specifies the relative path

../lib

as a link directory.

Policy CMP0015 is not set: link_directories() treats paths relative to the
source dir. Run "cmake --help-policy CMP0015" for policy details. Use the
cmake_policy command to set the policy and suppress this warning.
This warning is for project developers. Use -Wno-dev to suppress it.

-- Configuring done
-- Generating done
-- Build files have been written to: /root/treeseg/build
Singularity gears-singularity.tls.img:/treeseg/build> make;
Scanning dependencies of target pcd2xyz
[ 4%] Building CXX object CMakeFiles/pcd2xyz.dir/pcd2xyz.cpp.o
[ 8%] Linking CXX executable pcd2xyz
[ 8%] Built target pcd2xyz
Scanning dependencies of target treeseg
[ 12%] Building CXX object CMakeFiles/treeseg.dir/treeseg.cpp.o
[ 16%] Linking CXX shared library libtreeseg.so
[ 16%] Built target treeseg
Scanning dependencies of target segmentcrown
[ 20%] Building CXX object CMakeFiles/segmentcrown.dir/segmentcrown.cpp.o
[ 25%] Linking CXX executable segmentcrown
[ 25%] Built target segmentcrown
Scanning dependencies of target getdemslice
[ 29%] Building CXX object CMakeFiles/getdemslice.dir/getdemslice.cpp.o
[ 33%] Linking CXX executable getdemslice
[ 33%] Built target getdemslice
Scanning dependencies of target xyz2pcd
[ 37%] Building CXX object CMakeFiles/xyz2pcd.dir/xyz2pcd.cpp.o
[ 41%] Linking CXX executable xyz2pcd
[ 41%] Built target xyz2pcd
Scanning dependencies of target downsample
[ 45%] Building CXX object CMakeFiles/downsample.dir/downsample.cpp.o
[ 50%] Linking CXX executable downsample
[ 50%] Built target downsample
Scanning dependencies of target findstems
[ 54%] Building CXX object CMakeFiles/findstems.dir/findstems.cpp.o
[ 58%] Linking CXX executable findstems
[ 58%] Built target findstems
Scanning dependencies of target segmentstem
[ 62%] Building CXX object CMakeFiles/segmentstem.dir/segmentstem.cpp.o
[ 66%] Linking CXX executable segmentstem
[ 66%] Built target segmentstem
Scanning dependencies of target nearestneighbour
[ 70%] Building CXX object CMakeFiles/nearestneighbour.dir/nearestneighbour.cpp.o
[ 75%] Linking CXX executable nearestneighbour
[ 75%] Built target nearestneighbour
Scanning dependencies of target plotcoords
[ 79%] Building CXX object CMakeFiles/plotcoords.dir/plotcoords.cpp.o
[ 83%] Linking CXX executable plotcoords
[ 83%] Built target plotcoords
Scanning dependencies of target rxp2pcd
[ 87%] Building CXX object CMakeFiles/rxp2pcd.dir/rxp2pcd.cpp.o
[ 91%] Linking CXX executable rxp2pcd
../lib/libscanlib-mt.a(fileconn.o): In function scanlib::file_connection_imbue(std::locale const&)': fileconn.cpp:(.text+0x435): undefined reference to riboost::filesystem::path::imbue(std::locale const&)'
../lib/libscanlib-mt.a(rdtpconn.o): In function scanlib::rdtp_rconnection::impl::resolve(riboost::asio::ip::basic_resolver_query<riboost::asio::ip::tcp> const&)': rdtpconn.cpp:(.text+0x5201): undefined reference to pthread_create'
rdtpconn.cpp:(.text+0x5278): undefined reference to pthread_detach' ../lib/libscanlib-mt.a(rdtpconn.o): In function riboost::asio::detail::resolver_service_base::shutdown_service()':
rdtpconn.cpp:(.text._ZN7riboost4asio6detail21resolver_service_base16shutdown_serviceEv[_ZN7riboost4asio6detail21resolver_service_base16shutdown_serviceEv]+0x1d6): undefined reference to pthread_join' rdtpconn.cpp:(.text._ZN7riboost4asio6detail21resolver_service_base16shutdown_serviceEv[_ZN7riboost4asio6detail21resolver_service_base16shutdown_serviceEv]+0x1f9): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rdtpconn.o): In function riboost::asio::ip::resolver_service<riboost::asio::ip::tcp>::~resolver_service()': rdtpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEED0Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEED5Ev]+0x174): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rdtpconn.o): In function riboost::asio::ip::resolver_service<riboost::asio::ip::tcp>::~resolver_service()': rdtpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEED2Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEED5Ev]+0x164): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rdtpconn.o): In function riboost::asio::ip::resolver_service<riboost::asio::ip::tcp>::fork_service(riboost::asio::io_service::fork_event)': rdtpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE]+0x117): undefined reference to pthread_create'
rdtpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE]+0x176): undefined reference to pthread_join' rdtpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3tcpEE12fork_serviceENS0_10io_service10fork_eventE]+0x18d): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rddpconn.o): In function scanlib::rddp_rconnection::impl::resolve(riboost::asio::ip::basic_resolver_query<riboost::asio::ip::udp> const&)': rddpconn.cpp:(.text+0x2d51): undefined reference to pthread_create'
rddpconn.cpp:(.text+0x2df0): undefined reference to pthread_detach' ../lib/libscanlib-mt.a(rddpconn.o): In function riboost::asio::ip::resolver_serviceriboost::asio::ip::udp::resolver_service()':
rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED0Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x314): undefined reference to pthread_detach' rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED0Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x336): undefined reference to pthread_join'
rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED0Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x359): undefined reference to pthread_detach' ../lib/libscanlib-mt.a(rddpconn.o): In function riboost::asio::ip::resolver_serviceriboost::asio::ip::udp::shutdown_service()':
rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE16shutdown_serviceEv[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE16shutdown_serviceEv]+0x1d6): undefined reference to pthread_join' rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE16shutdown_serviceEv[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE16shutdown_serviceEv]+0x1f9): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rddpconn.o): In function riboost::asio::ip::resolver_service<riboost::asio::ip::udp>::~resolver_service()': rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED2Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x314): undefined reference to pthread_detach'
rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED2Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x336): undefined reference to pthread_join' rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED2Ev[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEED5Ev]+0x359): undefined reference to pthread_detach'
../lib/libscanlib-mt.a(rddpconn.o): In function riboost::asio::ip::resolver_service<riboost::asio::ip::udp>::fork_service(riboost::asio::io_service::fork_event)': rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE]+0x117): undefined reference to pthread_create'
rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE]+0x176): undefined reference to pthread_join' rddpconn.cpp:(.text._ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE[_ZN7riboost4asio2ip16resolver_serviceINS1_3udpEE12fork_serviceENS0_10io_service10fork_eventE]+0x18d): undefined reference to pthread_detach'
collect2: error: ld returned 1 exit status
CMakeFiles/rxp2pcd.dir/build.make:94: recipe for target 'rxp2pcd' failed
make[2]: *** [rxp2pcd] Error 1
CMakeFiles/Makefile2:437: recipe for target 'CMakeFiles/rxp2pcd.dir/all' failed
make[1]: *** [CMakeFiles/rxp2pcd.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2
Singularity gears-singularity.tls.img:
/treeseg/build>
Singularity gears-singularity.tls.img:
/treeseg/build>

how to modify "stepcovmax" and "radratiomin" in findstem?

I run with my .pcd data with
./findstems 15 0.1 1 cloud.slice.pcd cloud.slice.pcd

as I don't hava a .dat file, I calculate xmin xmax ymin ymax with my .pcd file using another cpp file, so part of my findstems is:

std::cout << "RANSAC cylinder fits: " << std::flush;
	std::vector<pcl::PointCloud<PointTreeseg>::Ptr> cyls;
	nnearest = 60;
	float dmin = atof(argv[2]);
	float dmax = atof(argv[3]);
        /*
	std::ifstream coordfile;
	coordfile.open(argv[4]);
	float coords[4];
	int n = 0;
	if(coordfile.is_open())
	{
		while(!coordfile.eof())
		{
			coordfile >> coords[n];
			n++;
		}
	}
	coordfile.close();
	//float xmin = coords[0];
	//float xmax = coords[1];
	//float ymin = coords[2];
	//float ymax = coords[3];
        */
        float xmin = -0.998983;
        float xmax = 0.716053;
        float ymin = -0.749072;
        float ymax = 0.968809;

	float lmin = 2; //assuming 3m slice
	float stepcovmax = 0.2;
	float radratiomin = 0.8;
	for(int i=0;i<regions.size();i++)
	{
                std::cout << "first "<< i <<"start" <<std::endl;
		cylinder cyl;
		fitCylinder(regions[i],nnearest,true,true,cyl);
		if(cyl.ismodel == true)
		{	
	                std::cout << "cyl is model "<< i <<std::endl;
			if(cyl.rad*2 >= dmin && cyl.rad*2 <= dmax && cyl.len >= lmin)
			{
                                std::cout << "second if true then"<< i <<std::endl;
				if(cyl.stepcov <= stepcovmax && cyl.radratio > radratiomin)
				{
                                        std::cout << "third if true"<< i <<std::endl;
					if(cyl.x >= xmin && cyl.x <= xmax)
					{
						if(cyl.y >= ymin && cyl.y <= ymax)
						{
							cyls.push_back(cyl.inliers);
						}
					}
				}
			}
		}std::cout << "done: "<< i <<std::endl;
	}
	ss.str("");

the output is:

Reading slice: complete
Cluster extraction: cloud.intermediate.slice.clusters.pcd | 71
Region-based segmentation: cloud.intermediate.slice.clusters.regions.pcd | 48
RANSAC cylinder fits: first 0start
cyl is model 0
second if true then0
done: 0
first 1start
cyl is model 1
second if true then1
done: 1
first 2start
cyl is model 2
second if true then2
done: 2
first 3start
cyl is model 3
second if true then3
done: 3
first 4start
cyl is model 4
done: 4
first 5start
done: 5
first 6start
done: 6
first 7start
cyl is model 7
second if true then7
done: 7
first 8start
cyl is model 8
done: 8
first 9start
cyl is model 9
second if true then9
done: 9
first 10start
done: 10
first 11start
cyl is model 11
second if true then11
done: 11
first 12start
done: 12
first 13start
cyl is model 13
second if true then13
done: 13
first 14start
cyl is model 14
done: 14
first 15start
cyl is model 15
done: 15
first 16start
cyl is model 16
second if true then16
done: 16
first 17start
cyl is model 17
second if true then17
done: 17
first 18start
cyl is model 18
done: 18
first 19start
done: 19
first 20start
cyl is model 20
done: 20
first 21start
cyl is model 21
second if true then21
done: 21
first 22start
cyl is model 22
done: 22
first 23start
done: 23
first 24start
done: 24
first 25start
cyl is model 25
second if true then25
done: 25
first 26start
cyl is model 26
second if true then26
third if true26
done: 26
first 27start
cyl is model 27
done: 27
first 28start
cyl is model 28
second if true then28
third if true28
done: 28
first 29start
cyl is model 29
second if true then29
done: 29
first 30start
cyl is model 30
second if true then30
done: 30
first 31start
cyl is model 31
done: 31
first 32start
cyl is model 32
second if true then32
done: 32
first 33start
done: 33
first 34start
cyl is model 34
second if true then34
done: 34
first 35start
cyl is model 35
second if true then35
done: 35
first 36start
cyl is model 36
second if true then36
done: 36
first 37start
cyl is model 37
second if true then37
done: 37
first 38start
cyl is model 38
second if true then38
done: 38
first 39start
cyl is model 39
second if true then39
third if true39
done: 39
first 40start
cyl is model 40
done: 40
first 41start
cyl is model 41
done: 41
first 42start
cyl is model 42
second if true then42
done: 42
first 43start
done: 43
first 44start
cyl is model 44
second if true then44
done: 44
first 45start
cyl is model 45
second if true then45
done: 45
first 46start
cyl is model 46
second if true then46
done: 46
first 47start
done: 47
terminate called after throwing an instance of 'pcl::IOException'
  what():  : [pcl::PCDWriter::writeBinary] Input point cloud has no data!
Aborted (core dumped)

as result shows, the third if inside for loop

cyl.stepcov <= stepcovmax && cyl.radratio > radratiomin

has never been true, I'm confused if it set the stepcovmax and radrationmin value too high to meet the requirments, so the final variable 'cys' has no input value, has anyone know how can I be able to modify 'stepcovmax' and 'radratiomin'? Thanks!

Reduce the amount of point cloud test data

Hi, I am looking forward to test and work with your treeseg processing pipeline and it looks very impressive to me that it seemed to work with such an amount of point cloud data generated from an 100x100 m investigation area. If I could make a wish I would suggest to offer a much smaller data set for testing the installation and all steps along the tree segmentation and estimations. Cheers!

DTM generation > Segmentation fault (core dumped)

Hi,
first of all: instead of the application getdtmslice i found getdemslice in the builds. I tried processing the whole bunch of downsampled point cloud tiles within the tutorial test data or only one tile ...
Always receive the error message: Segmentation fault (core dumped)
Working station: Ubuntu 20.04 + the required libpcl-dev libarmadillo-dev packages.
I am grateful for hints or solutions!

Formal request: LAS "starting point"

Andy (sorry for the barrage of stuff, I'm trying to document our issues using treeseg right now) -- I'd like to formally request .LAS support at least as a starting point. We aren't starting with the raw RIEGL data (as we discussed, the 400i seems to have a different format than you guys had). I think a more "stable" starting point would be some standard point cloud format like LAS, which then can get converted to PCD (and probably needs tiling support built in). We have a starting point from:

https://github.com/murtiad/las2pcd

And I've made an ubuntu-ready (plus allows input files as a parameter rather than needing interactive) at:

https://github.com/gearslaboratory/las2pcd

Parameters to tweek for crown segmentation

Thank you for publishing this nice piece of software, I am triing to segment a forest patch with beech trees, scanned leafs-off, with a Rigel VZ400. Most parts of the algorithem work well untill the point of crown segmentation segmentcrown [14 - 16] 0 ../beech.volume.*.pcd. The results show strongly undersegmented crowns for all trees. Do you have any hint how to obtain a stronger segmentation? I already found out that the second parameter of the function turns on wood-leaf-segemntation but what is the first one '[14 - 16]' doing? Thanks for your help!
beech21

Getting started

Hi,
I've read your paper which was fascinating and manage to install treeseg. I have a .pcd file generated by Google Cartographer from lidar data generated in a tree orchard.
Please forgive my ignorance, but I'm a bit stuck for how to use your algorithms on my point cloud. Do I execute commands from the ~/treeseg DIR where I installed the package? Do I need to run then in another environment? I'm running in Ubuntu 18.04
Thanks heaps
Ben

findstems.cpp

line 55:
float coords[4];
the length of coords should be at least 364 as the NOU11.dtm.dat has 365 lines, otherwise, segment fault core dumped error will occur.

Input point cloud has no data

Hi, first of all, thank you for making your code public.

I have an issue with segmentstem. I have a point cloud in ply format that I convert to pcd with ASCII format:
object_639143_nosky.txt

After running segmentstem with 12.5 arg and the point cloud I get the following error:

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.