Hello,
Im currently working on a project thats turning into quite a challenge.
im new when it comes to point clouds so im looking for a push in the right direction.
our client provided us with a massive point cloud, and i need to make this behemoth VR ready.(unity)
The point cloud:
the point cloud is a "SZF5" scan, separated into 6 sections, each section is split into ~250mb FLS files (about 10-100 mil points)
the total size is 101 GB
beside this their is also an 27gb .e57 file called noisehood full
in order to get this working in VR some serious optimization is required.
an octree-point visualization isn't what im looking for, meshes are required for other features(raycasting, lighting,collision,...)
what i am looking for is a way do to the following, and automate this for future clouds.
- i need to turn the entire cloud into a chunked grid of N meters big,
- each chunk needs needs to become mesh (triangulated, max vertex count per chunk is 65534 ) and a texture needs to be baked from the points that are ignored
- all of this needs to happen without loading the entire thing into memory,
e.g. each chunk reads every pointcloud en looks for points in its range,
after this, it can do the triangulation and the baking, save the mesh and start the process again for the next chunk
- i know that this might be a time consuming process but that isn't really an issue (a dedicated device can be set up)
before i dive into cloudcompare i would like to know if its capable of doing something like this? (on a smaller scale)
i have experience in c++ i'm willing to get my hands dirty in order to automate the process
thanks in advance!
Processing a massive pointcloud
Re: Processing a massive pointcloud
omg \o/
Not at all, sorry :D (well especially the meshing part).
Not at all, sorry :D (well especially the meshing part).
Daniel, CloudCompare admin
-
- Posts: 46
- Joined: Mon Jan 20, 2014 6:31 pm
- Location: Albuquerque, NM
- Contact:
Re: Processing a massive pointcloud
We're working on setting up a similar pipeline based primarily on PDAL and Houdini.
The first thing I would decide on is exactly how you're going to build the mesh as that will determine how you need to process the original point clouds, e.g. if you're going to use something like Poisson Surface Reconstruction you'll need to estimate normals for your points.
The first thing I would decide on is exactly how you're going to build the mesh as that will determine how you need to process the original point clouds, e.g. if you're going to use something like Poisson Surface Reconstruction you'll need to estimate normals for your points.
Jed
-
- Posts: 2
- Joined: Tue Jun 11, 2019 8:02 am
Re: Processing a massive pointcloud
if i would have to write it from scratch i would to the followingjedfrechette wrote: ↑Wed Jun 12, 2019 11:55 pm The first thing I would decide on is exactly how you're going to build the mesh as that will determine how you need to process the original point clouds, e.g. if you're going to use something like Poisson Surface Reconstruction you'll need to estimate normals for your points.
-> parse all files but create the octree on the hardrive
-> i would use marching cubes algorithm to generate the meshes
-> the texturing would be the hard part, this would require some form of projection mapping from the pointcloud onto the generated mesh
however i would rather use some existing software for this and save some time
i did manage to load in a small section and poisen is capable of turning it into mesh.
a lot of noise tough since i cant load the entire thing
i'm thinking about parsing all the pointcloud files before i throw them into cloudcompare
seperating each of them into grid like clusters that i can work with.