In general, I find CC to be a great way to display and manipulate my point cloud data. However, I have some really large point cloud files, and I'm unable to do a lot of functions because the size of the point cloud ends up crashing the program.
Is there a way to optimize a large point cloud for faster, more efficient display and to prevent the data from crashing the system?
My computer and graphics card are top notch.
Display of really large point clouds
Re: Display of really large point clouds
Up to a billion point, I know it's still possible to load clouds (if you increase the maximum virtual memory of Windows to something quite big). Of course it will be awfully slow. The best idea is generally to subsample the cloud before hand (you can do it with the command line).
Something that might make CC crash is also the safeguard of some drivers (I believe it's mainly for Qudaros) that don't like too long rendering times (this can be disabled - I just don't remember the name of this mechanism right now ;).
And anyway there's an absolute upper limit that lies around 2 or 4 billion points (this will indeed make CC crash).
I think that next year we should have the time to look at real solutions to this problem, as clouds are getting bigger and bigger.
Something that might make CC crash is also the safeguard of some drivers (I believe it's mainly for Qudaros) that don't like too long rendering times (this can be disabled - I just don't remember the name of this mechanism right now ;).
And anyway there's an absolute upper limit that lies around 2 or 4 billion points (this will indeed make CC crash).
I think that next year we should have the time to look at real solutions to this problem, as clouds are getting bigger and bigger.
Daniel, CloudCompare admin