CANUPO Crash

Questions related to plugins development
Post Reply
AndrewRoberts
Posts: 14
Joined: Mon Aug 11, 2014 5:08 pm

CANUPO Crash

Post by AndrewRoberts »

In CC 2.7 when I classify a point cloud using the CANUPO plugin the software crashes if the point cloud is greater than ~3 million points.

Specifically, it will process right to 99%-100% and crash.
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: CANUPO Crash

Post by daniel »

I never made Canupo crash (but maybe Dimitri has observed this as he uses big clouds?). But I bet it's possible as it can requires a lot of memory depending on your parameters.

Can you maybe provide me with your cloud and your classifier file (prm)? Or at least give me the number of scales and the other parameters you used?
Daniel, CloudCompare admin
AndrewRoberts
Posts: 14
Joined: Mon Aug 11, 2014 5:08 pm

Re: CANUPO Crash

Post by AndrewRoberts »

Apparently I cannot use the .prm has an attachment. I could email them to you if you would like to check it out.

I have tried many scales/parameters. The default settings (start 0.1, step 0.1, max 10) caused a crash. However, something like (start 1, step 10, max 501) seems to produce a better classification but appears to be slower.
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: CANUPO Crash

Post by daniel »

Indeed, if you use bigger scales, there are much more points to process and it's (much) slower.

I received the prm file but not the cloud. Can you send it to me as well?
Daniel, CloudCompare admin
daniel
Site Admin
Posts: 7713
Joined: Wed Oct 13, 2010 7:34 am
Location: Grenoble, France
Contact:

Re: CANUPO Crash

Post by daniel »

Ok, in the end I think the issue was due to the number of scales (100!). It didn't crash on my side (even with so many scales) but it's definitely possible that it crashes if you don't have enough memory (or if you are using a 32 bits version maybe?).

I'll use this opportunity to remind that 10 to 20 scales should be more than sufficient in all cases. The aim is to pick scales that are each corresponding to a sensibly different shape (what Lague et al. called 'dimensionaility' - i.e. whether the object is globally '1D' - e.g. a stem or a cable - or '2D' - e.g. big leaves or a rock surface, etc. - and eventually '3D' - e.g. a bush, rock considered as a whole, etc.). If the scales are too similar, they won't add any information to the descriptor and will therefore be meaningless.
Daniel, CloudCompare admin
Post Reply