issue with C2C on large airborne lidar lines
Posted: Fri May 19, 2017 3:40 pm
Hi Daniel,
I'm running C2C through command line (but the same issue occurs when using exactly the system in the GUI version) and I have the following issue:
. I'm comparing airborne lidar flight lines (typically with 30-80 millions points) to a reference dataset which does not have the same extent (and may not have in some cases any overlap). 90 % of the time it works perfectly, but for some lines, the calculation basically never ends (and I've waited a few hours).
. In these particular cases, it seems that the approximate distance calculation ends up with very large values (e.g.: average population 27523776 +-27523773), when the typical distance between the 2 point clouds should be smaller. This corresponds to case where there's not a lot of overlap between the two PC, and by randomly picking a few samples, it could be possible to end up with very large distances.
. As a consequence CC picks an octree level of 2 which has 2 consequences (I suppose): (1) it does not maximize the use of CPUs because there's only generally 2 cells; (2) neighbour search is just super long at this low octree level
. Now, I thought that by imposing a min_distance, I would get rid of the problem of the automatic low octree level selection, but it seems that this is not factored in the selection of the octree level, as I always end up with a level 2, while I'm chosing 10 m for instance. Am I right in thinking there's something off here ?
. In the end, I can impose manually the octree level and get the job done very quickly given that I'm note interested in distance greater than 10 m, but because strip lines do not have the same extent, I'm not necesseraly optimal in my choice with respect to the min-distance I'm imposing.
Would it be possible to have the automatic selection of the octree level accounting for the choice of the min-distance to get around these pathological cases ?
Thanks
Dimitri
I'm running C2C through command line (but the same issue occurs when using exactly the system in the GUI version) and I have the following issue:
. I'm comparing airborne lidar flight lines (typically with 30-80 millions points) to a reference dataset which does not have the same extent (and may not have in some cases any overlap). 90 % of the time it works perfectly, but for some lines, the calculation basically never ends (and I've waited a few hours).
. In these particular cases, it seems that the approximate distance calculation ends up with very large values (e.g.: average population 27523776 +-27523773), when the typical distance between the 2 point clouds should be smaller. This corresponds to case where there's not a lot of overlap between the two PC, and by randomly picking a few samples, it could be possible to end up with very large distances.
. As a consequence CC picks an octree level of 2 which has 2 consequences (I suppose): (1) it does not maximize the use of CPUs because there's only generally 2 cells; (2) neighbour search is just super long at this low octree level
. Now, I thought that by imposing a min_distance, I would get rid of the problem of the automatic low octree level selection, but it seems that this is not factored in the selection of the octree level, as I always end up with a level 2, while I'm chosing 10 m for instance. Am I right in thinking there's something off here ?
. In the end, I can impose manually the octree level and get the job done very quickly given that I'm note interested in distance greater than 10 m, but because strip lines do not have the same extent, I'm not necesseraly optimal in my choice with respect to the min-distance I'm imposing.
Would it be possible to have the automatic selection of the octree level accounting for the choice of the min-distance to get around these pathological cases ?
Thanks
Dimitri