Hey guys. Very new to cloud compare. I have data in the format of X,Y,Z, angle, Unix Time. Here's a sample value:
1748.724 9 619.178 -61.299 -2.733 1418669100.748
I would like to somehow carry the angle and Unix time as a custom unmodified field. I have tried this with CC v2.6.1 and have found the CC exports the angle correctly, but it puts the same Unix Timestamp for all entries in the exported text file.
I thought that perhaps the size of the Unix timestamp was too large for the scalar value I was forcing into so I did a little 'magic' to make it smaller. I multiplied the decimal Unix time by 1000, found what the first entry was and then subtracted all times by the starting value to generate an 'indexed' decimal time. CC carried this value fine for the test file I used but I'm not sure if I'll run into problems if my time reaches seven digits.
Any thoughts on how best to do this? I'm using ASCII format for both input and output. Thanks!
Is it possible to carry a custom field?
Re: Is it possible to carry a custom field?
Indeed, scalars are stored in 'float' format (32 bits floating point value). But you should even be able to divide your values by 1e6 instead of 1e3 without losing much precision. This will let you last longer...
Otherwise you could maybe also split your time on two scalar fields? One for number of millions for instance and the other for the remainder?
Otherwise you could maybe also split your time on two scalar fields? One for number of millions for instance and the other for the remainder?
Daniel, CloudCompare admin
Re: Is it possible to carry a custom field?
Ah yes, that does seem like a viable option. I may have to try this method. Thank you.
One other question in regards to the SOR 'Remove Statistical Outliers' button. First, this does a remarkable job at cleaning up my data in a very short period of time. As stated above, I am trying to carry my additional fields. Perhaps I'm wrong, but in my initial testing it appeared that all data was affected including my time values. Is this the case? Are the outliers simply being removed or are new values being created to best represent the data in that area? Please forgive me for my ignorance. Hopefully these questions make some sense.
One other question in regards to the SOR 'Remove Statistical Outliers' button. First, this does a remarkable job at cleaning up my data in a very short period of time. As stated above, I am trying to carry my additional fields. Perhaps I'm wrong, but in my initial testing it appeared that all data was affected including my time values. Is this the case? Are the outliers simply being removed or are new values being created to best represent the data in that area? Please forgive me for my ignorance. Hopefully these questions make some sense.
Re: Is it possible to carry a custom field?
The SOR algorithm only removes points / outliers. No point will be created.
You should prefer the equivalent 'Tools > Noise > Clean' filter which should better preserve the cloud features as it's a built-in algorithm while the other one is a portage from the PCL library.
You should prefer the equivalent 'Tools > Noise > Clean' filter which should better preserve the cloud features as it's a built-in algorithm while the other one is a portage from the PCL library.
Daniel, CloudCompare admin
Re: Is it possible to carry a custom field?
Thanks for the advice. I tried using 'Clean' filter and it took considerably longer (12hours+ vs 10 minutes) than the SOR filter and did not do as good of a job. I know that this is more than likely due to me not knowing how to set the 'Clean' filter as I was using the default values. Is there a way to replicate the default settings used for SOR filter (# points set to 10 and stddev set to 1) for the 'Clean' filter? Will this decrease the amount of time needed to process the data to a similar time frame as the SOR filter?
I'd really like to use the SOR filter but have it output the coordinates in the same space as the input data file. I did manage to set the Global shift to match the input data set and then export. Can you confirm that this indeed will keep points not removed and their associated scalar values untouched? Perhaps I did something wrong, but I did try finding XYZ points that are in the input file but was unable to find those exact matching points in the exported shifted data set. The data appeared to be in the correct space but I had trouble matching them. I really need the data coming out to match the exact values of the input file. I am using the output of CC in correlation with some additional data and need the positions and times to remain intact.
Thank you again for the help and for the program. It is a wonderful tool and I'm sure with a bit of time I'll be able to tweak settings to do exactly what I'm trying to do.
*EDIT* Apologies. I did manage to get things to work correctly using the global shift/scale on the SOR filtered data. I was using the wrong input file (one with unaltered times) and I was getting inaccurate times on output. This appears to all be corrected when using the correct input file.
I'd really like to use the SOR filter but have it output the coordinates in the same space as the input data file. I did manage to set the Global shift to match the input data set and then export. Can you confirm that this indeed will keep points not removed and their associated scalar values untouched? Perhaps I did something wrong, but I did try finding XYZ points that are in the input file but was unable to find those exact matching points in the exported shifted data set. The data appeared to be in the correct space but I had trouble matching them. I really need the data coming out to match the exact values of the input file. I am using the output of CC in correlation with some additional data and need the positions and times to remain intact.
Thank you again for the help and for the program. It is a wonderful tool and I'm sure with a bit of time I'll be able to tweak settings to do exactly what I'm trying to do.
*EDIT* Apologies. I did manage to get things to work correctly using the global shift/scale on the SOR filtered data. I was using the wrong input file (one with unaltered times) and I was getting inaccurate times on output. This appears to all be corrected when using the correct input file.
Re: Is it possible to carry a custom field?
First let me thank you guys again for the great tool. I have been using it a bit lately and it is truly remarkable!
I am continuing to shrink my Unix timestamps to fit in the scalar field limitations and this works fine. I'm at the stage of refining my process and have made great strides in terms of efficiency by using the commandline options. Unfortunately dealing with these Unix timestamps in my large ASCII files takes a considerable amount of time.
I started thinking that maybe I could convert my ASCII files into one of the other formats and import using that format instead. I am curious if someone can tell me if any of the available formats allows for carrying a text field that is just passed through? I would need some way to pass my Unix times unedited through to the final cloud. The times look like this if you're unfamiliar: 1440028800 .
I'm really just brainstorming for a way to improve my process and am curious if changing formats would offer a solution. Many thanks (again)!
I am continuing to shrink my Unix timestamps to fit in the scalar field limitations and this works fine. I'm at the stage of refining my process and have made great strides in terms of efficiency by using the commandline options. Unfortunately dealing with these Unix timestamps in my large ASCII files takes a considerable amount of time.
I started thinking that maybe I could convert my ASCII files into one of the other formats and import using that format instead. I am curious if someone can tell me if any of the available formats allows for carrying a text field that is just passed through? I would need some way to pass my Unix times unedited through to the final cloud. The times look like this if you're unfamiliar: 1440028800 .
I'm really just brainstorming for a way to improve my process and am curious if changing formats would offer a solution. Many thanks (again)!
Re: Is it possible to carry a custom field?
I'm dodging the question but you could also compile CC with 64 bits 'double' scalar values instead of 32 bits 'float' ones so as to keep your long timestamps unmodified. Simply change the 'ScalarValue' type in "CCTypes.h" and recompile CC.
Otherwise, PLY files can handle scalar values properly. And LAS files already have a a 64 bits 'timestamp' field.
Otherwise, PLY files can handle scalar values properly. And LAS files already have a a 64 bits 'timestamp' field.
Daniel, CloudCompare admin
Re: Is it possible to carry a custom field?
Thanks again for all the help. I've managed to get this working in a Linux build and this will be a big time saver.