I have a large e57 file (110GB) and I am unable to open in CC. I have a dedicated 6TB SSD card I am trying to use as virtual memory but cannot seem to get CC to see and use the additional RAM. Any suggestions, tips, and or approaches to working with a large data set with virtual memory?
here's the computer specs:
HP Z6 G4 Workstation
Intel Xeon Gold 6248R CPU @ 3.00 GHz, 2993 MHz, 24 cores
256 GB Physical RAM
Thanks,
TJGRO
Large Data Set
Re: Large Data Set
Is the error "out of memory" or is it something else?
Opening such a monster with CloudCompare is challenging anyway, as if even if it fits in memory, the interaction might be a little bit slow ;)
One option is to subsample the cloud via the command line first.
Opening such a monster with CloudCompare is challenging anyway, as if even if it fits in memory, the interaction might be a little bit slow ;)
One option is to subsample the cloud via the command line first.
Daniel, CloudCompare admin
Re: Large Data Set
Runs out of memory.
How do you subsample via command line?
How do you subsample via command line?
Re: Large Data Set
See https://www.cloudcompare.org/doc/wiki/i ... _line_mode (the -SS command)
Daniel, CloudCompare admin
Re: Large Data Set
Is there a way to only partially load a structured E57 file? Such as see a tree view of the scans and just load selected ones?
Kevin
Kevin