how to estimate filesize of output data

Discuss Trimble Realworks software here.
Post Reply
Pajosh2
I have made 10-20 posts
I have made 10-20 posts
Posts: 15
Joined: Thu Nov 08, 2018 6:12 pm
5
Full Name: Pavel Grée
Company Details: XGEO
Company Position Title: executive one
Country: Czech Republic
Linkedin Profile: No
Has thanked: 5 times
Been thanked: 1 time

how to estimate filesize of output data

Post by Pajosh2 »

Hi, is there a way how to estimate filesize of output data in RW for RCP export in registration or production mode based on points number.
Data contain color, intensity, coordinate system?
I know how many points i have in the cloud.
Many Thanx. PG.
VXGrid
V.I.P Member
V.I.P Member
Posts: 544
Joined: Fri Feb 24, 2017 10:47 am
7
Full Name: Martin Graner
Company Details: PointCab GmbH
Company Position Title: Research and Development
Country: Germany
Linkedin Profile: No
Has thanked: 160 times
Been thanked: 175 times
Contact:

Re: how to estimate filesize of output data

Post by VXGrid »

I don't know about Realworks, but since you want to export into RCP and know the resulting file size I can give you some hints:

Short answer: To calculate a complete true bytevalue, like you have 1000 points -> the RCP will be 253,789 bytes
This is not possible.

Even a rough estimate is hard, since the RCP format is compressed, and the compression rate is depending on the data entropy.

More points can be compressed better, if the points are more dense, the compression should be better as well.

The raw data size can be translated to:
GigaByte = 29 bytes * point number / 1024 / 1024 / 1024
-> 29 bytes = 24 (XYZ) + 3 (Color) + 1 (Intensity/reflectivity) + 1 (validity if structured).

The resulting size is then depending on the compression rate, so in the worst case, it's identical to the raw data size.
I'd assume, that the resulting size for structured (mirrorballs) is somewhere around 50% of the raw size, for inside scans and 90% for outside scans.
For unified rcps I'd think it is around half what you get with the structured rcp (because the structured one is containing two point clouds, the visualized one, and the mirrorballs).
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: how to estimate filesize of output data

Post by smacl »

VXGrid wrote: Thu Jun 10, 2021 10:38 amThe raw data size can be translated to:
GigaByte = 29 bytes * point number / 1024 / 1024 / 1024
-> 29 bytes = 24 (XYZ) + 3 (Color) + 1 (Intensity/reflectivity) + 1 (validity if structured).
That 24 bytes per XYZ coordinate tends to be the size used for a single precision accuracy floating point coordinate going to the GPU. There are much more efficient structures for storing very large sets of coordinates to higher accuracy from scanners and photogrammetry. A quick check here in SCC shows on average of 9-10 bytes per RGBI point for unstructured points and 11-12 for structured. PointTools POD format has a similar sizing, I'd need to check with other systems. It would make for an interesting benchmark to take a 1 billion point pointcloud and check its size on disk in various open and closed formats. In SCC I allow roughly 10GB per billion points for RGBI data, more if extra dimensions are stored which is often the case (e.g. height, clash distance, temperature) and less where RGB is not used (e.g. non-colorized scans) or where intensity is not used (e.g. photogrammetry).
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
VXGrid
V.I.P Member
V.I.P Member
Posts: 544
Joined: Fri Feb 24, 2017 10:47 am
7
Full Name: Martin Graner
Company Details: PointCab GmbH
Company Position Title: Research and Development
Country: Germany
Linkedin Profile: No
Has thanked: 160 times
Been thanked: 175 times
Contact:

Re: how to estimate filesize of output data

Post by VXGrid »

smacl wrote: Fri Jun 11, 2021 10:11 am
VXGrid wrote: Thu Jun 10, 2021 10:38 amThe raw data size can be translated to:
GigaByte = 29 bytes * point number / 1024 / 1024 / 1024
-> 29 bytes = 24 (XYZ) + 3 (Color) + 1 (Intensity/reflectivity) + 1 (validity if structured).
That 24 bytes per XYZ coordinate tends to be the size used for a single precision accuracy floating point coordinate going to the GPU. There are much more efficient structures for storing very large sets of coordinates to higher accuracy from scanners and photogrammetry. A quick check here in SCC shows on average of 9-10 bytes per RGBI point for unstructured points and 11-12 for structured. PointTools POD format has a similar sizing, I'd need to check with other systems. It would make for an interesting benchmark to take a 1 billion point pointcloud and check its size on disk in various open and closed formats. In SCC I allow roughly 10GB per billion points for RGBI data, more if extra dimensions are stored which is often the case (e.g. height, clash distance, temperature) and less where RGB is not used (e.g. non-colorized scans) or where intensity is not used (e.g. photogrammetry).
I'd love to see the comparison between the different file formats, however I think it is not easy to pull that off to get comparable results.
Or at least we won't get them from the hardware manufacturers' side, since I think there is no way to convert a Z+F or Riegl native cloud into a FARO or Leica raw file format.

But perhaps somebody has a native FARO LSPROJ - static scans with the mentioned point amount and then perhaps another project as a laz.
The FARO project could be exported from Scene into an E57 and from there we can compare everything :D

A nice task for a Bachelor thesis :D
Post Reply

Return to “Trimble Realworks”