System Benchmarking

To chat about anything else.
User avatar
gsisman
V.I.P Member
V.I.P Member
Posts: 898
Joined: Fri Oct 07, 2016 1:51 pm
7
Full Name: Steve Long
Company Details: Montgomery County DOT _ MD
Company Position Title: Land Survey Supervisor
Country: United States
Skype Name: gsisman1
Linkedin Profile: Yes
Has thanked: 767 times
Been thanked: 148 times

Re: System Benchmarking

Post by gsisman »

MikeDailey wrote: Wed Feb 19, 2020 12:20 pm
Carbix wrote: Wed Feb 19, 2020 2:45 am I think each scan environment should be relative to what the hardware was designed for. No small room scans with a Pxx series. BLK360 should be small and close. RTC360 could be a mix of mid range to inside scans (I'm curious to see how foliage will affect this kind of stuff).

I have BLK360 and RTC360 Data. Who has C10, Pxx and FARO?
I think you may be over thinking this a bit. I would suggest we look at file size for the created project from the scanner and the number of points contained in said project. Then we would also learn what manufactures are doing the best to condense the data with their own file type.

We will learn from the test, but I feel safe in assuming that the only difference that interior or exterior scans would have for bench-marking purposes is file size. Full dome scans outside will not get a return for the sky while interior scans would get returns on the majority of measurements, making these setups have larger file sizes.
Speaking from a LEica only background....
I would disagree with the difference with inside and outside scans. Inside scans usually provide MANY plane surfaces on all 6 sides with any range scanner and optimize very well with C2C. Sizea would vary greatly also in total number of points captured. Since Reg360 projects don't truly DLETE any points in the project DB, this would make a difference in the DB size. Sites outside with few true planes no overhead planes and lots of foliage cause problems with C2C registration times. Cleaning as much foliage out of your scan stations or out of your bundled scans can tighten things up and reduce registration time by eliminating noise.
Sites that have a lot of cleaning required will also be larger because of the extra bytes required to store the flags for Visible vs non-visible points and c2c vs non-c2c included points in the sequel db's used in Reg 360.
I have a spreadsheet set up over two years ago to compare our scan sizes ,download times, import times, size of REG 360 DB, Size of IMP db,Size of RCP project, Size of Jetstream/LGS file, export times, etc. I can share it as a starting point.
User avatar
Carbix
V.I.P Member
V.I.P Member
Posts: 236
Joined: Sat Mar 16, 2019 1:36 am
5
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 33 times
Been thanked: 50 times
Contact:

Re: System Benchmarking

Post by Carbix »

gsisman wrote: Sat Feb 22, 2020 8:51 pm
MikeDailey wrote: Wed Feb 19, 2020 12:20 pm
Carbix wrote: Wed Feb 19, 2020 2:45 am I think each scan environment should be relative to what the hardware was designed for. No small room scans with a Pxx series. BLK360 should be small and close. RTC360 could be a mix of mid range to inside scans (I'm curious to see how foliage will affect this kind of stuff).

I have BLK360 and RTC360 Data. Who has C10, Pxx and FARO?
I think you may be over thinking this a bit. I would suggest we look at file size for the created project from the scanner and the number of points contained in said project. Then we would also learn what manufactures are doing the best to condense the data with their own file type.

We will learn from the test, but I feel safe in assuming that the only difference that interior or exterior scans would have for bench-marking purposes is file size. Full dome scans outside will not get a return for the sky while interior scans would get returns on the majority of measurements, making these setups have larger file sizes.
Speaking from a LEica only background....
I would disagree with the difference with inside and outside scans. Inside scans usually provide MANY plane surfaces on all 6 sides with any range scanner and optimize very well with C2C. Sizea would vary greatly also in total number of points captured. Since Reg360 projects don't truly DLETE any points in the project DB, this would make a difference in the DB size. Sites outside with few true planes no overhead planes and lots of foliage cause problems with C2C registration times. Cleaning as much foliage out of your scan stations or out of your bundled scans can tighten things up and reduce registration time by eliminating noise.
Sites that have a lot of cleaning required will also be larger because of the extra bytes required to store the flags for Visible vs non-visible points and c2c vs non-c2c included points in the sequel db's used in Reg 360.
I have a spreadsheet set up over two years ago to compare our scan sizes ,download times, import times, size of REG 360 DB, Size of IMP db,Size of RCP project, Size of Jetstream/LGS file, export times, etc. I can share it as a starting point.
I think you just proved my point. Outside scans are harder to run. My next test file is going to be 64-100 setups all on low with no images in a 🌳. This will test c2c, setup clean on export as this is done per thread.

I want to try and keep the file size at 10gb-20gb so it’s not crazy for all of us to download.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com
User avatar
Carbix
V.I.P Member
V.I.P Member
Posts: 236
Joined: Sat Mar 16, 2019 1:36 am
5
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 33 times
Been thanked: 50 times
Contact:

Re: System Benchmarking

Post by Carbix »

At this point I was hoping to have seen more data from people is anyone else working on this?
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com
Dirkie2710
I have made 60-70 posts
I have made 60-70 posts
Posts: 61
Joined: Sun Jan 14, 2018 2:06 pm
6
Full Name: Dirkie Uys
Company Details: Landpartners
Company Position Title: Surveyor
Country: Australia
Skype Name: duaus
Linkedin Profile: No
Has thanked: 3 times
Been thanked: 10 times

Re: System Benchmarking

Post by Dirkie2710 »

I ran your data. I couldn't find the store file? but just used on of my own.

Below is what my data looks like.

Important notes :
Cyclone Core 9.4.2
3 seperate drives (all 2Tb NVMe M.2).
- 1 for Cyclone install
- 1 for Database
- 1 for Raw data

Did some other things while this ran, but nothing major (browsing internet etc) so might affect times.

Below is system specs, image of results and excel sheet for calcs
You do not have the required permissions to view the files attached to this post.
User avatar
Carbix
V.I.P Member
V.I.P Member
Posts: 236
Joined: Sat Mar 16, 2019 1:36 am
5
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 33 times
Been thanked: 50 times
Contact:

Re: System Benchmarking

Post by Carbix »

Awesome data. I love the break down. I forgot about my import speed and image sizes. They made a big difference.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com
User avatar
gsisman
V.I.P Member
V.I.P Member
Posts: 898
Joined: Fri Oct 07, 2016 1:51 pm
7
Full Name: Steve Long
Company Details: Montgomery County DOT _ MD
Company Position Title: Land Survey Supervisor
Country: United States
Skype Name: gsisman1
Linkedin Profile: Yes
Has thanked: 767 times
Been thanked: 148 times

Re: System Benchmarking

Post by gsisman »

Here are 5 exterior scans:
BLK360 Hi-Res HDR Imagery
Reg 360 installed on SSD Xeon w/48GB RAM, HDD 2nd Drive for Reg 360 TEMP DIR
Imported from Synology 8x6HDD NAS on 1GB Network
16 min for import and Auto-cloud Register Smart Align by time on my Workstation

https://mcgov-my.sharepoint.com/:f:/g/p ... g?e=M903h4


Little_Falls_Bridge_2020-02-24_1623.png
MAchine_2020-02-24_1627.png
You do not have the required permissions to view the files attached to this post.
GPavlidis
I have made <0 posts
I have made <0 posts
Posts: 5
Joined: Mon Dec 02, 2019 9:41 am
4
Full Name: Georgios
Company Details: NavVis GmbH
Company Position Title: Solution Manager
Country: Munich
Been thanked: 2 times

Re: System Benchmarking

Post by GPavlidis »

That's a really interesting topic to discuss about. As many have already mentioned, there are many perspectives to approach this topic.

Personally, I would agree with Daniel's and smacl's answer:
Firstly, I think this is a great area of discussion but as Daniel points you need firstly to figure out a list of results that you're trying to obtain and why.
Regarding SLAM based mobile scanners, NavVis has spent some time to investigate this complex topic and recently published a guide to evaluating the accuracy of an indoor mobile scanner and I thought that this would be of interest to you.

Feel free to take a look here: https://www.navvis.com/accuracy-handbook

Of course, the independent metrics to verify the accuracy of mobile scanning can be applied to any SLAM based mobile scanner of your choice (and yes nowadays there are many out there).

Cheers!
Dirkie2710
I have made 60-70 posts
I have made 60-70 posts
Posts: 61
Joined: Sun Jan 14, 2018 2:06 pm
6
Full Name: Dirkie Uys
Company Details: Landpartners
Company Position Title: Surveyor
Country: Australia
Skype Name: duaus
Linkedin Profile: No
Has thanked: 3 times
Been thanked: 10 times

Re: System Benchmarking

Post by Dirkie2710 »

Benchmark2.PNG
Ok so I re-ran the numbers.

I had a look at the data that Daniel uploaded.

Also I might be missing the point of this exercise, but based on what I assumed this on what I came up with.
That data is non-uniform, in the sense that it is outdoors (so non always getting a return), and scans are not same resolution (3 high res vs 1 med res).

Its also a small dataset so difficult to get a good average. I decided to use some data I have lying around and ran the test. At first I ran the import 6 different ways at each cube image size (1024,2048,4096 & 5120). So 6 results at each cube size. Parameters that were changed are :
1. Create thumbnails on/off
2. Map colours on/off
3. Pixel filter Medium/High

I found that there is a slight difference but not much (2-3sec per scan difference) by mapping colours on import vs not doing it.
Very odd results.
So the data I ran I found a few interest things.

1. As mentioned above mapping colours on import vs not mapping them (although scan has images) makes no significant difference to import time. So not mapping them just wastes time. You would still have to apply them later but wasted the time importing while it might as well have been done importing.
2. To check the above was due to images I ran a very similar data set (medium res, no images taken). As expected import times came down significantly, so much of the import time is spent on mapping colours.
3. High vs medium pixel filtering does have an impact on import time, but very slight. Similar to impact of mapping colours. Maybe 1-2 seconds total.
4. To double check the above a 3rd dataset was run (low resolution, no images taken) and import times were measured again.


So in conclusion :

None of the imports maxed out the RAM (128Gb). I was running them all on fast, and although it went high (avg 40Gb) I never used more than around 80Gb of RAM. Guessing that's just the limit of cyclone of only doing 4 at a time. Previous machines with less RAM crashed on fast so more RAM is good. Would be good if cyclone could use more RAM

GPU did barely anything. Bit of a waste. 24Gb RTX 6000 sitting idle all the time.

CPU is standard i9 9900X. Running 10 cores at 3.5Ghz. Trying to get IT to get turbo boost enabled to get it to around 4.2Ghz and see if there's any change. During import it would run around 60% and then jump up to full load for a few seconds. Hoping that makes a difference as well.

I think the speed of the SSDs (3 x 2Tb M.2) and layout of the folders (Cyclone install on C:\, DB on D:\ and RAW on E:\) does make a difference. At the very least having a SSD capable of doing high speed makes a difference.

Trying to map colours to scans that don't have any images affects the import time (few seconds per scan) which is odd.

On the most usual import type (medium res with images, importing 5120 size cubes) I had an averaged import time of 59sec per scan. Previous machine was doing over 2min 20 on same type scan with 2048 size cube images. So not too bad.

See below for comparison chart
You do not have the required permissions to view the files attached to this post.
User avatar
James Hall
V.I.P Member
V.I.P Member
Posts: 220
Joined: Tue Feb 02, 2010 5:13 pm
14
Full Name: James E Hall
Company Details: Dewberry Engineering Inc
Company Position Title: Survey Technician - Cyclone Modeler
Country: USA
Location: Frederick, MD
Has thanked: 5 times
Been thanked: 37 times

Re: System Benchmarking

Post by James Hall »

Carbix wrote: Sun Feb 23, 2020 1:51 am At this point I was hoping to have seen more data from people is anyone else working on this?
I'll be putting up times, have some deadlines to get out first.

James,
User avatar
Carbix
V.I.P Member
V.I.P Member
Posts: 236
Joined: Sat Mar 16, 2019 1:36 am
5
Full Name: Daniel Loney
Company Details: Excelsior Measuring
Company Position Title: Owner
Country: Canada
Linkedin Profile: Yes
Location: Canada
Has thanked: 33 times
Been thanked: 50 times
Contact:

Re: System Benchmarking

Post by Carbix »

Dirkie2710 wrote: Thu Feb 27, 2020 4:37 am Benchmark2.PNGOk so I re-ran the numbers.

I had a look at the data that Daniel uploaded.

Also I might be missing the point of this exercise, but based on what I assumed this on what I came up with.
That data is non-uniform, in the sense that it is outdoors (so non always getting a return), and scans are not same resolution (3 high res vs 1 med res).

Its also a small dataset so difficult to get a good average. I decided to use some data I have lying around and ran the test. At first I ran the import 6 different ways at each cube image size (1024,2048,4096 & 5120). So 6 results at each cube size. Parameters that were changed are :
1. Create thumbnails on/off
2. Map colours on/off
3. Pixel filter Medium/High

I found that there is a slight difference but not much (2-3sec per scan difference) by mapping colours on import vs not doing it.
Very odd results.
So the data I ran I found a few interest things.

1. As mentioned above mapping colours on import vs not mapping them (although scan has images) makes no significant difference to import time. So not mapping them just wastes time. You would still have to apply them later but wasted the time importing while it might as well have been done importing.
2. To check the above was due to images I ran a very similar data set (medium res, no images taken). As expected import times came down significantly, so much of the import time is spent on mapping colours.
3. High vs medium pixel filtering does have an impact on import time, but very slight. Similar to impact of mapping colours. Maybe 1-2 seconds total.
4. To double check the above a 3rd dataset was run (low resolution, no images taken) and import times were measured again.


So in conclusion :

None of the imports maxed out the RAM (128Gb). I was running them all on fast, and although it went high (avg 40Gb) I never used more than around 80Gb of RAM. Guessing that's just the limit of cyclone of only doing 4 at a time. Previous machines with less RAM crashed on fast so more RAM is good. Would be good if cyclone could use more RAM

GPU did barely anything. Bit of a waste. 24Gb RTX 6000 sitting idle all the time.

CPU is standard i9 9900X. Running 10 cores at 3.5Ghz. Trying to get IT to get turbo boost enabled to get it to around 4.2Ghz and see if there's any change. During import it would run around 60% and then jump up to full load for a few seconds. Hoping that makes a difference as well.

I think the speed of the SSDs (3 x 2Tb M.2) and layout of the folders (Cyclone install on C:\, DB on D:\ and RAW on E:\) does make a difference. At the very least having a SSD capable of doing high speed makes a difference.

Trying to map colours to scans that don't have any images affects the import time (few seconds per scan) which is odd.

On the most usual import type (medium res with images, importing 5120 size cubes) I had an averaged import time of 59sec per scan. Previous machine was doing over 2min 20 on same type scan with 2048 size cube images. So not too bad.

See below for comparison chart
This great data. My goal is to see what systems run this stuff the best. I wanted the file size to be something large enough to tax a system but small enough that we could run multiple runs and not take forever (we all have real work to do)

I'm very interested on how fast your system did the import. I have been using Cyclone Register 360. It looks like your using Cyclone "Original" (needs a better naming system).

One thing thats coming up to be very true is that the number of cores does not change the import time. Core speed is the big one.
Daniel Loney - Owner
Excelsior Measuring inc.
Vancouver - Okanagan - Calgary
www.ExcelsiorLevel.com
Post Reply

Return to “General Chat”