Laser Scanning Pain Points

Please post all your polls you want others opinion on.

What's your biggest bugbear with your current scanning set up?

Poll ended at Mon Sep 27, 2021 12:36 pm

Calibration cost - time & money
10
37%
Gaps in data/missing data
2
7%
Handling very large datasets
11
41%
Having to level the scanner
0
No votes
Other (please comment below)
4
15%
 
Total votes: 27

stuartcadge83
I have made <0 posts
I have made <0 posts
Posts: 3
Joined: Mon Sep 20, 2021 11:14 am
2
Full Name: Stuart Guthrie-Cadge
Company Details: KOREC Group
Company Position Title: Marketing Manager
Country: United Kingdom
Linkedin Profile: Yes
Been thanked: 1 time

Re: Laser Scanning Pain Points

Post by stuartcadge83 »

Morning all, thanks to everyone who took part and voted/commented in our poll, the results are here

https://www.korecgroup.com/our-laser-sc ... ts-are-in/

We've collated the results from both this LSF poll and our Linkedin poll together - we had 88 responses altogether.

I find it interesting that the number 1 bugbear is the size of datasets - not a surprise as such, but that it seems to be the top reason for most people. It's not something we talk about so much as a distributor so maybe it's something we need to focus on a little more in our discussions.

Anyway, thanks again to all who took part :)

Stuart
Least
I have made 80-90 posts
I have made 80-90 posts
Posts: 82
Joined: Wed Jan 23, 2013 5:15 pm
11
Full Name: Patrick
Company Details: 360G
Company Position Title: Director
Country: uk
Has thanked: 10 times
Been thanked: 13 times

Re: Laser Scanning Pain Points

Post by Least »

1. Handling very large data sets (45%)

Additionally, the Trimble X7 captures four times less data (500,000 points per second compared to the 2 million or 1 million of other manufacturer scanners) which of course means less data to store and handle. :lol: A bit of spin there..

I was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Along of course with an open point cloud format so that exporting to different file types is no longer required..
User avatar
landmeterbeuckx
V.I.P Member
V.I.P Member
Posts: 1615
Joined: Tue May 01, 2012 5:19 pm
11
Full Name: Lieven Beuckx
Company Details: Studiebureau Beuckx
Company Position Title: Owner
Country: Belgium
Linkedin Profile: Yes
Has thanked: 183 times
Been thanked: 548 times

Re: Laser Scanning Pain Points

Post by landmeterbeuckx »

Least wrote: Tue Sep 28, 2021 12:21 pm 1. Handling very large data sets (45%)

Additionally, the Trimble X7 captures four times less data (500,000 points per second compared to the 2 million or 1 million of other manufacturer scanners) which of course means less data to store and handle. :lol: A bit of spin there..

I was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Along of course with an open point cloud format so that exporting to different file types is no longer required..
I tend to make decimated datasets at 5mm, 1cm and 2cm. Even a 5cm one has enough data for certain workflows.
LSBbvba
Surveying services - 3D Laserscanning
Tel : +32477753126
www.lsbbvba.be
[email protected]
badam
V.I.P Member
V.I.P Member
Posts: 916
Joined: Tue May 11, 2021 5:36 pm
2
Full Name: Adam Berta
Company Details: InnoScan 3D Hungary Kft
Company Position Title: unknown
Country: Hungary
Linkedin Profile: No
Has thanked: 51 times
Been thanked: 297 times
Contact:

Re: Laser Scanning Pain Points

Post by badam »

Least wrote: Tue Sep 28, 2021 12:21 pm 1. Handling very large data sets (45%)

Additionally, the Trimble X7 captures four times less data (500,000 points per second compared to the 2 million or 1 million of other manufacturer scanners) which of course means less data to store and handle. :lol: A bit of spin there..

I was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Along of course with an open point cloud format so that exporting to different file types is no longer required..
That is a dream which never comes true. Export will always be necessary.

As long as you don't need to use the point clouds inside architectural application such as archicad, revit, etc. you can decimate the cloud "smartly" but if you need to block the view, to get better understanding in 3d or a section you need as many points as you can. Maybe if the industry changes to meshes instead of points this could be change until then load as many points into your downstream app as you can without performance issues. what bricscad do is a very good solution based on the marketing videos. it will handle large amount of data as easily as the registration softwares, or potree.

you can calculate curvatures and use those in cloudcompare subsample, it is in the wiki, but it is only good for clouds for meshing or not scan to bim solutions.

UI: the 2 million pps and such are just sales numbers... And it is "up to ...." it does not contain colorisation time just the raw point scanning, in special cases.
rtc360_pps.PNG
You do not have the required permissions to view the files attached to this post.
Jamesrye
V.I.P Member
V.I.P Member
Posts: 643
Joined: Mon Aug 11, 2008 4:13 pm
15
Full Name: James Rye
Company Details: Merrett Survey Partnership
Company Position Title: Spatial Analyst
Has thanked: 28 times
Been thanked: 69 times

Re: Laser Scanning Pain Points

Post by Jamesrye »

Least wrote: Tue Sep 28, 2021 12:21 pm 1. Handling very large data sets (45%)

Additionally, the Trimble X7 captures four times less data (500,000 points per second compared to the 2 million or 1 million of other manufacturer scanners) which of course means less data to store and handle. :lol: A bit of spin there..

I was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Along of course with an open point cloud format so that exporting to different file types is no longer required..
It is a bit of spin, but there's some truth to it. I used the Trimble X7 on a large building survey (circa 500 scans or so). Each floor was perfectly registered on site and imported in to Cyclone for checking (just auto-added cloud constraints and made the slices) then the targets were located and checked/ used to coordinate the data. The whole process was simplicity itself. The smaller file size meant that I could process the data extremely quickly. 3 days on site - and less than 1 day to export/import/register/export!!! The BIM modelling team got the point cloud in record time.
Clipboard01.jpg

The quality of the data was similar to that of the old C10 scanner.
You do not have the required permissions to view the files attached to this post.
Scott
V.I.P Member
V.I.P Member
Posts: 1037
Joined: Tue Mar 29, 2011 7:39 pm
13
Full Name: Scott Page
Company Details: Scott Page Design- Architectural service
Company Position Title: Owner
Country: USA
Linkedin Profile: No
Location: Berkeley, CA USA
Has thanked: 205 times
Been thanked: 78 times
Contact:

Re: Laser Scanning Pain Points

Post by Scott »

Jamesrye wrote: Tue Sep 28, 2021 4:55 pm
Least wrote: Tue Sep 28, 2021 12:21 pm 1. Handling very large data sets (45%)

Additionally, the Trimble X7 captures four times less data (500,000 points per second compared to the 2 million or 1 million of other manufacturer scanners) which of course means less data to store and handle. :lol: A bit of spin there..

I was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Along of course with an open point cloud format so that exporting to different file types is no longer required..
It is a bit of spin, but there's some truth to it. I used the Trimble X7 on a large building survey (circa 500 scans or so). Each floor was perfectly registered on site and imported in to Cyclone for checking (just auto-added cloud constraints and made the slices) then the targets were located and checked/ used to coordinate the data. The whole process was simplicity itself. The smaller file size meant that I could process the data extremely quickly. 3 days on site - and less than 1 day to export/import/register/export!!! The BIM modelling team got the point cloud in record time.
The quality of the data was similar to that of the old C10 scanner.
Is it possible to show us a 'slice' or two (plan and section/elevation)?
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Laser Scanning Pain Points

Post by smacl »

Least wrote: Tue Sep 28, 2021 12:21 pmI was thinking more along the line of 'smart decimation', ie. flat planar areas can handle more decimation than areas with curves or angle changes etc. without losing geometric detail.
Agreed entirely, decimation is a very blunt tool. Far better to identify planes and other geometric objects and reduce on that basis. Joon's thread on this subject is well worth a read. Where distance based decimation does well is in high density static scans where there are a huge number of redundant points near each setup station. I found decimating a rail tunnel to 2mm removed about half the points on a P40 scan where you had to look very closely to see any differences.
Along of course with an open point cloud format so that exporting to different file types is no longer required.
As per Daniel's recent thread, a fast open SDK to access native formats is a more efficient solution here. It removes one more translation stage in the processing workflow and makes it safe to archive your point clouds in the native format of whatever software you happen to be using. Open point cloud formats tend to suffer from technical shortcomings and patchy manufacturer support. It could well be we see a transition from point clouds as we understand them now to hybrids of point cloud, images, meshes and other geometric and attribute data. Trying to get an open format to second guess how this will all pan out is trying to read too many tea leaves in my humble opinion.
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
jedfrechette
V.I.P Member
V.I.P Member
Posts: 1236
Joined: Mon Jan 04, 2010 7:51 pm
14
Full Name: Jed Frechette
Company Details: Lidar Guys
Company Position Title: CEO and Lidar Supervisor
Country: USA
Linkedin Profile: Yes
Location: Albuquerque, NM
Has thanked: 62 times
Been thanked: 219 times
Contact:

Re: Laser Scanning Pain Points

Post by jedfrechette »

smacl wrote: Tue Sep 28, 2021 7:42 pm It could well be we see a transition from point clouds as we understand them now to hybrids of point cloud, images, meshes and other geometric and attribute data. Trying to get an open format to second guess how this will all pan out is trying to read too many tea leaves in my humble opinion.
Some of us are already working heavily with hybrid data models like you describe and are rapidly moving towards that being our primary way of working. The ONLY scalable way I can see to reach that goal without coding everything from scratch, which is not practical, is with Open Source tools and formats, namely by leveraging USD. Proprietary 3D software vendors, and scanning focused ones in particular, may do some things better, but they're way behind in this particular area.
Jed
User avatar
smacl
Global Moderator
Global Moderator
Posts: 1409
Joined: Tue Jan 25, 2011 5:12 pm
13
Full Name: Shane MacLaughlin
Company Details: Atlas Computers Ltd
Company Position Title: Managing Director
Country: Ireland
Linkedin Profile: Yes
Location: Ireland
Has thanked: 627 times
Been thanked: 657 times
Contact:

Re: Laser Scanning Pain Points

Post by smacl »

jedfrechette wrote: Tue Sep 28, 2021 10:48 pm
smacl wrote: Tue Sep 28, 2021 7:42 pm It could well be we see a transition from point clouds as we understand them now to hybrids of point cloud, images, meshes and other geometric and attribute data. Trying to get an open format to second guess how this will all pan out is trying to read too many tea leaves in my humble opinion.
Some of us are already working heavily with hybrid data models like you describe and are rapidly moving towards that being our primary way of working. The ONLY scalable way I can see to reach that goal without coding everything from scratch, which is not practical, is with Open Source tools and formats, namely by leveraging USD. Proprietary 3D software vendors, and scanning focused ones in particular, may do some things better, but they're way behind in this particular area.
The problem here is that as with proprietary tools, there are many similar, competing open source tools and formats, each of which is great at what it is primarily intended for and not quite fit for purpose for other applications. I hadn't come across USD which appears to be a natural progression from other previous scene graph based tools such as OSG and VSG. Great tools for rendering and interaction though very time consuming to learn and difficult to use with point clouds. Compare these to the domain specific BIM tools, such as IFC Rail, and you see there is far more needed for data storage standards than a scene graph. There are many such BIM domains covered by IFC, all with big open standards and APIs and while they can be front ended with scene graph tools you regularly hit a schema clashes in doing so. Still well worth doing, but no one tool is a panacea.

The point here is that the geospatial industry is branching in a lot of different directions at great speed. We can leverage many open source and proprietary tools to create solutions for a given context but I'm very much of the opinion that we won't ever see a single solution that is fit for purpose for all contexts as they are too diverse. Even within the very narrow domain of open source point cloud formats we've failed to arrive at a solution that is a good fit across all contexts. Scope creep has been the undoing of very many software development endeavours and is something to be very wary of. The thinking behind looking for an open interface for point cloud exchange is that an interface is considerably simpler than either a full blown API or new data format while remaining extensible and complimentary to both open source and proprietary solutions.

FWIW, I'm a huge fan of open source software but not to the exclusion of proprietary solutions.
Shane MacLaughlin
Atlas Computers Ltd
www.atlascomputers.ie

SCC Point Cloud module
Post Reply

Return to “Opinion Polls”