TIL-post: point cloud processing

We’re working on a small archeo-acoustics project, whose starting point is a LiDAR-obtained point cloud in LAS format. This is the first time we’ve worked with this kind of data, so I’m writing down the data processing steps.

There is a complete tutorial on handling point clouds in Blender by Florent Poux: https://youtu.be/DCkFhHNeSc0

In order to visualize the data, my first attempt was using Blender, with this plugin: https://github.com/nittanygeek/LiDAR-Importer

You only need to pay a bit of attention about the Python path while installing it, especially if you have multiple versions installed, but the provided instructions are perfect. We were able to see points in Blender, and to get a general idea for the shape of the cave, but getting from there to an actual mesh is non-trivial.

Instead, we found a short tutorial for doing the same thing using CloudCompare.

As a super short summary: open your point cloud in CloudCompare, select it, convert it to a mesh by going to Plugins>PoissonRecon. You can also choose between the color actually captured for each point, or this density heatmap: red is maximum information density, blue is the minimum.

In the next steps we’ll need to convert the mesh to quads (using Blender), import the model into Ramsete, then assign materials to each face and run the simulation. In order to properly calibrate the model we’ll need to know the specifics of each material, or ideally even perform actual acoustics measurements, like we did for the Tindari paper.


Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *