3D scanning technology is emerging as a crucial aspect of engineering design and simulation, but how can a simple sensor develop an accurate 3D model?
3D scanners can be used to generate models of rooms, parts, components, and even people. For many engineering companies, 3D scanners have become as essential to their business. Any device with an image or light sensor and some positioning technologies can be used as a 3D scanner. These devices, often phones or tablets, essentially measure the objects in the world around it using lasers or images to generate highly-dense point clouds or polygon meshes that can be transformed into a CAD compatible file. Theoretically, it sounds simple – just point your camera or sensor around the room and the 3D file is generated – however, there is a reason that this technology is only starting to grow within the engineering and technology industry, so let’s get into the technical aspects of what makes it possible.
Processing power is key to what makes modern 3D scanners possible. For most of the modern technology age, we have had the ability, or rather, the knowledge to create 3D scanners. The problem has always been that the amount of processing power needed to generate highly accurate and dense point clouds of the physical world exceeded what was feasible. In recent times, we are seeing a greater emergence of this technology because you now hold all of the processing technology right in your pocket. There are currently many mobile apps that can transform your device into a 3D scanner; a quick Google search will yield you plenty of resources. For more complex engineering applications, dedicated machines are typically required that use lasers and precise global positioning. Within these intricacies, there are different types of 3D scanners for different applications: Short Range, Mid Range, and Long Range.
Short range laser scanners typically encompass a depth of field less than 1 meter. Normally these use laser triangulation systems which involve a source and a sensor. In other words, the source is placed at a known location and the sensor in another known location. The source then shoots a laser at the observed object and the sensor receives the light at a known point. Using some simple geometry, a point in a 3D lattice can be generated. Repeat this process and a complex point cloud can be generated. Another short-range laser system that uses triangulation is known as a structured light scanner. Instead of shooting one laser after another at the object and observing the reflection location, these scanners use a series of linear light patterns to develop a map of the object. By observing how the linear light paths deflect around the object, the software can triangulate a point cloud scan.
Mid and long range scanning systems need slightly different laser imaging technology to function. They normally utilize a laser-pulse based system known as time of flight scanners. These systems use intensely accurate measurement systems to record the time of flight for a laser to hit an object and return, down to the picosecond. Through the use of 360˚ rotating mirrors, these systems can quickly and easily develop highly accurate models of the object. Another slight variant to these time of flight systems uses phase-shift technology. Without getting into too much of the nitty-gritty physics, these systems modulate the power and amplitude of the laser wave and monitor phase change to develop more accurate 3D scans.
Laser scanners will likely always be more accurate than image sensor scanners that are currently available on mobile platforms. However, for many applications like building surveying and architectural modeling, these image sensors can accomplish the scanning job to the necessary degrees of precision. Integrated with simulation software, 3D scanning can develop simulational models of the actual component, rather than the CAD design. As these scanning technologies continue to grow, we will likely see their deeper integration into engineering operations, possibly helping to play into IoT technologies and real-time dimensional feedback.