As technology continues to improve and costs decrease, the applications of LiDAR are likely to increase dramatically. At the Sensors Converge 2022 conference, LiDAR companies are demonstrating amazing technology for applications that include automobiles, robots, drones, smart cities, logistics, and even road repairs.
In this article, let’s dive into some of the technology being demonstrated at this year’s conference.
XenomatiX True Solid-State LiDAR
The first company we’ll dig into is XenomatiX. “It’s not a solid-state LiDAR; it’s a true solid-state LiDAR,” emphasized Jacopo Alaimo, XenomatiX’s Sales and Business Development Manager, when describing the XenomatiX LiDAR or “XenoLidar-X” products. The difference from competitors is that XenomotiX is not using mechanical scanning or even micromirrors to create a 3D point cloud.
Instead, XenomatiX uses thousands of VCSELs (vertical-cavity surface-emitting lasers) to map a scene in a single flash. A global shutter allows them to collect a full frame of data with a single snapshot of its CMOS image sensor.
XenomatiX award-winning true solid-state LiDAR sensor system. Image used courtesy of XenomatiX
The CMOS image sensor not only receives the LiDAR photons to generate a 3D topographic point cloud but is also used to capture a 2D visual background image. The 3D point cloud and 2D image can be overlaid to provide additional scene information or combined with sensor fusion for improved object detection.
The 2D visual image information is also used for noise reduction. The ambient light collected by the standard image capture provides a background lighting baseline that can be subtracted during the 3D LiDAR photon collection.
In the conference display shown below, an older generation of the XenomatiX LiDAR system was perched above its booth, providing a real-time point cloud display on the wall monitor. Overall, the features in the wall about 50 m away were clearly visible.
The XenomatiX LiDAR mounted above the booth provides a real-time point cloud of the Sensors Converge conference.
Initially, XenomatiX was focused on the far-field application of LiDAR for ADAS systems with a range of 200 m. However, it has recently expanded its efforts for near-field applications, including “automotive, industrial, robotics, and smart cities,” said Alaimo.
Two of the most interesting uses of the XenomatiX LiDAR that Alaimo described to us at the Sensor Converge conference involved the mapping of roadway surfaces with millimeter accuracy. First, looking ahead of a moving car it can generate a detailed terrain map that can be fed to the suspension system to provide a smoother ride.
XenomatiX’s Jacobo Alaimo explains road surface mapping with LiDAR.
Secondly, road surface topology can be used to improve the paving process.
As Alaimo explains:
“There are very strict thickness requirements when paving. If the final surface is not flat enough, the construction company has to repave everything. When this happens, they may spend nearly billions.”
WIth precision mapping before and during the repaving process, less paving material can be applied to smooth the surface and better corrections can be made during the blading process. Both have the potential to reduce costs and produce better roads.
PreAct’s Software-defined Near-field Flash LiDAR
While most of the work in ADAS (advanced driver-monitoring systems) is focused on far-field collision prevention, PreAct Technologies is bringing its continuous-wave near-field flash radar technology to imminent collision detection. With low cost, high frame rate, and high accuracy, it hopes one day to provide valuable milliseconds of warning to the car safety systems just before an unavoidable collision occurs.
Near-field LiDAR could provide imminent collision detection out to 20 meters. Image used courtesy of PreAct Technologies
At this conference, we meet with Paul Drysch, Founder and CEO of PreAct Technologies, shortly before its T30P Flash LiDAR won a Best of Sensors Award in the Automotive/Autonomous Technologies segment at the Sensors Converge conference on Tuesday. Naturally, we asked him why those extra milliseconds are important. Some of the answers were surprising.
First, you can launch much larger airbags that inflate more slowly. Larger and even multi-stage airbags could provide additional protection to the occupants, while the slower inflation rates would reduce the injuries that are currently caused by airbag inflation.
Another use of those precious milliseconds would be to lower the windows a bit. “People blow their eardrums, believe it or not, from the pressure caused in certain crashes,” explained Drysch.
If the system detected a side-impact collision was imminent, an electronic suspension could tilt the vehicle a few degrees to allow the frame to take more of the impact instead of the door. You transfer that impact energy from one of the weakest parts of the car to one of the strongest.
After nearly four years of hardware and software development, PreAct aims to release its first product in August, claiming that:
“Our whole goal in the automotive space was to replace traditional near-field sensors—ultrasonic sensors, short range radar, and, in some cases, even RGB cameras”
Images from continuous-wave near-field LiDAR. Image used courtesy of PreAct Technologies
Its unique approach is to use standard components including off-the-shelf time-of-flight sensors and LED emitters instead of lasers to detect objects from 1 cm to 20 m. The company wants the system to be inexpensive but provide high performance.
“At the end of the day, there’s a processors, an imaging chip, some LED emitters, a lens with housing and a PCB that connects it all together.“
Drysch admits that others have tried this approach without much success. Bright outdoor settings are the “nemesis” of the 940 nm light detection system; however, the company claims to have been able to overcome this obstacle. It uses a lower-end FPGA (field-programmable gate array) for the image processing to create the point cloud depth map and to provide a perception layer with algorithms for object classification.
The design Drysch showed at the conference had 10 LED emitters that operate in pairs to provide zone illumination. This supports use cases where you want to highlight different objects in the field of view. Users can configure how they want to illuminate these LEDs to best fit their application.
While PreAct’s primary focus is the automotive industry, it is also seeing opportunities in autonomous mobile robots (ARMs). By providing better object detection and higher resolution and increased distances, the systems can move faster and operate more safely.
A lot of robots in factories today will include a LiDAR, a couple of cameras, and an ultrasonic sensor to provide environment and obstacle detection. PreAct believes it can replace all of those with one or two of its software-defined LiDAR systems which individually can provide a field of view that spans 108 degrees horizontally and 81 degrees vertically.
Mirrorcle’s 3D LiDAR and Robot-to-Human Intention Communication
In a collaboration with Microchip, Mirrorcle was demonstrating a robot that combined its LiDAR imaging products with its vector graphics laser projection (VGLP) technology. The LiDAR uses a large-angle MEMs mirror for scanning the LiDAR to create scene information.
While the robot’s movement was limited by the small booth stand surface, the VGLP was projecting a vibrant, multi-color message and direction track information on the surface. This was an example of robot-to-human communication. The idea is that a moving robot can project its “intent” onto the floor ahead of it as it moves in and around people.
Mirrorcle’s RGB vector graphics laser projector provides robot intent messaging.
In the demonstration, the red lines would display a curved track on the surface indicating where the robot was going to be moving in the future. The green text provides text information. This was an interesting take on how robots and humans might work more efficiently and safely in proximity to each other.
LiDAR Makes Inroads Beyond Vehicles
While LiDAR for long-range sensing in autonomous driving applications will continue to get a lot of investment and attention, the presentations at Sensors Converge suggest that there are many more industries and applications that will benefit from this set of technologies.