Lumotive and Hokuyo Automatic have announced a combined effort to release the world’s first solid-state beam steering LiDAR sensor module. The LM10 Light Control Metasurface (LCM) chip by Lumotive is the heart of the Hokuyo Automatic YLM-10LX 3D LiDAR sensor.
Solid-state 3D LiDAR YLM-10LX. Image used courtesy of Hokuyo Automatic
Lumotive is a 2021 startup built around the technology used in the LM10 chip, its first commercial product. Hokuyo Automatic has been in the automation industry since its founding in 1946 and has an extensive portfolio of optical sensing and ranging products.
LM10 solid-state beam steering. Image used courtesy of Lumotive
Lumotive’s LCM manipulates and directs the laser light with the metasurface instead of the traditional mechanical componentry.
Solid-State Beam Steering
The aim of solid-state beam steering is to significantly reduce the size, weight, and cost of reliable LiDAR. Vision LiDAR typically uses moving mirrors or prisms to traverse a laser beam horizontally and vertically in a raster scan pattern similar to that of an old cathode ray tube (CRT). Building these systems robust enough to survive in an automotive environment is expensive from both a financial and structural perspective. It makes for a bulky and expensive system.
The LM10 is a solid-state chip manufactured on conventional semiconductor wafer fab equipment. Each chip replaces a significant set of electro-optical moving mechanical parts.
Software adjustable scanning with solid-state beam steering. Image used courtesy of Hokuyo Automatic
An additional advantage of the Lumotive LM10 sensor’s solid-state beam steering is its sensing flexibility. Range, angle of view, and other parameters with conventional beam steering are limited based on the mechanical and optical configuration as built at the factory. Solid-state beam steering can be used to adjust many of these parameters in real-time. For example, the system can keep a generalized scan ongoing but then focus on an anomalous object for greater detail.
LCM Technology and YLM-LM10LX Module
The Lumotive sensor can scan up to 180 degrees with appropriate optics. Resolution can be changed dynamically via software. It offers true zero-inertia solid-state beam steering. Scan patterns are software-defined and random access. The chip aperture is 11 mm × 9 mm, enabling compact end designs. As implemented in the YLM-LM10LX, it delivers a maximum field of view of 120 degrees (H) x 90 degrees (V) at distances from 0.5 to 10 meters.
World’s first solid-state beam steering LiDAR sensor. Image used courtesy of Lumotive
Lumotive also offers a complete reference design built around the sensor that is about half the size of a credit card. The design comes with a complete set of manufacturing and support files. Lumotive has shared libraries in C/C++ and sample code written in Python to drive the sensor and display the results. Hokuyo provides instructions, sample code (in Python, C#, and Java), and libraries for using the YLM-LM10LX with robot operating system (ROS) and ROS2.
LiDAR in Action
LiDAR, or Light Detection and Ranging, is one of the three primary vision technologies used in automated driver assistance systems (ADAS) and robot vision systems, along with radar and binocular image processing. LiDAR emits a laser beam pulse and measures time-of-flight with the formula d = (c × t)/2, with d being the distance, c the speed of light, and t the time. A simple ranging system may look at whatever point is directly in front of the sensor. For practical 3D vision systems, either the sensor has to move or the laser beam does.
Lumotive and Hokuyo’s partnership presents the first LiDAR sensor of its kind to apply programmable, solid-state optics to 3D sensing for service robotics and industrial automation. According to the partners, the sensor offers a higher range field of view (FOV) and range than other solid-state products on the market, thanks to the LM10 chip.
The LM10 chip. Image used courtesy of Lumotive
This solid-state beam steering component provides accuracy and stability in 3D object detection and distance measurement while also effectively managing multi-path interference. With its digital, software-defined scanning, users can modify the sensor’s detection range, resolution, and frame rate. It can also support multiple FOVs at once and adapt to an application’s changing requirements both indoors and outdoors.