The new sensor uses an infrared pulsed laser, which operates at a wavelength at which water vapor absorbs most sunlight before it reaches ground level, reducing the glare. The laser light reflected to the sensor is then detected by a layer of film made of quantum dots engineered to react to specific wavelengths of light. These quantum dots are five times as sensitive of the silicon-based detectors used in other sensors, enabling them to pick up reflections from a lower-powered laser.
According to programme manager Jennifer Lillie, the sensor detects obstacles 20 meters ahead at 30 frames per second. This is enough for commercial quadrotors moving at their top speed of 70 kilometers per hour to change path or decelerate to avoid collision. Lillie describes the sensor as “biscotti-sized”, making it suitable for small drones.
“In order to perform autonomously at a high flight speed of 20 meters per second, drones and other unmanned vehicles require at least half a second to recognize an upcoming obstacle and another half a second to change trajectory or decelerate in order to avoid it. This means accurate ranging at 20 meters is crucial,” said Jess Lee, InVisage President and CEO. “SML20 is the only solution enabling obstacle avoidance at that distance without being weighed down by a traditional bulky LiDAR.”
Many obstacle avoidance systems have turned to mechanical and solid state LiDAR for depth sensing, but conventional LiDARs place high demands on weight, size and power budgets, making them unsuitable for drones. The cost of LiDAR is also significant and can range from hundreds to tens of thousands of dollars per unit. Ultrasonic sensors and stereo cameras do offer more compact form factors than LiDAR, but ultrasonic systems offer only a sub-five-meter range and stereo cameras have high CPU demands and ranging capabilities limited by camera disparity. The SML20 eliminates the need to compromise, delivering effective collision avoidance with small size, minimal weight, and all-inclusive power consumption between 200 and 500 mW on average, according to the range requirements of the application.
Single camera-based obstacle avoidance systems use structured light to map their environments in 3D. Pairing Spark NIR sensors with lasers emitting a specific pattern of light, the system captures depth maps by detecting modifications to that pattern. SML20 delivers QuantumFilm’s increased sensitivity to 940-nanometer NIR light (five times that of silicon) at a 1.1-micron pixel size. This allows autonomous devices to perceive their surroundings with an accurate depth map fused with the sharpness of 4K 30fps video previously reserved for cinema cameras, in contrast to the limited information in the series of dotted outlines offered by LiDAR.
Conventional structured light cameras have struggled to perform accurately outdoors or in bright sunlight because more than half of sunlight is in the infrared spectrum. In the resulting wash of infrared, silicon-based camera sensors easily saturate and fail to detect the structured light patterns their devices emit. Optimized for the invisible NIR 940-nanometer wavelength, SML20 takes advantage of the fact that water in the atmosphere absorbs most of the 940-nanometer infrared light in sunlight, minimizing solar interference with structured light systems.
In combination with this wavelength optimization, SML20’s 1.1-micron pixels have a global electronic shutter—the only one of its kind at this pixel size. Because global shutter allows all parts of an image to be captured simultaneously, it eliminates the distortion of fast-moving objects caused by conventional rolling shutter. With global shutter, a structured light source can be pulsed in sync with an ultra-fast exposure, allowing for 20-meter ranging with high solar irradiance rejection while remaining eye-safe and low power.
The SML20 is only the beginning—stay tuned for extended range options at 100 meters and beyond in the coming quarters.