Page 29 - EETimes Europe June 2021
P. 29
EE|Times EUROPE 29
Depth Sensing Takes Machine Vision into Another Dimension
pre-processing, cleaning, and AI enhancement, if needed, must be done
closer to the sensor in order to lower the burden on the CPU.”
Today, however, there is little computation done on the sensor itself
because it generates heat, said Cambou.
FORWARD-LOOKING VISION
Image sensors are a key autonomy enabler, but they cannot be added
indefinitely; the required computing power would explode. One solu-
tion is improving data quality, Yole’s analyst said. “If you really want to
solve autonomy, you will need more diversity quickly.”
New technologies are emerging to add a level of sensitivity and build
The curved glass lid atop OQmented’s MEMS mirror device machines that can see better. Cambou identifies two directions: neuro-
inspired the technology’s name: Bubble MEMS. (Source: OQmented) morphic sensing, in which each pixel acts as a neuron and embeds
some level of intelligence, and quantum imaging, which detects each
photon individually.
are more inclined to assemble the individual components themselves France-based neuromorphic startup Prophesee has rolled out what
or have them assembled by third parties into a complete solution. it says is the first event-based vision sensor in an industry package:
its third-generation Metavision sensor. “If you couple this Metavision
DEPTH- AND SIDE-SENSING FOR BLIND-SPOT DETECTION sensor with a VCSEL projector or another kind of projector that can
Depth perception is the ability to see things in three dimensions and project a suitable pattern, you can realize an event-based structured
to measure how far away an object is. LiDAR indeed acts as an eye to a light sensor,” said Simone Lavizzari, product marketing and inno-
self-driving car, and many car manufacturers use it to build a three- vation director at Prophesee. Why, exactly? Today’s state-of-the-art
dimensional map of the environment around the vehicle. Nonetheless, depth-sensing techniques impose a tradeoff between exposure time,
developments have focused predominantly on front-facing LiDAR accuracy, and robustness.
systems that have a long detection range (beyond 200 meters) but a
relatively small field of view (about 20° to 30°).
OQmented, a 2019 spinoff from Fraunhofer Institute for Silicon
Technology (ISIT) in Germany, is working to change that. The com-
pany says it has developed a MEMS mirror technology that enables
side-scanning LiDAR with a 180° field of view.
“The side-looking LiDAR systems are more targeting the short range”
to enable blind-spot detection, said Ulrich Hofmann, founder and man-
aging director of OQmented. Blind-spot detection is an important safety
feature that makes the side-scanning systems “even more relevant than
far-looking systems,” he added. For example, “you need those observing
LiDAR systems for the short range when entering intersections, because
there is a lot of traffic from pedestrians, cyclists, cars, etc., which can
easily lead to confusion and accidents. For that reason, it is important to
have a clear overview over a wide angle but also high lateral resolution to
discriminate between different objects — static and moving.”
OQmented has placed a curved glass lid on top of its MEMS mirror (Source: Yole Développement)
device, in contrast to a plane-parallel glass lid, to transfer the laser
beam into and out of the package and enable 180° laser scanning. The Coupling an IR projector with Prophesee’s Metavision sensor yields
patented Bubble MEMS technology not only offers “hermetic vacuum a fast response time for each independent pixel, in turn allowing for tem-
packaging and protection” from environmental contaminants but poral pattern identification and extraction directly inside the sensor, said
also ensures that the laser beam successfully transfers into and out of Lavizzari. “If you use an event-based sensor to do structured light, the
the package, because it always hits the glass in a perpendicular way, response is very fast. We can have a 50× [scanning time] improvement, so
Hofmann said. That is not always the case when a planar parallel glass [you need] 1 millisecond to get the full 3D scanning versus the conven-
lid is used; for large scan angles, part of the light is reflected back tional 10 to 33 milliseconds with frame-based approaches.” The accuracy
into the package at the lid. That is unacceptable for any kind of LiDAR is state-of-the-art, but the “software complexity is reduced to the mini-
solution, said Hoffman. mum, because we don’t need to do matching in post-processing.”
Matching is not done on frames after the event but pixel by pixel,
CLOSER TO THE DATA SOURCE at the sensor level. Among other benefits, “there is no motion blur,
Image sensors generate an enormous quantity of data. While most of because we can capture the point cloud very fast, and we are com-
the processing currently resides in the cloud or in the central process- patible with outdoor applications,” said Lavizzari. Ultra-fast pulse
ing unit, the trend is to take computing closer to the source of data and detection indeed enables a power increase while maintaining the
embed intelligence near or within the sensor. technology’s eye-safe rating.
For viewing purposes, the data is usually compressed with H264, On the quantum imaging side, Cambou mentioned Gigajot Tech-
which means it can be funneled through bandwidths in the 100-Mbps nology’s Quanta Image Sensors (QIS), which are single-photon image
range, said Yole’s Cambou. “In the context of sensing, data streams sensors with photon-counting capabilities. Gigajot, a California-based
are typically 10× to 100× larger — 1 Gbps for machine vision is very startup, claims dynamic scenes can be reconstructed from a burst of
typical — and if 10 cameras are being used simultaneously, then you frames at a photon level of 1 photon per pixel per frame. ■
very quickly reach 10 Gbps and beyond. The necessity to manage the
data close to the sensor arises from the burden at the CPU level. All Anne-Françoise Pelé is editor-in-chief of EE Times Europe.
www.eetimes.eu | JUNE 2021