Page 43 - EE Times Europe Magazine - June 2025
P. 43
EE|Times EUROPE 43
AUTONOMOUS DRIVING | SAFETY AND SECURITY
AV Safety Is Multifaceted—Even at the
Hardware Level
By Saumitra Jagdale
he fundamental safety of any vehicle, autonomous or otherwise, perception software to work more effectively,
hinges on the robustness and reliability of its hardware. Any error without needing to duplicate preprocessing
steps for every sensor stream. It’s not just
at the hardware level has a domino effect on autonomous decision- more accurate detection but also a more
efficient pipeline overall.”
T making. In critical autonomous vehicle applications, safe and reliable
operation of individual and interconnected physical components—sensors, SAFETY AT THE PROCESSOR LEVEL
Reliability must extend to processors in AVs.
actuators, and communication systems—is paramount. The rule of thumb for industry players has
been to follow safety standards not just for
IT STARTS WITH SENSOR RESILIENCE processors but for all ADAS/AV system hardware. Indeed, ISO 26262
AVs, whatever their level of autonomy, rely heavily on sensors to compliance is now legally binding in many markets. For processors,
navigate the roads in all situations and weather conditions. “Avail- that means the design and production processes must incorporate
ability—meaning the system’s ability to function reliably under all error-correction code, parity checks, and dual-core lockstep archi-
conditions—is critical,” said Elad Hofstetter, CBO at Innoviz tectures (wherein two processor cores are run in parallel and their
Technologies, an Israel-based developer of solid-state LiDAR sensors outputs compared cycle by cycle; if there’s a mismatch, the system
and perception software. “The raw data of the sensors must help the knows there’s a fault). These aren’t just optional features; they’re
vehicle navigate through rain, dirt, spray, and extreme temperatures.” essential for achieving ASIL certifications (part of the ISO 26262 safety
Innoviz has embedded features in its InnovizTwo LiDAR sensors for requirements), particularly in safety-critical domains such as braking,
resilience under real-world operating conditions, Hofstetter said. “You steering, and ADAS decision-making.
can spray a water droplet or mud on our LiDAR sensors and occlude a The design focus diverges from there, depending on the specific
lot of it, and the LiDAR will still function well. We see no gaps in the use case. For Hailo, an Israeli startup developing automotive-grade AI
point cloud, and that’s ultimately quite important for vehicle safety.” accelerators, reliability in runtime AI computations is a cornerstone of
system safety.
RECALIBRATING PERCEPTION MODELS “One of the key things we focus on is ensuring reliability in how
That said, no single sensor type can provide all the information needed neural networks operate on our chips,” Hailo VP Yaniv Sulkes told
for safe autonomous driving. Each has its own strengths and weak- EE Times Europe. “These networks involve massive models and
nesses: LiDAR sensors offer accurate 3D depth perception but lack constant data movement, so it’s crucial to verify—at runtime—that
visual details such as color or signage; cameras provide rich colors and computations are completed correctly and that the results are as
textures but struggle in low light and poor
weather conditions. “The industry is shifting
toward new EE architectures—particularly
zonal architectures—where sensor data
from across the vehicle is transmitted via an
in-vehicle network to a centralized compute
module and is fused,” Ron DiGiuseppe,
automotive IP segment manager at
Synopsys, said in his technical presentation at
the recent Mobility Tech Forum.
The most recent approach to sensor data
fusion is at the object level: Different sensors
independently detect objects in the vehicle’s
environment, and an AI perception model
merges the sensor data to create a simulation
of the vehicle’s surroundings. This is sensor
fusion at a high level, and it’s a power-
intensive task for processors, especially in
real time.
“That’s why we see value in low-level
sensor fusion, which enables you to detect
all kinds of things that might be hidden,”
Hofstetter said. “We embed inside our LiDARs
features that allow for easier low-level fusion,
such as accurate time synchronization and
data output formatting. This helps reduce Innoviz Technologies’ InnovizTwo in Volkswagen’s L4 autonomous shuttle
compute load. You’re also enabling the (Source: Innoviz Technologies)
www.eetimes.eu | JUNE 2025