Page 48 - EE Times Europe Magazine - June 2025
P. 48

48 EE|Times EUROPE

        Face Value: AI That Knows When You’re Too Tired to Drive


          “Be it your gaze patterns, your blink rates, your smile—there’s so   Eye has since developed multi-camera systems and has been
        much information in those paths,” he added. “People don’t really   investigating the use of 3D cameras, including time-of-flight sensors as
        notice it, but our algorithms do.”                    well as radar, as part of a multimodal sensor setup.
          For automotive OEMs and Tier 1 suppliers, the end result of this   “The NCAP assessments have been pushing us toward this—for
        vigorous expression analysis development is the B-Automotive   example, radar can detect a child in the footwell,” Remijn said. “And we
        software development kit, which will help their automotive platforms   already have tools built for multimodal [sensor] fusion.”
        track driver behavior so their vehicles will meet future Euro NCAP   Unsurprisingly, the rising safety demands are inextricably linked
        safety guidelines. The software can be used with near-infrared and   with the steady push toward vehicle autonomy. And the shift toward
        color cameras and be deployed on standard automotive chipsets and   all-occupant tracking is already helping to bridge the gap between
        operating systems.                                    manual driving and semi-autonomous vehicles (Levels 2 to 3).
          Blueskeye AI recently joined forces with Japanese tech distributor   Blueskeye AI’s Valstar noted that the tracking technologies will
        Cornes Technologies to market its software to Japanese OEMs and    be essential for managing smooth handovers from human drivers to
        Tier 1 suppliers and, at the time of writing, had just signed a contract   autopilot systems and vice versa. Looking further, Remijn noted that
        with an unidentified automotive OEM.                  eventually, occupants will be the sole focus of in-cabin monitoring
          “The latest [Euro NCAP] test protocol includes intoxication, which   systems.
        probably takes [behavior analysis] the furthest
        so far, but 2029 and beyond will require much
        more detailed analysis,” Valstar said. “And that’s
        where we will really shine.
          “The way people behave is a combination of
        social signals, emotional signals and expres-
        sions of their biophysical state,” he added. “The
        complexity lies in what element is conscious
        and what element is autonomous, and that’s
        where a lot of our IP sits.”

        A WIDER VIEW
        Blueskeye AI is hardly alone in developing AI to
        provide insight into human behavior; a mix of
        deep-tech startups, vision AI firms, and auto-
        motive Tier 1 suppliers have joined this growing
        field. Examples include Valeo (France), Cipia
        (Israel), and Seeing Machines (Australia).
          One of the industry’s front-runners,
        Sweden-based Smart Eye, was founded back
        in 1999 and has been developing software for
        many years. The company acquired human-
        behavior-related AI software developers
        Affectiva and iMotions in 2021, and it has   Blueskeye AI is partnering with automotive manufacturers on the integration of its
        developed driver monitoring systems that   technology into in-cabin monitoring systems. (Source: Blueskeye AI)
        include infrared sensors, computer vision, and
        AI to bring insight into a driver’s state and behavior. Thus far, the   “When we get to Level 5 [the highest level of autonomous driving],
        software has been installed into more than 2 million cars globally,   the driver will become an occupant,” he asserted. “We’re already
        according to the company.                             seeing customers using our driver monitoring features and applying
          “In automotive, the starting point for us was high-precision eye   emotion [analysis] on the occupants. We believe this will evolve so
        gaze tracking and head posture tracking, but we realized there’s so   that driver features eventually just become a subset of the occupant
        much more to human insight than just the eye and gaze,” said Matthew   features.”
        Remijn, product VP for automotive solutions at Smart Eye. “That’s   What happens next? Fully autonomous vehicles are still many years
        when [our acquisitions] came into the picture, and we began to look at   away, and in the interim, the necessary AI software will have reached
        the face to extract emotions and more.                new heights in sophistication. “The holy grail is to view the person like
          “We’re now also really digging into intoxication and impairment,”   you know them, so you can read facial expression, understand stress,
        Remijn added. “The automotive OEMs and platform makers are rapidly   and make good suggestions to the end user” based on that understand-
        implementing distraction and drowsiness technology.”  ing, Remijn said.
          Like Blueskeye AI’s technology, Smart Eye’s driver monitoring system   Blueskeye AI’s Valstar said the field is now moving into “the hard
        uses facial expression analysis and tracks eye, head, and face movements   stuff” aspects of behavior analysis. “As part of our long-term research
        to detect the earliest signs of fatigue as well as distraction from behav-  and development, we’re looking at how hormones influence your
        iors such as drinking, smoking, and mobile phone use. Remijn pointed to   expressive behavior,” he said.
        additional requirements, such as seat belt detection. “We need to know if   Such research could have a profound impact on the cars of the
        [the belt] is routed correctly across the driver,” he said.  future. “[Vehicles] are going to be a great place to measure health, as
          Smart Eye has also been busy developing its Automotive Interior   the driver tends to be in the same place, driving the same route, day in
        Sensing system, which combines driver and cabin monitoring to   and day out,” Valstar said. “You’ll be able to start looking at some very
        track eye gaze, body key points, and other status data for all vehicle   small changes over time, which is great for, say, detecting degenerative
        occupants. First released in 2021, the system relies on an OmniVision   disease. I expect that in 10 years’ time, you’ll be getting a free assess-
        RGB-IR image sensor, to track driver and passenger movements. Smart   ment of your health, just as a bonus of driving your car.” ■

        JUNE 2025 | www.eetimes.eu
   43   44   45   46   47   48   49   50   51   52   53