Wednesday, December 26, 2018
The human mind has a tendency to sometimes wander off, unbidden, elsewhere — to a place irrelevant to the immediate, often urgent, matter at hand — without the user of the mind necessarily noticing this detour.
Cognitive scientists have long known this phenomenon and have been studying it for decades. They call it “mind wandering.” They talk about people getting hit by a train of thought that mentally disengages them from an attention-demanding task like reading, a face-to-face meeting, or, most important of all, driving.
To understand the cognitive state of a human brain — or the mental status of a human driver at any given moment — is believed to be the next frontier for driver-monitoring systems (DMS).
Think of Level 1, Level 2, or Level 3 cars with partially automated driver assistance systems. Even when your hands are on the wheel (as Tesla reminds you to do), your head is erect, and your eyes are open and staring at the street ahead, your brain could be picking daisies in La La Land.
When this happens, how can a machine — your car — recognize your impaired condition? What actions should it be programmed to take?
To some people, the DMS is an unglamorous, old-school, “boring” technology. They even ask, “Why DMS, with Level 4 autonomous vehicles hitting the home stretch?” Such notions, however, couldn’t be further from the current state of the art, according to Colin Barnden, Semicast Research lead analyst.
Ever since Euro NCAP, a voluntary vehicle safety rating system, made the driver-monitoring system a primary safety standard by 2020 in Euro NCAP’s Roadmap, car OEMs and Tier Ones are mobilizing DMS in every Level 2 car by 2020, explained Barnden. He believes that “the moment automation comes into a vehicle,” regardless of autonomy level, “it’s time to introduce DMS.”
The market already has a host of DMS technologies from companies such as Seeing Machines (Canberra, Australia), Smart Eye AB (Gothenburg, Sweden), Affectiva (Boston) and FotoNation (San Jose, California).
An Israel-based startup called ADAM is poised to join the club, claiming that it is the first to add a “cognition” layer to traditional driver-monitoring systems. The startup is at the “proof-of-concept” stage for many of the features to be embedded into its adaptive driver attention management (ADAM) platform.
However, Seeing Machines, for example, has amassed voluminous data on drivers’ behavior through 18 years of “human factor” R&D activities. So Barnden is skeptical that ADAM is truly the first with a “cognitive layer.” But he acknowledged that leading DMS technologies will no longer be about measuring just one thing — such as measuring head position, eye gaze, or eyelid closure. Instead, “human factors” need to consider a variety of parameters.
Why ADAM?
EE Times caught up with Carl Pickering, former global lead of Jaguar Land Rover Automated Driving HMI and head of Research and Technology Strategy, who was announced this week as ADAM’s CEO.
In an exclusive phone interview with EE Times, Pickering explained that he decided to join ADAM after 20 years with JLR because it is the first company that he has found with a critical technology that he had long sought at his previous company.
In developing HMI for cars, critically important to a scientist/engineer is knowing “how best to measure the driver’s workload.” A carmaker could decide on the adoption of hand gestures, handwriting, or voice recognition for its man-machine interface. But in evaluating each, “we must be able to measure the driver’s manual workload as the driver tinkers with switches and knobs in a car and visual workload as the driver looks at the road ahead,” said Pickering. The third element that comes into play is the “cognitive workload” that a driver must bear. “But really, how could we measure that? I had not been able to figure it out until I met a team at ADAM in Israel.”
AI algorithms
ADAM, established barely a year ago, consists of 10 people — mostly cognitive scientists and AI experts. Combined, “ADAM has roughly 15 years of experience in both fields,” Pickering told us.
The team has examined “a complex set of parameters” including eye gaze, pupil dilation, eyelid closures, blink rate, and others to discern a driver’s cognitive level. To set a baseline for cognition level — which varies from one person to another — the team has used AI and developed algorithms. With this, ADAM can measure the deviation from a driver’s baseline cognition level on an individual basis.
While acknowledging that all of these algorithms are still in the lab, Pickering said, “When I first saw how they are measuring the cognitive workload, I was blown away.”
Pickering sees most of the conventional DMS technology as “a point solution.” It serves to determine if a driver is falling asleep or drunk, often by measuring a single parameter such as head position or eye gaze.
More importantly, Pickering stressed, “we call ADAM an attention management ‘platform’ because we are taking a holistic approach to integrate a ‘cognitive layer’ both in DMS and ADAS.”
The startup’s claim to fame is that “ADAM does not require infrastructure and will work with existing vehicles.”
Asked about the basic building blocks of the ADAM platform, Pickering said that all it needs is an existing driver-facing camera. “Our IP is in software algorithms, so we plan to run them on existing image processors inside a vehicle.” The idea is that a software approach makes it easy to retrofit existing vehicles with ADAM.
However, if ADAM is, indeed, a holistic approach that integrates the measured deviation of a driver’s cognitive level inside ADAS features, ADAM will surely need to get access to an existing in-vehicle ECU — designed as an ADAS SoC — and memory. Asked about the hardware requirements, Pickering said that the company plans to unveil details of the technology specifications in mid-January.
A machine-to-driver handover
Clearly, if ADAM works as it says it does, the platform could potentially offer a solution to what are deemed as the eternal “handover problems” of L3 cars.
Currently, L3 is defined as “conditional automation.” A driver is necessary but is not required to monitor the environment. The driver, however, must be ready to take control of the vehicle at any time.
Seriously, though, when a car decides to hand the wheel over to its resident human, the car has no idea what the driver is doing. The system needs to ask the driver if he’s sitting in the driver’s seat, if his hands are on the wheel, where his head is facing, what he’s looking at and, most important, what his cognitive state is. Is he here or in La La Land? Only then can the machine decide if it is safe to let this guy drive.
For a car to simply tell a driver “Here, take over” is irresponsible. The system must be able to assess the severity of a single point of a failure inside the vehicle, which comes down to that human driver, his or her state of mind, and how long — considering the driver’s cognitive wherewithal — the handover might take.
Future ADAM roadmap
Over time, ADAM is planning to add a system that includes a forward-facing camera, according to Pickering. For ADAM’s attention management platform to know exactly what’s up ahead on the road is critical because the platform will be providing visual or audible cues to the driver.
When a driver’s hands are not on the wheel or his head nods, a warning must be a voice, a visual marker, or a physical nudge delivered by the driver’s seat. Perhaps more critical is to warn of a specific hazard on the road ahead as the driver “wakes up” and restores cognition to the baseline.
Of course, the last thing that a driver wants is a machine acting like a backseat driver constantly kibitzing and second-guessing routine road decisions.
Pickering explained, “ADAM’s goal is to seamlessly manipulate a driver’s attention through cues by directing his attention to things relevant to hazards, risk zones, or anything else we sensed.”
Again, ADAM is still in its early phase, with its technology still in the lab.
Nevertheless, the startup claims that its early lab-based simulations show significant improvement in both driver response times and situational awareness. “Tests indicate that ADAM improves driver attention and response time by up to 60%.”
If leading DMS tech companies, including a startup like ADAM, continue to push the limits of cognitive science and fold it into ADAS and DMS, the critics who called these technologies “boring” might end up eating their words.
Semicast’s Barnden predicts that the United States (and Japan and Korea) will “follow the EC and put camera-based DMS in all four-plus-wheeled road vehicles (truck, coach, and bus, too).” In his opinion, the DMS should be installed in every car as soon as is practically possible.
Backup driver-monitoring systems should be mandated to all test L2–5 “test vehicles,” he added. “Humans are the last line of defense in AV test vehicles to avoid collisions with other road users when the machines get it wrong. Backup drivers must be paying attention at all times.” He added, “Tired, bored, or distracted? Then detect that and just end the test session. It is basic health and safety protocols. Elaine Herzberg would still be alive had Uber been fitted with proper camera DMS.”
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|