Wednesday, October 16, 2019
In the upcoming era of autonomous vehicles, mandating that a self-driving robo-car pass a driver’s test before it’s allowed to roam free on public roads strikes most people as a sensible precaution.
Indeed, the idea isn’t new. I remember reading an op-ed essay, “A 16-Year-Old Needs a License. Shouldn’t a Self-Driving Car?” three years ago in the New York Times.
The authors wrote:
We have tests for driver’s licenses because people differ in their skills and abilities. Systems for self-driving vehicles are no different in this respect: When we share the road, we need to know who, or what, is behind the wheel.
Well put.
But some safety experts argue that giving an autonomous vehicle (AV) a driver skills test would not necessarily prove that the AV is safe.
Imagine, for example, you are driving on a highway, and you notice the mattress strapped to the top of the vehicle in front of you is coming loose. You — a human driver — are expected to recognize the impending peril and have enough common sense to take precautions — perhaps steering into another lane, or braking to create more distance between you and the car ahead or, faced with airborne bedding and no other options, running over the obstruction.
Or picture this. As you’re driving along Maple Street, a bouncing ball suddenly appears in the road. As an experienced driver, you assume the possibility that a kid following the ball might pop up (although the kid is not yet in view). That almost instinctive anticipation prompts elevated caution and you slow down your car.
Humans are not always right in predicting, or intuiting consequences, but human drivers are expected to be mature enough to think ahead, consider options, and take the necessary actions when unexpected developments occur on the road.
But how does this work if the driver’s not a person? Do AVs have the innate savvy to anticipate the unlikely and devise a safety strategy to minimize the potential harm, and how would you tell?
Last week, during a webinar on UL 4600, that question popped up: Do we need road testing standards for autonomous vehicles?
Underwriters’ Laboratory, which has been developing a safety standard for autonomous products (called UL 4600), released its first draft last week. Phil Koopman, CTO of Edge Case Research and one of the standard’s principle writers, is quick to acknowledge that UL 4600 “is not a road test.”
Responding to the question of road-test standards, Koopman, said:
There’s been a lot of sentiment to have more or less a driver’s test. Have the car do these maneuvers… and there’s some thought that if you do that, the vehicle will be ready.
But Koopman added, passing such a road test — checking the driver’s skills — is only one piece of what’s required today.
He noted, “In my mind, the driver’s test has three big pieces.” One is a written test to see if you know all the rules of the road. Another is the road test. “Certainly, it makes sense to see if these vehicles can do the basic things."
Koopman added, “But the problem is that it [a road test] doesn’t prove it’s safe. Because the third piece [required in obtaining a driver’s license] is a birth certificate to prove you are 16 years old and human, or whatever the local age is.”
Why being 16 is required
Koopman made the same point at an automotive safety workshop in Brussels last month. But it didn’t hit me until later why “being 16 years old” is such a vital ingredient. The question that must be asked is: Does an AV model have the same basic “common sense” that a16-year-old is expected to possess when he or she gets behind the wheel alone and hits the road?
Some 16-year-olds don’t, but we let them drive anyway — provided they can demonstrate they possess the basic skills. For good or ill, society has established 16 years as the rock-bottom minimum of experience, knowledge, discretion and trust.
But the question remains, whether AV designers are confident in declaring that their robo-cars are at least as intellectually and emotionally mature as high-school kid in a used Camaro?
Koopman stressed during the webinar, that UL4600 does not specify AV road tests. Rather, he said, UL4600 “defines the standard of care. If you do the following things in your safety case, we believe you’ve made a reasonable effort in trying to engineer your system being acceptably safe.”
In short, UL 4600, consisting of extensive lists of best practices (and pitfalls), is a guide to AV designers to instill in their machines the approximate level of prior knowledge or common sense a competent 16-year old driver might be able to exert.
As a kid growing up, when I got into a jam, my parents often asked me, “Junko, #DidYouThinkofThat?” Of course, as this was the pre-Twitter era, my parents used no hashtag. Still, every time Koopman cites examples of “#DidYouThinkofThat?” in describing UL4600, I can’t help but cringe. I ask myself, “Damn, why didn’t I think of that before?”
Why no road tests?
Koopman made it clear during the Webinar that UL4600 requires no AV road test. “Personally, I think it’s premature to have a road test [for AVs]” for now, he said.
Today, every AV is built differently. Some models are good at doing certain things but bad at other things.
Moreover, let’s not forget how rapidly technologies are advancing.
Making a one-size-fits-all road test today means developing a “one-size-fits-none,” Koopman noted.
Instead of mandating a road test, Koopman believes it’s better that AV developers specify metrics of their own, and have them argue why they believe this set of tests is enough to prove the safety of each AV.
Safety case
UL 4600 puts the “safety case front and center,” as Koopman notes. Unlike existing safety standards that prescribe “how to do safety” by following step 1, step 2 and step 3, UL 4600 is about “how you’ve done it [safety] enough,” he explained.
But what exactly does a “safety case” mean? “It’s a methodical way to show the use of the best practices,” Koopman explained during the webinar. “Here are all the things we did, all the things we think could be problems and here’s how we mitigate them.” In making a safety case, safety system designers should be able to say, “Yep, we’ve thought of all that.” Moreover, they must be able to back up their claims with specific evidence and test results derived through engineering rigor.
For example, assume designers have developed robo-cars that will not hit pedestrians. It’s not enough for AV designers to claim this achievement. They should be ready to argue that the AV “will detect pedestrians of all types (small, tall, short, showing bare legs, wearing Da-Glo jackets, etc.) and “it will stop or avoid… pedestrians.” Further, designers should also demonstrate that they thought of certain instances when AVs might encounter “difficult-to-detect pedestrians,” and still identify and mitigate the risks caused by such circumstances.
Of course, these arguments must be backed by evidence. AV designers should be ready to show not only the “results of detect and avoid tests,” but also, “analysis of coverage of different types of pedestrians” and how the data “shows high pedestrian coverage.”
Beyond requiring AV designers to maintain engineering rigor and discipline in developing safety systems, UL 4600 is unique in a sense that it incorporates [safety] lessons learned from multiple industries and companies over time.
In other words, it embraces Koopman’s persistent question, #DidYouThinkofThat?
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|