When you took your driver's license test and got your driver's license, somebody gave you a lecture on responsibility. Maybe your family was a driving instructor or a DMV tester. You were responsible for the car and it was your responsibility not to crash it.
You probably haven't taken this lesson in a while, but it's still true. Someone should always be responsible for any moving vehicle. This is a basic doctrine Driving, flying aircraft, using forklift, cycling or hell, even walking. If you run into someone else, you're in charge.
There are many there hemming and hawing How autonomous instruments can make this problem of responsibility more complex. Making things. Or at least they shouldn't. Some people like to pretend to present a much more complex responsibility matrix about Who is Responsible when something goes wrong, and invoke philosophical thought experiments about difficulty Everything seems to be a different problem than it is in the process of doing it. But whether the operator of the vehicle is a person or computer, someone is still responsible.
Nevertheless, there would be no news of the woman who was shot and killed by Uber's self-test vehicle in March 2018 when Tuesday's National Transportation Safety Board heard about Elaine Herzberg's death. Resultseveryone is responsible.
The NTSB identified Uber's “inadequate security culture” and “security issues;; has has been well documented, just as one of the many reasons. But it also affected the accident won by Uber's failure to pay attention to the safety driver – a separate issue of how self-driving cars were programmed and how management designed the testing process to prioritize “criteria gibi such as miles used to influence the new boss before security – and more about government agencies such as the Arizona Department of Transportation and the National Road Traffic Safety Administration to avoid strict regulations or mandatory safety reports.
Most importantly, even the NTSB found that Herzberg was partly responsible, because he couldn't cross the street at a pedestrian crossing – I don't care if the bike path he was riding spit him in the middle of the block and kept drugs in the system. The time of the accident, in its most auspicious interpretation, means that Herzberg is not warned enough to dive before the impact of the Uber SUV.
The entire “Probable Cause in of the NTSB:
The National Transportation Safety Board said the possible cause of accidents in Arizona Tempe was that the vehicle operator failed to monitor the driving environment and the operation of the automatic driving system because he was visually distracted by his personal cell during the trip. telephone. Contributing to the accident was the lack of adequate mechanisms for (1) inadequate safety risk assessment procedures of the Uber Advanced Technologies Group, (2) ineffective oversight of vehicle operators and (3) adequate automation of operators – all due to insufficient safety culture. Other contributing factors include (1) disabled pedestrians crossing N. Mill Avenue outside the pedestrian crossing, and (2) inadequate oversight of the Arizona Department of Transportation's automated vehicle tests;
In summary, the NTSB concluded that the accusation was based primarily on the security driver, a contractor hired by Uber to supervise the software driving the car. Uber's inadequate safety culture was considered a “contributing faktör factor to the accident. Herzberg's behavior and the Arizona State Department of Transportation's inattention about AV tasting are also under name control.
The attitude of the many offenses to be circulated was consistent with the general tone of the trial, a distress and sometimes even praise for Uber. This may seem strange if the NTSB investigation concluded that the company's workers were the cause of the accident. Many times, however, members of the NTSB not only forgive Uber, but praised his actions after the accident as if the company before Hertzberg's death was completely separate from him. Everything was very good about sports.
NTSB President Robert Sumwalt's summary statements described the general approach. “Uber ATG has really embraced lessons from this tragic event, Sum Sumwalt said. “Uber really embraced these lessons and we want to encourage them to continue this journey, and we want others to learn from it.”
You could almost hear the underlying emotion, the person he clearly wanted to say. we all make mistakes.
Meanwhile, the NTSB was never quite clear. Uber's journey consisted of cultural changes made by the company, except to ask for two security drivers instead (what could have made a difference in the 5.6 seconds Herzberg made to save his life?). Although Sumwalt said he hoped other AV companies would learn about it at the opening, it was not clear what he hoped to learn.
Of course, the hearing summary contains no clues. The state of Arizona includes the following recommendations to NHTSA, the American Association of Motor Vehicle Managers, and finally to Uber:
At least complete the implementation of a safety management system for automatic drive system testing, including safety policy, safety risk management, safety assurance and safety incentive.
These “recommendations asına are a Homer Simpson-esque in the absence of substance, it was enough to repeat the word “security..
Obviously, the NTSB does not have the authority to punish anyone. It conducts investigations and advises, but it is absolutely within their rights to do better.
But NTSB's enthusiasm with Uber – in a very veiled hardship in Elon Musk, Sumwalt said it would be only a curious curiosity if Uber CEO had not been symbolic because he was not hitting on him. is responsible for the death of. prosecutors Uber refused to charge For any criminal inaccuracies. Herzberg’s family an undisclosed compromise reached With the company less than two weeks before the accident, many of the facts underlying the case emerged.
Finally, the security driver is the one most responsible for all of this. Prosecutors refused to accuse him, and he was the only person mentioned in the NTSB's possible case statement.
To be sure, the security driver is far from innocent. On his smartphone, he was watching The Voice's clips placed under the steering wheel within minutes of the crash. He wasn't paying attention to the road. Doing so may have saved someone's life.
But that's exactly it; The safety driver's role is not to drive a car, but to save lives before retiring. Sitting in a car driven by a bad computer program, this is a wacky and inevitable prisoner job. Limiting the safety driver to the operator of the vehicle is a categorical error, as the NTSB makes when it says “operator”. He didn't drive, it was the computer. It was not a mistake, its purpose.
The main problem here is a series of decisions that Uber takes to drive cars using computers. unacceptably high failure rates it was Uber's ignorance on the way – or his dismissal …Decades of research that show that people don't share responsibility well with computersresearched his rival Waymo's investigation. embraced and learned many years ago. Waymo is armed with the same information and technology He decided he wouldn't take any risks as Uber. Uber did it.
Someone should always be in control of the car. When this car is driven by a computer program, the program maker is under control. When this computer program is so bad about driving cars, nobody is under control. Even a human backup drive cannot compensate for this.
The most cynical interpretation of the security driver's role in all of this is one of the falls. In this comment, if something goes wrong, they will be charged. I don't know if I'm willing to go that far, but that's the whole plan, then the Uber accident suggests it's good. After all, Uber is legally open, paying no legal or financial compensation for this disaster. They even found an assailant from the accident investigators for saying all the right things. The security driver can still go to jail.