Uber: Fatal crash entirely avoidable, says police
Reuters has secured a report by the Tempe, AZ police through a Freedom of Information Act request today. According to it, the crash that occurred on March 18, 2018, leading to the death of a female pedestrian, was "entirely avoidable".
The report of the police confirms that the operator of the Uber vehicle involved in the crash was not looking at the street at or immediately before the time of the accident. She was, instead, looking down, taking momentary pauses to glance at the road ahead. The operator had told the police that her business and personal phone had not been in use at all until she called 911, after the accident. The local authorities obtained records from Hulu which confirm that the account linked to the vehicle's operator was streaming the reality talent show "The Voice" at the time of the crash. The streaming of the show ended at 9.59pm, which matches the approximate time of the collision, according to the police.
It should be noted that the vehicle's detection systems (which include front and side-facing cameras, LIDAR, etc) detected an unknown object approximately 6 seconds before the collision occurred. The system acknowledged the victim as an unknown object, a vehicle, a bicycle, and eventually, a pedestrian. Although the system was designed to engage emergency braking in order to avoid collision, the emergency brakes had been deactivated, and the vehicle operator was supposed to engage the brakes. Sadly, it does not seem like Uber had built in a system that would alert the driver as to the need for braking.
The police report confirms that the detection systems of the vehicle operated as they should have - to some extent; they detected the pedestrian, predicted the impending collision, and calculated the necessary braking / change of path for avoidance thereof. The fact that an innocent woman was killed despite of that, though, highlights three issues with respect to the pursuit of autonomy.
Firstly, the accident shows that the competitive race to a Level 4 / Level 5 vehicle encourages AV-testing that is sometimes irresponsible. This was the case with Uber: irrespective of the operator's obvious mistakes, the system was not ready for primetime testing on public streets.
Secondly, the reaction of the victim highlights another shortcoming of the currently available "autonomous" vehicles. Pedestrians are uncomfortable interacting with them: they can hardly tell them apart from manual vehicles, and when they do, they react in accordance with what a manual vehicle would do (although in this case, the victim was tested positive for methamphetamine and marijuana, which could explain her lack of full awareness the collision happened)
Thirdly, and perhaps, most importantly, the behaviour of the operator is probably indicative of the behaviour of a large group of people with access to vehicles with some level of autonomous driving. Although Level 2/3 autonomy is not supposed to be used as full autonomy, it is often perceived as safe/good enough and gives people the impression that taking their eyes from the road is acceptable. Unfortunately, only a week after the Uber crash, a Tesla Model X crashed in Mountain View, killing its driver, a 38 year old software engineer working for Apple who, according to Tesla, had his hands removed from the steering wheel and had ignored multiple warnings from the car to regain control.