Some experts question their safety
Uber has been available in Pittsburg for two years, and this week the company is unveiling driverless taxis to journalists and soon to the public. Some safety experts are concerned about the sophistication, or lack thereof, of the technology and the use of average people to test them.
Early testers “guinea pigs”
A major issue safety experts find with the driverless taxis is that Uber is using regular people to test them out. The company is currently using 12 taxis in Pittsburgh, and hope to have 100 in service by the end of the year. However, Pennsylvania doesn’t currently have any laws passed that permit or regulate self-driving cars for testing, driving and what to do when an accident happens.
Joan Claybook of the National Highway traffic Safety Administration warns that “of course” there will be accidents with the new cars, and that testing them without normal people would be the best course of action.
Pittsburgh’s unique driving conditions
The driverless taxi project was pioneered in Pittsburgh because the prestigious robotics research center at CMU is nearby. However, an oft-mentioned problem with the cars is that they have issues dealing with bridges – not great for Pittsburgh, which has the most bridges of any larger city in the United States.
Pittsburgh also introduces other unique challenges, like its old roads that are narrow, form strange angles at intersections, winter weather conditions and older signage. Proponents say these conditions can be recognized by the technology and would be safer for a machine to navigate than an average human driver. Plus, it’s a good way for the public to have the opportunity to test out the new technology and get acquainted with it – new technologies can’t take off if people don’t start using them.
What happens in an accident?
Last May, the first reported death due to a self-driving car, the Tesla Model S, occurred in Florida. The car in Autopilot mode crashed into the back of a semi that the car couldn’t detect was in front of it. So far, the fault is not completely with the driver or the car’s technology – Tesla’s beta Autopilot system warns drivers to stay aware and keep their hands on the wheel at all times.
Because the technology is seemingly being thrust into the public as soon as the companies can release it, many worry the technology isn’t up to par and will cause more car accidents than it will prevent. So who is to blame in an accident, and who pays for damages? The investigation process when an accident happens with a self-driving car will be much more complicated, like the Tesla case.