In a tragic and unfortunate case, a man lost his life while behind the wheel of a Tesla Model S in autonomous mode. Without meaning to sound harsh, it was inevitablya matter of time before this happened. Google has been testing its autonomous cars for years now and numerous small incidents have taken place, but, to be fair most were not the car's fault. Unlike Google, which kept its testing internal, Tesla released its autopilot technology to all Model S and Model X owners in October 2015. That means real people are out there in cars that are driving themselves down the highway.
Both the Model X pictured above and the Model S feature Tesla's Autopilot beta technology
In the past month alone, reports have popped up of people reading newspaper when at the wheel and one person even seemingly asleep while his Tesla drove itself down the freeway. This kind of behaviour is generally frowned upon as the driver has a responsibility to remain alert even when relinquishing control to the car. Nevertheless, these reports showed that the system worked seemingly perfect. Until yesterday, when Tesla revealed the sad news that someone died behind the wheel of one of itscars in Autopilot mode.
On May 7th 2016, a Tesla Model S in Autopilot mode was going down a Florida freeway when it ran into the side of a trailer truck, cutting across the highway to get to the other side. Unfortunately, the side of the truck was white and neither the car nor the driver saw it against the bright blue sky. The car simply ran into it without touching the brakes or turning off Autopilot and warning the driver. Calling it a tragic loss, Tesla released a statement saying the extremely rare incident happened because of the high-ride height of the truck meant that the car hit it at windscreen level. Had the car hit the front or rear of the truck, Tesla says, chances of survival would have been high considering that the Model S has immensely safe crash structure. This claim was recently proved when a Tesla Model S withstood a massive accident where the car went over an 80-foot drop and rolled over numerous times in Germany. Incredibly, all five teenage occupants survived.
Autopilot beta can automatically accelerate, brake and change lanes at highway speeds, but Tesla insists drivers must have their hands on the wheel and be alert at all times
So what went wrong here? The key point to note is that Tesla does not call its Autopilot an autonomous driving technology but refers to it as more of a driving aid. Further, the technology itself has been called 'Autopilot beta' and Tesla makes no bones about the fact that the technology is still in development phase. No wonder then that customers have been instructed to always keep their hands on the wheel and a sharp eye on the road.
Without wishing to speak ill of the dead, it would appear that this might not have been the case in this accident. The reason Tesla insists on this is because Autopilot comes under what the industry terms as Level 3 of autonomous driving. In simple terms, it means the car can accelerate, brake and even change lanes at freeway speeds but requires the human to regain control in serious situations where the computers cannot take the necessary decisions. In an emergency situation, the car alerts the driver and deactivates Autopilot mode. If the driver fails to take over, the car will bring itself to a halt. Unfortunately, the Model S in Florida never sensed the impending danger and remained in Autopilot.
The solution to this weakness in the technology still lies a few years down the road. A Level 4 system, which is the ultimate goal for many companies including Tesla, will see the car carry out all driving functions, even in emergency situations. Level 4 systems from manufacturers like Volvo and Tesla are still deep in the development phase and are only expected by 2020.
Volvo is working on a fully autonomous Level 4 system that will allow the car to drive itself even in emergency conditions. This system is expected in 2020, the same time Tesla plans to debut its own Level 4 technology
As of now, self-driving technologies are in a dangerous transition phase;they are getting seriously good, good enough to let the 'driver' believe the car has full control when that isn't really the case. The National Highway Traffic Safety Administration (NHTSA) is currently investigating the crash, and it remains to be seen if such technologies will continue to beoffered to the public while they are in 'beta' test phase. Even if Tesla plugs this gap in Autopilot, there's no way to be certain new weaknesses won't crop up. After all, no one knows better than us Indians about how unpredictable road conditions can get. Until then, it is imperative for the driver to remain the driver.