A deadly 2019 crash involving a Tesla Model S prompted new scrutiny over who should be held accountable in such cases.
Tesla driver Kevin George Aziz Riad, 27, has been charged with two counts of manslaughter in the vehicle by negligence causing the deaths of Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez.
An April 2020 civil lawsuit filed by the Gilberto family in Los Angeles County Superior Court alleges that at the time of the December 2019 crash, the Riad was traveling at “excessive speed.” ” while using Tesla Autopilot. Semi-autonomous driving technology can steer, brake and accelerate the vehicle on its own.
The National Highway Traffic Safety Administration confirmed that Autopilot was active at the time of the crash after dispatching officials to investigate, Related press reported.
Prosecutors said they could not immediately release more details about the case.
The charges against the Riad appear to mark the first time a driver in the US has been charged with a felony for using semi-autonomous driving technology. Not only will the case renew discussion about the dangers of overusing technology, but it also has the potential to set a standard for holding motorists accountable in similar incidents.
In an interview with NBC News, Aron Solomon, Esquire Digital’s head of legal analysis, said: “Autopilot or not, from the second they get in the vehicle, the driver is responsible for the vehicle. everything happens. Referring to the negligence charges Riad is facing, Solomon said the legal concept for negligence is “situational.”
“The question is always what makes sense in that situation. That is always what the law will consider. ”
Michael Brooks, executive director at the Center for Automotive Safety, a nonprofit advocacy group focused on the US auto industry, said he hopes Tesla drivers and owners see the case. and understand that Autopilot has limitations. “It’s not going to get them from any point A to any point B that’s always safe, and they need to be held accountable for the car’s actions,” Brooks said.
Jonathan Handel, a law professor at the University of Southern California’s Gould School of Law and an expert on autonomous vehicles, said the case hopes to show that semi-autonomous systems, like Autopilot, are not a replacement. for the driver.
“I think it’s going to have an impact on how drivers approach technology, and so it’s hopefully going to have an impact on the way the industry operates,” he said, adding that he believes Tesla needs to bear the brunt of that. responsible for the deaths.
Tesla did not respond to multiple email requests for comment. In a court filing, the company argued that “the Model S meets or exceeds all of Tesla’s internal standards as well as applicable industry standards, including but not limited to those established by the National Institute of Technology.” United States National Standards issued.”
Lopez and Nieves-Lopez were driving through the intersection of Artesia Avenue and Vermont Avenue in suburban Gardena, Southern California, when their car was hit by Riad’s vehicle, according to the lawsuit.
Riad’s car was off the highway when it ran a red light. Lopez and Nieves-Lopez died at the scene. Riad, a limousine service driver, and his passenger were hospitalized with non-life-threatening injuries.
Above it websiteTesla says the Autopilot features are “designed to assist” the driver and require “active monitoring of the driver and do not make the vehicle autonomous.”
However, Brooks says using the term autopilot can mislead people into thinking the vehicle is more capable than it is, an assumption that can have deadly consequences.
Over the years, Tesla has made headlines after accidents involving its semi-autonomous technology. Follow New York Times. The car sped past a stop sign and flashing red lights, and crashed into another vehicle, killing a 22-year-old university student. The Tesla driver has not been charged in the incident.
Last month, Param Sharma was arrested in California and was charged with two counts of reckless driving and disobedience to a peace officer after police found him in the back seat of his Tesla as it sped down a highway with Autopilot enabled. activated. No one was injured.
“People think of it more as a self-driving car or a self-driving car when in reality all Teslas have is an advanced driver assistance system that uses things like lane departure, adaptive cruise, braking and other functions to give the driver a less burdensome experience,” explains Brooks. “And so we tend to see people overestimate the capabilities of the vehicles.”
Bryant Walker Smith, a law professor at the University of South Carolina who studies autonomous vehicles, said he hopes the case alerts all motorists to pay more attention to the road, regardless of ability. What is the capacity of that vehicle?
“I don’t want people to hear this and think, ‘Oh, that’s not my problem. I don’t have an expensive car. ‘ Because distraction and reckless driving are a problem for millions of Teslas on the road, tens of millions of other vehicles have driver assistance systems, and hundreds of millions of vehicles don’t,” he said. .
“This should be a wake-up call to people that they are operating dangerous machines.”
https://www.nbcnews.com/news/us-news/tesla-driver-charged-manslaughter-deadly-autopilot-crash-raises-new-le-rcna12987 Tesla driver charged with manslaughter in deadly Autopilot crash raises new legal questions about autonomous driving tech