Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths
Yet Phoney Stark keeps on whinging about the risks of AI but at the same time slags off humans who actually know their stuff especially regarding safety.
Tesla Autopilot has nothing to do with AI. It is a lane keep assist system with cruise control.
Which uses computer vision, which is a form of AI. It doesn’t have to be complex, or even work well, to be considered AI. All you need is a computer that makes decisions based on dynamic inputs and a set of rules on how to handle them.
Which is why the name Autopilot is (very dangerous) false advertising.
Also lets not forget the “your Tesla will be an autonomous Robotaxi” Bullshit.
Autopilot is not full self driving. FSD is an additional $15k option on top of autopilot. The article posted here was for an accident in 2019 before FSD was available to anyone. My tesla fully self-drives itself every single day regardless of what you might think.
https://www.oxfordlearnersdictionaries.com/definition/american_english/autopilot#:~:text=Definition of autopilot noun from the Oxford Advanced,the need for a person to control it
Its false advertisement or at least a poor naming choice. Should have called it “Megadrive Exo Giga Assistent” or something.
It has a lot to do with AI. Their systems use a lot of deep learning etc to recognize agents/obstacles on the road (perception), to infer how the agents will move in the future (prediction), and to generate trajectories for their car (motion planning). It definitely isn’t Artificial General Intelligence, but it is most certainly AI.
You are referring to FSD. Not Autopilot.