Tesla knew Autopilot caused death, but didn't fix it

L4sBot@lemmy.worldmod to Technology@lemmy.world – 588 points –
Claim: Tesla knew Autopilot caused death, but didn't fix it
theregister.com

Tesla knew Autopilot caused death, but didn't fix it::Software's alleged inability to handle cross traffic central to court battle after two road deaths

104

You are viewing a single comment

"A bit misleading" is, I think, a bit of a misleading way to describe their marketing. It's literally called Autopilot, and their marketing material has very aggressively pitched it as a 'full self driving' feature since the beginning, even without mentioning Musk's own constant and ridiculous hyperbole when advertising it. It's software that should never have been tested outside of vehicles run by company employees under controlled conditions, but Tesla chose to push it to the public as a paid feature and significantly downplay the fact that it is a poorly tested, unreliable beta, specifically to profit from the data generated by its widespread use, not to mention the price they charge for it as if it were a normal, ready to use consumer feature. Everything about their deployment of the system has been reckless, careless, and actively disdainful of their customers' safety.

You don't even seem to get the terms right so makes me question how well informed you really are on the subject.

Autopilot is the most basic free driver assist version that comes with every Tesla. Then there's Enhanced Autopilot which costs extra and is more advanced and lastly there's Full Self Driving BETA. Even the name indicates you probably shouldn't trust your life with it.

Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

So I need to understand the autopilot of a plane first before I buy a car?

I would be mislead then, as I have no idea how such autopilots work. I also suspect that those two systems don't really work the same. One flies, the other drives. One has traffic lights, the other doesn't. One is operated by well paid professionals, the other, well, by me. Call me simple, but there seem to be some major differences.

I would have though people would read autopilot and think automatic. At least that's what I do. I guess pilot is closely associated with planes but it certainly isn't what I think of.

This is a pretty absurd argument. You could apply this to literally any facet of driving.

"I have to learn what each color of a traffic light means before driving?"

"I have to learn what white and yellow paint means and dashes versus lines? This is too confusing"

God help you when you get to 4-way stops and roundabouts.

Not absurd, but reality. We do that in driving school.

I don't know where you are from and which teaching laws apply, of course, but I definitely learned all those lessons you mentioned.

That's precisely my argument and why "learning my new car's features is too confusing" is an absurd argument.

Yeah, there are some major differences in the vehicles, but both disengage when there's anything out of the ordinary going on. Maybe people base their understanding of autopilots on the movie "Airplane!" where that inflatable puppet groped the Stewardess afterwards.

5 more...
5 more...

They're not buying a plane though. They're buying a car with an autopilot that is labeled as "full self driving". That term does imply it will handle a complete route from A to B.

People are wrongly buying into the marketing hype and that is causing crashes.

I'm very concerned about some of the things I've seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it's OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

They should not be allowed to market the feature this way and I don't think it should be openly available to normal users as it is now. It's just too dangerous to put in the hands (or not) of normal drivers.

Autopilot has never been "Full Self Driving". FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as "Pro Pilot Assist" or "Honda Sensing" or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

I've never sat in a Tesla, so I'm not really sure, but based on the things I've read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn't be any confusion about this.

I've never sat in a Tesla, so I'm not really sure

There shouldn't be any confusion about this.

U wot m8?

Well, if it's just the lane assistance autopilot that is causing this kind of crash. I'd agree it's likely user error. The reason I say if, is because I don't trust journalists to know or report on the difference.

I am still concerned the FSD beta is "out there" though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

If it were about the FSD implementation, things would be very different. I'm pretty sure that the FSD is designed to handle cross traffic, though.

I do not trust normal users to understand what beta means

Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That's like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

Yeah, I don't trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who's really good at driving though? Humans. Perfect track record, those humans.

Why do you think companies need to warn about stuff like "Caution, Contents are hot" on paper coffee shops? People are stupid.

Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

If the courts would apply a reasonable level of common sense, they wouldn’t exist.

5 more...
5 more...