Tesla Under Investigation After Fatal Crash May Have Involved Autopilot System, Report Says

L4sBot@lemmy.worldmod to Technology@lemmy.world – 200 points –
Tesla Under Investigation After Fatal Crash May Have Involved Autopilot System, Report Says
forbes.com

Tesla Under Investigation After Fatal Crash May Have Involved Autopilot System, Report Says::undefined

27

You are viewing a single comment

I don't trust the tech at this point and I certainly don't trust the people who are using the full-self driving mode.

It's endangering everyone on the roads right now.

This is my opinion, too. Their "autopilot" feature is a glorified driving aid. It's not self-driving. It's supposed to help with driver fatigue, and you're supposed to keep both hands on the wheel. If it makes a mistake, that's okay, because you're driving the car, right?

Traditional cruise control without radar will maintain the speed you asked and it won't stop for emergency vehicles, but we don't blame that. Even though the "autopilot" feature does more automation, you're supposed to drive the car in an identical fashion with identical attention compared to traditional cruise control.

But safety is still what matters first. If you're sending a freeway-speed land missile into motorcyclists and police cars, I don't care if you were driving a 90s Civic or a car with automated driving features. The car hit someone. Fix that problem first, then figure out who to blame later.

In my option, until we have cars that are guaranteed to function as a completely autonomous experience, and the manufacturer of the car doesn't tell you to keep your hands on the wheel, you're still driving it. It's your responsibility. You can still steer, brake, change lanes, evade, etc. That's on you. As far as I'm concerned, anyone who thinks otherwise might as well blame their heated seats or radio station.

I understand that Tesla would be improving their software, and I agree with this, too. It's not great that they are fudging things quite a bit by pushing the self-driving rhetoric. They should focus on this, and it should be improved. But I still think that negligent drivers are at fault.

Statistically they're still safer than human drivers but nobody writes articles about your car dodging the woman drifting into your lane while eating a bowl of cereal and applying mascara on her morning commute.

I think your statement and the fear for self driving can be true at the same time.

Self driving is safer than humans most of the time… but not all the time. Nothing is perfect.

Self driving currently assumes that a human can intervene when it fails. It assumes that a human is present and not eating a bowl of cereal and applying mascara. It assumes that the human is actually paying attention, in a situation where they usually don’t have to because self driving is usually safer.

Yes, self driving is statically safer. Yes, self driving will one day be perfect.

But I don’t think we can fault anyone for being worried about self driving, especially with companies like Tesla, who sell the promise that you don’t really have to pay attention… even though you kinda have to right now.

I don't fault anyone for being uncomfortable with new things that they aren't familiar with, but I absolutely do fault them for making wild accusations or fear mongering from a place of ignorance.

If we listened to half the comments on this post, the tech would be completely banned, the developers in prison, the companies bankrupt, and the number of avoidable collisions and deaths on the road would increase. People are wanting to cut off their nose to spite their face here while the consequences fall on the rest of us who share the road. This is what I take issue with.