New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times

L4sBot@lemmy.worldmod to Technology@lemmy.world – 497 points –
New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times
carscoops.com

New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

167

In fact, by the time the crash happens, it’s alerted the driver to pay more attention no less than 150 times over the course of about 45 minutes. Nevertheless, the system didn’t recognize a lack of engagement to the point that it shut down Autopilot

I blame the driver, but if the above is true there was a problem with the Tesla as well. The Tesla is intended to disengage and disable autopilot for the remainder of the drive after a small number of ignored alerts. If the car didn’t do that, there’s a bug in the Tesla software.

I think it’s more likely the driver used a trick to make the car think he was engaged when he was not. You can do things like put a water bottle wedged in the steering wheel to make the car think you have tugged on the steering wheel to prove you are engaged. (Don’t ask me how I know)

The Tesla is intended to disengage and disable autopilot

What about: slow down, pull up to the right, stop the car, THEN disengage?

I think that's what it was supposed to do. I remember seeing a few videos about this.

IIRC it doesn't pull up to the side, but it does slow down slowly and safely until a full-stop.

Then the autopilot disengage.

This is what it does already: https://youtu.be/oBIKikBmdN8

That wasn't what it did here.

Like all the poor delusional fanbois here, you are going on the wrong assumption that some warning has been ignored. Just watch the initial video again and listen better this time.

I’m not even replying to the article or the original commenter. I’m replying to the person that said “why doesn’t the car slow down and stop when the warnings are ignored?” which is precisely what it does.

I’m far from a Tesla fanboy, and there is no shortage of valid criticisms against Tesla. However, misrepresenting what autopilot does in the event of a forced disengagement isn’t right either.

After 3 alerts, it's off until you park. There are visual cues that precede the alert though and these do not count. I don't recall how many there are and for how long, but you start by seeing a message asking to have your hands on the wheel, then a blue line at the top, them the line starts pulsing ,then you've got an audio alert that is the first strike. Three strikes during the same drive and you need to park before using autopilot again.

And those alerts don't come if you've overridden the system by putting a weight on the wheel or something.

Like an orange?

Balancing an orange on the steering wheel?

Pressing it between the spokes of the steering wheel, jamming it in place.

These days it'll detect that and shut down anyway.

I've had my hand misdetected as a 'defeat device' once.

I have a lot of trouble understanding how the NTSB (or whoever's ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn't seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

It's just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as "full self-driving." It's damaging the real industry.

That's similar to cruise control. Cruise control can be dangerous because someone could fall asleep (not having to manage your speed can afford up sleepiness) and the car wouldn't slow down.

In my opinion, those options are all the driver's responsibility to know their own limit and understand that the tool is just a tool and you are responsible to making sure your driving is safe for others. Tesla autopilot adds a ton of safety features that avoid a lot of collisions based on lacking attention, sleepiness, and actively avoiding other drivers faults. But it's still just a tool and the driver is responsible of their own car and driving.

The difference is that cruise control will maintain your speed, but 'autopilot' may avoid or slow down for obstacles. Maybe it avoids obstacles 90% of the time or 99% of the time. It apparently avoids obstacles enough that people can get lulled into a false sense of security, but once in a while it slams into the back of a stationary vehicle at highway speed.

It's easy to say it's the driver's responsibility, and ultimately it is, of course, but in practice, a system that works almost all of the time but occasionally causally kills somebody is very dangerous indeed, and saying it's all the driver's fault isn't really realistic or fair.

A lot of modern cruise control systems will match the speed of the car in front of you and stop if they stop. They'll also keep the car in the current lane. And even without cruise control, most modern cars will stop if a pedestrian steps onto the road.

It's frustrating that Tesla's system can't detect a stationary police car in the middle of the road... but at the same time apparently that's quite a difficult thing to do and it's not unique to Tesla.

It's honestly not too much to ask a driver to step on the brakes if there's a cop car stopped on the road.

It's actually not that hard to do, but Tesla is not willing to spend the necessary time and resources to solve the hard problems.

Maybe it avoids obstacles 90% of the time or 99% of the time.

99 is not enough!

99 means many many more dead people.

You need to go for 99.99%

Actually it's absolutely realistic and fair. I don't like Musk, or Tesla for that matter. But they make it pretty damn clear that you're 100% responsible for the vehicle when using that feature. Anyone who assumes they don't need to pay attention is a moron and should be held responsible. If a 747 autopilot system starts telling the pilot to take control of the plane and they don't... we wouldn't blame the manufacturer, we'd blame the shitty pilot that didn't do their job.

I can’t wait to get smacked by a Tesla beta tester and have everyone debate whether the car or the driver is responsible for my innards being spread across 4 lanes. Progress!

If the driver gets lulled into a false sense of security by a convenience system like this and the automation fails, it's one thing to blame the driver, and that may or may not be fair depending on how much trust you place in the average driver's competence, but the (hypothetical) victim is still dead, and who we decide to blame won't make one iota of difference to that.

The problem with Tesla is that their entire marketing is based on "Our cars drives themselves".

It's not "not-quite-self-driving" though, it's literal garbage. It's cruise control, lane assist and brake assist. The robot vision in use is horrible.

There are Tesla engineers bad mouthing the system openly.

Musk is a scammer and they need to issue an apology for all of the claims around autopilot, probably pay a great deal of money, and then change the name and advertising around it.

Oh, and also this guy should never drive again.

This is stupid. Teslas can park themselves, they're not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver's fault and they should be held responsible for their actions. It's not the courts job to legislate.

It's actually the NTSB's job to regulate car safety so if they don't already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

There's no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I've experienced it first hand.

Evidently, he was aware enough to respond to the alerts, per the logs (as stated in the WSJ video that's in the article). It shows a good bit of the footage, too.

Seems like they need something better for awareness checking than just gripping the wheel and checking where your eyes are pointed. And obviously better sensors for object recognition.

The headline doesn't state that the warnings were consecutive.

Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

I'll grant you, though, 150 warnings is still a little tough tough to believe...

The driver is responsible for this accident, Tesla still should be liable imo for all the shady and outright misleading advertising around their so called "self driving". Compare Tesla's marketing to like GMs of Hyundai's, both of which essentially have parity with Teslas system in terms of actual features, and you'll see a big difference

It should be pulling over and putting the flashers on if a driver is unresponsive.

Yes. Actually, just stopping in the middle of the road with hazard lights would be sufficient.

Sounds like the injured officers are suing. It's a civil case not criminal, so I'm not sure how much the court would actually be asked to legislate. I'd be interested to hear their arguments, though I'm sure part of their reasoning for suing Tesla over the driver is they have more money.

7 more...

150 more warnings than a regular car would give, ultimately it's the driver's fault.

Tesla actively markets their cars as ''the car drives itself''.

Do we have any evidence from the driver stating that he didn't realize he was using a glorified cruise control similar to autopilot on an airplane?

Source or stfu

Where I live you can right now go to Tesla's website and buy a car with "Full Self-Driving Capability" with a small print that includes the disclaimer that it doesn't make the vehicle autonomous, for whatever that's worth...

FSD is a paid feature that i assume was not being used during the accident, autopilot was being used

Ah I see, now that you've been proven wrong you're pretending you asked a different question.

You admit that Tesla advertises a "Full Self-Driving Capability" feature, which is basically what the person you said "source or stfu" to.

Whether or not the feature was used in this instance is not what we're discussing here.

We can have this discussion if you're feeling like you're up for it in good-faith, I think both are true that people are overall terrible at the activity of driving so more driver aids are overall better, but also current driver aids are very limited and drivers are not necessarily great at understanding and working within those limits.

They're not the only ones, but Tesla is really the worst offender at overstating their cars' capabilities and setting people up for failure - like in this case.

Yes you're right

What was used in this accident had nothing to do with my question and yes it looks like tesla advertisement is misleading

There's this sweet new website called Google. Try using that before you throat Musk.

Oh I tried, since I am so bad at googling please provide the source

Yes, even in self driving cars the driving is expected to pay attention in case they need to take control in unexpected events

If the driver was unresponsive in a normal car, it would stop.

The driver was responding. If he didn't respond the car would have stopped.

If this was a normal car he probably would have just crashed earlier.

TIL cruise control doesn't exist

So if the guy behind the wheel died and couldn't react to the alerts then the car can't do a decision to just stop instead of crashing into a police car?

He was reacting to alerts, complying to them by simply touching the steering wheel. He did that 150 times during that 45 minute trip ( not all the trip was on auto pilot).

So if the guy died the car would of disengaged auto pilot (I'm not sure how this works).

You can check the video in the article. It's quite informative .

Edit

I saw another video and it takes ~60 seconds after taking off your hand from the steering wheel for the car to safely come to a full stop.

So the headline should be "drunk driver hits police car."

Was he drunk? The article seems to use the fact that the car nagged him 150 times as evidence that he was impaired.

TBH if you're not used to it the steering wheel check can warn frequently. It's checking for a small amount of torque on the wheel rather than actually holding it (as there are no pressure sensors) and that catches people out but the prompt says to put your hands on the wheel.. I could believe 150 times on a long journey.

7 more...

Driver is definitely the one ultimately at fault here, but how is it that Tesla doesn’t perform an emergency stop in this situation - but just barrels into an obstacle?

Even my relatively ‘dumb’ car with adaptive cruise control handles this type of situation better than Tesla?!

Your relatively 'dumb' car probably doesn't try to gauge distance exclusively by interpreting visual data from cameras.

Wait, the Model X doesn’t have RADAR/LIDAR to supplement the cameras?

Nope. For whatever reason, Musk decided to just use cameras

"Whatever reason" is obviously just trying to cut corners and improve the bottom line with no regard for the consequences.

A few years ago, they were experimenting with LIDAR (most other car makers had it already then).

Then they abandoned it, even though everyone in the world thought that they need it so badly.

Now we see one of the results.

No, they were too expensive for Musk

Holy shit.... This is worse than I thought

Wanna know what's even worse? My M3 is equipped with LIDAR, but the functionality has been patched away because they don't want to develop for it since all their new cars only have cameras... So even though I have what's in practice a way better system equipped, my lane assist (won't call it autopilot) is still 100% dependent on the fucking cameras...

2 more...
2 more...

Even my relatively ‘dumb’ car [...] handles [...] better than Tesla?!

Not going to be the last time when you experience that :-)

I believe this is caused by the fog combined with flashing lights and upward/curved road. The Tesla autopilot system is super impressive in almost all situations but you can clearly see the limits in extreme situations. Here, the drunk driver is definitely at fault, I don't understand why they'd sue Tesla.

I don't understand why they'd sue Tesla.

Money. Tesla has much more money than the drunk.

2 more...

Setting aside the driver issue, isn't this another case that could've been prevented with LIDAR?

You know what might work, program the car so that after the second unanswered "alert" the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

I drive a Ford Maverick that is equipped with adaptive cruise control, and if I were to get 3 "keep your hands on the wheel" notifications, it deactivates adaptive cruise until the vehicle is completely turned off and on again. It blew my mind to learn that Tesla doesn't do something similar.

It does and did... He kept driving anyway. Drink drivers FTW.

I presume AEB kicked in but all that can do is reduce the speed of inpact.. if you're determined to kill yourself there's not much the car can do.

9 more...
9 more...

This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

Ah, so its just people defeating the system

The system with cars is that you don't distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

It’s frustrating to see

This is why we can’t have nice things

It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

You should get your priorities straight if you ever hope to be taken seriously

The system will warn you to pay attention

... and if we have learned anything from that incident, it is that the warnings have been worthless.

The system can be tricked even by the worst drunkards! 150 times in a row.

for a few seconds before shutting down.

Few seconds are not enough. The crash was already unavoidable.

You’re misinterpreting what I said and conflating two separate scenarios in your 2nd statement. I didn’t say anything about the system warning “for a few seconds before shutting down” in the event of an eminent collision. It warns the driver before shutting down if the driver fails to hold the steering wheel during normal driving conditions.

The warnings were worthless because the driver kept responding to them just before they timed out and shut autopilot down. It would be even worse if the car immediately pulled off the road and stopped in traffic without warning the driver first.

They aren’t subtle either, after failing to touch the wheel for about 5-10 seconds it starts beeping loudly and flashing an icon on the screen.

This is not a case of autopilot causing an accident, this is a case of an impaired driver operating a vehicle when they should not have been. If the driver was using standard cruise control, would we be blaming the vehicle because their foot wasn’t touching the accelerator when the accident happened? No, we wouldn’t.

This is not a case of autopilot causing an accident, this is a case of an impaired driver

It is both, of course. The drunkard and the autopilot, both have added their share to create such danger, that ended deadly.

Driving drunk is already forbidden.

What Tesla has brought on the road here should be forbidden as well: lane assist combined with adaptive cruise control AND such a bunch of blind sensors.

The driver was in autopilot. Auto pilot is cruise control and lane assist. It's not FSD. Tesla didnt bring that " to the road ". The driver was drunk, and with most auto pilot or FSD accidents...its user error.

Still unaware of a proven FSD accident.

They didn't say he didn't respond to the alerts. If you don't respond, autopilot turns off.

9 more...

Poor drunk impaired driver falling victim to autonomous driving... Hopefully that driver lost their license.

That doesn't drive the problem of autopilot not taking the right choices. What is the driver wasn't drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?

Furthermore, what if the driver got drunk BECAUSE the car's AI was advertised as being able to drive for you? Think of false publicity.

If your AI can't handle one simple case of a driver being unresponsive, that's negligence on the company's part.

How could the company be negligent if someone gets drunk or has a heart attack and crashes their car? No company has a Level 5 autonomous vehicle where no human intervention is needed. Tesla is only Level 2. Mercedes has a Level 3 option (in extremely limited conditions). Waymo claims Level 4 but is geofenced.

Don't see how that's a Tesla problem... Drunk/high driver operating their car incorrectly.

By driving it

Tesla wasn't driving it, the drunk/high owner was.

It was on autopilot, so technically the drunk wasn’t driving it. But he is the one responsible.

Autopilot doesn't work that way, the drunk should have known that when he wasn't drunk and not tried to use it that way.

It's like the old shaggy dog story about the guy driving a camper, setting the cruise control, then going into the back to make lunch.

That's not the fault of the cruise control.

Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash.

Is the video slowed down? In the video, if you pause 2.5s before the crash, the stopped police car seems to be very close already. A (awake) human driver would've recognized the stopped police car from way more distance than that.

I find that it can be hard to tell when a car ahead is stopped, maybe the visual system on the tesla has similar limitations. I think autopilot is controlled by the cameras alone but I'm not super up to date on tesla stuff. I would assume even a basic radar set up could tell something was stationary from quite far away.

Officers injured at the scene are blaming and suing Tesla over the incident.

...

And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

I hope those officers got one of those "you don't pay if we don't win" lawyers. The responsibility ultimately resides with the driver and I'm not seeing them getting any money from Tesla.

Well, in the end it's up to whether Tesla's ADAS is compliant with laws and regulations. If there really were 150 warnings by the ADAS without it disengaging, this might be an indicator of faulty software and therefore Tesla being at least partially at fault. It goes without saying that the driver is mostly to blame but an ADAS shouldn't just keep on driving when it senses that the driver is incapacitated.

Also from the article:

Data from the Autopilot system shows that it recognized the stopped car 37 yards or 2.5 seconds before the crash. Autopilot also slows the car down before disengaging altogether.

This source keeps pushing tesla propaganda. There's always an angle trying to sell that it wasn't the tesla's fault

I'm not so sure disengaging autopilot because the driver's hands were not on the wheel while on a highway, is the best option. Engage hazard lights, remain in lane (or if able move to the slowest lane) and come to a stop. Surely that's the better way?

Just disengaging the autopilot seems like such a copout to me. Also the fact it disengaged right at the end "The driver was in control at the moment of the crash" just again feels like bad "self" driving. Especially when the so-called self-driving is able to come to a stop as part of its software in other situations.

Also if you cannot recognize an emergency vehicle (I wonder if this was a combination of the haze and the usually bright emergency lights saturating the image it was trying to analyse) it's again a sign you shouldn't be releasing this to the public. It's clearly just not ready.

Not taking any responsibility away from the human driver here. I just don't think the behaviour was good enough for software controlling a car used by the public.

Not to mention, of course, the reason for suing Tesla isn't because they think they're more liable. It's because they can actually get some money from them.

The video is very thorough and goes into the hazy video caused by the flashing lights being one of the issues.

That's not the main problem. It is more like an excuse. The main problem has been explained in the video right before that:

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

The emergency vehicles just happen to be your most frequent kind of obstacles.

The fallback to the camera is a bad excuse anyway, because radar is needed first to detect any obstacles. The cam will usually be later (=at closer distance) than the radar.

The even better solution (Trigger warning: nerdy stuff incoming) is to always mix all results of all kinds of sensors at an early stage in the processing software. That's what european car makers do right from the beginning, but Tesla is way behind with their engineering. Their sensors still work indepently, and each does their own processing. So every shortcoming of one sensor creates a faulty detection result that has to be covered later (read: seconds later, not milliseconds) by other kinds of sensors.

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

Teslas don't use radar, just cameras. That's why Teslas crash at way higher rates than real self driving cars like Waymo.

Their radar is bad at recognizing immobile cars on the road. This means all objects. All obstacles on your road!

I feel like this is bad tech understanding in journalism (which is hardly new). There's no reason radar couldn't see stationary vehicles. In fact, very specifically, they're NOT stationary relative to the radar transceiver. Radar would see them no problem.

My actual suspicion here is that Tesla actively ignores stationary vehicles (it can know they're stationary by adding its known speed to the relative speed) not in front of the vehicle. Now, in normal streets this makes sense (or at least those on the non-driver's side). Do you pay attention to every car parked by the side of the road when driving? You're maybe looking for signs of movement, or lights on, etc. But you're not tracking them all, and neither will the autopilot. However, on a highway if you have more than 1 vehicle on the shoulder every now and then it should be making you wonder what else is ahead (and I'd argue a single car on the shoulder is a risk to keep watch on). A long line of them should definitely make you slow down.

I think Human drivers would do this, and I think an autopilot should be considering what kind of road it is on, and whether it should treat scenarios different.

I also have another suspicion, but it's just a thought. If this Tesla was really using radar as well as cameras, haze or not, it should have seen that stationary vehicle further ahead than it did. Since newer Tesla cars don't have radar, and coming from a software development background, I can actually see a logical (in terms of corporate thinking) reason to remove the code for radar. They would do this simply because they will not want to maintain it if they have no plans to return to radar. Think of it like this. After a few versions of augmenting the camera detection logic, it is unlikely to work with the existing radar logic. Do they spend the time to make them work together for the older vehicles, or only allow camera based AI on newer software versions? I would suspect the latter would be the business decision.

The question here is, could you see there was a reason to stop the car significantly (more than 3 seconds) before the autopilot did? If we can recognize it through the haze the autopilot must too.

Moreover, it needs to now be extra good at spotting vehicles in bad lighting conditions because other sensors are removed on newer Teslas. It only has cameras to go on.

Hard to argue Tesla at fault when clearly the driver was impaired and at fault here.

It's also so misleading that Tesla use the word Autopilot for what is basically adaptive cruise control and lane assist

i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo's capability and practice without matching it, which is probably why so many are over reliant. i've always been against mass-producing semi-autonomous vehicles to the general public. this is why.

and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.

So self driving cars, are not so self driving... Huh, whodathunk it lol /s

Tesla's are not safe.

Tesla on autopilot/FSD is almost 4 times less likely to be involved in a crash than a human driven Tesla which even then is half as likely to end up in a accident compared to average car. You not liking Musk fortunelately doesn't change these facts.

In the 2nd quarter, we recorded one crash for every 4.41 million miles driven in which drivers were using Autopilot technology (Autosteer and active safety features). For drivers who were not using Autopilot technology (no Autosteer and active safety features), we recorded one crash for every 1.2 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 484,000 miles.

Source

There is a bias here in the numbers. Teslas are expensive and not everyone is buying them. The lower accident rate can be explained by the different demographic driving the vehicle rather than Teslas being better. For exemple, younger people might be more likely to cause accident because of different factors and they are also less likely to buy a Tesla because they are so expensive. I dont have the numbers for this, but we should all be careful with the claims of Tesla on safety when they compared themself to the global average.

The biggest bias is that the data comes from Tesla. Do you think they are going to release something that makes them look bad?

Sure. There are always multiple factors in play. However I'd still be willing to bet that there's nothing in Teslas that makes them inherently unsafe compared to other cars.

So Tesla says. There is no independent verification of this data. It could all be bullshit.

Perhaps. I'm sure you'll provide me with the independent data you're basing that "Teslas are not safe" claim on

So you take Tesla's word and believe it, but ask for proof for the contrary?

You're just a hypocrite.

You made the first comment: "Teslas aren't safe", without providing proof.

And now you're calling someone a hypocrite because he asks for data of exactly what you claimed, while you're redefining your first argument as "the contrary".

So, do you have proof that Tesla's aren't safe in comparison to other cars, or is it just your opinion?

We're literally having this discussion under a video where automatic braking should have kicked in, but didn't.

But you can't base a fact on one accident. Or even multiple. What if newspapers like to write especially about Tesla accidents to generate clicks?

Teslas seemingly have a lot of accidents, but without checking the statistics and comparing it to other manufacturers you wouldn't really know if the perceived truth is a fact or not.

You're right, but the automobile industy is too opaque to do this kind of study. Maybe the insurance companies have this data, but they.are not sharing it.

In the absence of such data we have to make do with what we have. What we are seeing are a lot of examples where Tesla's are clearly doing irrational things and failing miserably. Youtube is not full of Mercedes cars failing miserably, eventhough there are a lot more Mercedes cars on the road.

The data is widely available and has been provided to you several times in this thread. You just refuse to accept it because it goes against your prior beliefs.

A person clearly can't be reasoned out of a belief they weren't reasoned into.

There is no independently available data on which car has the most crashes per km driven.

Stop spreading misinformation.

Tesla model Y scored the highest possible score on IIHS crash test as well as 5 stars on Euro NCAP

Their other models have similar results. I believe Model X is the safest SUV ever made.

EDIT:

More than just resulting in a 5-star rating, the data from NHTSA's testing shows that Model X has the lowest probability of injury of any SUV it has ever tested," Tesla said in a statement. "In fact, of all the cars NHTSA has ever tested, Model X's overall probability of injury was second only to Model S.

Source

Also might want to check this

EDIT2: Imagine downvoting the guy providing hard evidence and upvoting the fanatic making baseless claims backed by nothing

It's not hard to game benchmarks.

Or maybe you're so blinded by the hatred towards Musk that you can't even think straight and no evidence in the world could convince you otherwise?

You really should've checked the last link.

1 more...

almost 4 times less likely to be involved in a crash than a human driven

Not relevant at all here, when we are discussing occurences that seem so easily and obviously avoidable.

(But it's nice to see that the Fanboi team is awake now)

Tesla fails at basic safety in the most obvious and simple accidents (like this one or the car pileup at San Francisco tunnel).

We're talking about overall safety here. Even with 99.99% safety rate you're still getting 33000 accidents a year in the US alone. There's always going to be individual incidents to talk about

1 more...
1 more...

I think Mercedes is the only car company that will accept blame for a self-driving or self-parking failure. That should tell you something.

So few people will pay the value add-on that they may not need to.

It's what you get when you design places that require cars for everything

I hope the cops win. Autopilot allows for a driver to completely disengage their attention from the car in a way that's not possible with just cruise control.

There's no way you can drop a human in a life threatening critical situation with 2.5 seconds to make a decision and expect them to make reasonable decisions. Even stone cold sober, that's a lot to ask of a person when the car makes a critical mistake like this.

On cruise, the driver would still have to be aware that they were driving. With auto pilot, the driver had likely passed out and the car carried on it merry way.

Because people can't pass out with just cruise control? He didn't have 2.5 seconds. According to the article he had 45 minutes of multiple warnings.

No, autopilot does not allow for a driver to completely disengage their attention. That's what they want to get to but that's not where we are. You still have to drive the car, you still have to be in control.

This is the driver's fault.

I hope Tesla wins this one. We need to set a precedent for this, you can't be out here driving high, drunk or any type of impairment and just assume you can get around that by your cars technology.

Granted I agree that the car should slow down, put hazards on and pull you off if you don't respond by a certain timeframe. There are a few brands that do that and it should be standard.

It’s a douche bag trifecta

Tesla owner, driving drunk Cops, being cops Tesla, overselling their shitty car

I just hope that innocent bystander gets something from all three of them