California bill to have human drivers ride in autonomous trucks is vetoed by governor

Madison_rogue@kbin.social to Technology@lemmy.ml – 246 points –
California bill to have human drivers ride in autonomous trucks is vetoed by governor
apnews.com

SACRAMENTO, Calif. (AP) — California Gov. Gavin Newsom has vetoed a bill to require human drivers on board self-driving trucks, a measure that union leaders and truck drivers said would save hundreds of thousands of jobs in the state.

123

You are viewing a single comment

That's fair, but I was more concerned about an accident being caused where the "driver" has seconds to react to a mistake the car is making. After sitting doing nothing for hours there's no way they'd be attentive until it's too late.

They would be more likely to stop the accident from happening if they were there as opposed to not being there.

"More likely" is not likely. Autonomous vehicles shouldn't be allowed on the road at all.

At the current level of autonomous vehicle abilities, I agree with you, in a broad sense. Vehicles will need to still be able to differentiate between shapes, even during bad weather. Weather like blizzards, sudden downpours, heavy fog, dust storms, and the like. You still have to be able to see to safely pull off of the road.

Until we can guarantee with 100% certainty that they can truly drive without aid, I completely agree that these vehicles would not be safe on their own. Weather is very well known for being unpredictable at times. Life in general is also known for being unpredictable at times.

What happens if the sensors are unknowingly damaged? What happens if someone is wearing a costume that makes them look like a giant cereal box instead of human-shaped? What happens if there's a software glitch at a bad time? What protections are there to guarantee that it doesn't happen? Are those protections temporary? How often should they be reviewed?

It should be OK to acknowledge that we aren't quite there yet. Yes, it seems cool and all, but it's silly to risk lives over impatience. If it will happen, it will happen. Forcing it to happen sooner than it should could very well lead to it being banned altogether, especially if enough people die or get injured as a result.

IMO, anyone who causes serious crashes from using these things in "fully autonomous" mode should be charged as if the vehicle wasn't autonomous. As if the accident was caused by sleeping behind the wheel or texting while driving. The company should be charged similarly in that scenario, as their programming and marketing would also play a part in the crash.

Hey, if they're truly safe, none of these charges would actually happen. If there isn't an "oops" death in the first place, there won't be an "oops" death to investigate.

We could just not allow autonomous vehicles.

Ever? What kinda conservative bullshit lol

Not until they're safe. The tech isn't there yet.

There's no reason to bother with autonomous vehicles if we're just going to have human drivers anyway.

Testing?

I don't want companies to test drive on public roads, I did not sign up to be one of their test subjects.

There will be testing on public roads whether you like it or not, it's inherent to how any new thing works.

We have ways to test new technologies before unleashing them onto the public, what are you talking about? Even if it's necessary to test them on public roads, they could be limited to only certain roads so people who don't want to be part of he experiment don't have to risk their lives.

No matter how much testing they get beforehand, at some point they'll be on public roads. And when they first get access to public roads, that will be a test. That's just the only possible way for any new technology to come into being.

Okay, think about medication.

They test it before it gets regulatory approval. Once the testing is done it goes to the broad public because they're found to be safe when used as directed. At no point do they experiment on people without their consent.

Obviously data is still collected after that, but that's not the same as testing - are you conflating those two things? Because by your definition, testing never ends.

4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...
4 more...

Anyone who uses FSD on their Tesla would happily tell you it’s not even close to being safe yet. Hell if anything I’m MORE attentive when using the autopilot because it can be so sketch sometimes.

Hell if anything I’m MORE attentive when using the autopilot because it can be so sketch sometimes.

I doubt you're more attentive than someone who is literally driving lol

I drive 150-200 miles/day. I’m definitely zoned out for the most of it lol

And zoning out would be much worse with computer assistance!

Actually cars should be abolished for this very reason - humans can never be truly safe drivers, they always get bored and zone out.

4 more...