Formula E team fires its AI-generated female motorsports reporter, after backlash: “What a slap in the face for human women that you’d rather make one up than work with us.”

L4sBot@lemmy.worldmod to Technology@lemmy.world – 380 points –
caranddriver.com

Formula E team fires its AI-generated female motorsports reporter, after backlash: “What a slap in the face for human women that you’d rather make one up than work with us.”::px-captcha

66

You are viewing a single comment

you’d rather make one up than work with us

Yes? It's nothing personal, human women, but once "having a pleasant feminine voice" is something that machines can do more efficiently than humans, why shouldn't those machines be given the job?

You've got bigger problems than labour relations when "having a pleasant feminine voice" is the success criteria you use to measure the performance of a reporter.

I dunno, this logic sounds exactly like the fucked up logic that went on in the conference room that dreamed up this shitty idea only to have it face reality and be pulled on day one.

What else does a racecar reporter have to do? There's only so many ways you can say the cars are going round in circles.

So should be pretty easy for a human to do then right? A lot easier than training an AI model to be able to spontaneously describe what's happening on the race track at any given moment.

Cheaper too I bet.

Sure, you just have to hire a team of AI engineers who's job it is to train the AI on thousands of races and test it and test it and test it. Definitely cheaper than just hiring one human to be an announcer.

Not really. The real power of these LLMs is their ability to understand the written word, context and emotion then generate text based on it.

Bing AI uses search to get its sources and its training to summarise them. It doesn't need to be trained on the specific things it's generating off. It just needs to understand them.

Anyone who used ChatGPT to get information and not generate text was using it wrong. This is a very common misconception.

Not on TV. Female AI is for house and assistant chores only, like Siri and Alexa. At least no one's ever complained about that before...

Worst part is if they made it a guy, they'd get more flack. Damned if you do, damned if you don't.

It's clear they need to make it a golden retriever with subtitles and the project can keep going.

It's not reasonable to expect regular people to all have executive assistants. I'm not sure what point you're trying to make here. We're talking about a job that a real person could perform, working for a multi-billion dollar company, not an AI that can mark stuff in your calendar for you.

You do realize those voice assistants have male voices too? Just switch it over.

As for the reason why they are female by default, iirc they did some studies on it and it turns out people subconsciously trusted them less and especially men were likely to disregard their advice.

As a fan of F1, and I casually watch FE, I'd much rather have human commentators, thanks.

What about all the dudes that don't get a shot either way because they're not an attractive woman? Is it a slap in the face to them?

I feel like AI haters really struggle to grasp the concept of an actually competent AI that can do something better than a human would. The counter-arguments always seem to come from the assumption that this will never be the case but that's changing the subject.

If there is an AI doctor that has a proven track record of being better at diagnosing illesses than any human doctor then I'll rather consult the AI. I'm fully aware how "unfair" it is for the human doctor but I don't want to have to deal with misdiagnosis just because I wanted to show my support for human doctors and knowingly going for the inferior option.

The flip side is that the company that owns the doctor AI doesn't want you to use it because their 95% successful diagnosis means every 1 in 20 cases they have the opportunity to get sued.

Well presumably they would be using it to replace a doctor with even worse success rate so I'm not sure why wouldn't they want me to use that instead.

Legislation is always 2+ decades behind technology. Legal protections are in place for doctors making wrong decisions with the information they have on hand as long as it's to the best of their ability. The same protection doesn't extend to someone's brand new AI doctor.

Indeed, absurd argument (rather feeling) that should have no place in such a discussion. But it was no discussion. It was feelings making them cancel it because they want zero potential for bad news, regardless of how right they would be.

Image if translators argued the same about the various apps. Laughable.