Google Gemini ad controversy: Where should we draw the line between AI and human involvement in content creation?

ModerateImprovement@sh.itjust.works to Technology@lemmy.world – 70 points –
Google Gemini ad controversy: Where should we draw the line between AI and human involvement in content creation?
theconversation.com
4

I think LLM's are incredibly useful in limited circumstances. But it needs to be on-demand, NOT omnipresent.

For example, I'm learning a bit of C#. I was following a tutorial a few weeks ago that had some code I didn't understand. I spent an hour googling and reading documentation, but found nothing. I headed to ChatGPT and asked it what it meant. It gave a clear, easy to understand explanation.

Unfortunately, the industry is going about this all wrong. Google wants to force Gemini on my Pixel. So I nuked Android. Microsoft silently installed Copilot on my laptop without consent. That laptop is now happily running Linux.

Forcing these tools on people will just alienate people like me. In a way, I'm glad it happened. I'm much more satisfied now that these companies are minimized in my life.

Google wants to force Gemini on my Pixel. So I nuked Android.

And what do you use now?

The line is drawn where these companies don't make money