Apple wants AI to run directly on its hardware instead of in the cloud

L4sBot@lemmy.worldmod to Technology@lemmy.world – 226 points –
Apple wants AI to run directly on its hardware instead of in the cloud
arstechnica.com

Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.

57

You are viewing a single comment

Remember, this probably isn’t an either or thing. Both Apple and Google have been offloading certain AI tasks to devices to speed up response time and process certain requests offline.

Yep, though Google is happy to process your data in the cloud constantly while Apple consistently tries to find ways to achieve it locally, which is generally better for privacy and security but also cheaper for them too.

Yea thats why they look trough your images for "cp"

Who looks at images?

Ok have to correct myself, they crawled back from this 2 weeks ago due to backlash, but i doubt they wont do it at least in a similar way or hidden like they did with reducing power on older devices to "save battery" https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

All phones reduce power on older devices. Would you rather your phone be a little slower or just cut out at 15%? The problem with Apple is they weren’t upfront about it.

Google have been offloading certain AI tasks to devices

No, Google has simply been blatantly lying about this to convince you to buy new phones. It's very easy to prove, because as soon as you disable any network connections, these functions cease to work.

Just because a certain requests don’t work offline, that doesn’t mean that Google isn’t actually running models locally for many requests.

My pixel isn’t new enough to run nano. What are some examples of offline processing not working?

I wouldn’t be surprised if the handshake between Pro and Nano was intermingled for certain requests. Some stuff done in the cloud, and some stuff done locally for speed - but if the internet is off, they kill the processing of the request entirely because half of the required platform isn’t available.

Just because a certain requests don’t work offline, that doesn’t mean that Google isn’t actually running models locally for many requests.

Yeah? It does.