I understood it like Apple provides a pre trained LLM and it is then trained on device with user data directly resulting in new weights and configuration for each person‘s personal AppleLLM.
For me that seems more reasonable that way because the data is way less random but strictly orchestrated by the limitations defined by apple through the API that needs to be used in order to integrate your app with the user’s personal AppleLLM
And I still agree, the weights and configuration of the AppleLLM is as critical as 100gb screenshots of your windows, but definitely harder to understand if extracted.
I just don't think that's plausible at all. I mean, they can "train" further by doing stuff like storing certain things somewhere and I imagine there's a fair amount of "dumb" algorithm and programming work going on under the whole thing...
...but I don't think there's any model training on device. That's orders of magnitude more processing power than running this stuff. Your phone would be constantly draining for months, it's just not how these things work.
Ahh, lol, sorry for taking so long to understand 😅 guess many misunderstood apple, like I did, or not, at least I think I get it now.
So, the only difference between copilot and apple is that appleAI has access to the API where app developers decide what is seeable for the AI vs Access to everything one has seen on the screen except DRM stuff
At apple, as attacker, you would need to get access to that API and you can get all data and at copilot you need access to the Photos
So the difference why anybody prefer Apples solution, is because their LLM gets butter clean data which is perfectly structured by devs vs at windows, where the LLM has to work with pretty much chaos data
Where exactly is Apples solution spyware? It is only a process that runs while interacting and processing data.
Or is it enough to be proprietary and have access to this data, well then, spotlight is spyware.
It's spyware in that both applications are a centralized searchable repository that knows exactly what you did, when and how. And no, the supposed ability to limit specific applications is not a difference, MS also said you can block specific apps and devs can block specific screens within an app. They're both the same on that front, presumably.
What I'm saying is the reason people are reacting differently is down to branding and UX.
I understood it like Apple provides a pre trained LLM and it is then trained on device with user data directly resulting in new weights and configuration for each person‘s personal AppleLLM. For me that seems more reasonable that way because the data is way less random but strictly orchestrated by the limitations defined by apple through the API that needs to be used in order to integrate your app with the user’s personal AppleLLM
And I still agree, the weights and configuration of the AppleLLM is as critical as 100gb screenshots of your windows, but definitely harder to understand if extracted.
I just don't think that's plausible at all. I mean, they can "train" further by doing stuff like storing certain things somewhere and I imagine there's a fair amount of "dumb" algorithm and programming work going on under the whole thing...
...but I don't think there's any model training on device. That's orders of magnitude more processing power than running this stuff. Your phone would be constantly draining for months, it's just not how these things work.
Ahh, lol, sorry for taking so long to understand 😅 guess many misunderstood apple, like I did, or not, at least I think I get it now.
So, the only difference between copilot and apple is that appleAI has access to the API where app developers decide what is seeable for the AI vs Access to everything one has seen on the screen except DRM stuff
At apple, as attacker, you would need to get access to that API and you can get all data and at copilot you need access to the Photos
So the difference why anybody prefer Apples solution, is because their LLM gets butter clean data which is perfectly structured by devs vs at windows, where the LLM has to work with pretty much chaos data
Where exactly is Apples solution spyware? It is only a process that runs while interacting and processing data. Or is it enough to be proprietary and have access to this data, well then, spotlight is spyware.
It's spyware in that both applications are a centralized searchable repository that knows exactly what you did, when and how. And no, the supposed ability to limit specific applications is not a difference, MS also said you can block specific apps and devs can block specific screens within an app. They're both the same on that front, presumably.
What I'm saying is the reason people are reacting differently is down to branding and UX.