If so-called AI is basically just Large Language Models, how come predictive text on my phone is bollock-useless?
![](https://lemmy.world/pictrs/image/0943eca5-c4c2-4d65-acc2-7e220598f99e.png)
Like if I type "I have two appl.." for example, often it will suggest "apple" singular instead of plural. Just a small example, but it is really bad at predicting which variant of a word should come after the previous
You are viewing a single comment
You can run an LLM on a phone (tried it myself once, with llama.cpp), but even on the simplest model I could find it was doing maybe one word every few seconds while using up 100% of the CPU. The quality is terrible, and your battery wouldn't last an hour.
Does the AI processing have to be performed locally or constantly active?
No, but you open up a can of worms from a security aspect if you send it out to be processed.
I'm sure every phone having a keylogger won't end badly
And latency.