I used an Android phone with ChatGPT built-in — and I loved it | Digital Trends

ijeff@lemdro.idmod to Android@lemdro.id – 18 points –
I used an Android phone with ChatGPT built-in — and I loved it | Digital Trends
digitaltrends.com

crossposted from !chatgpt@lemdro.id

21

The developer of Tasker has released so many videos on how to incorporate ChatGPT into the Android Phone, even replacing Google Asaiatant without buying a new phone. Tasker is what, $5 and ChatGPT $20 a month?

Also if you use Firefox Beta, some extensions that use LLM like ChatGPT already work. I use one to summarise text and articles.

Seems like more of an ad rather than an article.

I don't think you need to pay to access GPT through Tasker

It does cost money, as it uses the API. New sign-ups come with some free credit though.

Right, but i read this from the dev saying if you had used a card before you could use 4.0 for no additional cost

It just means you have the option of using GPT-4 via API without waiting for access like previously. You still need to pay for it and it's significantly more expensive to use GPT-4 instead of GPT-3.5-Turbo.

The comment they made is about not having to subscribe to ChatGPT Plus and instead being able to use the API for less if you're not planning to reach $20 USD worth of API calls.

So you can use GPT without spending $20 a month. Which is my point

Yep, just not for free as it costs money per request (and can be quite expensive for GPT-4).

You might be rate limited with the free tier and won't have access to version 4, IIRC. But yeah, you can use it for free last I checked.

1 more...
1 more...

Can you get those extensions on not beta Firefox I only have some available?

incorporate ChatGPT into the Android Phone, even replacing Google Assistant without buying a new phone. Tasker is what, $5 and ChatGPT $20 a month?

Seems like more of an ad rather than an article.

I get the same feeling about your comment.

Thanks. Glad to help out. Like and subscribe, and don't forget to hit that bell button so you don't miss out on my new comments.

Peace!

1 more...

Quote:

Still prone to hallucinations

Since Folax is inherently drawing responses from ChatGPT, it can often hallucinate and present incorrect answers — often very confidently. Once again, the only way to remedy this is to upgrade to newer models, such as GPT-4 (or equivalent), which have fewer hallucinations and more accurate responses.

Counterpoint: Moving to GPT-4 makes it harder to realise when the reply is complete bullshit.

This is why ChatGPT needs to provide sources and references. but since it scraped things indiscriminately, that'll lead them to legal trouble. There's a services like perplexity[.]ai that uses internet search plugin for ChatGPT? and lists sources. much better if you want to check the validity of the things it spits out

Yeah this seems like a really tough problem with LLMs. From memory OpenAI have said they are hoping to see a big improvement next year which is a pretty long time given the rapid pace of everything else in the AI space.

I really hope they or others can make some big strides here because it really limits the usefulness of these models.

really limits the usefulness of these models.

The whole problem I have is the models are rewarded/refined for believability and not for accuracy.

Once there is enough LLM generated shit on the web, it will be used (most likely inadvertently) to train newer LLMs and we will be in a garbage in - garbage out deluge of accurate sounding bullshit that much of the web will become useless.

Yeah 100% with you on that. I think the folks building these things are also aware of this issue and maybe that's one of the reasons why ChatGPTs training set still ends in 2021. We'll have to wait and see what new solutions and techniques come along but for now I think we're going to be stuck with this problem for a while.

Okay but can it set a timer, tell me the weather, and change my songs? Because that’s what I really use an assistant for.

Can we just have both? I'm the opposite, I never use the current assistants but would probably use ChatGPT in this manner.

I want to like assistants like this, but I can't see myself using one regularly until I can run it locally on my device.

Yep. My next phone is going to have at least 16GB of RAM so I can run a modestly capable LLM on it.

I hope LLMs encourage vendors to stop being so skimpy with storage and RAM. 8/128 has been the norm for like 4 years now. Why is it not advancing?!?

And it's even worse on the iPhone side. 6GB...

Mycroft can run on your own linux device, including a raspberry pi if you have one, but the native android version is still a work in progress. You could configure a remote connection to your own self hosted instance of it though