d416

@d416@lemmy.world
0 Post – 9 Comments
Joined 9 months ago

messenger on mbasic used to work for me for years on my mobile browser but then they stopped that a few months ago for me (redirects to ‘get messenger’ splash). . Can anyone confirm mbasic messenger still works on mobile?

wait what how am I hearing about this Firefox docker for the first time. Got a kink to the dockerhub?

Hopefully this will work remotely on a smartphone because I’m looking for all ways to defeat FB messenger and access it through a desktop browser which they enforce. Thanks for sharing

7 more...

Ah you managed to hit the copilot guardrails. Copilot is sterile for sure, and a microsoft exec talks about it in this podcast http://twimlai.com/go/657

Try asking copilot to describe its constraints in a poem in abcb rhyme scheme which bypasses the guardrails somewhat. “No political subjects” is first on the list.

The limited context lengths for local LLMs will be a barrier to write 10k words in a single prompt. Approaches to this is to have the LLM have a conversation with itself or other LLMs. There are prompts out there that can simulate this, but you will need to intervene every few hundred words or so. Check out ‘AutoGen’ frameworks that can orchestrate this for you. CrewAI is one of the better ones. hope this helps

Without knowing anything about your specific setup I’d guess the issue is with docker not playing nice with your OS or vice versa. Can you execute the standard docker hello-world app? https://docker-handbook.farhan.dev/en/hello-world-in-docker/
If not then my money’s on this being an issue the OS. How did you install docker on mint, using sudo with a package install?
Fyi don’t feel bad - I installed docker on 3 different Linux distros last month and each had their quirks that I had to work my way through. Docker virtualization is some crafty kernel-level magic which can go wrong very fast if the environment is not just right.

2 more...

10-year vegan here , 20-year veg. My answer is no no no.

Other than the taste and what it represents, there is far better food to eat which is grown outside than animal flesh.. grown inside a lab no less.

2 more...

The easiest way to run local LLMs on older hardware is Llamafile https://github.com/Mozilla-Ocho/llamafile

For non-nvidia GPUs, webgpu is the way to go https://github.com/abi/secret-llama

Here is the consummate thread on whether to use microsoft copilot. some good tips in there… https://lemmy.world/post/14230502

I for one upvoted this post. I am tired of nanny state OSs restricting what we want to do thinking they are protecting the poor user who surely they must classify as noobs. We need a free open LIBERTARIAN OS now.