steph

@steph@lemmy.clueware.org
0 Post – 19 Comments
Joined 1 years ago

On behalf of garbage, I loudly protest on this attempt to assimilate it to Powershell.

Given this trend, GPT 5 or 6 will be trained on a majority of content from its previous versions, modeling them instead of the expect full range of language. Researchers have already tested the outcome of a model-in-loop with pictures, it was not pretty.

2 more...

Unplug your mouse. Seriously. Do it. It might sound like the "kicking and screaming" method but you'll learn to rely on your keyboard even for GUI tools and you'll vastly improve how fast you navigate your computer. You should find yourself more and more in the terminal, obviously, but you may learn also some nice tricks with everything else.

1 more...

We went from a computational tool serving a wide range of tasks to an entertainment widget barely more interactive than a TV were work is an afterthought.

The former was expected to not get in the way of their users, the latter is designed to retain attention as long as possible to maximize consumption

That's a lot of words for "I'm too lazy to master the most essential tool of my professional life and keep it updated to my requirements".

If you don't want to do it, feel free to pay an Ubuntu support subscription, open a ticket, and get back to work. As you said: you should be working on your problem instead of whining. Or maybe you earn more whining?

There's a saying that goes like that: "To a bad workman, there's always bad tools."

Others has answered the specific cases where TTM is paramount.

When time is less of an issue, in my experience it's in no particular order a mix of:

  • product owners or similar role wanting "everything and right now" for no reason whatsoever, except maybe some bonus;
  • bosses bossing around to try and justify their existence instead of easying progress ;
  • developers being not much more than code jockeys with a tendancy to develop by StackOverflow copy/paste;
  • operations lacking time, resources or knowledge to build a proper CI/CD pipeline - when it's not an issue of operations by ServerFault copy/paste;
  • experts (DBA, virtualization, middlewares) being kept out of the project, and only asked for advice when things go terribly wrong later.

All in all, instead of short term profit, it's a lack of not-so-long term vision and engagement from everyone involved. They just don't care.

Yeah, I'm the one in charge of fixing the mess, why you ask?

1 more...

Given the state of this world, there's better things to do than to add such gimmicks in EV. There's enough energy and matter wasted in gadgets on vehicle to at least let the new generation free of such stupidity. I could get behind a new kind of recycled ICE vehicles, operating on captured-carbon fuel and paid at a premium for those who need to love the rumble of a well-tuned engine, but that should stay a fringe hobby.

Time's for a compromise on the length of the fuse is over, we, as a whole, should be focussing on preventing the climate bomb to do too much damages to humans.

Or maybe we should double down, extract and burn even more fuel, produce and discard even more plastic, without forgetting to have it circle five times the Earth before before it hands in the customers' hands: it wouldn't be the first mass extinction and the planet will get through. Us humans, though...

Yet again an example of (for some, not-so) old, control-freak farts that just don't understand the world they live in. The law proposal is entitled "Regulation of digital space". As if a country could regulate an international network.

Sometimes I'm really ashamed of our politicians.

You can add support contract requirements for some pieces of software coming from vendors with so little confidence in their product that they're rather have it run on an outdated dependencies environnement. A side effect of the logic you talked about, applied to software vendors.

And *nix shells a perfect example of the KISS principle.

EOL of version 7 is next year in June, you got a nice pile of work here!

Subjective take: there's worse than FreeCAD - sure it's a bit "old school" but it's bearable. O. The other hand, the solver has crashed on me so many times... The workbench way of doing things requires some time to get usdmed to, sure, but a crashing solver is far worse.

Your question is a bit vague but it looks to me that what you want is some sort of expert system of inference engine.

There might be some open source solutions, and there's always the GNU Prolog language that might suit your needs.

I suspect that you won't get a graphviz structure out of it though.

Modular's Mojo might interest you - it just popped up in my news feed, it's entirely a coincidence.

You mean Microsoft will recoup the cost of unbundling by charging more per product compared to the previous bundle, given that it's now different products?

'cause at work the powers that be has gone all-in on MS and this decision won't change a bit their "strategy".

A process owned by any user will be able to exploit a userspace vulnerability, whatever this user is. Selinux, chroot, cgroups/containerization add a layer of protection to this, but any vulnerability that bypass these will be as exploitable from nobody as from any other local user. It will protect a user files from some access attempts but will fail to prevent any serious attack. And as usual when it comes to security, a false sense of security is worse than no security at all.

Remember that some exploits exist that can climb outside of a full-blown virtual machine to the virtualisation host, finding a user escalation vulnerability is even more likely.

The only real protection is an up-to-date system, sane user behavior and maybe a little bit of paranoia.

On a side note, w.r.t. keeping the dependencies up to date, have a look at renovatebot. It creates merge request for each and every dependency update, thus triggering a build to check that everything is OK.

Each and every line of code you write is a liability. Even more so when you wrote it for someone else. You must always be able to rebuild it from source, at least as long as your client expect the software to work. If you feel it's not worth it, you probably low-balled the contract. If you don't want to maintain code, have the client pay a yearly maintenance fee, give the code and the responsibility to maintain it to your client at the end of development, or add a time limit to it's support.

There's no "maintenance mode" software: either it's in use and must be kept updated with regard to it's execution environment, or it's not used anymore and can be erased and forgotten. Doing differently opens too much security issues, which shouldn't be acceptable for us all as a trade.

2 more...

Simple: because it goes against the KISS principle. The GNU tools that constitute the user interface to the system comes from a philosophy that started with Unix: simple tools, doing one thing well, communicating through "pipes" - i.e. the output of one tool is supposed to be used as the input of the next one.

Such philosophy allows to assemble complex logic and workflows with a few commands, automating a lot of mundane tasks, but also allowing to work at a large scale the same way you would work on a few files or tasks.

Graphical tools don't have such advantages:

  • UI are rarely uniform in their presentation or logic, as there's so much way to present options and choices;
  • Apple did something nice in the way of automating with AppleScript, but I've not encountered anywhere else. GUIs are rarely automatable, which means you'll need some clicking and pushing buttons if a task has to be repeated - or the GUI has to be altered to be able to replay a set of commands for multiple items;
  • interconnecting different GUIs so that they can exchange data is just impossible. You usually end up with files in dedicated format, and the needs to massage data from one format to another to be able to chain tasks from different GUIs
  • more importantly, command line work with minimal bandwidth and tooling on the client side. Tmux, Mosh and similar tools allow to work with an intermittent connection, and have a very low impact on the managed system;
  • in some specific fields - notably embedded and industrial systems - you just can't justify allocating resources just for a graphical environment. On these system, CLI is as powerfull as on a full fledged server, and don't requires stealing precious resources from the main purpose of the system.

Beware though: as time passes, Unix founding principles seems to get forgotten, and some CLI tools manifest a lack of user experience design, diverging from the usual philosophy and making the life of system administrators difficult. I've often observed this on tools coming from recent languages - python, go, rust - where the "interface" of the tools is closer to the language it's written with than the CLI uniform interface.