hallettj

@hallettj@beehaw.org
10 Post – 164 Comments
Joined 1 years ago

Programmer in California

I'm also on https://leminal.space/u/hallettj

I'm finding this mess interesting: the MAGAs vote and debate like a third party, which kinda gives us a House with no majority party which is something we usually don't get to see in America. And we're getting the deadlocks that come from a chamber that isn't willing to form a coalition - or at least not a reliable one.

I just hope the next speaker candidate doesn't try for the same Republican-MAGA coalition. Although I'm prepared to be disappointed. Do you think there's any chance a Republican would offer to sideline the MAGAs to get support from Democrats?

Under this analysis the Democrats have a plurality. How does that tend to work out in governments with more than two parties?

24 more...

That advice does not literally refer to interface the programming language feature. It means to test the observable behavior of a component, not internal implementation details.

In your example, write tests for both Rectangle and Triangle that call area, and assert the result is correct. But do not test, for example, the order of mathematical operations that were run to calculate the result. The details of the math are an internal detail, not part of the "interface".

1 more...

Wow, this is one of the most complicated Snopes analyses I've seen. But it seems like the statement is accurate with caveats. If the brightest component of Polaris is probably 50 million years old what was there before wasn't really Polaris. And then it doesn't make a difference whether sharks have been around for 450 million or 195 million years.

Wayland replaces the older X protocol. It doesn't have to operate with older protocols. You might be thinking of XWayland which is a proxy that receives X API calls from apps written for X, and translates those to the Wayland API so that those apps can run under Wayland implementations. Window managers can optionally run XWayland, and many do. But as more apps are updated to work natively with Wayland, XWayland becomes less important, and might fade away someday.

PipeWire replaces PulseAudio (the most popular sound server before PipeWire). Systems running PipeWire often run pipewire-pulse which does basically the same thing that XWayland does - it translates from the PulseAudio API to the PipeWire API. It's a technically optional, but realistically necessary compatibility layer that may become less relevant over time if apps tend to update to work with PipeWire natively.

So no, both Wayland and PipeWire are capable of operating independently of other protocols.

I remember finding this Practical Engineering video on Roman concrete to be informative: https://youtu.be/qL0BB2PRY7k?si=5exDGyEK_LTfGNOy

Veritasium also has a chapter on ancient concrete in this video: https://youtu.be/rWVAzS5duAs?si=EJ8rPDTPHlq90kgW

My memory is fuzzy, but I think some of the details are:

  • We know how to make Roman concrete, but it's not necessarily the best choice, and it might be more expensive than is appropriate for a given project.
  • Ancient structures don't have rebar, so they don't degrade due to rust causing expansion. But rebar is so useful that it's often a worthwhile trade-off.

Definitely see the other comments here about survivorship bias, and higher demands on modern structures.

The justification for invading Iraq was a claim that they were developing nuclear weapons. It was well known at the time that the evidence was flimsy, and that even if true it was a flimsy excuse for an invasion. The main piece of evidence was an intercepted shipment of aluminum tubes that were soon shown to have nothing to do with a nuclear program. (See https://en.m.wikipedia.org/wiki/Iraqi_aluminum_tubes). That one is not a conspiracy theory.

Apple controls what may be installed on iphones with an iron fist. Did you know there is only one option for a web browser? Chrome, Firefox, and other apparent alternatives are actually re-skinned Safari. They don't want to allow real competition to their own browser. This is certainly not the only case where they use app store approval powers to block competition.

Plus Apple takes 15-30% of every transaction on iphones. That includes payments in the app store, and also in-app purchases. Sure they have to fund the store, but given that Apple has an absolute monopoly over iphone app distribution this seems predatory to me.

Apple is anticompetitive, and seems to have little regard for their responsibility as a platform provider to allow application diversity to flourish.

So Google has a similar app store approval process, and takes basically the same percentage from transactions. But they are much more generous in what they allow in their store in terms of competing apps. And most importantly, Google does not have a monopoly on Android app distribution. You don't need to do any jailbreaking to set up F-Droid, or to install apps from the web.

It's true that the vast majority of Android users use Google's app store. And I think that Google taking a cut of in-app purchases is also predatory. Apps should be able to not use Google Pay, and to not pay Google a cut. But the fact that there are other options puts a limit on how much Google can block competition, and gives some option for publishers to avoid that 15-30% cut.

Debian unstable is not really unstable, but it's also not as stable as Ubuntu. I'm told that when bugs appear they are fixed fast.

I ran Debian testing for years. That is a rolling release where package updates are a few weeks behind unstable. The delay gives unstable users time to hit bugs before they get into testing.

When I wanted certain packages to be really up-to-date I would pin those select packages to unstable or to experimental. But I never tried running full unstable myself so I didn't get the experience to know whether that would be less trouble overall.

I think the takeaway from that episode is that many carbon offsets are scams, not necessarily all. So don't take corporate claims that they offset their emissions at face value, and consider carefully before you buy offsets.

Take a look at my other comment about Wren and Wendover Productions. (This John Oliver episode happens to include an excerpt from the Wendover piece I mentioned.)

I believe your last Linux experience in 2015 predates DXVK which has been transformative for Linux gaming. Wine used to have to implement its own DirectX replacement which necessarily lagged behind Microsoft's implementation, and IIUC didn't get the same level of hardware acceleration due to missing out on DirectX acceleration built into graphics cards.

Now DXVK acts as a compatibility bridge between DirectX and Vulkan. Vulkan is cross-platform, does generally the same stuff that DirectX does, and graphics cards have hardware acceleration for Vulkan calls the same way they do for DirectX calls. So game performance on Linux typically meets or exceeds performance on Windows, and you can play games using the latest DirectX version without waiting for some poor dev to reimplement it.

If you are using Steam with Proton, Lutris, or really any Wine gaming these days you are using DXVK. It's easy to take for granted. But I remember the night-and-day difference it made.

Oh this is just the thing for playing bard, and casting "vicious mockery" several times per combat

2 more...

Maybe a better case study would be figs since people actually eat those. From what I'm seeing in search results there is some difference of opinion, but maybe the prevailing opinion is that figs are fine for vegans because they are not intentionally exploitative or cruel to animals.

4 more...

I've seen NixPak which I think would be just what you want, except that it's for Nix instead of Gentoo. But Nix has the same features that you say you like in Gentoo.

1 more...

Yeah, I've had similar anxiety recently choosing a new place for my family to live. I think keep in mind that if both choices seem like good options you're likely to get some good outcomes either way. My wife put it like this,

What's nice is the way the human brain works, down the road we'll be thinking, "I'm glad we made this choice because then X happened."

4 more...

I did some digging around in the manual, and I tested this option which seems to work:

security.pam.services.doas.fprintAuth = true;

On my machine that adds this line to /etc/pam.d/doas:

auth sufficient /nix/store/fq4vbhdk8dqywxirg3wb99zidfss7sbi-fprintd-1.94.2/lib/security/pam_fprintd.so # fprintd (order 11400)

Edit: Note that the NixOS option puts in the full path to pam_fprintd.so. That's necessary because NixOS doesn't put so files in search paths.

Without doing more research I don't know how to add arbitrary options to pam files in case you run into something that isn't mapped to a NixOS option yet. The implementation for the pam options is here; there might be something in there that would work.

6 more...

This points to an interesting feature that appears in English: phrasal verbs. This is where a verb is made up of a verb word used in combination with one or more prepositions or "particles". For example in the phrase "put cheese on the pizza" the verb word "put" combines with the preposition "on". (There is no particle in this example.) Even though the words "put" and "on" are not consecutive, and even though "on" has its own function as a preposition, "put on" together form a verb that is lexically distinct (has different meaning and rules) from "put" used with a different preposition or particle.

IIUC you even get a different meaning if you use the same words with a different function. With "on" as a preposition you get, "put cheese on the pizza". But with the particle form of "on" you get a different verb with a different meaning: "put on a coat".

The use you posted, "put cheese", looks like a transitive form of "put" which would be distinct from both of the phrasal verbs I described. My guess is that this is dialect-specific: maybe some English speakers perceive transitive "put" as valid, while others only use "put" as part of a phrasal verb.

Language is messy, and there is no authoritative set of rules for English so you'll find lots of cases where people disagree about correct grammar. One of the classics is whether "where" substitutes for a prepositional or a noun phrase. Lots of people feel it is correct to say, "Where is that at?" while others think that sounds wrong, like saying, "It's at by the corner." (I think this might be the basis for the made-up rule, "don't end sentences with a preposition".)

Try sudo apt update before running the install command. The ISO might not be preloaded with a full package index, or it might be out of date.

If that doesn't work take a look at /etc/apt/sources.list to see if maybe the ISO uses some minimal repo that doesn't have the full set of packages.

This is exactly why we have Reversed Polish Notation. When will people learn?

1 more...

I think the best way to get an idea is to look at feature lists for fancy shells like zsh or fish. But in short there are a number of things a good shell can do to help to execute commands faster and more easily. Stuff like autocompletions which make you faster, and also make things more discoverable; fuzzy searching/matching; navigating command history; syntax highlighting which helps to spot errors, and helps to understand the syntax of the command you're writing.

"Atomic" is a catchy descriptor! Atomic distros for the Atomic Age! It could be an umbrella term since NixOS and Guix are atomic, but instead of images and partitions they use symlinks, and patch binaries to use full paths for libraries and programs that they reference. So there are image-based distros, and I guess expression-derived distros which are both atomic.

I haven't tried image-based distros. This post fills in some gaps for me. Thanks for the write-up!

1 more...

Yeah, that makes a lot of sense. If the thinking is that AI learning from others' works is analogous to humans learning from others' works then the logical conclusion is that AI is an independent creative, non-human entity. And there is precedent that works created by non-humans cannot be copyrighted. (I'm guessing this is what you are thinking, I just wanted to think it out for myself.)

I've been thinking about this issue as two opposing viewpoints:

The logic-in-a-vacuum viewpoint says that AI learning from others' works is analogous to humans learning from others works. If one is not restricted by copyright, neither should the other be.

The pragmatic viewpoint says that AI imperils human creators, and it's beneficial to society to put restrictions on its use.

I think historically that kind of pragmatic viewpoint has been steamrolled by the utility of a new technology. But maybe if AI work is not copyrightable that could help somewhat to mitigate screwing people over.

1 more...

The packages are defined in a Github repo, https://github.com/NixOS/nixpkgs. That contains the sources for all of the Nix expressions. Usually when you install packages you get pre-built binaries that are produced from the expressions in the repo through an automated system.

There is a group of "committers" who have the authority to merge PRs (pull requests) to the nixpkgs repo. There is a tracking issue for nominating new committers. That issue also describes criteria that new committers should meet. I found a comment claiming that there are 139 committers - but that comment is a few years old.

Packages are maintained by a larger group of authors who submit new packages or updates via PRs. Committers review these PRs before they can be merged. A key criteria for becoming a committer is to author a sizable number of PRs that go on to be approved through this process.

I didn't see descriptions of any measures that would prevent committers from making whatever changes to nixpkgs they choose to. Also package hashes are not a cryptographically-secure proof of reproducibility - it is technically possible to tamper with binaries in some ways that don't change hashes. So your trust in nixpkgs is based on,

  • vetting of committers
  • committers being sufficiently diligent in PR reviews
  • security of the build system
  • enough eyes on the project to catch a problem quickly if some malicious change does get through

As a system it looks good enough to me. People have to demonstrate a commitment to the project, and an ability to do the work to get the keys to the system. Personal reputations are at stake which I think is a solid motivator to act in good faith. I think if a malicious change did get in it would probably be caught quickly.

1 more...

Oh no - I didn't realize my preference for the Oxford comma might lead to trouble! I am a fan. When that Vampire Weekend song comes on I always whisper, "me…"

The link lists 78 CVEs of varying severity levels opened over a period of 11 years. Many of them are patched. (I don't know how to easily check how many are patched. The NIST listings provide issue tracker links and severity levels, and the handful of CVEs I looked at had fixes released.) I'm not convinced this is evidence that systemd is unacceptably insecure.

I get that it's frustrating that systemd has such a broad scope, and that it's not portable. But these are trade-offs. In exchange we get power that we wouldn't get otherwise. For example tying device management and scheduled tasks into systemd lets us use the same declarative dependency management in those domains as in the init system. The system is able to bring up services only when needed, boot faster, use fewer resources. The non-portability allows use of, for example, Linux cgroups to cleanly shut down forked processes. Even if we were using an alternative like Upstart I'm gonna guess we would end up relying on cgroups.

Red Hat's role is certainly something to keep an eye on. But systemd is open source, and it can be forked if necessary.

Radium produces the most radiation by miles. The plutonium gives off some alpha radiation that won't hurt you if you don't eat it. (Eye protection would be a good idea I suppose.) I don't remember what U-235 emits but I don't think it's a huge amount.

1 more...

git rebase --onto is great for stacked branches when you are merging each branch using squash & merge or rebase & merge.

By "stacked branches" I mean creating a branch off of another branch, as opposed to starting all branches from main.

For example imagine you create branch A with multiple commits, and submit a pull request for it. While you are waiting for reviews and CI checks you get onto the next piece of work - but the next task builds on changes from branch A so you create branch B off of A. Eventually branch A is merged to main via squash and merge. Now main has the changes from A, but from git's perspective main has diverged from B. The squash & merge created a new commit so git doesn't see it as the same history as the original commits from A that you still have in B's branch history. You want to bring B up to date with main so you can open a PR for B.

The simplest option is to git merge main into B. But you might end up resolving merge conflicts that you don't have to. (Edit: This happens if B changes some of the same lines that were previously changed in A.)

Since the files in main are now in the same as state as they were at the start of B's history you can replay only the commits from B onto main, and get a conflict-free rebase (assuming there are no conflicting changes in main from some other merge). Like this:

$ git rebase --onto main A B

The range A B specifies which commits to replay: not everything after the common ancestor between B and main, only the commits in B that come after A.

1 more...

It looks like it's made by the same team that made Journey

1 more...

I was curious if this was real, and it is! http://news.bbc.co.uk/2/hi/europe/7966641.stm

I haven't really gotten into Twitter or Mastodon because those systems are organized around relationships with people. Apart from close friends and family that doesn't match how my mind is organized. OTOH the Reddit / forum model is organized around interests and ideas. That clicks for me. I engage more with ideas than with personalities. Because of that I've been a regular Reddit user for many years while rarely checking in on Twitter. Since Lemmy follows a similar model I think it's likely to engage my interest the same way.

1 more...

Lots of articles on fusion deserve to be summarized because they have paragraph after paragraph explaining what fusion is, that it produces electricity, why electricity is useful, and the problem of climate change.

But this article gets straight to the point, and manages to limit the-protons-and-the-electrons talk to a single paragraph further down.

And there is also Nushell and similar projects. Nushell has a concept with the same purpose as jc where you can install Nushell frontend functions for familiar commands such that the frontends parse output into a structured format, and you also get Nushell auto-completions as part of the package. Some of those frontends are included by default.

As an example if you run ps you get output as a Nushell table where you can select columns, filter rows, etc. Or you can run ^ps to bypass the Nushell frontend and get the old output format.

Of course the trade-off is that Nushell wants to be your whole shell while jc drops into an existing shell.

There are specs for that!

For system-wide installation the Filesystem Hierarchy Standard essentially says:

  • put it in /usr/local/bin/ if you want to drop a script somewhere
  • or put it in /usr/bin/

Both should be in the default $PATH for most systems.

For single-user installs the XDG Base Directory Specification says,

User-specific executable files may be stored in $HOME/.local/bin. Distributions should ensure this directory shows up in the UNIX $PATH environment variable, at an appropriate place.

Those locations will work in 99% of cases.

Nothing will work for every case because Linux systems are many and varied. For example I'm on NixOS which doesn't adhere to that particular provision of XDG, and doesn't adhere to any of FHS.

Maybe I'm the only one who is happy with a single, ultrawide monitor. I used to have two monitors, but with one big screen I don't have to deal with keeping track of which screen has focus, or with the gap between them.

I did hold out for an ultrawide with the same vertical pixel count as a 4k which it turns out is expensive. With more pixels I can make the code smaller and still read it comfortably.

It helps to have a window manager that is good at laying out windows side-by-side. I'm a big fan of PaperWM which is an extension for Gnome.

1 more...

I second kitty. I switched from urxvt to konsole to get support for ligatures. Then I switched to kitty because it also has ligatures, it's faster than konsole, and it's easier to configure with version-controlled files.

I don't do very much customization: font, line spacing, color scheme, and a couple of custom key bindings.

Allow me to share, Federated Wiki. I don't think it uses ActivityPub, but otherwise I think it's close to what you described. Instead of letting anyone edit articles it uses more of a fork & pull request model.

I totally agree.

Right now I'm on a new project with a teammate who likes to rebase PR branches, and merge with merge commits to "record a clean history of development". It's not quite compatible with the atomic-change philosophy of conventional commits. I'm thinking about making a case to change style, but I've already failed to argue the problem of disruption when rebasing PR branches.

I'm a fan! I don't necessarily learn more than I would watching and reading at home. The main value for me is socializing and networking. Also I usually learn about some things I wouldn't have sought out myself, but which are often interesting.

Yes, I met someone who was allergic to basically everything, and said she had good results with hookworms

Fungi are more closely related to animals than plants.

I bring this up too. What my kid asks, "what is vegan?", and my wife says, "someone who eats plants", then I shout from across the room, "and fungi!" Tbh no one is amused but me.

There's nothing hypocritical about eating fungi! I just want recognition for the fungal contribution.

I like to use Obsidian for this kind of thing. It has tagging, and you can link notes and see the network of links in a visualizer. There's also a "canvas" feature that lets you lay out notes spatially in whatever way makes sense to you. I assume there is a web clipping plugin which could make it easy to grab the comment content and link at the same time.