What's the difference between package manager and why are there so many?

GravitySpoiled@lemmy.ml to Linux@lemmy.ml – 92 points –

Are they so different that it's justified to have so many different distributions? So far I guess that different package manager are the reason that divides the linux community. One may be on KDE and one on GNOME but they can use each other's packages but usually you are bound to one manager

72

You could also ask: "Why are there so many audio formats? Why are there so many video formats?" And so on.

The reason is different people have different ideas on what is the best way to do things.

And different goals. A large part of why apt is the way it is comes down to the way the Debian project is structured - a project relying on a large number of volunteers vs something like Red Hat where most of the changes come from employees, so has different rules and standards for how packages should be constructed

As another user mentioned, package managers are specific to distributions rather than DEs. The main difference between them is that they're developed by the respective distribution teams, but there are some practical differences too. For example, apt supports versioned dependencies while pacman doesn't because of the different distribution models between Debian and Arch (monolithic vs. rolling release). This affects their dependency resolution strategy with each being better suited for it's respective distribution.

To address your point about package managers being the main difference between distros, this isn't quite true. As mentioned, different distros have different distribution models, priorities, and overall biases/opinions that affect the user experience in a variety of ways and make them better suited to different use cases. I would never dream of putting Arch on one of my servers in the same way that you'd probably never catch me installing Debian on my gaming machine.

never dream of putting Arch on one of my servers in the same way that

All my devices, including servers and pi, on Arch testing (Only nvidia has fucked me twice):

I wouldn't worry too much about the package manager, just worry about whether the distro has a good package repository. If it has all the software you want to use, then use it. In my opinion, most package managers (dnf, apt, pacman, xbmp) are basically the same, and you would only notice a big difference if you ever tried to make your own package for your own software.

That said, a few package managers are very different from all the rest:

  • Crux OS "prt-get": simple and stupid: just downloads and installs tar archives.
  • Gentoo "emerge": builds all software from source code when you install it. This provides some guarantees that the source code was not tampered with by the distro maintainers, this is great if you need to review all of the source code that is running on your system, but terrible for most people who don't want to spend so much computing power on compiling stuff every time you do a software update.
  • Nix and Guix: creates its own blockchain-like database of isolated package dependency chains on your system, allowing you to instantly roll-back to the previous set of installed packages if you ever install something that breaks your system. It also guarantees that the software can be checked bit-for-bit (using SHA hash) traced back to the exact version and dependencies of the source code that built it. Nix and Guix packages also live peacefully side-by-side with any other package manager since all Nix/Guix apps are completely self-contained within its own database. In a way, it is sort of like one big AppImage or Docker container, but you can just keep adding or removing stuff to it as often as you want.
  • Silverblue, SteamOS, VanillaOS, BlendOS, CarbonOS: distributes "immutable images," so it is impossible modify the operating system at all. Updates will ship an entirely new operating system with all packages built-in. However you are allowed to install software into your home directory, and you can install FlatPacks and AppImages. This provides a great deal of security in exchange for a tiny bit of inconvenience.

My personal preference: I use ordinary Debian or Ubuntu to install the critical software that needs to be stable and reliable, and I use Guix OS on the side to install the bleeding-edge things that might break a lot.

I couldn't disagree more! Package managers are actually the only thing which differentiates distributions by a large margin. Syntax should be intuitive, download/updates fast and reliable. Also when watching git repositories for new software alternatives, you e.g. see often packages for good package managers, whereas you need to go some extra mile for "stable" package managers.

I wouldn’t worry too much about the package manager, just worry about whether the distro has a good package repository.

download/updates fast and reliable. Also when watching git repositories for new software alternatives, you e.g. see often packages for good package managers, whereas you need to go some extra mile for “stable” package managers.

But I would say these are not features of the package manager software, rather they are features of the package repository, that is, the online service that provides the packages. It doesn't matter if you use Apt, DNF, Pacman, if the package repo is slow, fully of packages that haven't been built right, the package manager software won't do much to make it better.

But like I said, a few package manager are really unique, like Gentoo Emerge, Crux Prt-Get, and Nix and Guix.

Can you decouple a package manager from its repository like that? And even if, is that a real world example?

Yes.

Ubuntu and debian both use apt, but differing repos. Different versions of ubuntu/debian use different repos, with newer/older software.

Ubuntu and Debian differences...don't see your point here. Nobody in Arch uses apt? Nobody on ubuntu uses pacman. If you use pacman you are using Arch repositories.

If you use pacman you are using Arch repositories.

Incorrect. There is manjaro, but there also is msys2, a windows program with the goal of making linux tools available on windows by recompiling all of them. That's very far from the arch philosophy and repos.

And ubuntu and debian have massively different repositories. One of them gives you the actual firefox package, and the other installs firefox via a closed source backend, app store called snap, when you attempt to install firefox using apt.

And then there is also the version differences, like debian stable is going to have much older software than ubuntu.

Thanks for pulling corner cases from dark places... not sure if we misunderstand but my point was as written, you use the package manager/repository which ships with your distro. So the original quote was:

I wouldn’t worry too much about the package manager, just worry about whether the distro has a good package repository. Which in my opinion is misleading at best.

Also, bit part of Portage (Gentoo "emerge") is being able to 'flag out' parts of the package out (or in) to the compilation.

Let's say you want to not have telemetry in your packages. So you set '-telemetry' globally, and each package that has known telemetry parts will not compile locally - so it can not be turned on (unless it's hidden really well).

Or you want to use pulseaudio? You can flag it globally, or for specific packages. That way you can influence software you install without knowing much about anything build-related - the work is done by the repository maintainers.

They won't be able to pry Gentoo from my cold dead hands. Arch, Nix/Guix can suck it, all my money goes to the Gentoo

From what I've heard compiling locally also allows for hardware optimizations specific to your system, though that may be false, as I've never used gentoo.

Sure, but such optimizations won't usually matter a lot. I have no hard data on that, but would still prefer having smaller binaries from removing unnecessary BS to having CPU optimizations.

Fortunately I got both ^^ And the system feels a lot more responsive to Ubuntu, but I never ran any benchmarks to prove that.

Yeah the smaller binaries is a big part too. I bet it feels like having your system hand-crafted just for you

It does. Unfortunately I don't spend as much time on my PC lately.

Also I still use binary blobs like Steam.

Oh no, there are 5 package managers out there and they're all wildly different

I know! I'll make a standard, universal package manager that'll be better than all the others that everyone will use!

There are now 6 different package managers

The package manager is really only a small part of the story.

A distro at the end of the day is a API/ABI platform. What makes Debian what it is, is that it has a specific set of old unmoving packages. What makes Arch is that it has the latest APIs always. And everything in between like Fedora.

So even if Fedora used dpkg it wouldn’t change anything, you can’t use its packages on Debian.

As to why so many exist… well a lot of them suck in their own unique way.

As to why so many exist… well a lot of them suck in their own unique way.

lmao, true

Except in NixOS, it's literally a distro built around a package manager. But it doesn't force you to choose, you can have both unstable and stable packages

Yeah modern usage in general involves silo’d ABIs, be it Flatpak, Nix, Docker/Podman. Modern languages even try to move away from any ABI.

Of course there are upsides and downsides to the traditional approach.

Modern languages even try to move away from any ABI.

I wouldn't put it that way. In the case of Rust, it seems everyone wants to have a stable ABI for a number of reasons (e.g. making dynamic linking possible without FFI), but the core developers feel like the ABI is still too unstable to commit to anything.

In my experience a lot of Rust developers love the lack of shared libraries and bundling everything, viewing it as a huge win. Maybe someday it will support it but I feel it will be less commonly relied on.

I've seen that sentiment but I think it's more a matter of people making excuses for Rust and not wanting to admit that it has any shortcomings compared to C++.

It's the same mentality that leads C++ developers to defend things like header files.

I think a lot of what drives the creation of redundant open source tools is that the urge to address a matter of personal taste meets the urge to start a new project, so people create new things that are different in key ways from older ones, but not necessarily better, and not necessarily even different enough to justify the amount of work that goes into them.

In some ways it feels a lot easier to start a new project then to build off an existing one:

  • You don't have to familiarize yourself with the old code, which may be in a language you don't know or don't like

  • You don't have to deal with the existing maintainers, who may or may not be supportive of the changes you want to make

  • You don't have to support use cases that don't matter to you personally

Some differences can be explained. Pacman was created after the Debian package manager (I guess that because Debian is older than Arch) . It is justified because Pacman is faster than Apt. But its too much work to replace Apt by Pacman comparing to the benefits.

But in some cases I don't know why. As instance I wonder why a distro, such as Void, created its own package manager instead of using the Alpine one. If Alpine is younger than Void, invert the sentence of course.

Void was created just for testing xbps. Without xbps there would be no Void.

I've yet to play with void. Is there anything cool/special about xbps?

It is extremely fast and simple. Also, it has its own "aur", called xbps-src. But nowadays void is not just xbps, it is also defined by runit (which is also extremely fast and simple) and minimalist dependencies (you will have to manually install many things, that other distributions ship reinstalled, in case you need them. By the way, if you prefer GUI package manager, there is octoxbps (not an advantage of xbps, but you might want that when you try void linux).

Nice! I'm coming from arch with systemd so it'll be interesting to play around with runit too. Definitely going to give it a tinker.

Thanks for the info!

Some package managers do have differences that justify a separate project (nix, gentoo's portage, etc).

For others, sometimes package managers are very similar feature-wise. But some developers would rather remake the thing because they would understand their code a lot better than someone else's. Or because it would be far easier for them to customize rather than extend another project.

Imo it is developer laziness. Being able to use other people's work is a valuable skill. But then again, this is open source, and people are free to develop the software they want the way they want.

Because linux runs on an incredible amount of platforms with an incredible amount of hardware targets and platform goals.

It runs on every supercomputer and every raspberry pi and every android phone, as well as most web servers and almost every steamdeck.

Use cases are so dramatically different there will always be multiple distributions with their own needs for package managers.

Because people will never agree on a single one, and it's FOSS so nothing is forced. I for one am glad I don't have to use apt because I prefer pacman, just as I am glad someone who doesn't want to use an Arch-derivative has Debian and apt to fall back on.

Hi, that's me! I've been using apt and Debian derivatives for 17 years. Bookworm is fantastic!

Same here, but lately I've also been pushed towards Snap and Flatpak. I miss the old visual Synaptic tool though...

I like Flatpak and Appimage. I won't touch Snaps.

There are many different "X" because people have different tastes in their choice of "X". I like KDE, the next guy likes Gnome. I like Apt, but I might like whatever NIXOS uses, others like Yum or DNF. I kinda like the idea behind GoboLinux, probably because I was a MAC OSx user for a long while.

Any "why are there too many X's on Linux" (where X is package manager, desktop environment, init system etc) appear to stem from the silly assumption that there happens to be an already built operating system called Linux and all these people are forking it and putting in their own stuff for the sake of their own egos and nothing else.

When really, the answer fundamentally boils down to either one of two things: either it doesn't exist yet, or the existing solution fails to meet a need. Linux, itself, is merely a kernel; it didn't come with a package manager or desktop environment. Those things all had to be made by separate parties and there isn't always agreement on how best to do them.

As a Guix user, I believe the Guix package manager has advantages over "traditional" GNU/Linux package managers, as well as other so-called "universal package managers" such as Flatpak.

Diversity generates competition that drives innovation. Also, people mostly work on what they like and use what they like. Although it certainly spreads resources on duplicate projects, you can't force people to work or use your personal favourite. You'll get used to it.

Are they so different that it’s justified to have so many different distributions?

Linux isn't a project its a source compatible ecosystem. A parts bin out of which different people assemble different things. The parts being open source means you don't need anyone's permission or justification to make something different out of them.

From these many and varied efforts comes life, vitality, interest, intellectual investment. You can't just take the current things you like best and say well what if we all worked on THOSE when many of them wouldn't even have existed save for the existence of a vital ecosystem that supported experimentation and differentiation.

If we really believed in only pulling together maybe you would be developing in cobol on your dos workstation.

People have different opinions on how packages should be managed. Of course, there are some package managers which are very similar to each other (DNF and zypper have the same backend), but they can also get really different (Nix/Guix and pacman are basically completely opposite in philosophy). It comes down to preference, and you can't force anything.

I don't think package managers are DE-only but rather depend on the distro

Imagine having only one option, and that option is dnf. I'm out. I don't want Linux anymore.

Ok I'll bite. What's so bad about dnf? I would take it anyday over apt.

Not much really. It is great, but slow as shit and makes me want to toss my computer across the room. I just want to install one tiny 5 kb package, I don't want you to take 10 minutes checking all the RPM fusions repos and go to the moon and back then install my package. No, just install the damn thing. I'll ask you when I need you to check that long list of repos. 😂

Thanks for the response. I anticipated "it's slow" but I guess that just doesn't bother me because otherwise I dig dnf/rpm over any other combo I've tried.

For example - a one liner to identify all packages from a particular repo is trivial with dnf, huge pain with apt.

I'm a pacman/yay guy myself. Haven't used apt for too long. Even when I run something with apt, I install nala

Dnf-5 was/is supposed to be a big speed improvement, no?

Not sure. I have fedora 38 in a VM and it's still very slow.

Not targeted to be default until Fedora 41. So if you wanted to try it out you would have to install it your self.

Ok then, I'll search it up and install it.

Sweet I hope it really is better for you then!

I actually did install it. I hate to judge it in a virtual machine, but I've noticed a small difference. It's still slow, but not as slow as the other one. Fun fact, the only reason why I don't use fedora is that I hate their installer. I have 3 drives on my PC, and I'm so scared that I'd mess things up and lose my photos/videos/games etc 😂 The installer is so confusing. I remember figuring it out once then just forgot it again.

I custom install every time, partially to preserve my user data partition, partially because I don't like the defaults (I like mirroring my disks and leaving space to grow into later if I want)

Same here, that's why I have 3 different drives in my PC, 512GB nvme ssd for root, 1TB SATA SSD for home, and 2 TB SATA SSD just for games/emulation and steam

Some of them are more or less historic (for example rpm/dnf and dpkg/apt), where they were developed at a similar time by unrelated projects and have just carried on separately ever since.

Others have been developed to represent very different approaches (such as portage, which is based on the traditional BSD way of managing software by building from source, or snap and flatpak, which containerise applications).

The multitude of systems don't really cause as many problems as you'd think. As a rule, non-containerised packages need to be custom-built for each distro anyway, so it doesn't really make any difference which packaging tool is chosen by that distro. That is, you can't really take a debian .deb package and expect it to work properly on Fedora, even if you install dpkg/apt first.

KDE and Gnome are just desktop environments. I use Gnome on Arch which uses pacman as the package manager. I could use KDE or XFCE or i3 or anything. The package manager is tied to the distro, and you can use almost any software with them.

“Are they….justified”?

  1. Somebody thought the need for a new package manager was great enough to spend time creating one. That person at least must think it is justified.

  2. We, the users, have not chosen just one of the options to be the standard. Does that “justify” that they all exist?

In the short term, the popularity of Linux is certainly hurt by the complexity of the ecosystem and the lack of standardization. As a product, it would see better adoption of it were more standardized. Without writing a book about why, there is no doubt about this. The short version is that, today, Linux is many products, none of which can compete as effectively as one would and all of them are impaired by the confusion this causes.

In the longer run though, it is almost certainly one of the great strengths of Linux. Linux is many products and as a result, it can target and effectively fill almost every niche. That is going to make it very hard for alternatives to compete at some point. Once Linux knowledge and Linux applications ( yes, I know ) become more mainstream, this compatibility between options becomes a strength. I can have my own operating system that is just the way I want it, but it still runs Docker and Stream ( as examples ).

Think of the cereal aisle at the grocery store. If I want to introduce a new cereal ( or pasta sauce or whatever ), coming up with one that has 10 flavours is not going to work ( without immense marketing muscle ). None of them will sell well enough and probably all of them will get pulled from store shelves. I would be better off launching one. However, once I have a mature market position, I can have not just the regular version but the whole wheat version, the honey nut version, the cinnamon version, the holiday version , etc. They will collectively make each other stronger and all potentially sell well ( again, think pasta sauce flavours if that makes more sense to you ).

This is why there was The Tesla Roadster at first and now there are the Model S, Model 3, Model X, Model Y, and maybe the Cyber Truck.

Linux is not a “product” though. It is an Open Source program. While any given Linux distributor ( distribution ) may think like I outline above, collectively the Linux market is fragmented. Linux is a mix of commercial, community, and individual interests all scratching their own itch.

I am super interested in Chimera Linux right now and fairly negative towards Ubuntu. This makes me part of your problem though. Chimera Linux makes “Linux” less predictable, more confusing, and more frustrating for new and potential users. Pushing everybody to Ubuntu would be a better market strategy. That said, I personally want to use Chimera Linux and, while I say that I want Linux to succeed, I also secretly hope that Ubuntu will fail. Chimera Linux uses a package manager used by only one other Linux ( and in fact they use different, incompatible versions of it so really they are unique ). Clearly, my priorities are mid-aligned with the premise of your question.

So, what does “justified” mean in the Linux space.

Convenience/ease of use; There's a little caveat that only "true linux users" are able to notice, tho --

... compiling is, and always will be, the best option.