LeFantome

@LeFantome@programming.dev
0 Post – 743 Comments
Joined 12 months ago

Wait what? Telnet? I am guessing cybersecurity is not one of the classes available at your school.

I teach. I use Arch for my school laptop.

I actually believe that “GNU / Linux” creates the confusion, even the Android problem you cite.

If we all just said “Linux” to mean Linux distribution and the software ecosystem that implies, almost everybody would agree what that meant. All this “actually what you are calling Linux is actually” and “Linux is just the kernel” stuff confuses people. If Linux is just the kernel then Android and Ubuntu are equally Linux. Most people do not even know what a kernel is until you start “educating” people that “Linux” is not Linux.

An Operating System is defined by the applications that it runs natively. Alpine Linux and Ubuntu run the same software and services. Chimera Linux runs all the same stuff even though it comes without any GNU software by default ( BSD utils, Clang compiler, MUSL ). They are all “Linux”. None of them are Android or ChromeOS. They are not the embedded OS in my thermostat or body worn camera. Of course, all these things use the Linux kernel but they are not all “Linux” operating systems.

There are many examples of the kernel not defining the Operating System. iOS and macOS are not the same thing. Windows and Xbox are not the same thing. Yes, us geeks know the common infrastructure they share.

And if an operating system is defined by its applications, is “GNU” a good label? My distro of choice offers 80,000 packages of which maybe 200 are managed by the GNU Project. Go to gnu.org and look at the list of packages that are actually GNU for real. It may shock you how short the list is.

There are other single sources that contribute more software. In terms of code and base architecture, Red Hat is probably the largest contributor ( and no, I do not use Red Hat — RHEL has fewer than 3000 packages for one thing ). I do not want to call my distribution “Red Hat” Linux but frankly it makes more sense than GNU.

Some of the GNU / Linux folks say that the reason for the label is the C library ( Glibc ). But not all Linux distros use Glibc. For a mainstream Linux user, does it make sense to say that Alpine, Void, and Chimera are not the same kind of OS as Ubuntu or Fedora? A regular user could sit down at any of them and not only use them mostly the same but perhaps not even notice the difference. I could write a Linux app without knowing about Alpine and the it could be built for it easily. They all use the same apps and desktop environments. They all run Docker natively. Even fairly deep Linux knowledge applies equally to them all. As pointed out, freedesktop.org applies to them equally. They have the same driver and hardware support ( including the full graphics and audio stacks ). Most people would agree that all these “Linux” systems are pretty alike and quite different from macOS, Windows, and Android. They are all much more like each other than they are even to FreeBSD.

The GNU name pays homage to the historical contribution of the GNU Project that, while important, is pretty historical at this point. If the goal is to promote Free Software or even the GPL, the right branding would be the FSF. So, even that is confusing.

Clearly, in my view, GNU is a terrible brand to try to glob on to Linux. It is not explanatory. It is not even accurate. It is mostly political and frankly overstates the current contribution of the project. I talked code above. There is more code in Wayland or X11 and Mesa than in all of GNU probably. There are more lines of code licensed MIT than GPL in most distros. Most GPL software available is NOT provided by the GNU project.

Again, GNU is a hugely important project to free software and the history of Linux. That history should be celebrated and acknowledged. Distorting it and contorting it is not the way to do that. Enough with “GNU / Linux” already.

They did not really “talk him into it”. They simply labelled it that when they uploaded it. Linus just went with the flow and it stuck.

No. I did not. And you can. Happy to conclude things here. Good luck.

I would say that you make a decent argument that the ALU has the strongest claim to the “bitness” of a CPU. In that way, we are already beyond 64 bit.

For me though, what really defines a CPU is the software that runs natively. The Zen4 runs software written for the AMD64 family of processors. That is, it runs 64 bit software. This software will not run on the “32 bit” x86 processors that came before it ( like the K5, K6, and original Athlon ). If AMD released the AMD128 instruction set, it would not run on the Zen4 even though it may technically be enough hardware to do so.

The Motorola 68000 only had a 16 but ALU but was able to run the same 32 bit software that ran in later Motorola processors that were truly 32 bit. Software written for the 68000 was essentially still native on processors sold as late as 2014 ( 35 years after the 68000 was released ). This was not some kid of compatibility mode, these processors were still using the same 32 bit ISA.

The Linux kernel that runs on the Zen4 will also run on 64 bit machines made 20 years ago as they also support the amd64 / x86-64 ISA.

Where the article is correct is that there does not seem to be much push to move on from 64 bit software. The Zen4 supports instructions to perform higher-bit operations but they are optional. Most applications do not rely on them, including the operating system. For the most part, the Zen4 runs the same software as the Opteron ( released in 2003 ). The same pre-compiled Linux distro will run on both.

I think Fedora is solid choice. I will tell you why I do not recommend it to new users myself.

1 - Fedora is very focused on being non-commercial ( see my other comments on its history ). This leads them to avoid useful software like codecs that I think new users will expect out of the box

2a- the support cycle is fairly short and whole release upgrades are required

2b - Fedora is typically an early adopter of new tech. It is not “bleeding edge” but it may be moreso than new users need.

3 - it is does not really target new users like say Mint does though it does target GUI use

4 - I do not use it myself anymore and I do not like to recommend what I do not use. What I do use has a reputation for not being new user appropriate ( not sure I agree ).

Nothing wrong with Fedora though in my view. I would never discourage anybody from trying it.

For anybody that does not know, Fedora was founded by Red Hat to be their “community” dostro. Before Fedora, there was only Red Hat Linux and it was trying to be both commercial and community. Red Hat founded Fedora to be an explicitly community distribution and then released the first version of Red Hat Enterprise Linux ( RHEL ). This resolved their commercial / community conflict.

Fedora is explicitly NOT an enterprise distribution. They are annoyingly committed to only free software. They release often and have short release cycles. Fedora is certainly not aimed at enterprises.

Rocky and Alma are RHEL alternatives and are absolutely aimed at the enterprise. Fedora merging with either of these projects would be super surprising indeed. It would make no sense whatsoever.

The “community” enterprise option from Red Hat is not Fedora, it is CentOS Stream. Alma has rebased onto CemtOS Stream ( which is what RHEL is also derived from ). That makes sense.

I have fewer comments on the health or future of RHEL or Red Hat itself or how much IBM. Ares about it. I guess I will say that I have never seen so many ads for it. I think revenues are at record levels. It does not feel like it is dying.

I don’t use Fedora or RHEL but Red Hat is one of the biggest contributors to Open Source. So, I hope this cynical poster is wrong. GCC, Glibc, Systemd, Xorg, Wayland, Mesa,SELinux, Podman, and the kernel would all be massively impacted by less Red Hat funding.

9 more...

Not the original commenter but Red Hat took steps a few months ago to make it harder to make complete bug-for-bug clones of their Enterprise product ( RHEL ). Basically, they stopped providing the exact build instructions and exact patch sets ( SRPMS ) to their competitors. You now have to jump through more hoops to do it ( like Rocky does ) or you have to fork your own Enterprise distribution from CentOS Stream ( like Alma now does ).

You still get everything you always did as a Red Hat subscriber ( even if you do not pay them — they have a free tier ). All the actual software is still Open Source for everybody ( subscriber or not ) and available free in CentOS Stream and Fedora. Red Hat is still one of the biggest contributors across the Linux ecosystem and, ironically, one of the biggest proponents and providers of GPL software in particular.

However, if you are a Red Hat subscriber and you share the RHEL SRPMS, Red Hat may not renew your subscription. That is their big evil move.

Many people did not like this change and the most extreme detractors have accused Red Hat of betraying Open Source or of even trying to take Linux proprietary. In my view, this is totally wrong. Read my second paragraph.

What many people do not seem to understand is that Red Hat founded the Fedora Project and, much later, the CentOS Stream Project explicitly to be open, community distributions so that they ( Red Hat ) could pursue their commercial interests with RHEL without friction from the community. I say people do not understand because some people now say they do not trust Fedora to stay Open when the entire reason it exists is to be that ( as an explicit strategy of Red Hat ).

One of the things that is annoying ( to me ) about Fedora is that it insists on being completely anti-commercial ( avoiding patented codecs for example ). The idea that Fedora is for businesses or will be “taken over” by IBM is silly. Red Hat employees have always been the biggest contributors to Fedora. It has always been Free ( as in freedom ).

The most extreme damage Red Hat may eventually do to Fedora is to stop paying so many people to work on it and the important packages it relies on. That has not happened and probably will not anytime soon ( in my view ).

“Fedora is Red Hat, Red Hat is mostly aimed at companies”.

I said this in another comment but Red Hat Linux used to target both the community and commercial interests. Fedora was founded to be an explicitly community distribution that was NOT aimed at companies. Red Hat then created Red Hat Enterprise Linux ( RHEL ) which absolutely targets companies ( for money ). The whole point of founding the Fedora project was for it not to target companies.

Fedora release often, has short support cycles, and is hostile to commercial software. It would be a terrible choice for a business in my view. It is a leading community distribution though.

The top foundational distros that all the others are based on are Ubuntu, Fedora, Debian, and Arch ( and maybe SUSE — I am not European ).

In my view, Ubuntu’s best days are behind it. Fedora has never looked so good.

I use one of the other distros above but I used Fedora long ago and it treated me well. I think it is a solid choice. My impression has been that it is gaining in popularity again.

Largely true but Red Hat also wanted to control their own cloud / container stack.

Cloud is a big part of why IBM paid so many billions for Red Hat to begin with.

That is a risk on the Windows side for sure. Also, once an ISA becomes popular ( like Apple Silicon ) it will be hard to displace.

Repurposing Linux software for RISC-V should be easy though and I would expect even proprietary software that targets Linux to support it ( if the support anything beyond x86-64 ).

Itanium was a weird architecture and you either bet on it or you did not. RISC and ARM are not so different.

The other factor is that there is a lot less assembly language being used and, if you port away from x64, you are probably going to get rid of any that remains as part of that ( making the app more portable ).

2 more...

Being cynical about Red Hat is fine as long as we keep it factual. I enjoy their contributions but otherwise have no skin in their game.

I am not as enthusiastic about Rocky. I cannot see at all how you can compare them to Debian. It seems unfair even to Alma to lump them in with Rocky as Alma is taking the high road. Best of luck with Rocky though. Truly.

Your make a good case that “community” means “cannot be shut down by a corporation”. Thank you for that. Can a “bug-for-bug RHEL clone” be community though? If Red Hat cancels RHEL ( unlikely ), is there still a Rocky Linux?

Full disclosure - I do not use any of these enterprise distros anymore although the stance taken by Alma makes them attractive to me. I am looking for ways to use them.

If we had more time and maybe more beer, I would be interested to get into a discussion about what “community” is.

CentOS pre-Stream was not a “community” distro in my view as I do not see “downloads that cost no money” as the backbone of what makes a community.

CentOS ( pre-Stream ) could not innovate their own distro. They could not even fix a bug without breaking their “bug-for-bug” RHEL compatibility promise. All they did was recompile and redistribute RHEL packages with the trademarks removed. What kind of community do you have if you do not produce anything? Everything from CentOS was actually provided by Red Hat. It was just literally “RHEL without paying”. There was no diversity.

CemtOS Stream is managed by Red Hat for sure as its primary purpose is to become the base for a future version of RHEL. However, it is Open Source and developed fully out in the open. Contributions are possible.

Unlike CentOS of old, the “community” can contribute to and debate the future of CentOS Stream. Alma has contributed bug fixes for example. It has been a bit painful as Red Hat is used to being the only one in the sandbox but the process is evolving. CentOS Stream has multiple contributors ( not just Red Hat ). This means that others have some influence on what RHEL looks like in the future. “The community” can build on that.

In my view, CentOS Stream is already a lot more of a “community” distro than the original CentOS was. You do not have to agree of course. Anyway, I hope other projects join with Alma and Red Hat in contributing to CentOS Stream.

For all their flag waving about “the community”, distros like Rocky and Oracle have shown no interest in contributing to CentOS Stream. They continue to clone the distro that Red Hat forks from CentOS Stream. They don’t get involved until all the work has been done. Then they make money off it ( the only reason they are there ).

2 more...

Once a chip architecture gets popular on Windows, it will be hard to displace. ARM has already become popular on macOS ( via Apple Silicon ) so we know that is not going anywhere. If ARM becomes popular on Windows ( perhaps via X Elite ), it will be hard to displace as the more popular option. That makes RISC-V on Windows a more difficult proposition.

I do not think that RISC-V on Linux has the same obstacles other than that most hardware will be manufactured for Windows or Mac and will use the chips popular with those operating systems.

Things should get a lot better with NVIDIA soon in distros like Fedora that want to stick with only open software.

IBM has to file public financials. Tracking rough Red Hat revenues is not all that hard. They have done very well.

https://ca.finance.yahoo.com/news/red-hat-high-growth-streak-151126295.html

I won’t get back into the Fedora thing. I bet the whole dollar though that, if RHEL disappeared, Rocky and Oracle would take no interest in Fedora.

3 more...

We agree.

My point is that “porting” is not such a big deal if it is just recompile. If you already target Linux with a portable code base ( to support both ARM and amd64 for example ) then the burden of RISC-V is pretty low. Most of the support will be the same between RISC-V and ARM if they target the same Linux distros.

The Linux distros themselves are just a recompile as well and so the entire Open Source ecosystem will be available to RISC-V right away.

It is a very different world from x86 vs Itanium with amd64 added to the mix.

Look at Apple Silicon. Fedora already has a full distribution targeting Apple Silicon Macs. The biggest challenges have been drivers, not the ISA. The more complete the Linux ecosystem is on ARM, the easier it will be to create distros for RISC-V as well.

Porting Windows games to Linux is not a small step. It is massive and introduces a huge support burden. That is much different than just recompiling your already portable and already Linux hosted applications to a new arch.

With games, I actually hope the Win32 API becomes the standard on Linux as well because it is more stable and reduces the support burden on game studios. It may even be ok if they stay x86-64. Games leverage the GPU more than the CPU and so are not as greatly impacted running the CPU under emulation.

Microsoft must make 40% of their revenue off of Azure at this point. I would not be surprised if more than 50% of that is on Linux. Windows is probably down to 10% ( around the same as gaming ).

https://www.kamilfranek.com/microsoft-revenue-breakdown/

Sure there are people in the Windows division who want to kill Linux and some dev dev folks will still prefer Windows. At this point though, a huge chunk of Microsoft could not care less about Windows and may actually prefer Linux. Linux is certainly a better place for K8S and OCI stuff. All the GPT and Cognitive Services stuff is likely more Linux than not.

Do people not know that Microsoft has their own Linux distro? I mean an installation guide is not exactly their biggest move in Linux?

6 more...

Wayland is the future. It has already surpassed X11 in many ways. My favourite comment on Phoronix was “When is X11 getting HDR? I mean, it was released 40 years ago now.”

That said, the fact that this pull request came from Valve should carry some weight. Perhaps Wayland really is not ready for SDL.

I do not see why we need to break things unnecessarily as we transition. This is on the app side. Sticking with X11 for SDL ( for now ) does not harm the Wayland transition in any way. These applications will still work fine via Xwayland.

Sure, a major release like 3.0 seems like a good place to make the switch. In the end though, it is either ready or it is not. If the best path for SDL is to keep the default at X11 then so be it ( for now ).

If we are marking the birth of Linux and trying to call it GNU / Linux, we should remember our history.

Linux was not created with the intention of being part of the GNU project. In this very announcement, it says “not big and professional like GNU”. Taking away the adjectives, the important bit is “not GNU”. Parts of GNU turned out to be “big and professional”. Look at who contributes to GCC and Glibc for example. I would argue that the GNU kernel ( HURD ) is essentially a hobby project though ( not very “professional” ). The rest of GNU never really not that “big” either. My Linux distro offers me something like 80,000 packages and only a few hundred of them are associated with the GNU project.

What I wanted to point out here though is the license. Today, the Linux kernel is distributed via the GPL. This is the Free Software Foundation’s ( FSF ) General Public License—arguably the most important copyleft software license. Linux did not start out GPL though.

In fact, the early goals of the FSF and Linus were not totally aligned.

The FSF started the GNU project to create a POSIX system that provides Richard Stallman’s four freedoms and the GPL was conceived to enforce this. The “free” in FSF stands for freedom. In the early days, GNU was not free as in money as Richard Stallman did not care about that. Richard Stallman made money for the FSF by charging for distribution of GNU on tapes.

While Linus Torvalds as always been a proponent of Open Source, he has not always been a great advocate of “free software” in the FSF sense. The reason that Linus wrote Linux is because MINIX ( and UNIX of course ) cost money. When he says “free” in this announcement, he means money. When he started shipping Linux, he did not use the GPL. Perhaps the most important provision of the original Linux license was that you could NOT charge money for it. So we can see that Linus and RMS ( Richard Stallman ) had different goals.

In the early days, a “working” Linux system was certainly Linux + GNU ( see my reply elsewhere ). As there was no other “free” ( legally unencumbered ) UNIX-a-like, Linux became popular quickly. People started handing out Linux CDs at conferences and in universities ( this was pre-WWW remember ). The Linux license meant that you could not charge for these though and, back then, distributing CDs was not cheap. So being an enthusiastic Linux promoter was a financial commitment ( the opposite of “free” ).

People complained to Linus about this. Imposing financial hardship was the opposite of what he was trying to do. So, to resolve the situation, Linus switched the Linux kernel license to GPL.

The Linux kernel uses a modified GPL though. It is one that makes it more “open” ( as in Open Source ) but less “free” ( as in RMS / FSF ).

Switching to the GPL was certainly a great move for Linux. It exploded in popularity. When the web become a thing in the mid-90’s, Linux grew like wild fire and it dragged parts of the GNU project into the limelight wit it.

As a footnote, when Linus sent this announcement that he was working on Linux, BSD was already a thing. BSD was popular in academia and a version for the 386 ( the hardware Linus had ) had just been created. As BSD was more mature and more advanced, arguably it should have been BSD and not Linux that took over the world. BSD was free both in terms or money and freedom. It used the BSD license of course which is either more or less free than the GPL depending on which freedoms you value. Sadly, AT&T sued Berkeley ( the B in BSD ) to stop the “free”‘ distribution of BSD. Linux emerged as an alternative to BSD right at the moment that BSD was seen as legally risky. Soon, Linux was reaching audiences that had never heard of BSD. By the time the BSD lawsuit was settled, Linux was well on its way and had the momentum. BSD is still with us ( most purely as FreeBSD ) but it never caught up in terms of community size and / or commercial involvement.

If not for that AT&T lawsuit, there may have never been a Linux as we know it now and GNU would probably be much less popular as well.

Ironically, at the time that Linus wrote this announcement, BSD required GCC as well. Modern FreeBSD uses Clang / LLVM instead but this did not come around until many, many years later. The GNU project deserves its place in history and not just on Linux.

9 more...

Manjaro because it is a bait and switch trap. Seems really polished and user friendly. You will find out eventually it is a system destroying time-bomb and a poorly managed project.

Ubuntu because snaps.

The rest are all pros and cons that are different strokes for different folks.

An amazing accomplishment and a real feather in the cap for Rust.

When are we going to see this incorporated into the mainline kernel?

I am a pretty big fan of Open Source and have used Linux myself since the early 90’s. Most governments are not going to save money switching to Open Source. At least not within say the term of a politician or an election cycle. Probably the opposite.

This kind of significant shift costs money. Training costs. Consultants. Perhaps hardware. It would not be at all surprising if there are custom software solutions in place that need to be replaced. The dependencies and complexities may be significant.

There are quite likely to be savings over the longer term. The payback may take longer than you think though.

I DO believe governments should adopt Open Source. Not just for cost through. One reason is control and a reduction of influence ( corruption ). Another is so that public investment results in a public good. Custom solutions could more often be community contributions.

The greatest savings over time may actually be a reduction in forced upgrades on vendor driven timelines. Open Source solutions that are working do not always need investment. The investment could be in keeping it compatible longer. At the same time, it is also more economic to keep Open Source up to date. Again, it is more about control.

Where there may be significant cost savings is a reduction in the high costs of “everything as a service” product models.

Much more important than Open Source ( for government ) are open formats. First, if the government uses proprietary software, they expect the public to use it as well and that should not be a requirement. Closed formats can lead to restrictions on what can be built on top of these formats and these restrictions need to be eliminated as well. Finally, open formats are much, much more likely to be usable in the future. There is no guarantee that anything held in any closed format can be retrieved in the future, even if the companies that produced them still exist. Can even Microsoft read MultiPlan documents these days? How far back can PageMaker files be read? Some government somewhere is sitting on multimedia CD projects that can no longer be decoded.

What about in-house systems that were written in proprietary languages or on top of proprietary databases? What about audio or video in a proprietary format? Even if the original software is available, it may not run on a modern OS. Perhaps the OS needed is no longer available. Maybe you have the OS too but licenses cannot be purchased.

Content and information in the public record has to remain available to the public.

The most important step is demanding LibreOffice ( or other open ) formats, AV1, Opus, and AVIF. For any custom software, it needs to be possible to build it with open compilers and tools. Web pages need to follow open standards. Archival and compression formats need to be open.

After all that, Open Source software ( including the OS ) would be nice. It bothers me less though. At that lobby, it is more about ROI and Total Cost of Ownership. Sometimes, proprietary software will still make sense.

Most proprietary suppliers actually do stuff for the fees they charge. Are governments going to be able to support their Open Source solutions? Do they have the expertise? Can they manage the risks? Consultants and integrators may be more available, better skilled, amd less expensive on proprietary systems. Even the hiring process can be more difficult as local colleges and other employers are producing employees with expertise in proprietary solutions but maybe not the Open Source alternatives. There is a cost for governments to take a different path from private enterprise. How do you quantify those costs?

Anyway, the path to Open Source may not be as obvious, easy, or inexpensive as you think. It is a good longer term goal though and we should be making progress towards it.

1 more...

Except that they are not expecting to merge this into RHEL. They are sending it to CentOS Stream.

The reason this is a first is because AlmaLinux recently decided to stop being a bug for bug clone of RHEL. Good for them. As this shows, they are now actually free to add value.

Rocky Linux cannot ship this patch unless RHEL does. Rocky cannot contribute anything by definition.

The article says this will be “based on Ubuntu” but it will probably actually just be Ubuntu with custom defaults, pre-installed software, and maybe repositories.

This just makes sense in my view. The cost relative to the number of machines they must deploy will be miniscule. If they do not mess with the core system too much, they can outsource almost all the admin and expertise to Canonical in terms of security and packaging. People saying this will blow up. Why? It does not sound like they are really creating a full distro from scratch. Is Ubuntu not viable?

In terms of why crating a custom version instead of just using actual Ubuntu. Again, the cost of customizing a distro can be dramatically less than making even simple configurations on every system after the fact. They can standardize what the desktop will look like and set key defaults. They can choose what applications are installed by default. They can remove applications from the repository that they do not want to be installed. The can ensure that localization is done well, etc.

2 more...

The creator here buys hardware designed to run Linux ( from Tuxedo ) and uses the distro designed to run on that hardware. So no big surprise he thought all hardware problems were solved.

Probably a similar issue with office apps. He is not trying to share an Outlook calendars with coworkers or deal with complex spreadsheets sent from colleagues. He is probably not sending many PowerPoints to his boss.

Good on him for reaching out to his community to ensure his opinions are backed by evidence.

1 more...

Linux does this all the time.

ALSA -> Pulse -> Pipewire

Xorg -> Wayland

GNOME 2 -> GNOME 3

Every window manager, compositor, and DE

GIMP 2 -> GIMP 3

SysV init -> SystemD

OpenSSL -> BoringSSL

Twenty different kinds of package manager

Many shifts in popular software

1 more...

I may actually give this a go.

With the addition of non-free firmware in Debian ( so better hardware compatibility ) and the rising popularity of Flatpak and Distrobox ( so access to newer software ), the advantages of Ubuntu are narrowing and the problems with Ubuntu continue to mount. Basing something like Mint directly on Debian makes sense to me.

I have been considering trying Debian with Distrobox / Arch to fill any application gaps. LMDE might fill that void instead.

2 more...

If you are a Linux user and like commercial games, you probably would prefer them to work on Linux.

“Market share” on Linux aligns the vested interest of game makers and Linux game players. If the company thinks it can make money, it will do more to allow games to run, or at least do less to stop them.

In my view, the “community” reaction was terrible. Regardless of if you agree with them or not, the response should be honest and rational. I found the reaction, emotional, political, and frankly dishonest. The response was that Red Hat was suddenly going proprietary, that they were violating the GPL, and / or that they were “taking” the work of untold legions of free software volunteers without giving back. They were accused of naked corporate greed by companies whose whole business is based on using Red Hat’s work without paying ( peak hypocrisy ).

Let’s start with what they actually did. Red Hat builds RHEL first by contributing all their code and collecting all the Open Source packages they use into a distribution called CentOS Stream. Once in a while, they fork that and begin building a new release of RHEL. That requires lots of testing, packaging, configuration, documentation, and other work required to make RHEL above and beyond the source code. Previously, they made the output of all this work publicly available. What they did was stop that. So, what does it look like now?

Red Hat now only distributes the RHEL SRPM packages to their subscribers ( which may be paying customers or getting it free ). The support agreement with Red Hat says that, if you distribute those to others, they will cancel your subscription. That is the big controversy.

What you cannot do now is “easily” build a RHEL clone that is guaranteed “bug for bug” compatible with RHEL and use it to compete with Red Hat. You will notice that those making the most noise, like Rocky Linux, want to do that.

So, are Red Hat violating the GPL? No.

First, Red Hat distributes all the code to make RHEL to the actual people they “distribute to” ( to their subscribers ) including everything required to configure and build it. This is everything required by the GPL and more.

Second, less than half of the code in RHEL is even GPL licensed. The text of the GPL itself says that the requirements of the GPL do not extend to such an “aggregate” ( the term the GPL itself uses ). So, Red Hat is going quite above and beyond the licensing by providing their subscribers code to the entire distribution. Yes, beyond.

Third, CentOS Stream remains open to everybody. You can build a Linux distribution from that that is ABI compatible with RHEL. That is what Alma Linux is doing now. Red Hat contributes mountains of free software to the world, both original packages and contributions to some of the most important packages in the free software world. Red Hat is not required to license packages they author under the GPL but they do. They are not required to make all of CentOS Stream available to the public but they do. They are certainly not freeloaders.

But what about this business of cancelling subscriptions? Isn’t that a restriction in violation of the GPL? Not in my view.

The GPL says that you are free to distribute code you receive under the GPL without fear of being accused of copyright violation. It says you can modify the code and distribute your changes. It says you can start a business in top of that code and nobody can stop you. Do RHEL subscribers enjoy all these freedoms. Yes. Yes they do.

What happens ( after the change ) when a RHEL subscriber violates the terms of their subscriber agreement? Well, they cease to be a subscriber. Does this mean they lose access to the source they got from RHEL? No. Does it mean they can be sued for distributing the code? No. I mean, you could risk trademark violation if you sell it I guess.

So, what does it mean that RHEL cancels your subscription? Well, it means they will no longer support you. I hope people see that as fair. It also means as bs they will no longer distribute their software to you IN THE FUTURE.

That is it. That is the outrage.

If you give away the results of Red Hat’s hard work to productize CentOS Stream into RHEL, they stop sending you future releases.

Again, that is it.

You can do whatever you want with what they already sent you. You have all the rights the GPL provides, even for software licenses as MIT, BSD, Apache, or otherwise. Nothing has been taken from you except access to FUTURE Red Hat product ( other than totally for free via CentOS Stream of course ).

Anyway, as you can see, they are the devil and we should hope their business fails. Because, why would we want a commercial successful company to keep contributing as much to Free Software and Open Source as they do?

15 more...

I have defended Red Hat a fair bit over the past year. Their level of contribution to the community is a big reason why.

It is clear though that their prominence comes with a downside in the paternal and authoritative way that their employees present themselves. Design choices and priorities are made with an emphasis on what works for and what is required for Red Hat and the software they are going to ship. The impact on the wider community is not always considered and too often actively dismissed.

Even some of the Linux centrism perceived in Open Source may really be more about Red Hat. For example, GNOME insists on Systemd. Both projects are dominated by Red Hat. There have been problems with their stewardship of other projects.

To me, this is a much bigger problem than all the license hand-waving we saw before.

I used to be a huge Manjaro fan. There were many ways it let me down, some of which were just bad governance.

The biggest problem though is the AUR. Manjaro uses packages that are older than Arch. The AUR assumes the Arch packages. This, if your use the AUR with Manjaro, your system will break.

It is not a question of if Manjaro will break but when. Every ex-Manjaro user has the same story.

For me, EndeavourOS is everything that Manjaro should be.

15 more...

Well, this is better news than him being completely gone from driver dev which has been the situation for months now. He formally resigned.

Of course, this may have already been in the works and the reason he left to begin with. Either way, good to see him back.

Things seems about to be in a pretty good spot NVIDIA wise. I do not use any of their recent gear so I do not care directly. That said, it will be good to have NVIDIA working well with Wayland just to remove the substantial amount of noise NVIDIA issues add to that project.

People are completely missing the point here. “Who made Red Hat the arbiter of when Xorg should end?”

I would say nobody but perhaps a better answer is all of us that have left the work of maintaining Xorg to Red Hat. All that Red Hat is deciding is when they are going to stop contributing. So little is done by others that, if Red Hat stops, Xorg is effectively done.

Others are of course free to step up. In fact, it may not be much work. Red Hat will still be doing most of the work as they will still be supporting Xwayland ( mostly the same code as Xorg ), libdrm, libinput, KMS, and other stuff that both Xorg and Wayland share. They just won’t be bundling it up, testing it, and releasing it as Xorg anymore.

We will see if anybody steps up.

2 more...

I am not saying “This is the Year of the Linux Desktop”. That said, things languished below 2% for decades and now it has doubled in just over a year. With the state of Linux Gaming, I could see that happening again.

Also, if ChromeOS continues to converge, you could consider it a Linux distro at some point and it also has about 4% share.

Linux could exceed 10% share this year and be a clear second after Windows.

That leaves me wondering, what percentage do we have to hit before it really is “The Year of the Linux Desktop”. I have never had to wonder that before ( I mean, it obviously was not 3% ). Having to ask is a milestone in itself.

11 more...

I cannot wait until this lands in most distros. So much of the Wayland noise will go away.

If you game on Linux, you are almost certainly using it.

I love your comment. Without thinking, I instinctively agreed with you. Then I was like, “wait, why wouldn’t you just run that software on Windows?”. So now I am wondering if you are predicting such a mass migration off Windows that all these Windows apps become abandonware.

Probably though, part of what you are saying is that it may be easier to maintain compatibility with older software via WINE than on Windows itself.

64 years from now, when ReactOS comes out of beta, it will be great for that as well.

What an odd article. First, the author goes to great lengths to assert that “Linux IS UNIX” with pretty circumstantial evidence at best. Then, I guess to hide the fact the his point has not proved, he goes through the history of UNIX, I guess to re-enforce that Linux is just a small piece of the UNIX universe? Then, he chastises people working on Linux for not staying true to the UNIX philosophy and original design principles.

Questions like “are you sure this is a UNIX tool?” do not land with the weight he hopes as the answer os almost certainly “No. This is not a “UNIX” tool. It is not trying to be. Linux is not UNIX.”

The article seems to be mostly a complaint that Linux is not staying true enough to UNIX. The author does not really establish why that is a problem though.

There is an implication I guess that the point of POSIX and then we UNIX certification was to bring compatibility to the universe of diverging and incompatible Unices. While I agree that fragmentation works against commercial success, this is not a very strong point. Not only was the UNIX universe ( with its coherent design philosophy and open specifications ) completely dominated by Windows in the market but they were also completely displaced by Linux ( without the UNIX certification ).

Big companies found in Linux a platform that they could collaborate on. In practice, Linux is less fragmented and more ubiquitous than UNiX ever was before Linux. Critically, Linux has been able to evolve beyond the UNIX certification.

Linux does follow standards. There is POSIX of course. There is the LSB. There is freedesktop.org. There are others. There is also only one kernel.

Linux remains too fragmented on the desktop to displace Windows. To address that, a standard set of Linux standards are emerging: including Wayland, pipewire, and Flatpak.

Wayland is an evolution of the Linux desktop. It is a standard. There is a specification. There is a lot of collaboration around its evolution.

As for “other” systems, I would argue that compatibility with Linux will be more useful to them than compatibility with “UNIX”. I would expect other systems to adopt Wayland in time. It is already supported on systems like Haiku. FreeBSD is working on it as well.