Out of interest, which programs do you need to install via terminal that concern the average user?
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
Average user doesn't mean total noob. Installing Windows and the relevant drivers is something many users in the "Gamer class" can do. These guys usually don't to command line (except for maybe pinging something), but they are comfortable with installing and configuring stuff in GUI.
They understand how to google the driver to their weird hardware, download the .exe or .msi, start it and navigate the install wizard.
On Linux I've had it a few times that you e.g. have to unload/load kernel modules and stuff to get a driver working. I once even had it, that the Linux driver for a device was only supplied in source code to be compiled with an ancient version of GCC that wasn't available over the package manager. So then I spend an hour or two fixing compiler errors to upgrade that old source code to work with a current GCC.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
And yeah, that's not something you'll do on a daily basis, but it is a huge roadblock for someone afraid of white text in a black window.
I feel like we can cherry pick situations on other operating systems where you might have to open a terminal window to solve an issue, but I agree that there are roadblocks that many won't even try to get past. There has been a load of progress around usability and informational resources for less advanced users over just the last few years. I feel the main barrier to entry is the willingness to learn something new.
I guess, many people here can't take it when people talk about the issues holding back Linux, considering the downvotes.
I think, if you like something, it is really important to talk about the issues it has, so that they can be improved. Blaming the users is not going to make more people switch.
These instances I posted weren't cherry picked. They where just what I encountered when setting up a single laptop.
I could tell you about the issues I had with my work laptop, where it was pretty difficult getting the VPN solution we use at work to run. We are using Teams and Outlook for work, both don't have official apps on Linux and the unofficial ones are really buggy.
Getting simple stuff like screen sharing to work under Wayland is basically impossible, which required me to revert to X11.
And sure, you can say that it's all edge cases, and people shouldn't be running Linux with a GPU, on an old device, with Microsoft tools or do screen sharing.
But if you say all of these common use cases are rare edge cases that shouldn't really be done on Linux, then you aren't talking about a general purpose desktop OS any more.
Older hardware and software that are made by companies who have hostile or ignorant stances towards FOSS are major contributors to at least some of the issues you mentioned. I can tell you that there is active development around solving some GPU/Wayland issues, but the limitations on what Linux can or can't do isn't fully the fault of Linux.
There is definitely room for improvement in Linux. The improvements in just the last three years shows that it has improved at a pretty brisk pace. Free and open community driven operating systems work toward the active needs of the community, so hopefully any issues or bugs you had got reported and you've actively checked up on them. I am making an assumption here, but if all of these issues you have had were extremely common, there would be every incentive for development of solutions to them.
There is some level of compromise that is needed when using proprietary software or hardware from hostile vendors or using some older hardware with Linux. This also goes for company supplied or required hardware/software. Linux might not be for everyone on every piece of hardware right now. The tradeoffs for having control over your hardware and software can sometimes be frustrating.
As for "blaming the users", I don't think I did that at all. I just feel like some folks prefer appliances over heavy machinery. That's personal preference. Sure, Linux should make onboarding as easy as possible, but in my opinion, the active pursuit of being #1 or #2 in desktop OS use is going down the wrong path. There is a certain type of person who chooses to go down the desktop Linux path and catering to their needs seems much more important for the long-term health of the OS.
I honestly don't see how Ubuntu installing the wrong driver is Nvidia's fault. Ubuntu has both the current and the legacy driver in their repos. When installing the OS it asks you whether you want to install proprietary drivers, which I agreed to, so it installed the non-legacy driver that does not work with the old GPU.
Also, when installing the correct driver on apt, there is no text prompt or CLI wizard (like in many other tools installed over apt) to actually load that driver into the kernel. I don't see how this is Nvidia's fault, as they haven't created the Ubuntu package.
That's squarely on the Ubuntu guys. That has literally nothing to do with hostile vendors.
I can't speak for Ubuntu or your situation as I don't have your issues and I don't use Ubuntu, but I would advise you reach out to the Ubuntu community with your issues and if you can't find a solution, file a bug report. They are a large community with a lot of engagement, so I would think that you might have luck either solving your problem or pointing their devs toward fixing the issue on their end. The squeaky wheel gets the grease.
Tbh, I spent enough time fixing it. I don't want to file bug reports and battle them though a bug log in my spare time. I do enough of that in my work time.
In my spare time I tend to use a computer to accomplish a task, not to fix the computer.
And that's where you hit the "average user" thing again. If an average user encounters a bug, they either live with the bug or give up. If you ask any of your non-dev friends how many of these has ever filed a bug report, I'd venture to say it's not going to be many.
And in the few cases where you actually see non-techy people filing a bug report, it's usually going to be on the level of "HELP! My PC is not working!" and the response will be "Closed for not following the template".
I don't know what to tell you. If you want to blame Ubuntu for your issues, but you aren't willing to go through the standard process of troubleshooting or filing a big report, maybe Linux isn't for you.
Dude, I am a dev, I am using Linux for over a decade. I spent hours fixing these issues. But we are talking about "The year of the Linux desktop" and Linux being used by people who aren't hardcore tech nerds or devs.
To quote from my edit that you probably didn't read yet:
And that’s where you hit the “average user” thing again. If an average user encounters a bug, they either live with the bug or give up. If you ask any of your non-dev friends how many of these has ever filed a bug report, I’d venture to say it’s not going to be many.
And in the few cases where you actually see non-techy people filing a bug report, it’s usually going to be on the level of “HELP! My PC is not working!” and the response will be “Closed for not following the template”.
The point I was trying to make is "Linux currently totally is not for everyone. There are a lot of issues that stop regular users from using Linux." and I was downvoted for that, and people kept argueing whose fault it is. And now you are coming to the conclusion that I was making in the beginning:
Maybe Linux isn't for everyone.
Because it currently totally isn't.
What really annoys me with this discussion (and I've had the exact same discussions hundreds of times) is that it always goes through the same cycle:
A: "Linux is great, everyone should use Linux right now, the only reason not everyone is using Linux is because they are too dumb to switch."
B: "But Linux has issues that are difficult for regular users, e.g. that you have to use CLI every once in a while."
A: "No, you never have to."
B: lists some specific instances where they had to
A: "But you are dumb for even doing that. Linux is not made for XYZ use cases"
B: "But XYZ use cases are pretty basic and most users run into random ones of these"
A: "If you think there are problems, you are maybe to dumb to use Linux"
B: "I fixed these issues, it was just a demonstration of where regular users get stuck"
Combine that with the really misguided notion that anyone who isn't a tech crack must be dumb because otherwise they would be a tech crack, and it gets really dumb. People completely forget that there are lots of different topics where even most tech cracks have no clue about (e.g. medicine). I have ~20 years of experience as a dev. I know my way around the things I'm working with. I am the one everyone calls if they have issues with their computers. But I got no clue about fixing cars, medicine, or making clothes, just to name a few. And I am really grateful that if I go to my doctor, he treats me and doesn't tell me I'm dumb because I don't have a medical degree.
Okay buddy. You obviously didn't come here to do anything but grind your axe. 3% of desktop use is pretty cool, even if it's likely just a ton of Steam Decks. Anyway, have a nice time developing.
Well, fanboy, have fun pushing people out of Linux. People like you are a major part of the reason that it's still only 3%.
Just imagine that: There's a free OS that doesn't track you, doesn't serve you ads, you don't need the newest hardware to run it, it does almost anything that Windows does, and yet, it's got only 3 meagre percent of market share after 30 years.
You think that is because it's so easy to get into if you don't have a tech background?
Sadly, next to the technical hurdles and the bad UX, there is a really toxic community that is happy to shit on anyone who dares to say that Linux isn't perfect.
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
That's not quite my definition of "common".
Average user doesn’t mean total noob. Installing Windows and the relevant drivers is something many users in the “Gamer class” can do.
The "Gamer class" is far from the average user, the average user doesn't even know what a GPU or a driver is and doesn't care. As long as the OS installs all drivers by default or the OEM has preinstalled them all is good.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
Until there's no more drivers for that generation of GPU. The Windows 11 drivers for AMD only go down to the Vega 64, if you have a Fury X or a 7970 you're out of luck. Not that Windows 11 even lets you install on a machine that old.
AMDGPU goes down all the way to GCN 1.2, which means you can even run a 7970 on a modern Linux OS. Even out of the box if your distro has the legacy flags enabled.
It would be fantastic if there was more hardware that works out of the box in Linux, but that's up to the manufacturers. Until more people switch to Linux they don't bother and until they bother everybody complains that XY doesn't work on Linux.
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
That’s not quite my definition of “common”.
Using a GPU under Linux is not common? And installing Linux on old laptops isn't either?
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
I can't say anything about AMD, since the last time I had an AMD GPU is ~15 years ago.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn't support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
I first installed the legacy driver over apt, but it didn't do anything, because apparently installing the driver doesn't actually load the kernel module for the driver. So I had to load it manually, and it still didn't do anything. Turns out, uninstalling the original driver didn't unload it from the GPU either. So I had to re-install the old driver, unload the module, uninstall the old driver, install the legacy driver and load the legacy module. Took me a few hours to figure all of that out.
No way someone without CLI experience will be able to do that.
Using a GPU under Linux is not common? And installing Linux on old laptops isn’t either?
Installing drivers for an older GPU, obscure printer, touchpad or other weird hardware is not common.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn’t support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
Which is an issue with Nvidia, they have no drivers for that GPU for Windows 11 either. Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia. It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
Which is an issue with Nvidia, they have no drivers for that GPU for Windows 11 either
Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia.
Yes, they can. They literally have the correct (legacy) driver in the Ubuntu repo. But the autoinstaller installs the wrong driver during installing the OS. And if you try to manually install it, there is not even a text prompt in the CLI saying "You just installed that driver, do you want to actually use it to? (Y/n)".
They could have even gone so far as to make a CLI wizard (like many other packages do) or even a GUI wizard. But no, the package just installs and does nothing by default.
It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
Also that is not correct. All the *buntu installers ask you when you install the OS whether you also want to have closed source drivers installed, and then it installs the closed source Nvidia drivers. Just the wrong ones.
Yes, they do.
That driver does not list the 635M but only the desktop version. Which is still impressive but I think they have separate drivers for their mobile chips? The latest driver listed for the 635M is only available for Windows 10 on their website.
Yes, they can. They literally have the correct (legacy) driver in the Ubuntu repo. But the autoinstaller installs the wrong driver during installing the OS. And if you try to manually install it, there is not even a text prompt in the CLI saying “You just installed that driver, do you want to actually use it to? (Y/n)”.
They could have even gone so far as to make a CLI wizard (like many other packages do) or even a GUI wizard. But no, the package just installs and does nothing by default.
Does ubuntu-drivers devices list the correct driver or is the recommended one too new? The driver packages in Ubuntu should install and activate themselves unless you have multiple installed, sounds like you ran into a bug.
Also that is not correct. All the *buntu installers ask you when you install the OS whether you also want to have closed source drivers installed, and then it installs the closed source Nvidia drivers. Just the wrong ones.
That does not change that Nouveau is used by default for the installer itself and by default for the OS if you don't select anything.
For example installing the GPU driver for an older GPU. Or installing the driver for an obscure printer, touchpad or other weird hardware.
Average user doesn't mean total noob. Installing Windows and the relevant drivers is something many users in the "Gamer class" can do. These guys usually don't to command line (except for maybe pinging something), but they are comfortable with installing and configuring stuff in GUI.
They understand how to google the driver to their weird hardware, download the .exe or .msi, start it and navigate the install wizard.
On Linux I've had it a few times that you e.g. have to unload/load kernel modules and stuff to get a driver working. I once even had it, that the Linux driver for a device was only supplied in source code to be compiled with an ancient version of GCC that wasn't available over the package manager. So then I spend an hour or two fixing compiler errors to upgrade that old source code to work with a current GCC.
Getting the same hardware to run under Windows meant downloading the .exe and running it.
And yeah, that's not something you'll do on a daily basis, but it is a huge roadblock for someone afraid of white text in a black window.
I feel like we can cherry pick situations on other operating systems where you might have to open a terminal window to solve an issue, but I agree that there are roadblocks that many won't even try to get past. There has been a load of progress around usability and informational resources for less advanced users over just the last few years. I feel the main barrier to entry is the willingness to learn something new.
I guess, many people here can't take it when people talk about the issues holding back Linux, considering the downvotes.
I think, if you like something, it is really important to talk about the issues it has, so that they can be improved. Blaming the users is not going to make more people switch.
These instances I posted weren't cherry picked. They where just what I encountered when setting up a single laptop.
I could tell you about the issues I had with my work laptop, where it was pretty difficult getting the VPN solution we use at work to run. We are using Teams and Outlook for work, both don't have official apps on Linux and the unofficial ones are really buggy.
Getting simple stuff like screen sharing to work under Wayland is basically impossible, which required me to revert to X11.
And sure, you can say that it's all edge cases, and people shouldn't be running Linux with a GPU, on an old device, with Microsoft tools or do screen sharing.
But if you say all of these common use cases are rare edge cases that shouldn't really be done on Linux, then you aren't talking about a general purpose desktop OS any more.
Older hardware and software that are made by companies who have hostile or ignorant stances towards FOSS are major contributors to at least some of the issues you mentioned. I can tell you that there is active development around solving some GPU/Wayland issues, but the limitations on what Linux can or can't do isn't fully the fault of Linux.
There is definitely room for improvement in Linux. The improvements in just the last three years shows that it has improved at a pretty brisk pace. Free and open community driven operating systems work toward the active needs of the community, so hopefully any issues or bugs you had got reported and you've actively checked up on them. I am making an assumption here, but if all of these issues you have had were extremely common, there would be every incentive for development of solutions to them.
There is some level of compromise that is needed when using proprietary software or hardware from hostile vendors or using some older hardware with Linux. This also goes for company supplied or required hardware/software. Linux might not be for everyone on every piece of hardware right now. The tradeoffs for having control over your hardware and software can sometimes be frustrating.
As for "blaming the users", I don't think I did that at all. I just feel like some folks prefer appliances over heavy machinery. That's personal preference. Sure, Linux should make onboarding as easy as possible, but in my opinion, the active pursuit of being #1 or #2 in desktop OS use is going down the wrong path. There is a certain type of person who chooses to go down the desktop Linux path and catering to their needs seems much more important for the long-term health of the OS.
I honestly don't see how Ubuntu installing the wrong driver is Nvidia's fault. Ubuntu has both the current and the legacy driver in their repos. When installing the OS it asks you whether you want to install proprietary drivers, which I agreed to, so it installed the non-legacy driver that does not work with the old GPU.
Also, when installing the correct driver on apt, there is no text prompt or CLI wizard (like in many other tools installed over apt) to actually load that driver into the kernel. I don't see how this is Nvidia's fault, as they haven't created the Ubuntu package.
That's squarely on the Ubuntu guys. That has literally nothing to do with hostile vendors.
I can't speak for Ubuntu or your situation as I don't have your issues and I don't use Ubuntu, but I would advise you reach out to the Ubuntu community with your issues and if you can't find a solution, file a bug report. They are a large community with a lot of engagement, so I would think that you might have luck either solving your problem or pointing their devs toward fixing the issue on their end. The squeaky wheel gets the grease.
Tbh, I spent enough time fixing it. I don't want to file bug reports and battle them though a bug log in my spare time. I do enough of that in my work time.
In my spare time I tend to use a computer to accomplish a task, not to fix the computer.
And that's where you hit the "average user" thing again. If an average user encounters a bug, they either live with the bug or give up. If you ask any of your non-dev friends how many of these has ever filed a bug report, I'd venture to say it's not going to be many.
And in the few cases where you actually see non-techy people filing a bug report, it's usually going to be on the level of "HELP! My PC is not working!" and the response will be "Closed for not following the template".
I don't know what to tell you. If you want to blame Ubuntu for your issues, but you aren't willing to go through the standard process of troubleshooting or filing a big report, maybe Linux isn't for you.
Dude, I am a dev, I am using Linux for over a decade. I spent hours fixing these issues. But we are talking about "The year of the Linux desktop" and Linux being used by people who aren't hardcore tech nerds or devs.
To quote from my edit that you probably didn't read yet:
The point I was trying to make is "Linux currently totally is not for everyone. There are a lot of issues that stop regular users from using Linux." and I was downvoted for that, and people kept argueing whose fault it is. And now you are coming to the conclusion that I was making in the beginning:
Maybe Linux isn't for everyone.
Because it currently totally isn't.
What really annoys me with this discussion (and I've had the exact same discussions hundreds of times) is that it always goes through the same cycle:
A: "Linux is great, everyone should use Linux right now, the only reason not everyone is using Linux is because they are too dumb to switch."
B: "But Linux has issues that are difficult for regular users, e.g. that you have to use CLI every once in a while."
A: "No, you never have to."
B: lists some specific instances where they had to
A: "But you are dumb for even doing that. Linux is not made for XYZ use cases"
B: "But XYZ use cases are pretty basic and most users run into random ones of these"
A: "If you think there are problems, you are maybe to dumb to use Linux"
B: "I fixed these issues, it was just a demonstration of where regular users get stuck"
Combine that with the really misguided notion that anyone who isn't a tech crack must be dumb because otherwise they would be a tech crack, and it gets really dumb. People completely forget that there are lots of different topics where even most tech cracks have no clue about (e.g. medicine). I have ~20 years of experience as a dev. I know my way around the things I'm working with. I am the one everyone calls if they have issues with their computers. But I got no clue about fixing cars, medicine, or making clothes, just to name a few. And I am really grateful that if I go to my doctor, he treats me and doesn't tell me I'm dumb because I don't have a medical degree.
Okay buddy. You obviously didn't come here to do anything but grind your axe. 3% of desktop use is pretty cool, even if it's likely just a ton of Steam Decks. Anyway, have a nice time developing.
Well, fanboy, have fun pushing people out of Linux. People like you are a major part of the reason that it's still only 3%.
Just imagine that: There's a free OS that doesn't track you, doesn't serve you ads, you don't need the newest hardware to run it, it does almost anything that Windows does, and yet, it's got only 3 meagre percent of market share after 30 years.
You think that is because it's so easy to get into if you don't have a tech background?
Sadly, next to the technical hurdles and the bad UX, there is a really toxic community that is happy to shit on anyone who dares to say that Linux isn't perfect.
That's not quite my definition of "common".
The "Gamer class" is far from the average user, the average user doesn't even know what a GPU or a driver is and doesn't care. As long as the OS installs all drivers by default or the OEM has preinstalled them all is good.
Until there's no more drivers for that generation of GPU. The Windows 11 drivers for AMD only go down to the Vega 64, if you have a Fury X or a 7970 you're out of luck. Not that Windows 11 even lets you install on a machine that old.
AMDGPU goes down all the way to GCN 1.2, which means you can even run a 7970 on a modern Linux OS. Even out of the box if your distro has the legacy flags enabled.
It would be fantastic if there was more hardware that works out of the box in Linux, but that's up to the manufacturers. Until more people switch to Linux they don't bother and until they bother everybody complains that XY doesn't work on Linux.
As of right now the biggest hurdle is Nvidia without drivers included in Linux. Without a distro that takes care of installing their drivers they are essentially out of luck.
Using a GPU under Linux is not common? And installing Linux on old laptops isn't either?
I can't say anything about AMD, since the last time I had an AMD GPU is ~15 years ago.
When I installed an Ubuntu variant on my G580, which has a Geforce 635M it automatically installed the current driver for Geforce GPUs when I setup the OS, but that driver doesn't support the 635M. That one needs a legacy driver. And getting that to work was a major pain.
I first installed the legacy driver over apt, but it didn't do anything, because apparently installing the driver doesn't actually load the kernel module for the driver. So I had to load it manually, and it still didn't do anything. Turns out, uninstalling the original driver didn't unload it from the GPU either. So I had to re-install the old driver, unload the module, uninstall the old driver, install the legacy driver and load the legacy module. Took me a few hours to figure all of that out.
No way someone without CLI experience will be able to do that.
Installing drivers for an older GPU, obscure printer, touchpad or other weird hardware is not common.
Which is an issue with Nvidia, they have no drivers for that GPU for Windows 11 either. Not saying that this is not an issue but there is absolutely nothing Linux can do to make every legacy GPU work without help from Nvidia. It uses the open source driver out of the box, which works sometimes but not for everything and definitely not for gaming.
https://www.nvidia.com/en-us/drivers/results/180339/
Yes, they do.
Yes, they can. They literally have the correct (legacy) driver in the Ubuntu repo. But the autoinstaller installs the wrong driver during installing the OS. And if you try to manually install it, there is not even a text prompt in the CLI saying "You just installed that driver, do you want to actually use it to? (Y/n)".
They could have even gone so far as to make a CLI wizard (like many other packages do) or even a GUI wizard. But no, the package just installs and does nothing by default.
Also that is not correct. All the *buntu installers ask you when you install the OS whether you also want to have closed source drivers installed, and then it installs the closed source Nvidia drivers. Just the wrong ones.
That driver does not list the 635M but only the desktop version. Which is still impressive but I think they have separate drivers for their mobile chips? The latest driver listed for the 635M is only available for Windows 10 on their website.
Does
ubuntu-drivers devices
list the correct driver or is the recommended one too new? The driver packages in Ubuntu should install and activate themselves unless you have multiple installed, sounds like you ran into a bug.That does not change that Nouveau is used by default for the installer itself and by default for the OS if you don't select anything.