Do folks managing servers mainly do so via command-line interfaces?

ALostInquirer@lemm.ee to No Stupid Questions@lemmy.world – 128 points –

Or is that more of a stereotype, and there are some (maybe more?) out there using some form of graphical interfaces/web dashboards/etc.?

It's struck me as interesting how when you look up info about managing servers that they primarily go through command-line interfaces/terminals/etc. It's made me wonder how much of that's preference and how much of it's an absence of graphical interfaces.

71

Software engineer here who works on web services. Most production-critical things in our workplace aren't managed by GUI's, or command lines... but by code. There are usually some infrastructure-as-code tools involved, like Terraform, CDK or Pulumi.

GUI's are often reserved for quick fixes and trying out things on staging servers (derisively called "click-ops").

Chef, Puppet, Ansible, SaltStack, even Otter (for Windows).

For smaller servers/services, they are plenty of admins still getting their hands dirty in a shell (for instance my home lab is a baremetal hypervisor with a few hosts and a whole load of Docker containers).

But in business environments, especially those using cloud providers, infrastructure as code is king.

I'm looking to rise up from the hell desk, I have an enterprise grade server sitting collecting dust at the moment (heat issue, not on the server, just the average ambient temperature is uncomfortable without it running is too much) but its running unraid at the moment and not much else.

Any suggestions on where to start with infrastructure as code?

Google “Terraform homelab” and read a few guides on how to use Proxmox, Terraform, Ansible, Puppet, cloud-init, Packer, etc.

A great starting point is being able to write some code that will consistently build a homelab setup, perhaps running a few useful services like Snapdrop, Pihole, OpenVAS, Etherpad Lite, etc. The goal being capable of standing everything up and tearing it down using Terraform and Proxmox (Terraform instructing Proxmox to create VMs and Ansible to configure those VMs with what you need).

There are loads of similar solutions (such as Ansible and Puppet) so don’t be scared of trying a few different guides and wiping the server a few times along the way. It’ll give you a strong understanding of the various tools and, once you’ve done it a few times, you can land on your preferred setup and start building your own use cases for it all.

Hope this helps!

Much appreciated. Just needed some phrases to throw in a search engine to get started.

It is far from a stereotype, and most times it isn't personal preference either.

It is just about using the best tool for the job.

Many tasks can be done either this or that way, but one of the ways is usually much faster, or repeatable/scriptable, or easier to make mistakes etc.

GUIs are very limiting. You’re only able to do what the designer wants you to be able to. By using the terminal it’s much simpler to do more complicated tasks (once you’ve gotten past the learning curve).

Also since so many servers are headless (no display outputs) they’ll be remotely logged into, meaning there’s only a terminal to interface with the machine.

This can be true. Part of the reason I ask is that as more data is visual in nature, it seems like it might make it more difficult to manage strictly via CLI, especially since metadata is likely to be lacking in description and even with a descriptive filename and details, it's a picture/video for a reason.

I'm sure there are existing arrangements to handle that though, like web GUIs for any visual media review as needed.

Can you give me an example? Sure graphs are quick to spot spikes and such, but outside a webui like you mentioned servers also usually have warning triggers, you know what's better than staring at a graph looking for a spike? Getting paged once a spike happens with information on possible causes and the state of the server. That's very difficult to setup using GUIs, but almost trivial to do if your okay with CLIs.

It's more on the hobbyist end of things, but as an example I was thinking like if you had a server you're using to back up or store photos on, trying to parse it strictly via CLI doesn't seem like it'd be terribly useful.

You'd also want to view the images directly, I'd think, but I'm guessing in that situation you'd just use whatever web UI the software you're using might provide.

I have a server I use among other things to backup my photos, I don't understand what you mean by "parse", but I administer my photos through my file explorer as if they were on my computer, because I configured the server to expose a samba share on the folder that I have the pictures.

You got the gist enough despite the term I used & answered what I was wondering about (as did the other person replying), so I appreciate it! Parse was just another way of trying to say see the file in full, filename, additional metadata, & content. With visual media I'd think you'd have to do like you (& they) said, configure it to be opened via something else for a comprehensive review.

I tend to work with visual media more, so for me a CLI feels like more of a backwards approach to navigation & data management.

Yeah, makes sense, however to setup good structure for being able to do that you'll need CLI, e.g. if you want programs that administer photos and allow you to create albums and set tags, I personally don't need that level of organisation, but if this is your main use you might want to invest the time to setup something like Lychee or piwigo, which are easily setup through docker (which if you're into self hosting you should learn).

Viewing the images directly sounds to me like a different context. Browsing the images is more akin to end user activity, i.e. using the server for its intended purpose. Managing the server is more like making sure it's running, that there is enough space allocated, security holes are plugged, software is up-to-date, etc. Administrative tasks. When wearing the admin hat, there wouldn't usually be much of a need to actually look at the photos - you'd be more concerned with file names and metadata, not contents. In that context, the GUI becomes less important. And if you ever do need to see them, you can always fire up the GUI software for that occasional situation.

It used to be 100% command line. Now it's 80% shell and 20% web portals.

No it's legit. Most servers nowadays are Linux so if you're working on a specific server you are using the command line.

It's way more efficient generally.

GUI can be great for quick specific tasks but you are limited by the features added by the software.

Speaking from ~20 years experience: Yes, mainly. But both GUIs and web dashboards are common and widespread. It varies wildly based on what type of server you're maintaining and what type of organization you're in.

If you run a custom Minecraft server via some online service, you'll be going through a web dashboard.

Typical corporate or government IT tends to be Windows/GUI based, but of course with as much automation as possible in the form of global policy settings.

Typical small web development shop tends to be via code configuration and web dashboards on rented hosts.

Typical SAAS-provider type company tends to be strictly command-line but as little as possible, and try to have everything via configuration, rules, deployment scripts, etc checked into a version control system.

It's extremely varied, but to my understanding it is correct that command line is most common.

At my work it's all command line or inside the code Itself. No need to be scared of the cli.

For a single new problem that hasn't yet been automated I use CLI utilities to collect information to use to write code for a new automation.

I use web UIs to monitor metrics (grafana) and write custom exporters to collect metrics that can show performance or potential issues and logs.

Depends on the kind of server - Linux, yes, command line all the way. Windows (and Active Directory and other Microsoft stuff) you use GUI mostly in combination with some PowerShell scripts (often running on the "command line").

There are still people running Windows Servers? An Active Directory Domain Controller can easily be set up on Linux.

Yes I personally find it much quicker running a few commands via SSH and editing Docker compose files in a text editor than clicking around in some kind of web interface. It's also much easier asking for help or helping someone else I you can just send commands to execute instead of explaining different menus and buttons they need to go into.

You can't really do configuration management with a GUI. Or version control. Everything I do I manage with Ansible as much as possible. YAML is self-documenting as well. How much effort is 'run command with parameter' documentation vs explaining how to navigate a GUI?

You can't really do configuration management with a GUI.

Cries in Windows Server

How much effort is ‘run command with parameter’ documentation [...]

Tbh it's less so the effort and moreso, how understandable is the documentation? A good GUI has the benefit of visual design explaining without words what may take a lot of documentation that may or may not be easily understood depending on the writer, which in many technical situations is someone deep in the jargon that has forgotten the way back to more accessible language.

Once you're familiar with the commands, no doubt as many others have said it's more efficient (especially once you're knowledgeable enough to write scripts for frequent sets of commands), but there is a learning curve at play as one muddles through documentation in a similar way to an unfamiliar GUI.

If you understand that the terminal is better in the long run then you answered your own question, most people who fiddle with servers do so for a long time so the time investment is worth it. A similar analogy is learning knife skills, if you just cook for yourself being able to chop an onion in seconds saves you a minute a day from the one onion you used, not really worth outside being a neat party trick. But if you work in a kitchen that's mandatory, chopping an onion in seconds means you save an hour for the 60 onions you chop in preparation for the service. Same idea for GUIs/terminal, it has a higher learning curve but if you try to avoid the curve you'll never be able to do it fast, so the time investment is worth it if you're going to be doing this daily (like most server admins do)

Same idea for GUIs/terminal, it has a higher learning curve but if you try to avoid the curve you’ll never be able to do it fast, so the time investment is worth it if you’re going to be doing this daily (like most server admins do)

Yeah, part of the thinking behind this question was with those doing this more as a hobby in mind (e.g. self-hosters) where it's sort of a limbo. You may be doing it daily as part of your hobby, but also never on the level that really demands the degree of proficiency or efficiency you describe as you're not going to be casually handling a large network of servers (probably), so on one hand learning the CLI may simply be part of the fun, but on the other, it may also lean into overkill depending on what you're aiming to do.

I think it's still worth learning it as a hobbitst, same thing as the knife skills for someone who likes to cook, you don't need to be super proficient with a knife, but the basics of knife skills will up your cooking by a lot, at some point you'll reach diminishing returns and you'll stop learning, but the basic is almost essential. Same thing for CLI, you don't need to become a master in the command line, but being comfortable around it will help you a lot. In other words, trying to run a server without CLI is like trying to cook without a knife, is it possible? Depending on what you're trying to do yes but in general you're shooting yourself in the foot, just because a blender can replace a knife in some instances doesn't mean you can use it for all of the same things you would use a knife, a GUI is the blender of servers, it makes some things easier but is not as versatile.

No, any documentation >>> GUI. GUI relies on your previous experience with similar environments. Just jump into a GUI of Visual Studio (not code) project configuration and see for yourself.

As others mentioned, it depends on the details. But anything routine should be done with CLI or code, because then it can be scripted/automated. The time savings for that adds up surprisingly quickly, especially when you consider the human errors that are avoided.

Personally, it's the power of powershell that I use for the hundreds of windows servers. Otherwise it's the power of Linux bash shell scripts for the dozens of Linux servers. None of the Linux servers run a gui so there's no options there. Tbh for me, self documenting gui is the slowest way to do work. Configuring hundreds at once with peer reviewed scripts and change control is much more effective since the peer review and change control will be needed either way.

Oh though I use fortimanager a lot of configuring dozens of Fortigates. Only have a few scripts on it though.

Yes, I manage my Raspberry Pi Plex/Pi-Hole/OpenVPN server using command line because, well, it's Linux.

Also for work I manage Windows Server based systems and although this is a largely GUI based operating system, Command Prompt/Powershell are still tools I use daily to provide more control and depth over what I can do (bulk admin actions, for example).

I don't really use command line out of preference - moreso because it's the necessary tool for the job.

You can accomplish a lot through scripting - light programming. Anything you can type into a command line can also be executed by a script. This is the power of them. So when managing large arrays of resources, the potential for automation and scaling effort is big.

I don’t think all of that can be replaced with GUIs. When you consider that sysadmins are working with tons of different software from many different providers and trying to make it all work together, the command line interface is really the common language they all share - even the bulllshit legacy products that are still hanging around after 15 years of being outdated. You’d need extremely sophisticated GUI tools to be able to do everything, across all these products, at scale.

There are of course GUI tools for specific tasks, but the command line is the foundational tool.

Disclaimer: I know fuck all about Microsoft things. Only have some experienc with *nix

I work with embedded devices, and to some degree also servers. Graphical user interfaces are usually ignored as they just take up space and resources, and you can't even do half of what you can in the terminal (assuming you know your way around a terminal, but being a server admin, you really should).

Web interfaces are usually used for status pages, or if anyone who isn't you or a fellow admin needs to to anything as, you guessed it, they are very restrictive, so the other party hopefully can't do much damage.

I manage my Lemmy server via SSH/terminal so yes, command line.

Good exercise for my command line skills!

cd /

sudo rm -rf *

I only use CLI access over SSH to manage servers. Using a GUI is painful, especially working remotely. Even wjen I had to deal with windows servers I'd setup an SSH server on it and use powershell.

Ideally you don't even access the server manually over that, but use a management system like Ansible or Puppet to configure and manage it.

If you are managing linux servers and you need to go into them for whatever reason, then using ssh is common and you will use a terminal to do so. There are some web interfaces for other things like managing a virtual machine cluster for creating virtual machines, set up network configurations for the cluster, etc.

Any decent Linux admin with manage with CLI and any decent windows and min can do it via PowerShell but some functions still need the GUI

There are some applications which only get managed via webUI

Infrastructure as Code has enough of a footprint now that you can manage many servers and applications via code. Ansible, Terraform, chef, puppet are the big names in the IaC space.

Command line all the way! I have a few webUI’s I use to make some tasks easier for myself and a few other users, but all my setup and maintenance is done in the command line—often the things I need to use aren’t even easily available in any other form unless I have the desktop version of the OS running.

Both, actually. 95% web-GUI, and if something goes wrong, a bit of ssh.

What? no. Automation automation automation. Have your ansible, chef, or puppet ready. Where possible, have your terraform or equiv. ready. Python and bash scripts. Only use web portals where no automation solution exists.

This I'd do if I had a room full of servers and a team that would tend them 24/7.

I have a real job to do, and take care of one server on the side if necessary.

It depends.

My personal servers are a mix of the two. I have a Synology NAS that I manage through a web-based GUI. Sometimes I’ll dip into command line via SSH, but not very often.

I have two more lower-power Linux servers that I manage through command-line primarily. They don’t have many system resources, so I want them to have as much available as possible to serve things.

Windows servers I use GUI management most of the time

It's mostly about using every last one of the cpu cycles efficiently, which is old school think from the days when 640k was supposed to be enough for anyone. When I was a wee tyke in the 8bit era we had machines that did graphics, but they were for launching games from a terminal/console.

There are many servers with GUIs, primarily Windows servers, and there's probably web or GUI interfaces available for every useful service you can run which will be handy in some contexts, but there's a kind of speed and simplicity you can get with a good console that no gui can touch. Hard to explain unless you've done some work.

I've seen both. But typing in a command or writing it into a script or configuration file and then activating it for 80 servers is way faster and easier than logging in into everyone and clicking some menus and buttons with the mouse. Or you have mantenance every 2 weeks and it's super easy to launch a prepared script instead of dragging and dropping files with the mouse for hours.

Also people vastly overestimate how difficult it is to use the command line. It needs a certain amount of knowledge, true. But it it is not a big deal compared to knowing the concepts behind your server, how a webserver or some network works etc.

Linux servers usually don't have the graphical interface installed, because nobody bothers and it'd be a waste of space and time.

For my Debian server in the basement of my house, yes. I rarely log into it directly, and just ssh in from my desktop or my laptop.

Same goes with the server I rent for my website. I always use command line for that.

You definitely need cli for some stuff at least. Contrary to popular belief, cli is actually much easier for accessing and managing stuff. So most sysadmins and devops use cli at least to some extent.

Most servers and server providers only provide ssh access to ma age stuff, you can get some gui in more advanced panels to for example setup firewall, add ssh keys, open and close ports. You might expect a docker manager of sorts in some places. But since almost anything you can do with gui, you can do with cli, it's considered an extra benefit if you provide the gui.

Some tools used are gui only though. They certainly use some sort of cli stuff behind the scenes but you can't interface with their functions without gui. You certainly can do the same stuff with coding and running commands but why bother when the tool might be decent ane gets the job done.

All in all, it comes down to preference and more important than that, necessity. If you are an expert with cli usage and have a good memory or cheatsheet, cli is mostly preferable than a gui. Cli is much more standardized, there is no design change, commands might change but most of the time it isn't. In gui you mostly get less data, but you can get charts. So in analysis mode, gui would be preferable.

There is no rule to follow, but since most stuff is only done using cli, you see it being used more often. Some applications are implementing better guis, some guis interface with a lot of application cli outputs, making it much easier to understand what's happening. So you might get to see guis in action more often. You might have seen graphana for example in a bunch of movies. But I guess it doesn't give the same hacker vibe as a dude with 50 terminals witg fast scrolling text. Which is useless but there are cli apps to do that as well.

I support a large number of windows servers, and the answer is it largely depends on what i'm doing, like if i'm doing a file restore i find navigating folder structures a pain in the ass from a CLI, so i'll just use GUI.

If i'm doing something a bit more complicated like running AD reports, easier to just use powershell.

Depends I like to use network management system to give a overview and mail notications.

There are performings tools that more detailed what have web interfaces.

For Linux there is a limited amount software that have web interfaces. Even when they do most documentation is written with cli in mind.

GUIs are a useful tool to give common users an accessible way to understand how they're interacting with technology. They're best utilized,I think, when the point of the technology is to be viewed. Like imagine you had to browse and launch Netflix shows in a terminal. That would be incredibly annoying.

In contrast, its really goddamn slow to perform routine server maintenance and commands through a GUI. I save so much time managing my servers through CLI, scripting, and automated tasks. Almost everything I'm dealing with is text based information.

What I miss in the CLI, is proper structure in the text.

For example, a good IDE will list all problems found with your code, ordered in a sensible way. If I then click in a problem, it expands, and I get to see the full text description of that error. I can then click on a file/line combo, and be directed immediately there.
If I run CMake directly, I get kilobytes of error message dumped into a single blob of text.
Colors, line prefixes, and separator lines attempt to bring some structure into this mess, but it still remains a big wall of text.
It takes less effort for me to process the data presented to me by a good IDE, because it is organized and structured.

Same thing with git commits/branches/tags.

Same thing with diffs/merges.

Almost ALL text data can be organized in some way. But most text data is not big enough or common enough, that it is worth our time and effort to structure it that well. I therefor see GUIs as tools for when you are doing something so commonly done, that the effort of structuring the data was worth it.

Using a command line interface is muche easier most of the time compared to a GUI. One of the reasons I can think of is that it's easier to cram a lot of options inside of commands than it is to make an interface for them. You can have a look at nmap for instance, the menus inside the gui are clunky and fairly hard to read. Another advantage of the cli is that you can use it for scripting which is part of the basic skills of many sys admins. For a lot of treatments (like CSV files) it's also much faster to pipe commands together than to go through Excel for instance. It can be intimidating at first but the man pages and sometimes a bit of internet research can go a long way.

I'm a mix of each. If i can do it faster via a gui then i do the gui. if it's faster via command line then i use that. Those hardcore command line guys can enjoy typing 20 commands in which 4 button clicks can achieve the same thing 🤷

Yeah but usually those 4 buttons are running 20 command lines in the background so it takes just as long; may as well customize any of the commands to your liking while you're taking the same amount of time.

I think you kind of missed the forest for the trees here. Those 20 command lines that run in the background will take the same amount of time to complete each task as manually entering them, sure. But the benefit is that you take less time to overall because I don't care how fast you are with typing, you still have to enter 20 commands instead of just the 4 buttons.

The main reason why I prefer to take the time to write the commands is so I can save a note of what commands I ran to get it working. Being able to say tthese are the commands" is much easier when I'm either helping someone repeat my work or auditing to figure out what I broke

If you're single lining them all every time then sure, but if it's a repeatable task then just make an alias for it, and make love of &&.

I have one home server I manage only through VNC.

It can depend on the server OS, if you run a barebones Linux server then you’ll be using command line a lot but Windows servers look similar to windows desktop (GUI wise)

Linux servers can also have GUI’s (unRAID, TrueNAS, Proxmox)

Depends on what I'm doing with it. I usually don't really need to do anything particularly specific or complicated, so running everything off of a command line works for me. It's more lightweight, and is less fiddly than having to set up a graphical remote desktop.

If it's something more complicated, or more difficult to automate, then not using the command line would make more sense.

I'm running Debian on my home server and I do a lot via SSH from my desktop. I used to host a Minecraft server back when it was alpha so I learned a lot about cli. I do have xfce setup in case of an "Oh shit" moment though because I tend to do stupid shit.