Thinking of getting into self hosting but I'm a complete noob
I think of myself as technically inclined. I have installed Linux multiple times and have basic command line knowledge, and I've programmed in many languages, with the most experience making a static website game using HTML/CSS/JS.
Additionally, I own the superspruce.org domain (my registrar is Dynadot), but I don't really know how to wield the power of owning a domain. I also have some spare computers to be used for hosting, a 2009 laptop running Lubuntu and a 3900X+32GB RAM desktop other running KDE Neon, but I'm also open to experimenting with cloud hosting too (I know, sacrilege here).
However, I don't know much about the TCP/IP protocol or other networking protocols. I'm happy to learn, but the curve would need to start gently.
I would want to try hosting my websites, and also a personal non-federated Lemmy instance to serve as a archivable forum for my games. Even if it's not very useful, it's great experience.
Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:
4 acronyms in this thread; the most compressed thread commented on today has 16 acronyms.
[Thread #548 for this sub, first seen 26th Feb 2024, 18:05] [FAQ] [Full list] [Contact] [Source code]
Seems people are already making great recommentations.
Personally I also use docker-compose, a WireGuard VPN and an Nginx-Server to proxy/ssl-terminate all my services to my VPN or the Internet.
Just host whatever you like/need. Something like Nextcloud is probably a solid start.
Btw, I have not seen it mentioned here, but awesome-Lists are a thing on github where people collect various gems for certain categories. Here is the one for self-hosting which I have used extensively and really like: https://github.com/awesome-selfhosted/awesome-selfhosted
Also, when you start hosting more than 2-3 services, keeping them up-to-date might become a hassle which you can forget. For docker-based hosting I'd recommend you setup watchtower, which can keep your servives up-to-date for you.
I would caution against automatic updates! Notifications, yes absolutely. But automatically updating things is a great way to have things break suddenly when you're not in a good place to troubleshoot.
Probably depends. There are some services which I know are rock solid and have never failed me when auto-updating in 2+ years now.
However you are right that it can cause issues. I forgot to mention monitoring totally. My bad. A service like uptime kuma is really worth having for that reason. For a few services that really saved me a few times tbh. I set it up to broadcast status changes to a telegram channel and a special mail inbox. But it can seemingly use any and all services under the sun.
If you only have a handful of services, manual updating is good for learning and can prevent problems by seeing them immediately. However once you have so many services that you can't possibly update them all or have the motivation to, it's better to let them auto-update and have a service like uptime kuma notifiy you if something goes wrong imo.
As with all things in the world, it's matter of striking the right balance of tradeoffs.
Don’t make anything accessible via the internet if you’re new and starting out. The last thing you want is to accidentally leave a port open, leave an admin page with a default guessable password, or a piece of vulnerable software running and have someone gain access to your local network.
Start locally and learn the basics following the excellent advice of others here, and slowly build your knowledge until you understand the various moving and connecting pieces.
If you want to expose some service to Internet, first of all learn how to install and correctly use a VPN server (I use Wireguard which I find pretty easy), otherwise keep everything in LAN until you'll be confident enough to be exposed.
Some things, or points, to consider.
good luck have fun!
I see a number of comments to use a virtual server host, but I have not seen any mention of the main reason WHY this is advisable... If you want to host something from your home, people need a way to reach you. There are two options for this -- use a DDNS service (generally frowned upon for permanent installations), or get a static IP address from your provider.
DDNS means you have to monitor whenever your local IP address changes, send out updated records, and wait for those changes to propagate across the internet. This generally will mean several minutes or more of down time where nobody can reach your server, and can happen at completely random times.
A static IP is reliable, but they cost money, and some providers won't even give you the option unless you get a business-class connection, which costs even more money. However this cost is usually already rolled into the price of a virtual machine.
Keep in mind also that when hosting at home, simply using a laptop to stay online 24/7 is not enough, you also need a battery backup for your network equipment. You will want to learn about setting up a firewall and some kind of IDS to protect the front end of your services, but for starting out you can host this on the same machine as your other services. And if you really want to be safe, set up a second internal machine that you can perform regular backups to, so when your machine gets hacked you have a way to restore the information.
My first server was online for two whole weeks before someone blew it up. Learn security first, everything after that will be easy.
I actually use a dynamic IP and it works pretty well for me, I don't remember having any issues because of that. Also, what happened after those two weeks to your server and how? I've been running my things for over 3 years and I haven't done anything special in terms of security.
This was back in '99 and I didn't know much about linux (or servers) at the time, so I'm not exactly sure what they did... but one morning I woke up and noticed my web service wasn't working. I had an active login on the terminal but was just getting garbage from it, and I couldn't log in remotely at all. My guess was that someone hacked in, but hacked the system so badly that they basically trashed it. I was able to recover a little data straight from the drive but I didn't know anything about analyzing the damage to figure out what happened. so I finally ended up wiping the drive and starting over.
At that point I did a sped-run of learning how to set up a firewall, and noticed right away all kinds of attempts to hit my IP. It took time to learn more about IDS and trying not to be too wreckless in setting up my web pages, but apparently it was enough to thwart however that first attacker got in. Eventually I moved to a dedicated firewall in front of multiple servers.
Since then I've had a couple instances where someone cracked a user password and started sending spam through, but fail2ban stopped that. And boy are there a LOT of attempts at trying to get into the servers. I should probably bump up fail2ban to block IPs faster and over a longer period when they use invalid user names since attacks these days happen from such a wider range of IPs.
Interesting. I guess security wasn't that good by default back then, firewalls are now set up by default on pretty much every server distro.
There was no such thing as a default firewall, but even now when I set up a new Debian machine there are no firewall rules, just the base iptables installed so you CAN add rules. Back then we also had insecure things like telnet installed by default and exposed to the world, so there's really no telling exactly how they managed to get into my machine. It's still good to learn about network security up front rather than relying on any default settings if someone is planning on self-hosting.