2026-03-09
Last time I talked about my attempt at building a home server and the motivation behind it, I did a pretty bad job of it. It has been almost two months I have had the server. So, I would like to expand upon that.
I am not someone paranoid about the control cloud service providers exert over us. I have always enjoyed the convenience that comes with it. As we progress further in the busy world and schedule of the 21st century, there is less and less time to indulge ourselves into managing every detail of our daily lives. Even if you are very tech-savvy, I would not expect you to be into self-hosting every service you depend on every day just because you can. Most of the time, the cost to make it worth the loss of convenience is just too high, and anyone who chooses to opt for delegating that technical overhead is perfectly understandable. I have paid for subscriptions and managed tools all my life. I am still not averse to that idea if that option allows me to actually use the service I paid for, without disrupting my user experience and digital rights. It is a very simple ask, from a consumer point of view. But unfortunately, It is becoming less and less common. I have previously talked about the enshittification of Notepad, Windows, and dozens other cloud providers that I have been depending on a good part of my life. The reasons behind might vary from product to product, but no policy should allow a provider or company to use their consumer as a product, or a digital hostage, or a front to milk cash from.
I am hardly the first person to talk about this, or even the part of a minority. There is an ever-increasing number of population joining in this new wave. The more we submit our control to an entity whose only goal is to generate revenue, the greater the need is for spreading the awareness on how to be independent of that, whenever that becomes necessary.
I do want to say that, don't just jump ship because everyone is doing so. You might have perfectly valid circumstances to care less about privacy and ownership. Everyone has different priorities. I started out by self-hosting Overleaf. I needed the ability to compile large documents without getting timeouts, collaborate with multiple people having limited knowledge of version control, and I did not consider that convenience worth of paying so much for the premium version. It took me less than an hour setting up the service and sending the link of the instance to others. That one hour and the electricity bill I incur due to having my old desktop running 24/7 ended up as the only costs, which I'm happy paying.
So what is actually in the server? Why am I yapping so hard?
This is the most important to me out of all. I do not take a lot of photos. Yet, over the years, my photo collection has grown so large that I would need to either a) pay for a cloud solution (i.e. Google Photos, the free tier does not cut anymore) or, b) slum it out by making multiple accounts so that I can still keep using Google's service. Paying is not a bad option. However, costs accumulate over time, and it does not give me the absolute privacy that I want. The only way to make my photos truly private is to keep them with me. Which brings me to Immich. It is like drop-in replacement for Google Photos. My instance has the machine learning and other facial recognition services disabled, as I don't need them, and it runs pretty lightweight without those.
It has been a while since Pocket was discontinued. Readeck serves as almost the perfect alternative. I can always save a page and come back to it whenever, from any device. With how unorganized and forgetful I am with my readings and content consumption, it always helps to have a tool like it do the work for me.
I have very weird experiences with feed readers. I switched from a bookmarks bar because it was very ugly. I cannot stand that little ribbon under my address bar. I then moved to Feedly, used it for some years, but the experience was not smooth. Something about it always irked me. I even built an RSS reader myself. Got kinda tired of maintaining it because I would have to frequently add tidbits here and there to deal with some quirky edge case. FreshRSS has been working well for now. I am not fully pleased with it yet, the UI also bugs me sometimes. But it makes up for that by having a good chunk of customization options, so I'm willing to stick with it.
Already talked about this. It does not consistent usage like the aforementioned ones, but it's always good to have this in the pocket. I have been out of academia for a good time, but it'll come in handy when I if I do decide to go back.
Jellyfin is my media server. Lets me stream all the movies and TV shows that I have collected to all my devices. I have always had strong opinions about the acquisition and collection of digital media and Jellyfin lets me keep it that way. Nothing more to see here.
Another one like overleaf. Does not see that much usage, but comes in handy whenever I want to work with some PDF files. It makes working with confidential PDF files easier without uploading them to some random cloud server.
Serves as my local copy of the git repos I work with. I understand it does not really add anything useful to how I work, but it's kinda fun to have that running I don't know why.
I have been very on and off with n8n, dating all the way back to my past job. I used to have workflows that I depended on regularly. It was also part of the tech stack our management used at my old office. My usage has definitely dropped since then, but I still have to use it if I come around any problem that I already have a workflow ready for. I have decided to get into it again, because it makes life so much easier with the boilerplate built into it, and the premade templates it provides. Also, having my own instance means I do not have to pay for it after the 14-day trial.
My friends and I used to play a lot of Valheim during the Covid era. I was not that much into self-hosting then to actually care enough to get another machine and host a dedicated server there. We used to have one of us host a server on their client from the game and connect to that. That came with the inconvenience of the session being unavailable if that one person was offline. And the performance of the server also depended on their hardware and network. Recently, I have been trying to get some more people into Valheim. So it helps to have a dedicated server already set up. People can log in even if I am out of the house, the network latency is consistent, the performance is good for everyone.
I have previously talked about the necessity of archival in some depth. A part of my storage is dedicated to serve that purpose. I have been seeding a few hundred gigabytes of Linux ISOs since the birth of this server. It might not amount much compared to the total size of the data that needs to be archived, but as I have always said before, I believe that every part counts.
I use Portainer for managing my containers. It includes restarting, debugging, or updating them. Better than what I usually do, managing all of them manually like a caveman. For DNS and accessing my server from outside my home, I set up Tailscale on my server and my devices. So, I am not out of reach even if I am not physically in range of my router.
What about backups, you say?
Well, up until yesterday, there was none to talk about. I was showing my setup to my brother-in-law, whose job is to maintain infrastructure at a large software company. He obviously brought up the DR mechanism I had implemented, and I had to disappoint him a bit by saying it was running by itself. It is not the case anymore, though. This morning, Claude Code and I cooked up some poor man's DR system.
The first layer is the backup of the compose and configuration files. The container directory is now a Git repo, and tracks my example configuration files and the YAML files.
The second layer has the database dumps and the docker volume backups for each service. Each night, my backup script takes a snapshot for each service, puts them in a TAR, and saves it. I set up rclone to push one copy of that to my laptop and another to a Google Drive folder.
The third layer, which I have not brought into fruition yet, is supposed to have backups for the actual media I am hosting, i.e. my photos, movies, music. I have planned to periodically (1 or 2 weeks maybe?) back them up to an offsite drive. With the rising price of storage, I have put the thought of setting up a RAID server on hold for now.
So that one is on the list for the immediate to-dos for the server. The other one is to get a new device in the cluster, and offload some burden on that. I have been eyeing up an old Thinkpad that has been on the Facebook Marketplace for pretty cheap. If I have that, I would probably think about setting up Pi-Hole (blocking ads network-wide), Nextcloud (primarily for their office suite), and LanguageTool (self-hosted proofreading) on that. Currently, LanguageTool only runs in my local machine, and I have been thinking of making that available to all the devices in the network.
Multiple people have told me to mess around with Proxmox for the server. And personally, I have always wanted to experiment with NixOS for some time. So, if I can get my hands on a new machine, one that I can treat as a disposable, I will try out one of those two.