Building a Little Home Server
***

Building a Little Home Server

Est. 9m read

Why would you…

First question I have to answer is why it’s worth spending money on another computer.

Well, it’s actually part of a long-standing issue I’ve been having with my desktop computer. I need multiple operating systems and I can’t use VirtualBox or similar Type 2, “Hosted Hypervisors” due to performance limitations.

In addition, I needed a storage solution that’s going to work with me and not against me. For the longest time I’ve been using internal hard drives and to transfer data off of them requires a USB or uploading them to a Google Drive and the likes. And once the internal drives are full, you can’t simply grow them.

The Inception

One day, searching on YouTube, I decided to start looking at videos on NAS and RAID. They are two words that I often hear brought up when people are talking about expanding their PC storage. I didn’t know much about either at the time.

NAS & RAID

  • NAS: Is short for Network Attached Storage. You can think of it like a computer that stores data on your local network. It’s like a giant USB over ethernet.
  • RAID: Is a “storage virtualization technology” that makes multiple disks look like just one. It comes in many configurations, RAID-Z is cool because if a drive fails it can be recovered using the “parity” drive in the same VDEV.
ZFS Components
RAID-Z diagram. This setup has 2 VDEVs. One of the three drives in each VDEV is a parity drive.

Once I realized that this is exactly what I was after, my first instinct was to check Amazon. If I can buy a NAS that’d be easy. Turns out they’re incredibly expensive ($300 up to nearly $1,000) for a NAS that can hold at least 4 drives (and the drives aren’t included).

Digging some more, I learned that pre-built NAS servers are just really small computers with fancy drive bays. There’s nothing too special about them other than the form factor. We could probably build one for about the same cost with way more power if we sacrifice on form factor. Let’s see…

System Requirements

Overview

I know it needs a decent amount of storage. It needs a decent (but not overkill) CPU and RAM if I intend to use this for more than just a NAS. I prefer an Intel CPU for compatibility.

I don’t want it to rely on a GPU, so I’ll opt for integrated graphics in the CPU to start.

Motherboard

The motherboard needs to have at least 6 SATA ports. 8 would be ideal for the storage. I’d also like a motherboard that’s known to work with Hackintosh (Z690 is known to work).

While doing my research, I learned that people who use their home server as a NAS prefer1 Intel “NICs” (network interface cards) on their motherboard because they’re much more reliable than Realtek.

Computer Case

The case needs to be able to fit the ATX sized motherboard and at least 6 HDD/SDDs. I wanted something super small, but I couldn’t find a case that was compatible with the rest of my build. The only other thing I was looking for in a case is easy-to-install drive bays.

Fractal Design Define R5 Drive Bays
Fractal Design Define R5 Drive Bays

Power Supply

I also need a PSU that can power it. A 750W should be plenty. I determined this by putting the components into pcpartpicker.com and making an educated guess based on the estimated wattage. PSUs also have an efficiency rating- I determined Gold would be good enough.

Extras

I needed some fans and thermal compound. They are in-fact sold separately.

Final List

Minimum RequirementsPurchased
Z690 motherboard: 6+ SATA, Intel NICASRock Z690 Pro RS Socket LGA1700 ATX Motherboard
Intel CPU: 4 cores minimum, Integrated graphicsIntel Core i5-12400
32GB DDR4 RAMCorsair Vengeance LPX DDR4 32GB (2x16GB)
Case intended for NASFractal Design Define R5
750W PSU Gold+EVGA 850W Gold PSU (Modular)
Fans & Thermal CompoundARCTIC P12 and Arctic Silver
StorageSee TrueNAS Scale section below

The total cost for me was only $560! Another bonus to building the PC is that all of these parts can be resold, replaced and upgraded.

Cutting Costs

I planned on spending as little as possible on this machine. Here are 3 ways I’m planning to cut down on costs for this build:

  1. Buy everything used and/or open-box for a discount
  2. Cross-check online stores and similar components for the best value
  3. Selling things I’m no longer using (like my MacBook Pro with the broken display…)

Assuming everything I listed sells at fair value. I think the new server is practically free.

Building It

Putting the hardware together was the easy part. It took only a few hours. Once I realized it was working, I decided to try installing Proxmox on a drive just to see what it could do. Everyone seems to recommend it as the base operating system for a home server.

Proxmox

Once I accessed the Proxmox interface, I started poking around and watched a few videos to setup my first Ubuntu VM. After it was running, I just wish I had known about it sooner!

During my testing phase, there were a few things I made note of: for one, you get to upload your ISOs to Proxmox. So when you create another VM with the same OS, the ISO is already in Proxmox. This means that I needed more storage than I thought for Proxmox. Maybe even another drive.

The VMs can be installed on a separate drive, but before it can, the target disk needs to be wiped and setup through Proxmox as Storage. You can then store a bunch of operating systems on that drive without having to partition it. Each VM can be given a set amount of cores, RAM, and disk space.

After installing Proxmox a second time with a slightly adjusted hardware configuration, I felt confident that it would work for my needs. The next step for me was looking into TrueNAS.

TrueNAS Scale

At the start, I accidentally installed TrueNAS Core. Luckily I hadn’t fully committed yet, so switching to Scale was not difficult. The advantage to TrueNAS Scale is that it comes with an App Store of apps that can connect to your NAS storage. The one I was most interested in is Nextcloud. It’ll act like a self-hosted Google Drive. Deploying Nextcloud within TrueNAS Scale was just a few clicks.

One thing I wish I knew years ago is that RAID works best with drives that are the same capacity, brand, and model. If they don’t match in capacity, it will be limited by the smallest disk in the array. Another thing I wish I knew was that you need 3 disks to create your first VDEV in RAID-Z.

As a result, I had to go out and buy 3 new 2TB SSDs because none of mine matched in capacity, brand or model!

But with that, I was able to create my first VDEV (4TB of SSD capacity, 2TB of parity data) and it has been working really well. I’ve configured a ZVOL to use for my Steam games and for everything else I use Datasets. For example, I have 300GB allocated to “ai-models” which is a Dataset formatted as NTFS. I can access it from Windows and Linux and this is already a huge improvement from what I was dealing with before.

It’s like partitioning, but way better.

What Now?

Now that I had my OS problem solved and my storage problems solved, what else can this thing be good for?

Well I created a dedicated Ubuntu VM for Dokku. This will act as my self-hosted Heroku. No more paying cloud providers to run my apps/bots.

After that I setup Windows and I even got Hackintosh up and running.

Installing a GPU

What I noticed in Windows and macOS though is that the integrated graphics are kinda bad. If I do plan to use this home server for any AI, video editing, or gaming, the iGPU is not going to cut it. That’s when I decided to put my main GPU (NVIDIA 3090) into the server.

I learned that simply plugging in a GPU sounds easy, but it’s actually quite a timely process to configure Proxmox to allow for GPU passthrough.2

As a result, I’m only planning to give Windows access to the GPU for now. Once passed through, it is able to run any game and load 6GB+ Stable Diffusion models without an issue. It was after installing this new GPU that I decided to go out and get another 32GB of RAM (2x16GB) because now this server is doing more than I was expecting.

Tinkering

Tinkering, and figuring out what else I can do with it has taken the better part of nearly a month. It’s not for nothing though- I’ve now got the home server I always wanted… and I learned a ton along the way.

In less than a month, I’ve learned to setup RAID-Z with TrueNAS Scale. I’ve learned to setup and use ISCSI, Samba, Bind9, Cloudflare Tunnels, Nextcloud, and my favorite: Proxmox. All of these help me in some way or another do things that I couldn’t before. I’ve also learned to use ComfyUI and StableSwarmUI because now I can store the Stable Diffusion models without worrying about running out of disk space.

Some Screenshots

These screenshots are not mine, I just wanted to add some visuals.

  1. I’ve got StableSwarmUI running so that I can generate images from anywhere using my ComfyUI workflows.
StableSwarmUI
StableSwarmUI by StabilityAI. It's a user-friendly frontend for ComfyUI.
  1. Nextcloud is going to replace my Google Drive. It has every feature I could ask for and I get much more than 15GB. I can even let friends/family use it.
Nextcloud
File sharing server with link sharing, in-browser previews, and much more.
  1. Proxmox is managing everything. If I’ve got an idea, or want a new service, I just create a new VM. Proxmox is pretty easy to use and very performant.
Proxmox
It's like VirtualBox but 100x better.

In Conclusion

That pretty much wraps it up. Would I recommend you do this? If you’re just looking for a storage solution, I would recommend buying the NAS from Amazon. It comes with a warranty, it’s easier to use, and there’s tech support to assist.

But if you’re working in tech and maybe you’ve thought to yourself “having two computers would solve this” I think it’s almost inevitable to at least try this.

The parts I picked have been great for what I needed. The only comments I would make:

  • The case is larger than I expected.
  • Having a CPU with more cores (and no iGPU) would be better for how I’m currently using it.
  • I should’ve gone with 2x32GB RAM sticks instead of 2x16GB sticks.
  • Buying a CPU cooler will be quieter than using the stock cooler.