NK4 Build Log - Migrating from a Pi Cluster to a bona fide NAS

Hi everyone,

Welcome to my first time ever building a PC, let alone a server/NAS-type box. I have been chatting with a bunch of you on Discord and asking question after question that I could probably solve with Google Fu, but you’ve been amazing and so helpful. I decided to build a NAS Killer 4.0 based on the needs for my server and all of the parts are almost here. In follow-up posts, I’ll talk a bit about what the needs are from this box + provide updates as I put it together. So without further ado, let’s get started.

2 Likes

1. What I need this box to do (Server Wish List/Needs):

This box needs to primarily be my own personal NAS, but also a server with a varied list of applications that need a ton of uptime. Almost all of these were running on my old Pi cluster and will now live on here.

Since I’m finally building my own server, I want to make sure I have:

  • Near 100% uptime (My wife can only tolerate “sorry hun, the light don’t work because the server is broken” so many times - and I’m out of times)
  • Solid kernel support (x86 solves this mostly)
  • Data integrity (I’ve been using ZFS on Ubuntu)
  • Ability to upgrade as we go (both in terms of drives and PCIe)
  • Minimum 1Gb Ethernet
  • Local network storage for my Surface Pro 7 (Samba access)
  • Ability to play nice with the Pis (for fun)

So here’s the plan:

OS - Ubuntu

  • Prefer Ubuntu because of previous experience
  • Supports ZFS-on-Root
  • Want to keep compatibility with my existing architecture (detailed later)
  • Must be Linux-based

Data Storage Backend - ZFS

  • Going to use ZFS (pools already created, data integrity features are a huge plus)
  • Finally have access to ECC RAM to improve ZFS data integrity checking

File Sharing / Backup

  • Samba shares direct access to files from Windows/Mac client machines
  • NFS shares for direct access to files from other nodes in Docker Swarm
  • TFTP for direct access to files from PXE Boot Devices
  • rclone for remote backup

Apps

  • Docker (Already using Docker Swarm)
  • Pi-Hole
  • Wireguard

Media Apps (Docker)

  • Plex - 2-5 streams
  • The Rs (Sonarr/Radarr/Lidarr)
  • Transmission
  • Tautulli
  • Ombi

“Cloud”-like Apps (Docker)

  • Nextcloud
  • Wordpress
  • BitwardenRS

Home Automation Apps (Docker)

  • Home Assistant

System/Cluster Monitoring Apps (Docker)

  • Prometheus
  • Grafana
  • Alertmanager
  • cAdvisor
  • Node Exporter
  • Docker Daemon Exporter

Docker Swarm “Backend”

  • Traefik (Reverse Proxy)
  • MariaDB (DB Backend for Bitwarden, Home Assistant, Nextcloud and Wordpress)

2. My existing setup (and general architecture):

Enjoy my lovely drawing skills. Here’s my home office setup. The blocks on the left are this IKEA shelf if you need an idea of scale.

Currently, my NK4 (finally built so it makes the drawing :slight_smile:) is the black PC on the bottom.

Working backwards from the network, here’s what we’re working with and how it’s set up:

  • My TP-Link Archer C7 v5 router is not pictured, but on the other side of the apartment in the living room. It’s an AC1750 router, so we get pretty solid speeds, but we’re limited by our 330M down/30M up internet. It’s cheap, but we’re bottlenecked on the upload side
  • Working back from the network we have a TP-Link Range Extender underneath my desk, which provides an AC1750 Wi-Fi uplink to the cluster on nonsense on the left. The extender has a single 1GbE port
  • That 1GbE port is connected to a set of 2 TP-Link TL-SG105 5-Port GbE Switches and 1 TP-Link TL-SG1005P 5-Port PoE GBe Switch. The switches are patched together with small runs of Ethernet cable, and I have 4 PoE + 6 regular GbE ports

So that’s the network. Now for the cluster. This project originally started as just a BeagleBone Black, but grew to eventually be a small SBC cluster. Here’s what we got:

That’s a long list. This project was capable of hosting all of those docker services above + a pair of PiHoles and Wireguard. I mainly had issues with storage which is what this project needs to solve. Finally a bona fide NAS will actually allow Plex and Nextcloud to run continuously.

The way these boxes ran before was as follows:

  • The ODROID-N2 was my “master” node and ran ZFS with a separate set of USB 3.0 disk enclosures - it ran my MariaDB instance + Plex and anything that couldn’t run over NFS exports
  • The BeagleBone and Pi Zeros barely run anymore, but originally started it all. Now they would run mostly a Wordpress container at most + maybe a Tautulli container. Anything that is basically just a webserver runs fine on these little guys. One of the Pi Zeros has a temp sensor HAT that I use to gather ambient temp data around the cluster.
  • The Pi 3B+ and 4s combined to run most of the heavy stuff like Nextcloud, Home Assistant, Bitwarden, the R’s, etc. One Pi ran a Pi-Hole and the other Pi 4 ran a WireGuard VPN

The cluster’s focus was high availability of these services, so I used glusterfs + NFS shares to spread out data so if a node failed the services would degrade but at least keep running. This worked in a fashion, but the central location of the MariaDB instance + the file storage killed most services if the ODROID went down or the USB acted up (which happened frequently)

3. The parts list:

EDIT: Updated parts list to include new, additional parts

Type Part Quantity Total Cost (incl. tax) Comments
Case Cooler Master Elite 350 1 $81.65 7x 3.5" slots + 4x 5.25" Slots
PSU Cooler Master Elite V3 500W 1 $0 75% Efficiency, Included w/ case
5.25" Adapter Icy Dock 5.25" to 1x 3.5" + 2x 2.5" ExpressCage 2 $80.54 2x 2.5" Slots are External & Hot-Swappable
Motherboard Supermicro X9SCM-F 1 $54.38 Used, missing I/O plate
CPU Intel Xeon E3-1230v2 1 $58.79 Used
RAM 8GB Kingston PC3-12800E DDR3-1600MHz ECC UDIMM 2 $97.98 -
CPU Cooler Thermaltake Gravity i2 1 $13.05 -
Case Fans Arctic P12 120mm PWM Fan 2 $28.76 Fans are positioned at opposite ends of case
HBA LSI SAS9211-4i + Full Height Bracket 1 $31.49 Used, x4 PCIe 2.0 so 4Gbps per drive
SATA Cables Relper-Lineso 18" Straight SATA3 Cable 6 $8.70 Yellow
SATA Cables Relper-Lineso 18" Right-Angle SATA3 Cable 6 $8.70 Blue
SATA Cables Cable Matters 15-pin SATA to 4 SATA Power Splitter Cable - 18" 2 $13.05 -
SAS-to-SATA Cables CableCreation Mini SAS 36-pin Male to 4xSATA 7-pin Female Cable 1 $8.70 -
Initial Build Excluding Drives - - $485.79 -
PSU EVGA 600 BQ 600W 1 $39.18 80+ Bronze, 600W, Semi-Modular - B-Stock Sale. Replaces original PSU
Case Fans Arctic F9 92mm PWM Fan 1 $7.61 Added to improve airflow and try and create negative pressure (so the top drive slot cools better)
HBA LSI SAS9210-8i + Full Height Bracket 1 $43.42 Used, x8 PCIe 2.0 so 4Gbps per drive - 2 ports, better orientation
VGA Dummy Plug VGA Dummy Plug 1 $9.30 -
SAS-to-SATA Cables Cable Matters Internal Mini SAS to SATA Cable 1 $13.05 4th drive cable was flaky on the original, so I ended up replacing it with this one
SATA Cables Cable Matters 15-pin SATA to 4 SATA Power Splitter Cable - 18" 2 $13.05 Needed more for more drives
SAS Cables CABLEDECONN Mini SAS 36 SFF-8087 to 4xSFF-8482 with SATA power 2 $28.28 SAS Drives need power/signals too
Motherboard I/O Shield SuperMicro I/O Shield MCP-260-00027-0N 1 $6.51 Bought to improve airflow - it did have an effect of a couple degrees
5.25" Adapter Icy Dock 5.25" to 1x 3.5" + 2x 2.5" ExpressCage 1 $40.27 2x 2.5" Slots are External & Hot-Swappable
Additional Parts Excluding Drives - - $200.67 -
Total Excluding Drives - - $686.46 -
2.5" SATA SSD Kingston 240GB A400 SATA3 4 $127.32 $132.63/TB - 500MB/s Read / 350MB/s Write, 7mm Height
2.5" SATA HDD Seagate Barracuda 1TB 2 $90.09 Junked to add more SSD capacity $45.05/TB - Had from previous cluster, Up to 140MB/s, SMR, 7mm Height
3.5" SATA HDD Seagate IronWolf 4TB 4 $440.90 $27.56/TB - Had from previous cluster, Up to 180MB/s
3.5" SAS HDD Seagate Constellation ES.2 3TB 6 $96.00 $5.33/TB - Bitdeals! Up to 155MB/s
Drives - - $754.31 -
Total Including Drives - - $1440.77 -

You should change your HDDs to price/TB, that’s what we use here. :slight_smile:

Subscribed, waiting for more updates!

Just changed it, and cool! I’m taking a break for dinner but should have some more stuff up later tonight :slight_smile:

1 Like

Ok time for…

4. Starting the Build!

So most of the parts are here, time to start this build for real.

First thing to show up was the case which I immediately started plugging drives into:

Although, my first batch of SATA cables were right angle, so I decided to rearrange them a bit.

I’ll be updating this topic with more pictures as I go, but depending on when this mobo gets here, I’ll either have pictures up today or on Monday night

2 Likes

I’m curious, was your previous experience Unraid or FreeNAS? I use Unraid a lot and it’s really great.

Actually, I have no experience with either believe it or not. Most of my experience with Linux has been with my little Pi/ODROID cluster (that I need to fill in info on up above I just remembered). So for that I’ve been running mostly Raspbian Buster, some Debian Buster, and Ubuntu Focal/Groovy.

I don’t really have anything against Unraid or FreeNAS specifically, like you said I’ve heard from many people that they’re great. I just really like the regular linux experience having spent 2-3 years learning how to build a small compute/docker cluster on it.

I’ve also been hesitant about being locked into a proprietary platform (even if it’s based on Debian Linux). My biggest issue with the ODROID/Pi cluster was that it was 32-bit/64-bit custom ARM architectures, so if the device makers/community didn’t feel like supporting a package (like ZFS) then I was forced to compile it on my own and deal with the headaches of supporting it myself

You should really give Unraid a shot then. You can have dual parity, along with the ability to use various drive sizes and add drives one a time.

1 Like

Ok, first thing to go was the stock case fan, replaced this and the other one with the Arctic P12s. I decided to go without Power Sharing on these since they’re on opposite ends of the case

Next, we add the Icy Dock to hold those 2.5" SSDs (second one gets here tomorrow)


So far so good, pre-running the cabling is coming up next

Oh and here’s a picture of all the parts here minus that pesky X9SCM-F + the second Icy Dock

Oh and bonus puppy, his name is Whiskey and he’s very tired

1 Like

Pre-running cables now, I have 8 drives total, so planning to use the LSI card + 4 SATA cables. Until that second Icy Dock arrives though, best to just hold 2 cables back.

Ok back to building! Finally back home and that pesky motherboard finally got here! I decided to go with the X9SCM-F for the port combos + Supermicro still seems to be supportive of this board all these years later. This is the board standalone finally awaiting that CPU

For the CPU we’re going with an Intel Xeon E3-1230v2 for some good performance at a slightly lower cost than the E3-1270v2. That’s going in next.

Now that our CPU is in, my next job is the CPU fan. For price, I decided to go with the Thermaltake Gravity i2 since this chip doesn’t even reach the rated 95W TDP we should be fine. Also, this was my first time installing a cpu fan so thanks @JDM_WAAAT for that NK4 build video which I used for reference!

1 Like

Ok, after way too long, finally got memory populated, BIOS updated and LSI HBA flashed!

The process took longer because I had to figure out that the Supermicro X9 motherboards have a limited option ROM, so sas2flsh.exe fails on FreeDOS. The solution was to use the UEFI flashing method using the built in EFI shell. This worked like a charm and we’re now fully flashed and back to building!

Ok now that we’re back on track. Installed the second Icy Dock ExpressCage and reinstalled the front cover, now to start cabling things up!

Woohoo! Took a bit, but now all cables are plugged and we’re ready to go!

Cable management isn’t my strong suit, but I did my best

1 Like

Do you have plans for the 2 open 5.25 bays?

Right now? Cable storage :joy: but I do have some hazy plans for the future

Right now I’ve got 4x4TB (3.5") + 2x1GB (2.5") + 2x240GB (2.5" SSD) with capacity for 5 more 3.5" drives. Until I can get it enough 3.5s to fill those spaces, I don’t really see any immediate need for the 5.25s

I have thought about getting a 6x2.5" adapter for one of them though and setting up a bank of SSDs. Otherwise I’d likely just add more of the same 1x3.5+2x2.5 adapters.