Currently, my NK4 (finally built so it makes the drawing ) is the black PC on the bottom.
Working backwards from the network, here’s what we’re working with and how it’s set up:
My TP-Link Archer C7 v5 router is not pictured, but on the other side of the apartment in the living room. It’s an AC1750 router, so we get pretty solid speeds, but we’re limited by our 330M down/30M up internet. It’s cheap, but we’re bottlenecked on the upload side
Working back from the network we have a TP-Link Range Extender underneath my desk, which provides an AC1750 Wi-Fi uplink to the cluster on nonsense on the left. The extender has a single 1GbE port
So that’s the network. Now for the cluster. This project originally started as just a BeagleBone Black, but grew to eventually be a small SBC cluster. Here’s what we got:
An ODROID-N2 w/ 4GB of RAM - Pictured underneath the Ethernet switches on the left with a single, blue Ethernet port and 4x Blue USB3.0 ports
A BeagleBone Black Rev C - Pictured on the right side, middle of the stack of Pis with a single, blue Ethernet port)
3x Raspberry Pi 3B+ with the official PoE HAT (pictured on the right in the bottom of the stack of Pis with single, Orange PoE Ethernet port + 4x Grey USB 2.0 ports)
1x Raspberry Pi 4 w/ 4GB of RAM with the official PoE HAT (pictured on the right in the stack of Pis with a single, Orange PoE Ethernet port + 2x Blue USB 3.0 ports + 2x Grey USB 2.0 ports)
1x Raspberry Pi 4 w/ 4GB of RAM with a ClusterHAT v2.3 (pictured on the right, in the stack of Pis with a single, Blue Ethernet port + 2x Blue USB 3.0 ports + 2x Grey USB 2.0 ports)
4x Raspberry Pi Zero connected to the ClusterHAT and the top Pi 4 (pictured on top of the stack of Pis, arranged vertically)
That’s a long list. This project was capable of hosting all of those docker services above + a pair of PiHoles and Wireguard. I mainly had issues with storage which is what this project needs to solve. Finally a bona fide NAS will actually allow Plex and Nextcloud to run continuously.
The way these boxes ran before was as follows:
The ODROID-N2 was my “master” node and ran ZFS with a separate set of USB 3.0 disk enclosures - it ran my MariaDB instance + Plex and anything that couldn’t run over NFS exports
The BeagleBone and Pi Zeros barely run anymore, but originally started it all. Now they would run mostly a Wordpress container at most + maybe a Tautulli container. Anything that is basically just a webserver runs fine on these little guys. One of the Pi Zeros has a temp sensor HAT that I use to gather ambient temp data around the cluster.
The Pi 3B+ and 4s combined to run most of the heavy stuff like Nextcloud, Home Assistant, Bitwarden, the R’s, etc. One Pi ran a Pi-Hole and the other Pi 4 ran a WireGuard VPN
The cluster’s focus was high availability of these services, so I used glusterfs + NFS shares to spread out data so if a node failed the services would degrade but at least keep running. This worked in a fashion, but the central location of the MariaDB instance + the file storage killed most services if the ODROID went down or the USB acted up (which happened frequently)
Although, my first batch of SATA cables were right angle, so I decided to rearrange them a bit.
I’ll be updating this topic with more pictures as I go, but depending on when this mobo gets here, I’ll either have pictures up today or on Monday night
Actually, I have no experience with either believe it or not. Most of my experience with Linux has been with my little Pi/ODROID cluster (that I need to fill in info on up above I just remembered). So for that I’ve been running mostly Raspbian Buster, some Debian Buster, and Ubuntu Focal/Groovy.
I don’t really have anything against Unraid or FreeNAS specifically, like you said I’ve heard from many people that they’re great. I just really like the regular linux experience having spent 2-3 years learning how to build a small compute/docker cluster on it.
I’ve also been hesitant about being locked into a proprietary platform (even if it’s based on Debian Linux). My biggest issue with the ODROID/Pi cluster was that it was 32-bit/64-bit custom ARM architectures, so if the device makers/community didn’t feel like supporting a package (like ZFS) then I was forced to compile it on my own and deal with the headaches of supporting it myself
Ok, first thing to go was the stock case fan, replaced this and the other one with the Arctic P12s. I decided to go without Power Sharing on these since they’re on opposite ends of the case
Pre-running cables now, I have 8 drives total, so planning to use the LSI card + 4 SATA cables. Until that second Icy Dock arrives though, best to just hold 2 cables back.
Ok back to building! Finally back home and that pesky motherboard finally got here! I decided to go with the X9SCM-F for the port combos + Supermicro still seems to be supportive of this board all these years later. This is the board standalone finally awaiting that CPU
Now that our CPU is in, my next job is the CPU fan. For price, I decided to go with the Thermaltake Gravity i2 since this chip doesn’t even reach the rated 95W TDP we should be fine. Also, this was my first time installing a cpu fan so thanks @JDM_WAAAT for that NK4 build video which I used for reference!
The process took longer because I had to figure out that the Supermicro X9 motherboards have a limited option ROM, so sas2flsh.exe fails on FreeDOS. The solution was to use the UEFI flashing method using the built in EFI shell. This worked like a charm and we’re now fully flashed and back to building!
Right now? Cable storage but I do have some hazy plans for the future
Right now I’ve got 4x4TB (3.5") + 2x1GB (2.5") + 2x240GB (2.5" SSD) with capacity for 5 more 3.5" drives. Until I can get it enough 3.5s to fill those spaces, I don’t really see any immediate need for the 5.25s
I have thought about getting a 6x2.5" adapter for one of them though and setting up a bank of SSDs. Otherwise I’d likely just add more of the same 1x3.5+2x2.5 adapters.
Updated the build! Ok so it’s been a bit, but I wanted to post a quick update since this build has grown a bit since this post
Since this post, I took @JDM_WAAAT’s advice and bought 6x3TB Bitdeals SAS drives. Also went and added a 92mm fan (which does not fit this case apparently - they expect 90mm…because of course they do), a QuickSync Box, a UPS, and just yesterday bought a new PSU to add some room to grow power wise and declutter all those cables! Oh and I added an LSI SAS9210-8i for more drive support too
Pictures will follow since this build looks a slight bit different now