this post was submitted on 30 Aug 2025
206 points (87.9% liked)

Selfhosted

51133 readers
722 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

My rack is finished for now (because I'm out of money).

Last time I posted I had some jank cables going through the rack and now we're using patch panels with color coordinated cables!

But as is tradition, I'm thinking about upgrades and I'm looking at that 1U filler panel. A mini PC with a 5060ti 16gb or maybe a 5070 12gb would be pretty sick to move my AI slop generating into my tiny rack.

I'm also thinking about the PI cluster at the top. Currently that's running a Kubernetes cluster that I'm trying to learn on. They're all PI4 4GB, so I was going to start replacing them with PI5 8/16GB. Would those be better price/performance for mostly coding tasks? Or maybe a discord bot for shitposting.

Thoughts? MiniPC recs? Wanna bully me for using AI? Please do!

you are viewing a single comment's thread
view the rest of the comments
[–] muppeth@scribe.disroot.org 1 points 3 days ago (1 children)

Wow! very cool rack you got there. I too started using mini pcs for local test servers or general home servers. But unlike yours mine are just dumped behind the screen on my desk (3 in total). For LLM stuff atm I use 16GB radeon but thats connected to my desktop. In the future I would love to build a proper rack like yours and perhaps move the GPU to a dedicated minipc.

As for the upgrades, like what others stated already, I would just go for more pc's rather then rpi.

[–] nagaram@startrek.website 1 points 2 days ago (1 children)

The PIs were honestly because I had them.

I think I'd rather use them for something else like robotics or a Birdnet pi.

But the pi rack was like $20 and hilarious.

The objectively correct answer for more compute is more mini PCs though. And I'm really thinking about the Mac Mini option for AI.

[–] muppeth@scribe.disroot.org 1 points 2 days ago (2 children)

is the mac mini really that good? running 12-14b models on my radeon rx 7600xt is ok'ish but i do "feel it" while running 7-8b models sometimes just doesn't feel enough. I wonder where does mac mini land in here.

[–] nagaram@startrek.website 1 points 2 days ago

From what I understand its not as fast as a consumer Nvdia card but but close.

And you can have much more "Vram" because they do unified memory. I think the max is 75% of total system memory goes to the GPU. So a top spec Mac mini M4 Pro with 48GB of Ram would have 32gb dedicated to GPU/NPU tasks for $2000

Compare that to JUST a 5090 32GB for $2000 MSRP and its pretty compelling.

$200 and its the 64GB model with 2x 4090's amounts of Vram.

Its certainly better than the AMD AI experience and its the best price for getting into AI stuff so says nerds with more money and experience than me.

[–] nagaram@startrek.website 1 points 2 days ago (1 children)

From what I understand its not as fast as a consumer Nvdia card but but close.

And you can have much more "Vram" because they do unified memory. I think the max is 75% of total system memory goes to the GPU. So a top spec Mac mini M4 Pro with 48GB of Ram would have 32gb dedicated to GPU/NPU tasks for $2000

Compare that to JUST a 5090 32GB for $2000 MSRP and its pretty compelling.

$200 and its the 64GB model with 2x 4090's amounts of Vram.

Its certainly better than the AMD AI experience and its the best price for getting into AI stuff so says nerds with more money and experience than me.

[–] muppeth@scribe.disroot.org 1 points 6 hours ago

Interesting. Is there a non-apple solution like this?