this post was submitted on 03 Feb 2024
49 points (93.0% liked)

Linux

48624 readers
1682 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

A university near me must be going through a hardware refresh, because they've recently been auctioning off a bunch of ~5 year old desktops at extremely low prices. The only problem is that you can't buy just one or two. All the auction lots are batches of 10-30 units.

It got me wondering if I could buy a bunch of machines and set them up as a distributed computing cluster, sort of a poor man's version of the way modern supercomputers are built. A little research revealed that this is far from a new idea. The first ever really successful distributed computing cluster (called Beowulf) was built by a team at NASA in 1994 using off the shelf PCs instead of the expensive custom hardware being used by other super computing projects at the time. It was also a watershed moment for Linux, then only a few yeas old, which was used to run Beowulf.

Unfortunately, a cluster like this seems less practical for a homelab than I had hoped. I initially imagined that there would be some kind of abstraction layer allowing any application to run across all computers on the cluster in the same way that it might scale to consume as many threads and cores as are available on a CPU. After some more research I've concluded that this is not the case. The only programs that can really take advantage of distributed computing seem to be ones specifically designed for it. Most of these fall broadly into two categories: expensive enterprise software licensed to large companies, and bespoke programs written by academics for their own research.

So I'm curious what everyone else thinks about this. Have any of you built or admind a Beowulf cluster? Are there any useful applications that would make it worth building for the average user?

you are viewing a single comment's thread
view the rest of the comments
[–] notabot@lemm.ee 7 points 10 months ago (5 children)

It really depends on what sort of workload you want to run. Most programs have no concept of horizontal scaling like that, and those that do usually deal with it by just running an instance on each machine.

That said, if you want to run lots of different workloads at the same time, you might want to have a look at something like Kubernetes. I'm not sure what you'd want to run in a homelab that would use even 10 machines, but it could be fun to find out.

[–] plenipotentprotogod@lemmy.world 3 points 10 months ago (4 children)

I’m not sure what you’d want to run in a homelab that would use even 10 machines, but it could be fun to find out.

Oh yeah, this is absolutely a solution in search of a problem. It all started with the discovery that these old (but not ancient, most of them are intel 7th gen) computers were being auctioned off for like $20 a piece. From there I started trying to work backwards towards something I could do with them.

[–] notabot@lemm.ee 3 points 10 months ago (1 children)

They sound usable enough. If you're interested in it, have you considered running a LLM or similar? I think they cluster. If they've got GPUs you could try Stablediffusion too.

Mind you, at that price point I think we're past the point of just thinking of them as compute resources. Use them as blocks, build a fort and refuse to come out unless someone comes up with a better idea.

[–] plenipotentprotogod@lemmy.world 2 points 10 months ago (1 children)

I'll have to look a little more into the AI stuff. It was actually my first thought, but I wasn't sure how far I'd get without GPUs. I think they're pretty much required for Stablediffusion. I'm pretty sure even LLMs are trained on GPUs, but maybe response generation can be done without one.

[–] agressivelyPassive@feddit.de 2 points 10 months ago

Not really, at least not in a useful way. I have an i5 6500 in an old Dell desktop and even with 16gb of RAM, you can't really do that much without waiting forever. My m1 air is way faster.

load more comments (2 replies)
load more comments (2 replies)