this post was submitted on 15 Oct 2025
61 points (87.7% liked)

Selfhosted

52431 readers
1184 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Hi all, i am quite an old fart, so i just recently got excited about self hosting an AI, some LLM...

What i want to do is:

  • chat with it
  • eventually integrate it into other services, where needed

I read about OLLAMA, but it's all unclear to me.

Where do i start, preferably with containers (but "bare metal") is also fine?

(i already have a linux server rig with all the good stuff on it, from immich to forjeio to the arrs and more, reverse proxy, Wireguard and the works, i am looking for input on AI/LLM, what to self host and such, not general selfhosting hints)

you are viewing a single comment's thread
view the rest of the comments
[–] eleitl@lemmy.zip 1 points 6 days ago (1 children)

Is Radeon V with 8 GB HBM worth using today?

[–] ragingHungryPanda@piefed.keyboardvagabond.com 2 points 6 days ago* (last edited 6 days ago) (1 children)

not for LLMs. I have a 16GB and even what I can fit in there just isn't really enough to be useful. It can still do things and quickly enough, but I can't fit models that large enough to be useful.

I also don't know if your GPU is compatible with ROCM or not.

[–] eleitl@lemmy.zip 1 points 5 days ago* (last edited 5 days ago)

The GPU used to but they dropped ROCm support for Radeon V and VII some time ago. Have to look at that Strix Halo/AI Max thing I guess.