this post was submitted on 07 Mar 2026
56 points (98.3% liked)

Selfhosted

57265 readers
468 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Shimitar@downonthestreet.eu 7 points 9 hours ago (2 children)

I plugged in an NVIDIA gpu in my server and enabled ollama to use it, diligently updated my public wiki about it and now enjoying real time gpt: OSS model responses!

I was amazed, time cut from 3-8 minutes down to seconds. I have a Intel Core7 with 48gb ram, but even an oldish gpu beats the crap out of it.

[–] sharkaccident@lemmy.world 1 points 2 hours ago (1 children)
[–] Shimitar@downonthestreet.eu 1 points 1 hour ago

NVIDIA Corporation GA104GL [RTX A4000] (rev a1)

From lspci

It has 16gb of VRAM, not too much but enough to run gpt:OSS 20b and a few other models pretty nice.

I noticed that it's better to stick to a single model, I imagine that unload and reload the model in VRAM takes time.

[–] mierdabird@lemmy.dbzer0.com 2 points 8 hours ago

In that same vein I got an AMD Pro V620 32GB off ebay and have been struggling to get it to POST on my x570 motherboard, but I finally tried it on my old ASUS b450-i with a Ryzen 5 2400GE and with a few BIOS setting changes it fired right up.

Now I need to figure out what I'm doing wrong on the x570 board so I can run the V620 combined with my 9060XT for bigger models