russjr08

joined 11 months ago
[–] russjr08@bitforged.space 2 points 1 day ago

All I wanted them to change was the fact that the installer for the game in 2020 would download then decompress one file at a time, so it took forever for the game to install (on top of the fact that it uses an in-game installer in the first place).

I don't have the new version, but based on what I've been reading they sure curled the monkey paw this time.

[–] russjr08@bitforged.space 1 points 1 week ago

Primarily I use Arch on my desktop (and by proxy, my Steam Deck which runs SteamOS), which is what I've landed on after a ton of distro hopping. The idea of Atomic distros catches my eyes, but for me in its present state there are too many steps needed in order to make deeper changes (for example, installing a kernel module) - but I quite like SteamOS on my Deck since I know it will always be in a "consistent" state, for example.

On servers I run a mix of Rocky Linux and Debian.

[–] russjr08@bitforged.space 5 points 1 month ago (2 children)

(inb4 ethernet over HDMI: There is no implementation of the spec in the wild).

How about Thunderbolt? This looks like macOS, and while I'm not 100% sure if they utilize HDMI ports anymore, they certainly use Thunderbolt.

[–] russjr08@bitforged.space 7 points 2 months ago

Oh wow, I didn't expect another release so quickly! Props to the COSMIC team! I can't recall where the roadmap for the features and their targeted releases went, but I hope we can get Night Light/Blue Light filtering soon.

I also did not know they had a Mastodon account, thanks for the shout so that I could give 'em a follow.

[–] russjr08@bitforged.space 6 points 2 months ago

It depends on who you're referring to as a casual user. My mother for example would certainly have a hard time with it, then figuring out the key to bring up the boot menu (and being faced with a scary dialog that they've never seen), then selecting the right device, then likely being faced with GRUB which would also look scary to her, and by then she'd be overwhelmed before even getting to the install portion.

[–] russjr08@bitforged.space 7 points 2 months ago (1 children)

I'd recommend using ROCM through a Distrobox container, personally I use this Distrobox container file and it has suited all of my needs with Stable Diffusion so far.

That is, if you're still interested in it - I could totally understand writing it off after what happened 😅

[–] russjr08@bitforged.space 5 points 2 months ago

I usually just get by with Alacritty and Zellij, pairs pretty well together.

[–] russjr08@bitforged.space 13 points 2 months ago

They're only just now cancelling that ridiculous fee? I swear I thought they cancelled that dumb idea a bit ago.

You've opened a door that you cannot close, Unity.

[–] russjr08@bitforged.space 1 points 2 months ago (1 children)

Hmm, gotcha. I just tried out a fresh copy of text-gen-webui and it seems like the latest version is borked with ROCM (I get the CUDA error: invalid device function error).

My next recommendation then would be LM Studio which to my knowledge can still output an OpenAI compatible API endpoint to be used in SillyTavern - I've used it in the past before and I didn't even need to run it within Distrobox (I have all of the ROCM stuff installed locally, but I generally run most of the AI stuff in distrobox since it tends to require an older version of Python than Arch is currently using) - it seems they've recently started supporting running GGUF models via Vulkan, which I assume probably doesn't require the ROCM stuff to be installed perhaps?

Might be worth a shot, I just downloaded the latest version (the UI has definitely changed a bit since I last used it) and just grabbed a copy of the Gemma model and ran it, and it seemed to work without an issue for me directly on the host.

The advanced configuration settings no longer seem to directly mention GPU acceleration like it used to, however I can see it utilizing GPU resources in nvtop currently, and the speed it was generating at (the one in my screenshot was 83 tokens a second) couldn't have possibly been done on the CPU so it seems to be fine on my side.

[–] russjr08@bitforged.space 1 points 2 months ago

Yeah, I definitely am not a fan of how AMD handles rocm - there's so many weird cases of "Well this card should work with rocm, but... [insert some weird quirk that you have to do, like the one I mentioned, or what you've run into]".

Userspace/consumer side I enjoy AMD, but I fully understand why a lot of devs don't make use of rocm and why Nvidia has such a tight hold on things in the GPU compute world with CUDA.

[–] russjr08@bitforged.space 1 points 2 months ago (3 children)

Ah, strange. I don't suppose you specifically need a Fedora container? If not, I've been using this Ubuntu based distrobox container recipe for anything that requires ROCM and it has worked flawless for me.

If that still doesn't work (I haven't actually tried out kobolcpp yet), and you're willing to try something other than kobolcpp, then I'd recommend the text-generation-webui project which supports a wide array of model types, including the GGUF types that Kobolcpp utilizes. Then if you really want to get deep into it, you can even pair it with SillyTavern (it is purely a frontend for a bunch of different LLM backends, text-generation-webui is one of the supported ones)!

[–] russjr08@bitforged.space 1 points 2 months ago (7 children)

What card do you use? I have a 6700XT and getting anything with ROCM running for me requires that I pass the HSA_OVERRIDE_GFX_VERSION=10.3.0 environmental variable to the related process, otherwise it just refuses to run properly. I wonder if it might be something similar for you too?

view more: next ›