BaroqueInMind

joined 1 year ago
[–] BaroqueInMind@lemmy.one 6 points 3 months ago

Yeah everything about this software is cool af

[–] BaroqueInMind@lemmy.one 7 points 3 months ago

Because I am dumb and don't know what I am talking about.

[–] BaroqueInMind@lemmy.one 4 points 3 months ago (1 children)

Please elaborate on your comment, because it makes no sense to me.

[–] BaroqueInMind@lemmy.one 6 points 3 months ago

OPNsense > pfsense

[–] BaroqueInMind@lemmy.one 2 points 4 months ago

This is a great idea that will never work because it's too expensive to maintain.

[–] BaroqueInMind@lemmy.one 2 points 4 months ago* (last edited 4 months ago)

With 144Gb of total RAM, you should be able to run any CPU intensive software.

The LLMs use GPU vRAM though, so it doesn't matter how much system RAM you have, since GPU vRAM is what the xformers and tensor scripts prioritize and have been ultimately optimized to use over CPU and RAM.

[–] BaroqueInMind@lemmy.one 2 points 4 months ago* (last edited 4 months ago) (2 children)

What's the bus speed of the RAM? You might run it just fine but still bottlenecked there.

[–] BaroqueInMind@lemmy.one 1 points 4 months ago (1 children)

They caught on to that and now you gotta setup a routine that automatically disables it every 24 hours because it now auto re-enables itself after a while.

[–] BaroqueInMind@lemmy.one 0 points 4 months ago* (last edited 4 months ago)

How am I "internet police" when I never demanded anything from you and simply pointed a hypocrisy and shared the observation with you?

You sound like you have deeper issues that need a professional to help you resolve. Or don't, and just stagnate. I'm not the fucking internet police telling you what to do and what not.

[–] BaroqueInMind@lemmy.one 7 points 4 months ago* (last edited 4 months ago) (5 children)

Ironic since the word "dork" in your name is slang for a penis. I hate hypocrites like you.

view more: ‹ prev next ›