admin

joined 1 year ago
[–] admin@lemmy.my-box.dev 2 points 4 months ago

In a similar vain, I tried out a garmin smartwatch for a while, and at some point it warned me I was getting stressed.

I wasn't though - I was excited about a project that I had been working on coming together. But apparently the watch could only think in negative moods.

For that, and other privacy and usability based reasons, I decided to return it and go back to my non-heart-rating Pebble Time Steel.

[–] admin@lemmy.my-box.dev -2 points 4 months ago (1 children)

I suppose that's one way to generalize an entire country.

[–] admin@lemmy.my-box.dev 10 points 4 months ago

Oh, sure. For the 405B model it's absolutely infeasible to host it yourself. But for the smaller models (70B and 8B), it can work.

I was mostly replying to the part where they claimed meta can take it away from you at any point - which is simply not true.

[–] admin@lemmy.my-box.dev 1 points 4 months ago (1 children)

Oof - not on my 12gb 3060 it doesn't :/ Even at 48k context and the Q4_K quantization, it's ollama its doing a lot of offloading to the cpu. What kind of hardware are you running it on?

[–] admin@lemmy.my-box.dev 34 points 4 months ago* (last edited 4 months ago) (2 children)

WAKE UP!

It works offline. When you use with ollama, you don't have to register or agree to anything.

Once you have downloaded it, it will keep on working, meta can't shut it down.

[–] admin@lemmy.my-box.dev 97 points 4 months ago (35 children)

Technically correct (tm)

Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.

[–] admin@lemmy.my-box.dev 1 points 4 months ago (3 children)

Ah, that's a wonderful use case. One of my favourite models has a storytelling lora applied to it, maybe that would be useful to you too?

At any rate, if you'd end up publishing your model, I'd love to hear about it.

[–] admin@lemmy.my-box.dev 7 points 4 months ago (5 children)

Yeah, there's a massive negative circlejerk going on, but mostly with parroted arguments. Being able to locally run a model with this kind of context is huge. Can't wait for the finetunes that will result from this (*cough* NeverSleep's *-maid models come to mind).

[–] admin@lemmy.my-box.dev 1 points 4 months ago* (last edited 4 months ago)

Agreed. So in other words - everybody wins.

I'm by no means under the impression that librewolf will take over to become more dominant than Firefox anytime soon. So if Firefox does the heavy lifting and does the dirty work, the community will still benefit from these better versions downstream.

[–] admin@lemmy.my-box.dev 3 points 4 months ago (1 children)

I haven't given it a very thorough testing, and I'm by no means an expert, but from the few prompts I've ran so far, I'd have to hand it to Nemo concerning quality.

Using openrouter.ai, I've also given llama3.1 405B a shot, and that seems to be at least on par with (if not better than) Claude 3.5 Sonnet, whilst being a bit cheaper as well.

[–] admin@lemmy.my-box.dev 5 points 4 months ago (2 children)

Orrr... It's like saying Firefox should keep on doing whatever it's doing, and people who care will get its benefits without having to suffer its drawbacks.

[–] admin@lemmy.my-box.dev 3 points 4 months ago (4 children)

Get downstreamed into librefox.

view more: ‹ prev next ›