plixel

joined 1 year ago
[–] plixel@programming.dev 9 points 5 days ago (1 children)

You can install Ollama in a docker container and use that to install models to run locally. Some are really small and still pretty effective, like Llama 3.2 is only 3B and some are as little as 1B. It can be accessed through the terminal or you can use something like OpenWeb UI to have a more "ChatGPT" like interface.

[–] plixel@programming.dev 1 points 8 months ago

Thanks for this!

[–] plixel@programming.dev 4 points 9 months ago

I really liked Kagi at first, especially since I use it mainly for programming as well, but recently I feel like the quality has gone downhill. Right around the time they integrated the Brave stuff I've noticed a significant amount of me having to scroll down past the usual Google-like fluff results before getting to actually relevant information. It's a little sad to see because when I first used it, it was so good now it basically feels like a skinned Google-lite at this point. I'm still a customer but only because I haven't found a good alternative yet.