Install ollama and create an alias for ollama run <model>
for ease of access is what I did
amzd
joined 1 year ago
If you have a smart TV you can use stremio, don’t even need a box.
They are harvested for their organs
You should make sure you are running a model that fits in your vram, for me it runs faster than any online LLM I’ve tried.
ollama + codellama works perfect, I use it from neovim with a plug-in called gen-nvim I think
Do I understand correctly you use the install script for files outside home dir? If so could you share this as I’m running into that issue.
CodeLLaMa. It costs 15GB of vram which my gpu has
Run your own ai to help with coding
I love installing a 61GB game to play one game mode that would’ve been a 10GB game by itself
A steam deck can run all the games those two can and it runs Linux which means it will probably never be obsolete