It’s pretty easy with Ollama. Install it, then ollama run mistral-7b
(or another model, there’s a few available ootb). https://ollama.ai/
Another option is Llamafile. https://github.com/Mozilla-Ocho/llamafile
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
No spam posting.
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
No trolling.
Resources:
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
It’s pretty easy with Ollama. Install it, then ollama run mistral-7b
(or another model, there’s a few available ootb). https://ollama.ai/
Another option is Llamafile. https://github.com/Mozilla-Ocho/llamafile
If you want to be able to get into the nitty gritty or play with options besides just a chat, I recommend Text Generation WebUI.
Installing is pretty easy, then you just download your desired model from Hugging Face.
Or if you want to use it for roleplay or adventure style games, KoboldCPP is easy to set up.
Sounds like a really cool project, sadly i dont have much knowledge to contribute. Still, what kind of issues have you run into? Any specific errors or problems?
Maybe Serge would fit your use case.
Surge is probably the easiest way to get a basic setup. If you just want to download a model and chat, I recommend it.
If low on hw then look into petals or the kobold horde frameworks. Both share models in a p2p fashion afaik.
Petals at least, lets you create private networks, so you could host some of a model on your 24/7 server, some on your laptop CPU and the rest on your laptop GPU - as an example.
Haven't tried tho, so good luck ;)
I haven't looked into specific apps, but I have been wanting to try various trained models and figured just self hosting jupyterhub and getting models from hugging face would be a quick and flexible way to do it
I've heard good things about H2O AI if you want to self host and tweak the model by uploading documents of your own (so that you get answers based on your dataset). I'm not sure how difficult it is. Maybe someone more knowledgeable will chime in.