278
Someone made a GPT-like chatbot that runs locally on Raspberry Pi, and you can too
(www.xda-developers.com)
This is a most excellent place for technology news and articles.
Direct link to the GitHub repo:
https://github.com/nickbild/local_llm_assistant?tab=readme-ov-file
It's a small model by comparison. If you want something that's offline and actually closer to comparing to ChatGPT 3.5, you'll want the Mixtral 8x7B model instead (running on a beefy machine):
https://mistral.ai/news/mixtral-of-experts/
Sick, I only need 90gb of VRAM!
Hopefully we see more specific hardware for this. Like extension cards with pretty much just tensor cores and their own ram.
I’d love to see some consumer level AI stuff, sadly it all seems to be designed for server farms and by the time it ages out into consumer prices it’s so obsolete there’s no point in getting it.
It's not quite consumer level I'd say but Coral.ai has some small Google Edge based TPUs.
Do they want consumer ai cards to exist though?
Think about the data!
Card makers? They only want money, if theres enough consumer level demand they will make them.
I guess your right.