Priceless.
avidamoeba
Pulled whatever is available on Ollama by this name and it seems to just fit on a 3090. Takes 23GB VRAM.
So you're telling me there's a chance.
Does anyone have an idea how much RAM would this need?
Hm, this might not be a bad replacement for my Unifi access points, if its radio is up to snuff. It's significantly cheaper than Unifi for WiFi 6.
E: ordered one
Weird. Been upgrading several OpenWrt machines for many years now. Click a button in the UI, select a file, click another button to update.
Yes. Search generally pulls data from databases. It doesn't compute weather forecasts. The addition of AI results is net addition computation. In the worst case scenario where the generation of the AI results happens on-the-fly, that's a lot more computation. I'm sure they pre-compute a lot of them so they're not in the worst case scenario. However in the best case scenario they still have to do this new additional heavy (check LLM compute usage) computation once per result. So the profit margin for search is very likely lower than it used to be when isolating for this variable. If they're somehow increasing their revenue from these results, that's another variable that might offset it. I've no idea. What I'm certain about is the cost is higher after AI results were introduced because more energy is used.
Looks a bit insensitive but it's not even expressing support for Israel's war.
Even though that surely results in them being able to access more money and makes shareholders richer, that's not a factor in profit margins. Profit margins are just about revenue vs cost. In this case - how much the make from each search vs how much it costs to produce that search.
Yeah. Also I'm guessing their AI additions to search made their profit margins worse since they take a lot more computation to produce. Although they probably cache a lot of them for common searches.
They ruined it without AI before AI was commonplace. They ruined it with higher profit margins. 🥹
WiFi