this post was submitted on 09 Apr 2024
151 points (94.2% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's going to hit an index, not the actual data, it's going to return approximate and not accurate results. Tons of engineering been done around basic search precisely to get more data locality.
Read a blog post at some time (please don't ask me where) talking about Bing vs. Google when Bing started to use ChatGPT and it basically boiled down to "Google has the tech to do it, they don't roll it out because they don't want to eat the electricity bill this is MS spending money to get market share". The cost difference in providing search vs. having ChatGPT answer a question was something like 10x. It might not be that way forever what with beating models down to work in trinary and stuff, though (that's not just massive quantisation but also much easier maths, convolutions don't need much maths when all you deal with is -1, 0, 1 IIRC you can throw out the multiplication unit and work with nothing but shifts and adds)