this post was submitted on 05 May 2024
1414 points (98.8% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think it takes a while for that kind of competitor to emerge and gain enough traction to become a genuine alternative option. The primary option everyone long since adopted kinda has to suck for a while :/
It also is going to take another leap in algorithm.
It was a hard problem to solve when Google's founders cracked it, but it's an even harder problem to solve now that you have state of the art spam bots filling the Internet full of shit that looks like it was composed by humans.
If someone cracks how to figure out whether something is ai or not (for real, not the fake solutions we have now) and adds that to a good search algorithm and filters the fake shit by default, they will have a hell of a product on their hands.
I'm of the opinion that it will require human interaction to fix this. It can't be purely solved via algorithms.
What people don't realize is that the original Google search algorithm, PageRank, effectively looks at how real humans interacted with the websites they were indexing. Only websites referenced by other websites were being considered by Google's search engine. And at the time, that meant real human beings were making those links. This gave them a real advantage over other, purely algorithmic search engines.
Something like this will have to be recreated. We will have to figure out a way of prioritizing search results that real human beings have found to be useful.
Tough to do when those services tend to get infiltrated by bots as well.
DDG has been around for quite a while. Now it was a few years ago I used it last time, but the reason I switched back to Google was because I was clearly less productive with DDG.
I don't think something like duckduckgo is gonna be the eventual contender to take on google. I think it'll have to be an engine with its own index or some kind of lateral solution.
Something like brave, kagi, qwant, or stract could maybe turn into something exciting with more momentum, but honestly I have a hard time seeing them be the kind of scrappy competitor with a new approach that unseats the old king who has lost their way in pursuit of more profit at the expense of product quality. None of them seem like they truly have a new approach, but only time will tell how that story plays out this time.
I honestly think it will have to be semi curated and look a lot like a more complex digital version of an encyclopedia.
I mean I think the stupid LLM craze is from trying to make something like that in the vain of the "Hitchhikers Guide" but without having to do the actual work and using autogenerated articles except that makes them prone to being false. The Hitchhikers Guide still had writers and people entering and double checking information.
It can then further link you to related stuff from the web but the wide spread of information is somewhat dead since it's now the product to be sold and free and easy sharing of it would ruin profit margins.