this post was submitted on 14 Feb 2024
675 points (95.7% liked)

Technology

59605 readers
3435 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] mods_are_assholes@lemmy.world -3 points 9 months ago (6 children)

You really have no fuckdamn naive your statement is. You don't want an AI war and we cannot avoid one.

[–] Flumpkin@slrpnk.net 3 points 9 months ago* (last edited 9 months ago) (5 children)

I know there is currently a massive PR campaign for a power grab to consolidate control over AI software. They want to control the means of generation. Only MozillAI can save us from King GhAIdorah!

Sorry I'm upsetting you. I know we're entering an acceleration of technology at a time where our institutions globally are in an absolutely horrendous state. People on all sides are brainwashed as hell. The AI watchdogs are insane as well. What's left but gallows humor? I do hold out some hope though.

[–] mods_are_assholes@lemmy.world 2 points 9 months ago (4 children)

You cannot upset me more than the current common misunderstandings that everyone has about AI already does.

I don't think you understand the implications of undetectable AI to shift social conversation or the kind of world that those AI owners want to create.

[–] Flumpkin@slrpnk.net 0 points 9 months ago (1 children)

That might actually be the kind of thing where open source AI could help. At least I hope. To detect bias, lies or AI powered filtering / sorting of content.

[–] mods_are_assholes@lemmy.world 5 points 9 months ago (2 children)

Ok so this is one of the naive thoughts that makes me upset.

The open source community can't even make a distro of linux that is out of the box functional for everyday users and you think somehow they are going to be able to outcompete billion dollar companies that can afford the best gear and devs?

Look, I bought in heavy to open source early on in the 90s, and have done my best to go open source for every tool I can, but the simple fact is that even the 'best' open source projects are severely lacking in aspects and YOU CAN'T TRUST DEVELOPMENT OF AI TO THAT.

Compare The Gimp to Photoshop. It isn't even close, why? Because Adobe has a fucktonne of cash to throw at their projects and they have clear direction and motivation.

I don't like it

I'd prefer a fully open source world

But it isn't going to happen, and open source AI will always lag behind corporate AI, and considering how fast it has been developing, even being 3 months behind renders a tool useless as an AI detector.

We aren't prepared for this and 90% of what everyone on the internet says about AI is poorly informed and full of confabulation, and WORST of all, when you try and explain this to them they get antagonistic.

We have already seen the threat AI can pose in 2016 with Cambridge Analytica helping to hand trumpty dumpty the election by using AI to focus target vulnerable facebook groups.

AND THAT AI WAS A FUCKING INFANT compared to what we have now.

It's going to be so bad and almost none of you have the slightest clue.

[–] Kedly@lemm.ee 1 points 9 months ago

See, THIS is the criticism of AI I can actually empathize with, I might even agree with it somewhat

[–] OhNoMoreLemmy@lemmy.ml 1 points 9 months ago

Honestly, most of what Cambridge analytica did was blackmail, illegal spending, and collusion between campaigns that were legally required to be separate.

Much of the data processing/ml was intended as a smoke screen to distract from the big stuff that was known to work and consequently legislated against. The problem is that they were so incompetent that the distraction technique was also illegal.

Maybe the machine learning also worked, but it's really not clear.

load more comments (2 replies)
load more comments (2 replies)
load more comments (2 replies)