this post was submitted on 24 May 2024
255 points (98.9% liked)

Technology

59589 readers
2962 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The only exception is private messages, and some users have reported difficulty opting out.

you are viewing a single comment's thread
view the rest of the comments
[–] trollbearpig@lemmy.world 10 points 6 months ago* (last edited 6 months ago) (1 children)

Hahaha, I don't know why people are so shocked. I'm sure we will see anything useful with AIs anytime soon, just like with crypto hahaha.

In the mean time, it's obvious these companies are using AIs as an excuse to bypass laws and regulations, and people are cheering them ...They are bypassing copyright laws (in a direct attack to open source) with their autocomplete bots, but we should not worry, it's not copyright infrigment because the LLMs are smart (right), so that makes it ok ... They are using this to steal the work of real artists through image generation bots, but people love this for some reason. And they are using this to bypass the few privacy laws in place now, like Facebook/Meta could ever have another incentive.

Maybe I'm extremist, but if the only useful thing we are getting from this is mediocre code autocomplete that works sometimes, I think the price we are paying is way too high. Too bad I'm in the minority here.

[–] Grimy@lemmy.world 3 points 6 months ago (2 children)

Llms need a lot of data, to the point that applying copyrights to the training would only let a few companies in the game (Google and Microsoft). It would kill the open source scene and most of the data is owned by websites like Reddit, stack and getty anyways. Individual contributors wouldn't get anything out of it.

You also have to be willfully blind in my opinion to seriously think generative AI has as narrow a scope as crypto.

This stuff is rocket fuel to the gaming industry for instance. It will let indie companies put out triple A games and I'm guessing next gen RPGs will have fully interactive NPCs.

[–] Hackworth@lemmy.world 1 points 6 months ago (1 children)

I've heard every possible combination of thoughts on A.I. We need like a 6-dimensional alignment chart.

[–] trollbearpig@lemmy.world 1 points 6 months ago (1 children)

Nah dude, you need to build an opinion of your own lol. No chart is doing that for you.

[–] Hackworth@lemmy.world 1 points 6 months ago (1 children)

Oh I have an opinion, I just meant to keep track of everyone's for the sake of conversation.

[–] trollbearpig@lemmy.world 1 points 6 months ago

Oh, ups. Sorry lol.

[–] trollbearpig@lemmy.world -1 points 6 months ago* (last edited 6 months ago) (1 children)

Only Google, Microsoft and Meta are playing the field anyway. The investment in energy alone you need to get these kinds of results is absurd. The only economically viable alternative is open source IMO, and I doubt that's going to happen if these companies have any say in the matter. Funnily enough, this also fixes the training data problem, it can be created by consent like any open source collaboration. But instead we need to allow rampant copyright infrigment hahahaha.

And about the games, I guess we will see if we ever see any of them, like in the real world. To me games are about playing, not about almost human NPCs anyway. But for tastes colors and all that, I'm the first to admit I have weird tastes lol.

[–] Grimy@lemmy.world 2 points 6 months ago (1 children)

There's already some open source models from other companies. I also think the requirements will go down with time as well but energy is definitely an issue.

Meta actually released an open source model which jumpstarted the whole ecosystem, anyone can fine-tune a base model now. You can take your favorite hobby, accumulate data on it and build something in a few days and share it.

I just think the good outweigh the bad and individuals weren't going to get paid anyways. Most of the data is owned by specific websites, big publishing houses and the like so I can overlook the infringement issues.

[–] trollbearpig@lemmy.world -1 points 6 months ago* (last edited 6 months ago)

I don't know man. I mean, if you assume that the people that were doing the art, music, code, etc, that's being stolen were not going to get paid anyway then yeah. If they were doing shit just for love they may continue, and with new toys lol.

But I don't think that's a good assumption. Even if not a lot, sone people do get paid for this kind of work. And now they will not get paid anymore. And maybe that's leveling the playing field. Or maybe that's telling people with talent to stop doing what they do well. Probably both. But at the end of the day we are going to see less art made by people and more done by "AI", much more done by "AI".

And that's the biggest problem IMO, for most people art is social and part of the reward is the recognition we get from other people when we do good art. But with AI that's gone, on the internet at least. The sea or superficially good but mediocre shit we are already seeing is going to kill a lot of indie art.

And then there is hallucinations which seem unsolvable ... and the environmental damage ... and the labor practices abuses ... and their monopolization of the technology ... and their missleading marketing ... I honestly just see so much damage and almost no benefit, yet ... maybe some day it all will pay off, I don't see it