this post was submitted on 15 Mar 2024
491 points (95.4% liked)

Technology

59534 readers
3183 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] whoisearth@lemmy.ca 9 points 8 months ago (1 children)

So my work uses ChatGPT as well as all the other flavours. It's getting really hard to stay quiet on all the moral quandaries being raised on how these companies are training their AI data.

I understand we all feel like we are on a speeding train that can't be stopped or even slowed down but this shit ain't right. We need to really start forcing businesses to have moral compass.

[–] RatBin@lemmy.world 3 points 8 months ago

I spot aot of people GPT-eing their way through personale notes and researches. Whereas you used to see Evernote, office, word, note taking app you see a lot of gpt now. I feel weird about it.

[–] Fedizen@lemmy.world 7 points 8 months ago* (last edited 8 months ago)

this is why code AND cloud services shouldn't be copyrightable or licensable without some kind of transparency legislation to ensure people are honest. Either forced open source or some kind of code review submission to a government authority that can be unsealed in legal disputes.

[–] RatBin@lemmy.world 3 points 8 months ago

Obviously nobody fully knows where so much training data come from. They used Web scraping tool like there's no tomorrow before, with that amount if informations you can't tell where all the training material come from. Which doesn't mean that the tool is unreliable, but that we don't truly why it's that good, unless you can somehow access all the layers of the digital brains operating these machines; that isn't doable in closed source model so we can only speculate. This is what is called a black box and we use this because we trust the output enough to do it. Knowing in details the process behind each query would thus be taxing. Anyway...I'm starting to see more and more ai generated content, YouTube is slowly but surely losing significance and importance as I don't search informations there any longer, ai being one of the reasons for this.

load more comments
view more: ‹ prev next ›