this post was submitted on 21 May 2024
422 points (97.7% liked)
Technology
59569 readers
3825 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
As long as this is opt-in and users understand the risks, then I don't have a problem with it. I wouldn't use it on my personal PC, but it would probably be handy for my work PC. (Although my organization would probably block the feature for security reasons. So maybe it's not actually that useful after all.)
It'll be opt-out with the setting in some obscure and hard to find menu, just like every other AI program. And that's if they're required to even allow you to opt out.
This is conjecture. Maybe we should wait before we make assumptions? Am I being too logical for /c/technology?
It's conjecture based on evidence from the way previous companies have handled AI data as well as the way Microsoft themselves generally handle things.
I'd rather prepare for the corporate greed and be pleasantly surprised than be disappointed when Microsoft does something that will negatively impact their userbase in the name of profits again (or MAUs or whatever else looks good on the quarterly report).