this post was submitted on 24 Apr 2024
-60 points (15.1% liked)

Technology

59569 readers
3431 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] eager_eagle@lemmy.world 45 points 7 months ago* (last edited 7 months ago) (6 children)

sigh

The news:

So, in conclusion: If your face is large, you’re a conservative; if it’s skinny, you’re a liberal; and facial recognition is bad—we all know that. That seems to be all you need to know.

The paper:

Our results, suggesting that stable facial features convey a substantial amount of the signal, imply that individuals have less control over their privacy. The algorithm studied here, with a prediction accuracy of r = .22, does not allow conclusively determining one’s political views, in the same way as job interviews, with a predictive accuracy of r = .20, cannot conclusively determine future job performance.

r=0.22 is a weak to moderate correlation, btw. An actual predictor will need more data than just one's face in order to have a decent chance.

[–] brian@lemmy.ca 2 points 7 months ago

What's amusing to me is that they referred to the job interviewer having similar reliability, but didn't say whether it was good or not. Purely let the bias of the article imply that they were highly reliable.

load more comments (5 replies)