Companies have done this on purpose. They all want you to stay in their walled garden, their "ecosystem" of various products. So they make it easy to get into and get connected to people and things, and then make it hard to leave because you're "invested."
EldritchFeminity
I really don't understand how people use Instagram. I've tried, but it's about 45% ads, 10-15% posts by people I don't follow, it's not in chronological order (or any sense of order for that matter), and regardless of whether I was on there yesterday or 2 months ago, it'll show me about 40 posts before saying "You're all caught up from the past 3 days!" and then refuse to show me any more.
I guess this is why I'm here on Lemmy and went crawling back to Tumblr, one of the last vestiges of the old internet. At this point, I'd rather watch a platform die than become marketable to advertisers and shareholders.
There's a much better correlation between wealth and conservatism than age. Almost like those who begin to benefit from the system of oppression are incentivesed to keep it going.
Tipping is ingrained into our basic economic culture. Restaurant staff (waiters and waitresses in particular) make 80%+ of their money through tips. Federal minimum wage is about $7.25 USD, and almost no states have a minimum wage that low (some places it's easily double that), but it's completely legal to pay wait staff $2.25 an hour and expect them to make up the difference to $15-20 per hour in tips almost anywhere. A standard "good" tip at a restaurant is 20%. Even going to a grocery store you'll often see a tip jar on the counter that people toss their spare change into. Outside of restaurants, no other job is completely dependent on tips to live, but in many service industries it's still customary to tip as a way to show appreciation for a service rendered (especially if they go above and beyond).
On the one hand, yes, and Fandom is a blight on the internet.
On the other hand, AI like ChatGPT are wrong some 53% of the time. The fact that this is another "use nontoxic glue to keep your cheese from falling off of pizza" situation doesn't mean that Google isn't equally culpable for doing nothing to prevent these sorts of occurrences even when the sources are right (AI is as likely to make things up that aren't even in its cited sources as it is to actually give you info from them).
If they haven't been swayed already, this won't do a damn thing.
Sexism? Absolutely. Self-awareness? Not so much.
4chan is where incels were born.
Another Millennial here, so take that how you will, but I agree. I think that Gen Z is very tech literate, but only in specific areas that may not translate to other areas of competency that are what we think of when we say "tech savvy" - especially when you start talking about job skills.
I think Boomers especially see anybody who can work a smartphone as some sort of computer wizard, while the truth is that Gen Z grew up with it and were immersed in the tech, so of course they're good with it. What they didn't grow up with was having to type on a physical keyboard and monkey around with the finer points of how a computer works just to get it to do the thing, so of course they're not as skilled at it.
Because we're talking pattern recognition levels of learning. At best, they're the equivalent of parrots mimicking human speech. They take inputs and output data based on the statistical averages from their training sets - collaging pieces of their training into what they think is the right answer. And I use the word think here loosely, as this is the exact same process that the Gaussian blur tool in Photoshop uses.
This matters in the context of the fact that these companies are trying to profit off of the output of these programs. If somebody with an eidetic memory is trying to sell pieces of works that they've consumed as their own - or even somebody copy-pasting bits from Clif Notes - then they should get in trouble; the same as these companies.
Given A and B, we can understand C. But an LLM will only be able to give you AB, A(b), and B(a). And they've even been just spitting out A and B wholesale, proving that they retain their training data and will regurgitate the entirety of copyrighted material.
Reminds me of when I read about a programmer getting turned down for a job because they didn't have 5 years of experience with a language that they themselves had created 1 to 2 years prior.
The argument that these models learn in a way that's similar to how humans do is absolutely false, and the idea that they discard their training data and produce new content is demonstrably incorrect. These models can and do regurgitate their training data, including copyrighted characters.
And these things don't learn styles, techniques, or concepts. They effectively learn statistical averages and patterns and collage them together. I've gotten to the point where I can guess what model of image generator was used based on the same repeated mistakes that they make every time. Take a look at any generated image, and you won't be able to identify where a light source is because the shadows come from all different directions. These things don't understand the concept of a shadow or lighting, they just know that statistically lighter pixels are followed by darker pixels of the same hue and that some places have collections of lighter pixels. I recently heard about an ai that scientists had trained to identify pictures of wolves that was working with incredible accuracy. When they went in to figure out how it was identifying wolves from dogs like huskies so well, they found that it wasn't even looking at the wolves at all. 100% of the images of wolves in its training data had snowy backgrounds, so it was simply searching for concentrations of white pixels (and therefore snow) in the image to determine whether or not a picture was of wolves or not.
It's fairly common for women to start taking estrogen post menopause because their estrogen levels drop, and that can cause issues like losing bone density. I believe HRT was originally developed specifically for that reason, and it was only later that they tried using it to treat gender dysphoria.
But in this case, they specifically call out being a femboy on HRT, which I think pretty clearly says that they're on estrogen for that female figure. Whether it's DIY or just informed consent is anyone's guess, though.