this post was submitted on 18 Feb 2024
114 points (96.7% liked)
Technology
59589 readers
3394 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Google asks for tax records, passport, id card for there developer account & adword. They can certainly ask for more info to verify the sender before sending the DMCA
I doubt they can. Once they learn about something infringing, they have to remove the link "expeditiously". A proper DMCA notification is not even strictly necessary. Even so, the DMCA specifies what information a proper notification should contain. An email address is sufficient ID. ->DMCA
Look at it from the perspective of the copyright industry. They want to be able to take down infringing material as quickly and easily as possible. If some small players get caught in the cross-fire, that doesn't matter. In fact, it is still competition that was removed. The "official" outlets are safe. Everything else can go.
And it's not like Google would make any friends by lawyering this. If Google made this a little bit hard for the copyright industry, you'd get complaints about how they are ripping off those poor, starving artists. Everybody's darlings, like Taylor Swift, would make noise. Lemmy would be up in arms against the evil corporation.
Look at this thread. Crickets. Look at the threads on AI. What people here want is more takedowns*, more restrictive copyright. And people here are apparently the left-wing, traditionally opposed to that sort of thing. I always get gloomy when this comes up.
It reminds me of that bit from Its Always Sunny in Philadelphia where Dennis is talking about Hollywood movies.
It's just so silly and yet so accurate. Whether it's social values, politics or even just the opinion of AI and it's capabilities vs. it's potential vs. how people actually use it, there's this pervading idea that restrictions en masse are a viable solution. I feel almost the opposite, like to some extent the oversaturation of it intrinsically lowers the negative reception of it. Prohibition philosophy - when it's not allowed people will work even harder to use it in those ways, when it's not only allowed but widely used and even encouraged, people just inherently care less over time.
We're at a point right now where we are getting some pretty poor quality oversaturation of AI content and the tool alone is what is being blamed, to the point where copyright is being touted as this saving grace despite it consistently having been used against us smaller artists when corporate money is involved. Copyright isn't promoting small artists, rarely has, nor is it preventing AI, but it's somehow suddenly meant to ensure that the art you uploaded isn't reproduced? That seems not only unlikely, but like it's a scapegoat for a larger issue. Generative art isn't a problem because Ms. Jane working two 40-hour jobs uses it to make art featuring existing characters. That circumstance was and never will be a problem because Jane very likely would never have the money to commission an artist in the first place. What Jane makes is 100% irrelevant, so long as she's not claiming it as her original creation and trying to sell it - beyond that? I don't think anyone should care or fault her, because she is doing the amount of art that her circumstances allow her.
What I absolutely agree is an issue is businesses and corporations using AI, cutting staff further overworking employees that remain. However, that Secret Invasion intro that seemed likely AI generated? I can't in good faith try to argue "they should be tried for infringement" but I can fully support the fact that they should have hired an artist who would at least try to better use the tools at their disposal. I can simultaneously feel that the fact that Deforum may have been used is absolutely awesome, while also being annoyed and frustrated that they didn't utilize artists who deserve it.
There is a very large difference between Ms. Jane making AI images, even movies, and any corporate product - or that AI generated rat for the science journal. For the former, it is something that IMO is fully necessary in order for Jane to be able to enjoy the experience of a creative process under the bullshit system we've worked out. The latter is a completely unnecessary replacement used to cut costs. And yet, for neither does the concept of infringement actually matter that much, because copyright isn't the fundamental issue of AI, it's just the one people are latching on to. Without realizing that the likelihood of copyright laws helping someone like us is nil. Especially since there's probably an overlap of people who laugh at NFT's and pirate files because bits of data aren't a physical commodity that runs out, but a generative Imaging tool that does it is... Too far?
I think AI's issues are separate from what I've mentioned here. What people blame AI for is something else entirely. AI is still just the tool that speeds up the process. We have the concept of safeguards utilized as signs, barriers, and nets, so that if someone wants to use a bridge for the wrong purpose there are some measures in place to prevent them. We don't blame bridges for what the person is trying to do - we recognize that there is some reasonable level of safeguard and beyond that we just have to trust the person to do the right thing. And when it does show to be a pervasive issue, even still there is pretty much a bare minimum done - add another layer and a net and call it a day - instead of focusing on maybe why people in society are so inclined to jump.
The issue is always us. Yes AI makes evils job easier, like so many tools have. But trying to safeguard AI to the point of non-existence is just absurd from every angle, given that the bad stuff is likely going to happen in abundance regardless. I don't particularly see AI as the evil so much as the humans creating the meaningless AI generated articles.
What really gets to me is the simple-mindedness of it all. Rich people own a lot of property. They own much more than everyone else. That's the very definition of rich. So when money has to be paid for the "use" of some property, it will disproportionately benefit the rich. I feel that this is obvious.
And yet when it gets to copyright, intellectual property, so many people seem unable to put this together. Somehow, paying money for AI training is supposed to benefit "the starving artist". At first, I thought these were far right libertarians who, as per usual, put all their faith in property rights. Now I just... I don't even know.
The point would be to ask that data before you could make a DMCA notification, but I agree that you could not fix this way a bad written law.