this post was submitted on 10 Jan 2024
1237 points (96.5% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Their argument is that the copying to their training database is "research". This would be a legal fair use of unauthorised copying. However, normally with research you make a prototype, and that prototype is distinctly different from the final commercial product. With LLM's the prototype is the finished commercial product, they keep adding to it, thus it isn't normal fair use.
When a court considers fair use, the first step is the type of use. The exemptions are education, research, news, comment, or criticism. Next, they consider the nature of the use, in particular whether it is commercial. Calling their copying "research" is a bit of a stretch - it's not like they're writing academic papers and making their data publicly available for review from other scientists - and their use is absolutely commercial. However, it needs to go before a judge to make the decision and it's very difficult for someone to show a cause of action, if only because all their copying is done secretly behind closed doors.
The output of the AI itself is a bit more difficult. The database ChatGPT runs off of does not include the whole works it learned from - it's in the training database where all the copying occurs. However, ChatGPT and other LLM's can sometimes still manage to reproduce the original works, and arguably this should be an offense. If a human being reads a book and then later writes a story that replicates significant parts of the book, then they would be guilty of plagiarism and copyright infringement, regardless of whether they genuinely believe they were coming up with original ideas.