News
Claude maker Anthropic's use of copyright-protected books in its AI training process was "exceedingly transformative" and fair use, US senior district judge William Alsup ruled on Monday.
A three-member panel had been convening in closed-door meetings about Sgt. Joseph Hanley since March 13, hearing testimony and reviewing evidence. They issued their decision on Friday.
Anthropic’s use of copyrighted books to train its artificial intelligence assistant Claude was “exceedingly transformative and was a fair use,” a federal judge ruled.
A federal judge ruled that Meta did not violate the law when it trained its AI models on 13 authors’ books.
IBM sees enterprise customers are using ‘everything’ when it comes to AI, the challenge is matching the LLM to the right use case ...
A judge’s decision that Anthropic‘s use of copyrighted books to train its AI models is a “fair use” is likely only the start of lengthy litigation to resolve one of the most hotly ...
A federal judge in California has issued a complicated ruling in one of the first major copyright cases involving AI training, finding that while using books to train AI models constitutes fair ...
A federal judge found that the startup Anthropic’s use of books to train its artificial-intelligence models was legal in some circumstances, a ruling that could have broad implications for AI ...
A judge has sided with Anthropic in a copyright case that determined that the company training its AI models on purchased books is fair use.
Key fair use ruling clarifies when books can be used for AI training In landmark ruling, judge likens AI training to schoolchildren learning to write.
While the startup has won its “fair use” argument, it potentially faces billions of dollars in damages for allegedly pirating over 7 million books to build a digital library.
The judge also ruled fair use law allowed Anthropic to take purchased physical books and scan them into a digital “research library” that can be used to train its models.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results