Former Meta Exec Claims AI Doomed If Artists Demand Permission for Model Training
Former Meta executive Nick Clegg recently expressed concerns about the potential impact of artist consent requirements on the UK's AI industry. During a promotional event for his forthcoming book "How to Save the Internet" at the Charleston Festival, Clegg was asked about the demands from artists for stricter copyright laws governing the use of their creative works in training AI models. Clegg acknowledged that artists should have the right to opt out of having their work used for AI training. However, he argued that requiring explicit consent from every artist before using their work would be impractical due to the vast amounts of data involved in training these models. "I just don’t know how you go around, asking everyone first. I just don’t see how that would work," he stated. Clegg further warned that implementing such a requirement exclusively in the UK would effectively halt the local AI industry, as it would face significant barriers that other countries might not impose. The debate over AI copyright laws in the UK has been heating up in recent months. In October 2022, the UK government introduced the Data (Use and Access) Bill, which allows tech companies to use creative works like books and music for AI training unless explicitly opted out by the copyright holder. This bill aims to facilitate innovation and growth in the AI sector by providing access to essential data. Earlier in May, the House of Lords proposed an amendment to the bill, requiring tech companies to disclose and seek consent from copyright holders before using their work. However, the House of Commons rejected this change, drawing criticism from various high-profile artists. For instance, Elton John, in an interview with the BBC on May 18, expressed strong opposition to the bill, calling it "criminal" and vowing to "fight it all the way." He felt betrayed, viewing the bill as enabling large-scale theft of artistic work. Paul McCartney, Dua Lipa, Elton John, and Andrew Lloyd Webber, along with hundreds of other musicians, writers, designers, and journalists, signed an open letter supporting the amendment earlier in May. Beeban Kidron, the film producer and director behind the amendment, emphasized the importance of transparency. She argued that if tech companies were forced to disclose the copyrighted content used in their AI models, it would deter potential misuse and strengthen copyright enforcement. Technology Secretary Peter Kyle defended the rejection of the amendment, stating that both the AI and creative sectors need to succeed and prosper. However, Kidron and her supporters remain determined. In an op-ed in the Guardian, Kidron promised that "the fight isn’t over yet," noting that the Data (Use and Access) Bill will return to the House of Lords in early June, where further debate and potential amendments could still occur. The controversy highlights the tension between fostering AI innovation and protecting intellectual property rights. Tech companies argue that stringent consent requirements could hinder progress and stifle innovation, while artists and creators contend that transparency and consent are crucial for ethical development and fair compensation. Nick Clegg’s views stem from his extensive experience in both politics and technology. As the UK’s deputy prime minister from 2010 to 2015, he has a deep understanding of political processes. His tenure at Meta, where he served as vice president for global affairs and communications from 2018 to 2022, and later as president of global affairs, has given him firsthand knowledge of the challenges tech companies face in developing and deploying AI technologies. Clegg’s arguments are supported by industry insiders who share his concerns about the practical difficulties of obtaining individual consent for massive datasets. They believe that such requirements could create significant legal and logistical hurdles, potentially derailing the AI industry in the UK. On the other hand, creatives and their supporters argue that existing copyright laws are inadequate and must evolve to address the unique challenges posed by AI. This ongoing debate underscores the need for a balanced approach that protects artist rights without stifling technological advancement. Policymakers will continue to grapple with this issue as they seek to strike the right balance between innovation and fairness. The outcome of the upcoming discussions in the House of Lords will be closely watched, as it could set a precedent for how other countries choose to regulate their AI industries.