The United States Federal Trade Commission's (FTC) new report, Generative Artificial Intelligence and the Creative Economy, offers new insight into how generative artificial intelligence is affecting professionals in music, filmmaking, software development, and other creative fields.
The report follows an October 2023 roundtable comprised of visual artists, programmers, screenwriters and musicians - each of whom articulated their individual and collective experiences with generative AI. The central theme was unmistakable: generative AI, despite offering potential to revolutionise creative processes, is a double-edged sword.
AI Training: Use of Creative Works Without Consent?
A critical concern discussed was the unauthorised scraping of creative content for training AI models. Participants like Duncan Crabtree-Ireland of SAG-AFTRA emphasised that AI’s capabilities are inextricably tied to the creative input it receives, noting that “no AI algorithm is able to make something out of nothing.” This reliance on human creativity, however, is marred by opaque practices. Renowned author Douglas Preston (author of 32 New York Times bestsellers and former President of the Authors Guild) voiced that his work was being used without consent, raising pressing questions about copyright, ownership, and moral rights.
AI companies “refuse to answer any questions about what data sets they're using, where they're getting their books, and how they're being used. There's no transparency at all. It's an absolute black hole.” — Douglas Preston
Photo by Jakob Owens @ Unsplash
Legally Confused? Rights, Ownership, and Transparency
Many artists, unaware of the terms embedded in their contracts or those governing the platforms they use, find themselves inadvertently surrendering their creative rights. This situation is exacerbated by a lack of transparency from AI developers about their data sources and training practices. The need for legal clarity and advice has never been more pronounced. Although the landscape is quickly evolving, with proper legal representation, creators can better understand the full scope of how their work may be used.
This is especially important as AI-generated content is increasingly appearing alongside human-made works, blurring lines and creating confusion. As Umair Kazi from the Authors Guild pointed out, the inundation of AI-created content in marketplaces like online book retailers is not just an issue of market saturation; it's a direct threat to the livelihoods and reputations of human creators.
"A voice actor in New York worked for a company for three years, and year four, they were let go because they were told the company had enough of their audio, and they were going to now create a synthetic version of their voice." — Tim Friedlander, president, National Association of Voice Actors
Imitation or Infringement? The Peril of Style Mimicry
A particularly alarming concern is AI’s ability to mimic the unique styles and voices of individual creators. Illustrators like Karla Ortiz and writers represented by the Authors Guild are witnessing the emergence of AI-generated works that mimic their styles, potentially misleading audiences and infringing upon their personal brand. This phenomenon extends beyond just visual arts and literature; voice actors, represented by Tim Friedlander, are facing similar challenges with AI-generated voice clones.
The Response: Unionising and Advocating for Rights?
In response to these challenges, many creators are turning to collective action. Union contracts and advocacy efforts are becoming vital tools for artists seeking to enshrine their rights to choose whether and how AI is used in their work. The collective voice of creators could be a powerful force in pushing for fairer practices and greater transparency in the AI arena.
For studio executives, producers, and all decision-makers in the creative sector, the message from the FTC roundtable seems clear: embrace the future, but do so with your eyes wide open. In navigating these uncharted waters, staying informed and proactive is key. Understanding the legal intricacies of AI technology and its impact on creative rights is not just a matter of compliance - but a strategic imperative. It's also about preserving the sanctity of creative expression, and ensuring that innovation does not come at the cost of the creators who fuel it.