On 18 January 2024, the UK Government released The Generative AI Framework for HMG, which offers insight on using generative AI (primarily Large Language Models, or "LLMs") safely and securely. Although the Framework is intended for civil servants and people working in government organisations, the ten Principles it contains for using generative AI responsibly could also be applied to the creative and audiovisual sectors.
The 10 Principles for the responsible use of AI - and what the media and entertainment industry can learn
Principle 1: "You know what generative AI is and what its limitations are"
Understanding the capabilities and limitations of generative AI is crucial to making informed, intelligent decisions. In the entertainment industry, stakeholders should understand that AI models do not possess creative intuition or storytelling abilities on their own. Because AI lacks personal experiences, emotions, and contextual awareness, it can assist - but not replace - the creative instincts and artistic vision of human professionals. Overreliance on AI might compromise artistic quality, audience engagement, and the unique talents of human professionals.
Principle 2: "You use generative AI lawfully, ethically, and responsibly"
Creatives should use generative AI in ways that respect intellectual property rights, privacy, and ethical considerations. The topic of AI ethics is complex and dynamic, and perhaps beyond the scope of a blog focused on legal issues! Nevertheless, anyone using AI for their project should be mindful of ethical issues such as AI bias, perpetuation of stereotypes, and the liklihood of job displacement across the industry. As for the "lawful" use of AI, new laws including the European Union's incoming AI Act will introduce new regulatory obligations, and lawsuits like Authors Guild v. OpenAI Inc may have considerable implications as to how copyright protections apply (or don't!) to LLM-generateed content. It's always best practice to take legal advice on how best to properly attribute content and sources, and observe licences.
Principle 3: "You know how to keep generative AI tools secure"
As with any sort of technology - especially those connecteed to the Internet - media and entertainment companies should ensure that AI tools used in content creation or distribution are secure to protect sensitive information and confidential data. Amongst other things, proper authentication and access control mechanisms should be in place to prevent not only accidental data breaches, but intentional leaks or distribution of fake (but realistic) footage. This is vital to maintain the confidentiality of unreleased content or proprietary information.
Principle 4: "You have meaningful human control at the right stage"
In the entertainment industry, human sensibility, empathy, and cultural understanding are indispensable for storytelling and character development - and these aspects should remain under human control to create relatable and authentic content. Decision-makers in the creative process (which ideally includes performers too) should have oversight and control over the content generated by AI, to ensure it aligns with their artistic vision, and doesn't compromise their creative authority.
Principle 5: "You understand how to manage the full generative AI lifecycle"
Entertainment companies must comprehend the entire generative AI lifecycle, from software selection and data gathering, to model training and ongoing maintenance. Effective deployment and continuous improvement are essential to ensure that AI enhances creativity and aligns with the creative vision.
Principle 6: "You use the right tool for the job"
Media professionals should choose generative AI technologies that best suit specific creative tasks, while considering cost-effectiveness and sustainability. Whether using GANs for images or language models like GPT-3 for text generation, choosing the right AI technology is critical to maintain content quality and align with the intended creative vision.
Principle 7: "You are open and collaborative"
If producers, studios and other parties with significant control over a project are willing to collaborate with performers, creatives, and the public, it can help to ensure transparency and accountability in the use of AI. Open dialogue with writers' guilds, actors' unions, and industry associations can lead to fairer agreements, better working conditions, and creative insights... or at the very least, help to avoid problems down the line! (See the SAG-AFTRA stike!) As for being "open", if AI-generated content is used in a production or recommended to viewers, transparency about its involvement should also be considered - or indeed, pending the outcome of certain regulation - it may soon be required.
Principle 8: "You work with commercial colleagues from the start"
Creatives should work closely with their commercial partners to ensure they're on the same page regarding budget, resource allocation, and other strategic goals. A lack of communication with business partners - including technology suppliers and technical specialists - can potentially disrupt project timelines and creative plans. In addition to speaking with commercial partners, engaging with legal teams early in AI-related projects can help to avoid legal uncertainties, particularly regarding intellectual property rights, licensing and royalty agreements, and other contractual obligations.
Principle 9: "You have the skills and expertise needed to build and use generative AI"
Without the required skills, content generated using AI may lack the quality, creativity, and relevance expected by audiences. This can lead to decreased viewer engagement and diminished artistic value, as well as frustration from financial backers and other stakeholders. No less importantly, mishandling personal data or failing to protect content from unauthorised access can result in legal liability and potential lawsuits from creators, authors, performers, or other rights holders.
Principle 10: "You use these principles alongside your organisation’s policies and have the right assurance in place"
This Principle emphasises the integration of the above AI Principles into an organisation's existing policies and procedures, so that it's easier to hold individuals and teams accountable for AI-related actions and outcomes. For instance, content development and approval policies can include checks to ensure compliance with Principle 7 ("You are open and collaborative"), before AI-generated ideas or content progresses down the production line. This is likely to be more relevant to larger studios and production companies with many employees, but it is worth bearing in mind for smaller organisations and even freelancers, too.
A few final thoughts
Although written for civil servants, the UK Government's Generative AI Framework for HMG provides helpful insight on how the entertainment industry might navigate the complex landscape of generative AI. The ten Principles underscore the importance of security, collaboration, and technical skills development, and call for a conscientious approach towards legal and ethical use. Importantly, they also stress the necessity of human oversight. Generative AI is a powerful tool, but its effective and responsible use hinges on a careful balance between technological possibilities and the enduring value of human creativity.