Image Credit - Deadline

BFI Warns Against AI’s Threat to UK Film

June 12,2025

Technology

UK Screen Sector Faces ‘Direct Threat’ from AI Script Raid

Companies developing artificial intelligence are taking an enormous library of 130,000 television and movie screenplays to develop their systems, a practice the British Film Institute has declared a “direct threat” to the United Kingdom's screen industry. This massive, unauthorized data collection from protected works presents a significant danger to the long-term health of Britain's entertainment production field.

In a detailed analysis, the BFI explores the potential advantages and disadvantages of AI for Britain's movie, television, video gaming, and visual effects businesses. The institute also voices serious apprehension that the automation of certain tasks could eradicate opportunities for newcomers, thereby cutting off a vital pathway for emerging professionals entering the workforce.

The Core of the Copyright Crisis

The document identifies the most critical problem confronting the £125 billion sector as the utilization of protected creative assets, also known as intellectual property, to instruct generative AI systems without authorization or compensation for the original creators. This massive data collection is being contested by creative experts who view it as a widespread violation of copyright law.

Britain's creative fields are strongly advocating for a system requiring explicit consent. Such a framework would obligate AI developers to obtain approval and negotiate licensing agreements before they could use existing content. The nation's government is actively reviewing which regulations it should implement to address this issue, contrasting with "opt-out" models that critics say unfairly burden creators.

A Looming Threat to New Talent

The capacity for artificial intelligence to handle responsibilities once done by junior employees is creating widespread alarm. Many functions like scriptwriting, language conversion, and specific technical visual effects can now be performed by machines. This development is fuelling anxieties about job obsolescence among professionals, with particular concern for roles that have traditionally offered a first step on the career ladder for new entrants to the industry.

The BFI’s analysis highlights a stark reality: AI’s power to automate functions directly endangers job security, especially for those in starting positions. Industry leaders and unions echo this concern, warning that the displacement of junior workers could have long-term consequences for the diversity and skill base of the entire workforce. Preparing the current and future workforce for this integration is therefore an essential, urgent task.

Government Pledges and Industry Doubts

The UK government has acknowledged the necessity for action, with officials aiming to find a workable balance that encourages AI innovation while protecting the nation's creators. Lisa Nandy, the Culture Secretary, has worked to calm the creative sector's fears, promising a series of roundtables with industry representatives to help draft enforceable legislation.

However, a sense of urgency and scepticism pervades the creative sector. High-profile leaders, including the director-general of the BBC and the chief executive of Sky, have publicly criticised proposals that would permit technology firms to use copyrighted material without permission. The creative industries launched a "Make it Fair" campaign to raise public awareness, arguing that current proposals favour large tech companies to the detriment of UK artists.

BFI

Image Credit - OCED

The Double-Edged Sword of Technology

While the study recognizes the technical upsides of artificial intelligence, including the ability to make performers appear younger and enhance the believability of spoken dialects, such as the controversial use in The Brutalist featuring Oscar-winner Adrien Brody, it equally highlights anxieties regarding workforce reductions. These tools can dramatically accelerate production timelines and make content creation more accessible.

This duality places the industry at a critical inflection point. Rishi Coupland, who is the director of research and innovation at the BFI, communicated that AI presents considerable potential but could also weaken established commercial structures, replace experienced professionals, and diminish public confidence in media. The technology offers transformative opportunities alongside significant risks to the established order.

A Critical Shortfall in Skills and Training

To navigate this new landscape, the workforce must adapt. The BFI report, however, identifies a "severe deficit" in the availability of AI-related educational programs. Much of the current instruction on AI is informal, and the study notes that a large number of professionals, especially those working as freelancers, do not have the ability to access materials that would help them cultivate skills that complement artificial intelligence.

Without structured programmes, there is a risk that the workforce will be unprepared for the widespread integration of AI. The report, which was released in collaboration with Loughborough, Edinburgh, and Goldsmiths universities, stresses that upskilling is essential. It calls for targeted investments to ensure that creative professionals can work alongside AI, leveraging its capabilities rather than being replaced by them.

UK’s Global Standing at Risk

London is a global powerhouse in creative fields, second only to Mumbai as a leading center for visual effects experts. It is home to world-leading companies like Framestore and DNEG, which contributed to blockbuster franchises including Avengers: Endgame, Harry Potter, and the BBC's television version of His Dark Materials. The United Kingdom is a base for over thirteen thousand creative technology firms.

This world-leading position is now in jeopardy. The BFI report warns that without a clear strategy and robust regulatory framework, the UK could lose its competitive edge. The current paradigm of uncompensated data scraping threatens the economic model that has allowed these industries to flourish. Protecting intellectual property is therefore not just a matter of fairness, but a crucial component of maintaining the UK's status as a creative superpower.

The Call for Responsible Innovation

The 45-page document puts forth nine key suggestions designed to guide the industry through this period of disruption. A central recommendation is the urgent establishment of a formal marketplace for intellectual property licensing. This would create a clear and legal pathway for AI developers to access creative works while ensuring rights holders are compensated, involving new frameworks and agencies to facilitate such deals.

Other recommendations focus on responsible and ethical AI development. The report calls for fostering "market-aligned, culturally sensitive AI instruments" to avoid the risk of cultural homogenisation that could result from models trained on narrow datasets. It also highlights the significant environmental impact of large-scale AI models, urging the adoption of sustainability standards to minimise their carbon footprint.

The Role of National Research Labs

This analysis was also created alongside the CoStar Foresight Lab, a £75.6 million national web of research facilities dedicated to inventing innovative technologies for the entertainment world. This partnership underscores the importance of collaboration between industry, academia, and research bodies. CoStar aims to provide the sector with the intelligence and tools needed to adapt to technological shifts and maintain its global competitiveness.

Jonny Freeman, the director of CoStar, remarked that AI delivers potent instruments for boosting creativity and productivity throughout all phases of content creation. He cautioned, however, that the technology introduces pressing concerns regarding competencies, workforce adjustments, moral standards, and the industry’s viability. The lab's role is to provide an evidence-led roadmap to help all parties make informed decisions.

High-Profile Criticisms and Government Assurances

The debate over AI and copyright has intensified recently, drawing in major industry figures. The heads of both the BBC and Sky have voiced strong opposition to proposals that would allow tech companies to use protected creations without consent. Their intervention highlights the unified front the creative sector is presenting against what it perceives as an existential threat.

In response, Culture Secretary Lisa Nandy has stated that her team approached the matter without any predetermined leanings. As a Labour government, she affirmed that the fundamental tenet that creators should receive payment for their contributions is foundational. She gave her assurance that if a policy does not benefit the creative fields, it will be unacceptable to them, aiming to build trust and foster collaboration.

Empowering a New Wave of Creators

Despite the profound challenges, the study additionally concedes a silver lining. By lowering technical and financial barriers, artificial intelligence could enable a fresh generation of UK artists to craft superior work. Independent creators, operating with modest budgets, could gain access to sophisticated production tools that were once the exclusive domain of major studios, leading to a surge in innovation.

This potential can only be realised, however, if the underlying framework is equitable and viable. The report argues for increasing funding for accessible tools and fostering ethical AI products designed to support independent talent. If managed correctly, the AI revolution could create a more inclusive and dynamic creative ecosystem, where new voices can thrive regardless of their resources or prior experience.

BFI

Image Credit - Broadcast Now

Navigating the Legal and Ethical Maze

The current legal landscape surrounding AI and copyright is fraught with uncertainty. Tech companies training their models on vast swathes of internet data often do so without a clear legal basis, prompting high-stakes litigation. The UK government’s consultation on the issue is an attempt to provide clarity and establish a framework that can command the trust of both the tech and creative sectors.

Transparency is a key demand from Britain’s creative fields. They are calling for AI developers to disclose what materials their models are trained on. Furthermore, there is a push for clear labelling so that audiences know when AI has been used in the creation of media content. A recent survey showed strong public support for such disclosures, indicating a desire for authenticity.

The Freelancer’s Predicament

The UK’s screen world relies heavily on a flexible, freelance workforce. These independent professionals are often the most vulnerable to industry shifts. The BFI's report highlights that freelancers, in particular, often cannot get the formal training and resources needed to adapt to the rise of AI. Without institutional support, many risk being left behind as new, AI-integrated workflows become standard practice.

Addressing this gap is crucial for the health of the entire ecosystem. The report’s call for greater funding in skills and training is especially pertinent for this segment of the workforce. Ensuring that freelancers have opportunities to upskill will be essential for maintaining the talent pipeline and ensuring the UK’s creative industries remain resilient and adaptable.

The Specter of Homogenisation

A significant ethical concern raised in the report is the risk of cultural homogenisation. Many of the most powerful generative AI systems are trained on datasets that are predominantly American or otherwise culturally narrow. If these tools become standard within Britain, they could inadvertently sideline local narratives and marginalise unique British voices. This would be a profound loss for a nation known for its distinctive creative output.

To counter this, the report advocates for supporting the creation of culturally considerate AI instruments. This involves cross-disciplinary collaboration between technologists and creative professionals to build models and platforms that are better suited to the specific needs of the UK industry and its audiences. National institutions are already experimenting with fine-tuning AI models to reflect their own editorial standards.

An International Perspective

The UK is not alone in grappling with these issues. Governments and creative industries around the world are facing the same set of challenges. The European Union has moved to implement a copyright framework that includes an opt-out for rights holders, a model the UK government is now considering. How the UK decides to regulate AI will have significant implications for its international competitiveness.

The BFI's suggestions aim to position Britain as a global leader by creating a fair and solid marketplace for intellectual property rights in the era of AI. By creating a clear, ethical, and commercially viable framework, the UK could attract investment and become a global hub for responsible AI-supported content production, setting a standard for other nations to follow.

The Path Forward

The analysis from the BFI serves as both a stark warning and a call to action. It paints a picture of a creative sector at a crossroads, facing both an existential threat and a moment of unprecedented opportunity. The path forward requires a delicate balancing act: fostering innovation in AI while protecting the intellectual property that is the foundation of the creative fields.

Success will depend on collaboration. Policymakers, tech companies, industry bodies, and creative professionals must work together to build a future where technology serves human creativity rather than supplanting it. The coming months of consultation and legislative debate will be critical in determining whether the UK can successfully navigate this transition and ensure its world-leading screen sector continues to thrive.

Do you want to join an online course
that will better your career prospects?

Give a new dimension to your personal life

whatsapp
to-top