Press "Enter" to skip to content

Will AI reinforce bias in the movie-making decision process or remove it?

Using artificial intelligence to make decisions about movies offers a starring role for explainability and a chance to build trust.

Last week at CES 2020, Lynne Parker, asked the question, “Whose job is it to protect us from the black box?” during a session on global leadership in artificial intelligence (AI). Parker is the deputy chief technology officer, White House Office of Science and Technology Policy.

More from CES 2020

On the same day, The Hollywood Reporter broke the news that Warner Bros. studio had signed a deal with Cinelytic, an AI-driven project management system. The platform uses predictive analytics “to guide decision-making at the greenlight stage” and “assess the value of a star in any territory and how much a film is expected to make.” 

Several studios already use AI platforms with the goal of avoiding expensive flops like “Playmobil” (production budget: $40 million vs. Global box office: $13 million) and “Cats” (production budget: $100 million vs. global box office: $10.9 million).

With more and more studios using these tools, there will be high-profile test of bias in AI and the power of explainability.

Reinforcing bias or removing it?

Adding artificial intelligence to the movie making process could go either way in terms of getting more profitable movies on the big screen. Relying on data to pick winners could result in more horrible sequels if it’s powered by a dataset of “we’ve always done it this way” numbers –see Amazon’s failed hiring system, and the algorithm that Goldman Sachs uses to approve credit limits for Apple Cards. 

If the dataset is reviewed for bias at the outset and regularly updated, the decision-making could ease the bias inherent when one homogenous group makes all the funding decisions. UCLA’s annual Hollywood Diversity Report suggests audiences prefer movies and TV shows that feature relatively diverse casts. Captain Marvel and Black Panther both made more than $1 billion worldwide, but women and black people still don’t get as many lead roles as white men.

There are even more important decisions that AI will influence – such as who gets approved for a loan, or who gets put on a watch list –but these Hollywood decisions could be higher profile than most. Looking at which movies have a better chance of being made is an easier way to understand the potential bias in one AI system that could be present in many other applications. The questions about how that process works in a creative industry are worth asking in every other industry where AI is influencing and making decisions: 

  • What is the data set being used to make these decisions?
  • How often is it updated? 
  • Has the data been tested for bias? 

To write the diversity report, researchers at the UCLA College of Social Science consider the race and gender of lead actors, show creators, writers and directors as well as  global and domestic box office returns, viewer and social media ratings, Oscar and Emmy awards, genre, show locations, international distribution and the demographics of ticket buyers. 

Women are slightly more than half the US population, but they represented 33% of film leads, 40% of TV leads, and 43% of cable leads. People of color made up less than 22% of leading roles in movies, TV shows and cable shows, even though this group represents nearly 40% of the US population. 

In the 2019 Hollywood Diversity report , the researchers concluded despite some improvements, “the kind of structural change necessary … simply has not occurred in the film sector.”  AI-powered decision making has the power to reinforce existing bias or support those structural changes that have not happened yet.

Opening up the AI black box

Another opportunity of AI in the movie world is the chance to explain explainabiliy. IBM lists “explainability” as one of the four pillars of trusted AI required to build confidence in the datasets and automated decision-making process. Explainability is the big opportunity here. Obviously actors have more ways to make money than warehouse employees or food service workers whose jobs are most likely to be affected due to machine learning, but some of the same issues will come to light: 

This new decision-making process in the movie industry also provides a new area of study for the UCLA College of Social Science who wrote the diversity report. It’s also a possible competitive edge for AI firms looking to win more business in the creative field. Trust is the key to winning humans over when it comes to AI, and explainability is key to building that trust.

Also see

blackbox with input and output arrows” data-credit=”Barbulat/Getty Images/iStockphoto” rel=”noopener noreferrer nofollow”>Black box systems
blackbox with input and output arrows

Barbulat/Getty Images/iStockphoto

Source: TechRepublic