• JohnWallStreet
  • Posts
  • Invisible Data Biases Threaten to Undermine AI’s Progress, Set Sports Properties Back

Invisible Data Biases Threaten to Undermine AI’s Progress, Set Sports Properties Back

sports. media. finance.

Invisible Data Biases Threaten to Undermine AI’s Progress, Set Sports Properties Back

Artificial Intelligence (AI) is revolutionizing the sports industry. It now powers everything from new player recruitment and performance evaluation platforms to innovative fan personalization and revenue optimization tools.

However, hidden and invisible biases in the data underpinning these applications threaten to undermine the tech’s progress and set the industry back.

That may sound a bit dramatic. But the real-world examples below reflect why sports properties should address any/all potential biases that may exist in datasets from the start (or as close to it as possible), before the AI’s output becomes a problem.

  • Racial Bias in Sports Commentary

A June ‘20 study by the Danish firm RunRepeat, in collaboration with the Professional Footballers' Association, analyzed 2,000+ commentator statements from 80 professional matches across Europe’s top leagues. It found that players with lighter skin were praised more frequently for their intelligence, work ethic, and leadership qualities than their darker toned teammates and opponents.

That kind of bias will influence audience perceptions. It can also affect AI-powered player performance analysis systems. When models are trained on skewed media coverage, they risk perpetuating stereotypes that can lead to unfair and/or inaccurate evaluations. Overlooking a potential star can cost an organization tens of millions of dollars in performance value and lost local revenue.

  • Facial Recognition Bias at Stadiums/Arenas

Several NBA and MLB teams have explored or are using AI-powered facial recognition applications for security purposes or to provide an enhanced fan experience. However, studies, including one conducted by the National Institute of Standards and Technology in 2019, have found that these systems have higher inaccuracy rates for people of color. 

Imprecision can lead to misidentification that spurs a negative fan experience or profiling, which can result in costly lawsuits and/or damage the brand’s reputation.

  • Gender Bias in Media Coverage

A 2021 UNESCO study found that women's sports account for just 4% of the total sports media coverage worldwide. AI models trained on that data may undervalue female athletes and the leagues they play in suppressing revenue generating opportunities (think: media rights, sponsorships, merchandising).

Hidden and invisible biases stem from historical inequalities in training datasets and a dearth of diversity in data sources and AI development teams, which unintentionally leads to the reinforcement of stereotypes and further exclusion of underrepresented groups.

By addressing prejudices head-on, sports organizations can promote fairness and inclusivity–and work to eliminate biases that may negatively impact the business.

Not just from an ethical perspective, either.

Targeting data biases will help AI systems to make more accurate decisions and unlock talent across the organization. And by proactively establishing measures to do so, organizations can prevent compliance issues from occurring down the line.

It’s widely expected that governmental regulations are coming. 

Sports properties heavily using AI would be wise to undergo an external audit that provides an objective evaluation of their systems and identifies any hidden or invisible biases that internal teams may have overlooked.

However, even an organization just dabbling in the tech should have a basic series of best practices to mitigate potential biases and ensure fair use. We suggest doing the following, at a minimum: 

1. Diversify Your Data Sources

Ensure your datasets accurately represent all stakeholders across all demographics, including those from underrepresented groups, to ensure fairness in algorithms. 

2. Conduct Regular Internal Audits of AI Systems

Routine checks on datasets, algorithms, and AI outputs can help identify and quickly correct any potential biases that may exist. IBM’s AI Fairness 360, Microsoft’s Fairlearn, and Google’s What-If Tool are among the solutions designed to help organizations detect and mitigate these hidden or invisible opponents hampering AI systems.

3. Foster Cross-Functional Collaboration

Assigning diverse teams to AI projects can help ensure fairness and inclusivity, and more holistic decision-making from the systems. Organizations may want to consider including ethicists, sociologists, former players, and coaches in the development and oversight of AI-powered applications.

4. Maintain Transparency and Accountability

Publishing open documentation on AI methodologies will increase trust and reduce concerns (think: biases, unfair practices) amongst all stakeholders. Transparency can also foster opportunities for collaboration and/or innovation, as other organizations will learn from and build upon your best practices.

5. Invest in Internal Bias Training

Educate your staff about the potential for AI bias in data and algorithms. Awareness empowers employees to recognize and address problems when they see them, which reduces the potential for flawed decisions, unfair evaluations, and negative fan experiences.

AI offers rights owners across the industry unparalleled opportunities to incrementally grow revenues and enhance fan engagement. However, without diligent efforts to address data bias, these innovations will only exacerbate existing inequalities and bring about new challenges. 

It is on the industry’s decision makers to implement best practices, seek out and correct biases, and ensure this revolutionary technology serves stakeholders equitably. Doing so will foster growth, trust, and long-term success.

Editor’s Note: Back in July, we explained how sports properties are leveraging Gen-AI to enhance partnership proposals. But it’s hard to convey the value prop without seeing the output firsthand. Send a .ppt or PDF file of a recent sponsorship deck to [email protected] and we’ll run it through ChatGPT-4 for you. You’ll be amazed with the tangible ideas it provides to make the pitch more effective/convincing

About The Author: Former Washington Commanders chief strategy officer Shripal Shah has spent much of the last decade helping media companies, big box retailers, and innovative startups enhance their businesses using AI. He’s now transforming sports businesses using much of the same playbook. 

Shah is also a professor in Georgetown University's Sports Industry Management Program and the author of “Leveling Up With AI: A Strategic Guide to AI in Sports Marketing” and “The Art of Victory: Generative AI and the New Frontier of Global Sports.” You can reach him direct at [email protected].

Top 5 Sports Business Headlines
Click here to subscribe to Sport & Story Daily and never miss a story.

  • HBSE, Fox Sports, Playfly teaming up to put on Coretta Scott King Classic

  • Coaching High Performers in the Sports and Business World with Senior Executive Coach Pam Borton

  • Exclusive Interview with Alabama's Jalen Milroe

  • Jaylen Brown Says He Turned Down $50 Million to Start His Own Shoe Company

  • UNLV QB Matthew Sluka leaving team amid allusions to NIL payment issues