For Sports and Athletics
Your organisation bought Catapult AI, Second Spectrum, or Hudl AI to get faster insights. But performance data showing an athlete is 5% slower tells you nothing about whether they will fight harder in week 10 of the season. Scouting AI identifies athletes with measurable attributes while missing the competitive character that separates good players from great ones. The risk is real: organisations that let algorithms drive decisions about athlete development, team selection, and fan engagement often find themselves optimising away the human elements that made them successful.
These are suggestions. Your situation will differ. Use what is useful.
Catapult AI and Second Spectrum excel at measuring movement patterns, heart rate variability, and spatial positioning. They cannot measure an athlete's response to adversity, their willingness to push through discomfort, or how their behaviour changes when the stakes rise. A player's WHOOP data might show poor sleep and high resting heart rate, but it will not tell you whether they are mentally fragile or simply dealing with something outside sport. Use these tools to notice patterns that need human investigation, not to make final decisions about athlete readiness or potential.
Scouting AI can measure sprint times, pass accuracy, and physical attributes across hundreds of players. It cannot recognise which athletes compete with ambition when they are losing, which ones adapt their game to new systems, or which ones will thrive in your club's specific culture. A scout who has watched 200 hours of tape on a player develops intuition about their decision-making under pressure. That intuition often identifies athletes that measurable attributes alone would miss. Your scouting department should use AI to process volume and flag candidates. Your scouts should use their judgement to make recruitment decisions.
Performance science tools like Hudl AI and WHOOP can track physical adaptation and inform when athletes need recovery. They cannot manage team culture, individual motivation, or the confidence that comes from coaching that treats athletes as people, not data points. Young athletes develop character through relationships with coaches who notice them as individuals. When programme decisions are driven purely by metrics, athletes learn that their effort is valued only when the numbers match expectations. Development programmes that stay athlete-centred produce more resilient competitors than those optimised around data.
Algorithmic fan engagement systems optimise for engagement metrics and often produce content that drives reaction rather than connection. A perfectly algorithmic feed maximises clicks but can destroy the sense of shared community that makes supporters loyal across seasons. Fans stay committed to organisations because they feel part of something real. Algorithmic optimisation often erodes that feeling by treating fans as attention units rather than members of a community. Use analytics to understand what your fans care about, then create authentic content and experiences that reflect that knowledge.
The organisations that use AI most effectively treat it as a tool that surfaces information for human experts to interpret. A selection committee that starts with Catapult load data and ends with conversation between coaches is using AI correctly. A selection committee that starts with an athlete's metrics and uses conversation to confirm what the data already suggests has inverted the process. Create decision-making processes where the AI output is the starting point for judgement, not the endpoint. Train your staff to ask what the data is missing, not whether the data is correct.
Key principles
Key reminders
Related reads
The Book — Out Now
Read the first chapter free.
No spam. Unsubscribe anytime.