Using AI Technology to Deliver More Content from Live Sports Events
May 16, 2022
IQ AI Sports Producer latest USe

Every year, major live sporting events are consistently among the most watched by TV audiences around the world. In addition, broadcasters and content producers are increasingly challenged to expand viewer engagement with content, in terms of duration and depth – and to do so efficiently.

Video technology has always played a key part in influencing how deeply the viewing audience is immersed in video programming. An early example of this was a study that people who reported dreaming in black and white dropped from 25% to 7% when colour TV was launched. More recently, social media has played a growing role in lengthening the engagement among both avid and casual fans of sporting events. Convivia’s 2020 Q1 Streaming Report suggested that a 30% increase in video content on sports social media posts increases public engagement by over 100%.

Using social media and over the top (OTT) platforms, sports video content producers are now keenly focused on identifying programming material to use as part of the push for wider and more in-depth engagement with their output. Among the popular new content types being produced are discussion and analysis programming and delivering team training footage as build-up to events. Major sports events are being broadened with coverage of matches involving minor teams or players through on-demand platforms. Some sports leagues are also looking to launch their own programming that provides more unique camera viewpoints and statistical analysis.

While this extra and broader content is achieving its aim of wider and deeper audience engagement, content monetisation and production cost efficiency remain critical issues.

Keeping up with content demands

Remote production has become a key enabler for the production of live content, delivering more efficient use of staff at production centres, reduced travel costs, a better work-life balance creating more attractive job opportunities, and more environmentally friendly operations. But, as a way of working, remote production does not in itself reduce the burden on the production team of increasing the volume of produced content.

For many forms of content, new artificial intelligence production technologies are evolving that can play a role in automating or assisting content production by tracking in-play action and selecting potentially interesting shots. Through machine learning developed over hours of analysis of sporting action and through production rules, it is now possible to identify players and teams, recognise match referees and track the game play within a field of camera vision. Other AI production techniques being used include motion prediction to track ball or puck movement, and interpolation techniques to maintain smooth tracking action.

One of the most impressive features of leading artificial intelligence (AI) sports production systems is how they can replicate coverage over a wide field of vision. This capability was previously unique to fully-manned productions. Such classic systems include multiple cameras that employ panning and zooming to track the action, create excitement and maintain viewer interest.

The top AI systems achieve this by utilising wide field-of-view cameras. The most advanced employ high-quality optics across camera arrays and image stitching and de-warping techniques to present a flat field of vision. Some systems have been extended to integrate multiple cameras sited in key locations, for example, near the corner post or goal on a soccer pitch. The AI technology is trained to switch camera views to give the most dynamic production. Using the above techniques, AI systems are able to produce content that engages viewers at a level that rivals traditional systems.

In addition to being used as fully automated solutions, AI systems can be employed to assist directors in their tasks. An AI system can be utilised to perform core game play tracking, allowing the human director to focus on additional content by manually switching in camera views as desired. For example, a wireless sideline camera can capture the emotion of the team reaction on the bench to a win. Directors can also manually trigger an action replay of a key moment when appropriate.

Sports content production is more than just about the action taking place on the field of play. For sports fans, it is increasingly the discussion, debate and analysis around events that creates true engagement. AI-driven studio production systems have exciting possibilities in this area. AI-based camera direction tools now have the power to automatically find and track the movement of presenters and switch to the person talking. The result: a natural, fluid discussion with correct pacing.

AI is the answer

Can AI-automated production systems accurately claim to deliver all the major elements of manually produced programming? Increasingly, the answer is yes. AI technology can capably help a production team offload some of the work required to produce engaging programming. This is being borne out by the rapidly increasing number of AI live sports production systems being deployed.

Can AI technology fully automate event production? The answer to this is possibly more nuanced. AI systems can confidently accomplish the key elements of producing and directing an event to create an engaging production. Systems can follow the action, create dynamic shots that maintain viewer interest and deliver high-quality video, graphical overlays and audio.

But, to some extent, the truth will always remain that nothing understands what it is to be human and to play on our emotions like another human. A skilled director who can create the story of the event will always have the potential to deliver something extra that an artificial system cannot.

For content production businesses, the question is different: can an AI system automatically produce sporting events that engage viewers at a price point that enables new profitable content revenue streams? For the top AI systems, the answer to this question is also yes.

Michel Bais, Managing Director Mobile Viewpoint