Skip to main content

AI AND THE FUTURE OF
ROBOTIC STUDIOS

Robotic studios allow broadcasters to do more with less – while doing it better. AI and robots have always occupied centre stage in science fiction, but now it’s happening in real life. In this article, we look at the role of AI in robotic studios. We look at the present, the likely future – and offer some reassurance as well.

What we are talking about

First, we need to agree on how we will talk about AI. Depending on your technical background, what you read and where you get your news, you’ll likely have a different understanding of AI.

Over the last year or so, generative AI models like ChatGPT and Text to Image have exploded into the public consciousness, with adoption rates off the scale. These two model types now represent most people’s default ideas about AI. They are indeed important, but they shed little light on how people might use AI in studio production scenarios.

So, when we talk about AI, we simply mean an electronic system that’s capable of learning (as in “machine learning”) and then applying that “knowledge” to a specialised task. It’s not, “When will our AI technology become sentient?” so much as, “How can we improve what we do in the studio in a useful, safe, repeatable and dependable way?”.

Where is AI already used in studios?

While there are plenty of examples where AI is used to generate newsroom content – for headlines and even on-air scripts – there is less evidence of its use in making a studio-based TV programme. But that is changing, and some transformative AI-based applications are already available for studio production.

Voice Prompting

An absolute mainstay of live production, the prompter is a labour-intensive device. An operator must advance the script to match the presenter’s pace and position. It’s a tedious job, but it’s also critical. When it goes wrong, the presenter must ad-lib awkwardly.

With Autoscript Voice, Keyword recognition, and AI can track the presenter’s pace, work out where they are in the script, and scroll the text to the correct position and at the ideal speed. When there is an interruption, such as a breaking news story, the system can stop the prompter and resume scrolling and position matching as soon as it hears the presenter reading the script again.

Presenter Tracking

Robotic cameras are traditionally programmed to move to preset positions and follow prearranged paths. Incorporating AI opens a palette of possibilities to accurately track and frame presenters. VEGA Presenter Tracking can identify talent and guests, enabling person-specific camera positions and settings. It can also judge a presenter’s position and intention by – essentially – reading their movements and body language. This leads to much tighter, more accurate, more deterministic tracking.

Reasons for using AI

Humans are typically not deterministic. Different individuals will behave differently, and even the same person will not always do the same thing all the time. This variability makes humans human and is an intrinsic part of our ability to be creative and solve problems. It’s possible that if there were no variability between humans or between our own actions, we would achieve very little. We would all make the same mistakes and keep repeating them.

So, we should celebrate humanness. It’s not randomness; it’s difference with a purpose. And perhaps ironically, it’s one reason why AI will have such a central role in studio automation.

AI can learn from our behaviour and that of others. In a live broadcast studio, the nature of the task – to present to the camera – constrains the presenter’s degree of freedom. In “narrow” environments like this, AI has much to contribute.

Humans struggle to concentrate over long periods with tedious tasks, and this can lead to errors and stress for the presenter and the production team because mistakes are instantly visible to viewers. Existing automation can deal with consistently recurring tasks, but AI-based automation can adapt to unplanned events like changes in a running order.

A dynamic response to unplanned or rapidly evolving circumstances is just one example of how and why broadcasters should introduce AI into their studios….

Want to read the full guide?

Download For FREE

Download your FREE technology guides

Discover how Vinten technology guides can help you leverage AI and machine learning in robotic broadcast studios with practical advice and actionable insights.

What’s covered?

Part 1: Software architecture in the digital studio age

Explore the benefits of AI automation, including enhanced accuracy and streamlined operations, and gain a glimpse into future trends such as responsive presenter settings and voice-controlled robotics.

Part 2: AI and the future of robotic studios

Look at the benefits of transitioning to IP-based solutions and address challenges in enterprise network integration. Demonstrate real-world applications of network-friendly systems and discuss the potential of cloud-based broadcasting.

Part 3: Doing more with less – empowering the studio control room team

Provides comprehensive insights into how AI-driven automation can streamline studio setup, optimize real-time performance, enhance productivity with voice prompting and presenter tracking.

Part 4: Making robots more creative on-air

Learn how AI technology overcomes current limitations, generates smoother camera movements, and enables new production possibilities, empowering creative professionals to innovate in live studio environments.
Our Brands