This article will be published in the 2018 edition of Pure Live from Live Production. Sign up for the excellent Live Production newsletter here.

The BBC has a long and respected tradition of covering large sporting events and has constantly refined and improved its coverage over the decades. Having worked for them for 35 years, Lead Sound Supervisor Dave Lee has gained a wealth of experience planning and delivering the sound and communications of a wide range of TV programmes.

Today, remote production is increasingly used to help deliver more content, to more screens, across more devices; and it is proving a key challenge for modern broadcasters.

The case for remote production is compelling. Remote production reduces our carbon footprint, it maximises utilisation of existing studio equipment at the home location, it keeps quality high and keeps costs low; and the staff can work in a well-established and familiar production environment that’s close to home.

Since the start, sports broadcasting has been central to the growth of remote production. Manufacturers and broadcasters have had to work together to overcome some fundamental challenges. In the broadcast audio world, the biggest challenge is how to combat latency; not of the overall transmission signal, but the live on-air conversations between reporters, presenters and experts at the remote venue(s), as well as in any remote studio.

These are relatively new problems for broadcast workflows. The traditional way to cover large international events is by driving an outside broadcast truck to the event, setting up flypacks, or building a transmission suite on site and mixing the entire event locally. As broadcasters begin to embrace the concept of remote broadcasting, they are finding that with careful planning they can maintain quality levels while at the same time save money.

Remote broadcasting cuts travel budgets, saves on shipping and equipment, and gives more time to staff. It maximises a broadcaster’s investment in existing studio architecture, increases content across a variety of delivery methods and allows broadcasters to be more creative with content.

But such things come at a cost and the challenges are something that broadcasters haven’t had to deal with before.  BBC Sport’s Dave Lee has been central to the development of remote production for the organisation.

“Latency is absolutely key to any live sports production – the main consideration being the talent hearing what they need to hear to do their job properly. They need to hear a combination of things: mainly instructional talkback information from the production team plus the programme into which they are contributing – a mix minus themselves. They must be able to talk to one another – presenter to commentator to reporter and so on. This involves a lot of bi-directional audio traffic.”

It may surprise many that BBC Sport has been embracing remote production for many years: for example, the Vancouver 2010 Winter Olympics and again in Sochi 2014. Lee explains, “We had a very small team on location in Sochi with all video and audio sent via International connectivity back to the UK for transmission. The majority of the production and technical team members were located at BBC Sport in Salford, where we have our state-of-the-art transmission suites.”

When audio engineers mix live TV content, they combine local content at base where the transmission occurs, such as video from servers, audio play-ins and studio content; with a number of outside sources. The OS remote contribution from the venues generally includes Commentary, Presentation and Reporters; and often involves physical studios at the remote venue too.

These outside sources must hear the programme into which they’re contributing. To achieve this, broadcasters use a mix minus feed for every outside source. Some ground-based staff also need specific programme mixes that includes their own voices, for example, when Presentation is stationed in a noisy environment such as amid an enthusiastic and vocal crowd.

The various mixes can all be adversely affected when working remotely.

Traditional mix minus working is forgiving and successful when dealing with small latencies, because people at the venue are not hearing (echoes of) themselves. If this latency doesn’t affect the flow of conversations between contributors, then everything is good.

But as soon as you move into remote production, with its inherent higher latencies, the conversations start to suffer. This was exactly the challenge that BBC Sport encountered in Sochi – as have other broadcasters working in remote production environments.

Lee explains, “In Sochi we did use remote production successfully. However, whenever any onsite talent needed to talk to any other onsite talent, that traffic came over our international links, through the UK sound desk and back out on the mix minus to the other talent. They replied and then that came back to the UK, through the sound desk and back out again to the other talent. This torturous signal path introduced a considerable amount of latency, a combination of multiple international round trips plus video encoding/decoding (with the embedded audio). It all adds up to a significant delay.”

This results in slow hand-overs, laboured conversations and interruptions, which can be particularly confusing and frustrating for viewers when presenters and reporters who appear at the same venue exhibit a significant delay. “They are all within metres of one another, but there’s a delay because they hear each other via this international latency. We had to find a way of making this better,” says Lee.

Calrec and BBC Sport have a longstanding relationship, most recently from using Calrec consoles and Hydra2 networking technology at Salford. After Sochi, meetings were held to exchange ideas about how to eradicate remote site latency.

It was concluded that the talent on the ground inevitably needed to hear the UK talkback and UK programme content via the international link – mix minus all OS contributors – but also be connected locally to one another to negate the international latency. That was the technical nut to be cracked.

Traditionally, this could be achieved by having a physical mixer onsite at the event. The Eureka moment came when it became clear that this system works – why change it? What’s required is the ability to achieve a local mix of dialogue, but control this remotely.

This collaborative working resulted in Calrec’s RP1 remote production unit, which sits at the remote venue. The latency challenge is simply solved by providing local DSP channels for mixing the venue audios locally, along with switched talkbacks and mix-minus-all-venues added to each contributor’s mix. The nature of the remote control aspect is fundamental, with fader data generated by the transmission sound console in the UK sent via an international IP link.

At a big event like the Commonwealth Games, broadcasters will have technicians at the venue during set up, before they connect with the team back home. This is an aspect which needs careful consideration. Configuration and basic operation should be possible ‘offline’ to test the system; and also to provide a redundant back up should there be technical connectivity issues later on. For this reason, control of the RP1 can be local (via a web-based GUI), but once set up local control can be locked. The transmission audio engineer takes control and ‘blocks’ the use of fader and cut facilities of the GUI controlling the remote RP1. The same content that is put to air in Salford is also mixed within the RP1 so the talent hears precisely what’s going on. The talent hear each other via local connectivity in real time when they are faded up. These faders mirror the host console faders in the home production facility, as the host console controls the RP1 onsite.

Lee explains, “We can now treat audio content generated in the UK, which is behind-time, separately from the instantaneous audio content generated locally. Anything that’s available on the event side of the latency, the talent only hears through the RP1 remote mixer; it doesn’t pass to the UK and back.” Of course, this scenario applies to any production suffering from delays, not just international events.

Calrec’s RP1 solution was successfully deployed at both the Winter Olympics in Pyeongchang and the Commonwealth Games in the Gold Coast, Australia.

But then consider a studio at the remote venue; remote talent might want a full programme mix in their earpiece rather than a mix minus, so that the Presenters and Guests can hear each other clearly. “When there’s ambient noise, foldback and a lot of talkback traffic, the talent can’t always hear what’s going on around them – even people sat next to them!” Lee says, “You have to feed the studio mics into the contributors’ ears. The latency must be zero to avoid echoes of themselves and the people they can half-hear sat next to them.”

So how can zero remote studio latency be achieved?

Lee says, “There’s nobody mixing the mics locally in the studio, but in remote production that’s ideally what’s required. We can liken these mix requirements to those of a foldback mixer at a concert enabling each performer to hear clearly. The requirement is for the mic mix generated at the studio to be controlled remotely by the host console back in Salford.”

For every mic in the studio there’s a fader on the remote mixer GUI – the RP1 – that’s controlled by the equivalent fader back on the console in Salford. Whatever decision the broadcast audio engineer makes is mirrored within the remote RP1 mixer. However, the audio content used in the RP1 is direct, not via any international link.

The audio from mics that are fed into the RP1 are also sent to the host console over the international circuits. This allows the broadcast engineer to control the main output mix and the local venue mix at the same time. It is one, fully integrated solution. The RP1 studio faders are paired with and follow the transmission faders, so Presenters and Guest hear almost exactly what viewers hear.

Please submit your details to download this exclusive content