By Pete Walker, Senior Product Manager.
As IP connectivity becomes increasingly widespread and more accessible to all sectors of our industry, those with their sights set on the true goals of interoperability are working hard to play together. AES67 defines a common language for the streaming of live, high-quality low-latency audio over IP. But what are we doing to manage the transition from traditional connectivity, and how do we achieve the full benefits that have been promised with IP workflows?
A key touted advantage of moving to IP is the ability to use existing network infrastructures and COTS (Commercial-Off-The-Shelf) hardware. Broadcasters want to be able to pass audio, video, control and other data over shared IP networks, and they want to use open standards to do so between devices made by different manufacturers. Standardised IP connectivity eradicates much of the cost, space, system complexity and cabling overhead of having a multitude of interfaces for analogue, AES3, MADI, SDI, etc. This is the goal of both AES67 and ST2110.
AoIP has been around for many years, with lots of broadcasters already relying on it daily to produce live on-air content. In many cases, it’s been radio stations that have been pioneering the way in terms of “full IP” system-wide integration, as well as the linking of geographically remote facilities, with flagship examples like the BBC’s Virtual Local Radio (ViLoR) project. This is in part because radio has not been held back by the video over IP debate of ST2022-6 (SDI over IP) vs ST2110 (separate RTP streams for video, AES67-audio and metadata) that has been going on in the TV world. Also, the bandwidth requirements of audio are negligible compared to that of video, making the network infrastructure considerably more affordable.
Although AoIP networking is on the increase, there’s still a tendency to go for single-manufacturer solutions, ensuring all the devices have the same parameter set and interpretation of AES67, with the same discovery and connection management methods. This makes successful deployment easier and provides a clear line of technical support from the vendor. However, this leaves broadcasters feeling tied to their investment and it isn’t the true goal of IP; if you’ve bought 10 HP PCs, you would not be concerned about going out and buying a Dell, at least in terms of whether it will be able to access, edit and share the same files. IP broadcast equipment should be no different.
In TV, audio equipment has to interface with a wider range of kit from vendors in different fields, which has highlighted the challenges of interoperability further.
So what’s missing? AES67 gives us a standardised protocol and parameter set that means for example a Wheatnet device can exchange audio streams with a Livewire device as both can be configured to operate within the bounds of AES67, though it’s not necessarily simple to connect stream connections between different vendors. It typically requires an engineer to configure output streams on each device and often manually entering complex configuration details to be able to receive streams from other devices. While this works reliably, it relies on engineers to set up and it results in a static streaming configuration. For dynamic routing of audio, providing the operational workflows needed for live broadcast, we’re still relying on expensive broadcast routers, albeit IP. This is not the goal of using COTS IP.
With AES3, you plug in a BNC, and the receiving device knows it is expecting two channels of audio on that connection. With AoIP, a single connection to a network allows for the exchange of lots of channels of audio with lots of different pieces of equipment, but you can only receive that audio if you know it exists in the first place. The fundamental part missing from both AES67 and ST-2110 is advertisement and connection management. In the absence of an agreed standard, many, but not all vendors have followed the Ravenna approach, which is helpful for advertising AoIP streams between those vendors, but this still leaves us with labour intensive configuration and static streaming connections.
This is where the likes of AIMS (Alliance for IP Media Solutions) and the JT-NM (Joint Task Force on Networked Media) become very important. They are promoting full interoperability including NMOS, a standardised mechanism for not only discovery/advertisement (NMOS IS-04), but also connection management (NMOS IS-05), from a centralised point, meaning you do not have to log in to each device on a network to configure its connections, but instead have a user-friendly UI capable of dynamically routing streams between devices from different manufacturers – using the network to perform the routing function.
NMOS has gained strong buy-in across the industry, from both manufacturers and broadcasters. It’s widely seen as the route to true interoperability, but its uptake has been slow in some areas.
As broadcast equipment manufacturers, true interoperability could be perceived as a threat because it increases competition. We defend our market share by making reliable, high performing and easy-to-use products with a feature set designed for live broadcast applications, not by needlessly implementing proprietary or unique methods. To unlock the full potential of IP and give broadcasters the workflows, costs and efficiency savings they need to compete in the modern broadcast era, we must all work together, fully and properly implementing the agreed standards to provide proven multi-vendor systems that are easy for operational staff to use. At Calrec we are working with our partners and following the JT-NM roadmap – our Type R, IP-native radio system being a prime example – working towards the ultimate goals of making life better for our customers.