Sport is the most popular genre on television. It brings excitement and real jeopardy because you never know what is going to happen in a live sport event.
The result of this is that large sports bodies and clubs have built up very strong business cases, making a high proportion of their income through the sale of intellectual property rights. Broadcasters and production companies, keen to make the most of these expensive rights, drive standards ever upwards, bringing new engagement and insights to audiences (and delivering them to advertisers).
To ensure live coverage, the major stadiums in the major leagues have always had strong connectivity. Originally leased video lines were installed at every venue, linking them back to broadcast hubs. Now those video lines have been replaced by high capacity dark fibre, again providing direct, secure, reliable contribution links, now for multiple synchronised signals.
If you are a niche sport, or a lower tier of a major sport, you look on this with enormous envy. You want to showcase your sport to your fans, to expand your supporter base, and introduce a sport to a new audience to encourage participation.
You can see how to save on production costs, maybe going for six cameras rather than 24. But the insurmountable problem is getting the signal from the venue to the broadcaster. Microwave links are unheard of today; SNG capacity is scarce and very expensive; and there is no high-capacity fibre available.
But the venue will have business internet. Could this provide the solution?
The problem with the public internet, as we all know, is that it is the Wild West. You can put a signal in at one point and get it out at another, but you have absolutely no control over how it gets there, what might happen to it on the way, and how long it might take. If you want to use the public internet to deliver broadcast signals, then you need a marshal in town to patrol the Wild West.
Routing on the internet is established using the shortest path first principle. This should result in the lowest latency, but it does not take quality into account. You may be inserting video at 10Mbps or more, but if there is a link somewhere in the internet chain between origin and destination that is limited to 2Mbps, you will only get 2Mbps end to end.
Similarly, if there is a noisy link somewhere in the chain, then latency will vary wildly while error correction struggles to keep up, or you will get a glitchy, blocky signal at the output.
The solution is at the same time both very simple and extremely complex. If you route the signal down one path this is not under your direct control, but across multiple paths in parallel, you can achieve the quality, reliability and stability that you need for broadcast signals.
Stability and predictability are crucially important. There is little point creating and delivering excellent content if you have no way of monetising it. You need to be able to insert advertising and splice other content into the stream. The technologies to do this are readily available, but they depend upon stable inputs to cut into.
As I said, if the idea is simple to grasp, its realisation is very difficult. It depends upon artificial intelligence and machine learning. The transmitting end of the circuit must be equipped with historical data, and the means to derive empirical evidence from the connections in real time, which combine to create an optimum delivery path. These decisions are dynamic, so the pathing can be changed as necessary.
The transmitter adds extra flags into the header of each data packet, which determines which sub-path it will take, how the data is divided across paths, and most importantly to synchronise it at the receiver. It also needs excellent forward error correction to ensure minimum consistent latency, with the recombined signal automatically healing itself.
Given the ability to find enough bandwidth, the solution is applicable to any sort of carrier, from 5G to dedicated fibre, and to any combination of circuits. It is also applicable to any length of circuit: Major League Baseball is available, live, in Taiwan thanks to it. And, while I have talked about it as a real-time solution, it could also be used for high-stability, high-security content transfer, for instance for synchronise server networks in multiple locations.
Implementing this sort of system delivers stable, broadcast-quality video delivery at a significant reduction over the cost of dedicated circuits. That makes it affordable for all those sports and clubs who have wanted to exploit their rights but have not been able to. It also means they can consider added value content, like news from the training ground and features from away games.
What will make it successful is the development and adoption of open, published standards around this, so that connectivity can be integrated into the whole ecosystem of content creation and delivery. If the industry works together, we can make the whole process seamless, which is what producers desperately need.
We know that the internet can do video, but in general it is limited to the quality of a Zoom conference call. The application of AI and machine learning to the management of internet circuits has the power to transform it.
This promises the same simplicity of video delivery, but smoothed out: glitch free, consistent quality, and suitable for carrying high value content through the contribution and distribution chain. It brings interoperability, a very high degree of resilience and reliability, and the ability to scale to the requirements of the production. Above all, it is affordable: an exciting combination for the future of sports coverage.
This article first appeared online with SVGEurope
Do you know of someone who may be interested in this? Share it via: