This article originated from the IABM website
By Roger Thornton, Copy Cruncher at IABM
The Publish block of the BaM Content Chain® came out as one of the most important in the recent IABM Buying Trends survey, with 57% finding it to be the most important in terms of investment. Unsurprisingly, this is reflected in buyers’ top media technology purchasing priority being multi-platform content delivery. Even though BIY is on the rise across many content chain blocks, 90% of broadcasters have no plans for this in their Publish operations. With most technology buyers looking to increase their technology investment by up to 10% in the coming year, the future looks bright for media technology suppliers in Publish over the coming months and years.
For this issue’s feature article, we asked IABM members with products and services in Publish to assess its current drivers, opportunities and challenges, and to take out their crystal balls to look into the future – including the potential impact of 5G. As Deluxe’s Sr. Director Product Solutions, Nav Khangura, says, “Publish is arguably the most important stage of the content supply chain. Satisfying the demands of an increasingly demanding audience is the most important factor in keeping you in the race.”
Despite some heavy editing, this is a long article –for which I make no apology. 13 IABM member companies contributed to it and each has a different opinion and approach to the market. It will be an illuminating read for everyone involved in delivering content in our rapidly changing M&E landscape.
What are the current market drivers?
Changing viewing habits and demands were top of most people’s list. “The main drivers of change in Publish all stem back to one key theme: the ongoing change in consumer viewing habits,” Deluxe’s Nav Khangura says. “Whether it’s the volume increase of original content, the explosion of localization needs, the evolution of higher quality formats or the continual increase in the number of digital platforms, all drivers lead back to meeting consumers’ viewing demands and the endeavor to have a competitive edge in the race to win consumers’ viewing time. As original content grows year over year, so does the need for instant scalability and intelligent automation to transform, package, distribute and publish.”
Consumers in charge
Shawn Carnahan, CTO at Telestream, also sees consumers driving change – and so Telestream too. “We have been witnessing some basic changes in consumer behavior for some time. From their perspective, consumers are seeking to transition to more flexible ways to consume content. The MSO/Aggregator model is steadily being replaced by direct relationships between content creators and their consumers. This is fragmenting the market. Then there is a large group of linear TV broadcasters that are seeking to replicate their channels in OTT.
“These factors are definitely driving the technology companies to support these moves. At Telestream, it is driving our roadmap, and in conversations with colleagues at Harmonic and other companies, they say the same thing. Our collective challenge is how to build more cost-effective, flexible and elastic origins,” Carnahan adds.
Unsurprisingly perhaps, Harmonic’s Director Playout Solutions, Andy Warman, agrees, and suggests the route to success. “We see three main drivers shaping buying decisions: growing demand for OTT content delivery, adoption of media over IP technologies and the move
to cloud-based technologies. This underscores the market’s desire to transition toward software-based solutions and away from an appliance-based approach. It provides customers with flexibility and the ability to quickly adapt to changing market needs.”
Richard Heitmann, VP General Manager at IBM Aspera, also sees the cloud providing the current answer and looks to 5G’s potential future impact. “Cloud has become an acceptable option for traditional publishing workflows with several channels deploying “playout origination” from the cloud as commodity editing in the cloud becomes more prevalent. In the coming years, more and more production workflows and traditional broadcast editing will move to the cloud. In the US, the reallocation of C-band satellite frequencies to mobile carriers is driving traditional network distribution from satellite to terrestrial carriers. Furthermore, with the advent of 5G (though widespread consumer usage is likely several years out), edge compute will offer consumers tremendous opportunity for customized consumption from ads to long-form content,” Heitmann says.
Choice of viewing devices
Nick Fielibert, CTO, Video Network at Synamedia, identifies the profusion of different viewing devices as driving the market: “The main drivers are devices (smart TV, media players, tablets, smartphones, PCs etc.) that can consume video over the internet. Content providers see the audiences growing if they publish content outside the traditional model of a Service Provider (SP) as a content aggregator. For this reason, all Content Providers (CP) create a direct connection to consumers with an OTT service. Because of the abundance of devices, SPs also see a need to provide the aggregated experience to their subscribers.”
Rohde & Schwarz Strategic Marketing Manager, Tim Felstead, also sees the change in consumer habits as a driver, but adds a second, as well as looking forward to 5G: “Allowing the possibilities of technology development to improve and further integrate the variety of processes required to produce content for consumers. These include integrating and virtualizing playout solutions including great graphics capability, and automating content generation for VOD file delivery and promo versioning to offer two direct examples. And there is also the increasing value of live events while at the same time the opportunity presented by highly capable hand-held mobile devices and associated networks. Our research, product development and trial deployments driven by our transmitter group, along with many key industry partners, is enabling the tight integration of telecommunications (wireless and mobile broadband) and linear broadcast delivery through the 5G Broadcast standards,” Felstead adds.
As a service provider at the sharp end of Publish, Red Bee Media is well placed to see where the market is heading –and what is required of the technology. “It’s about multi-platform more than ever. In this rapidly evolving business and tech landscape, it’s crucial to be able to get content to the right place/right time/looking perfect. Agility, control, visibility,” sums up Steve Russell, Head of OTT & Media Management at Red Bee Media.
Quality of experience
For Antonio Corrado, CEO, Mainstreaming, it’s back to the consumer – and QoE. “For us, the main driver of change is the consumer demand in regard to quality. Traditionally, streaming over the internet has been a risk as companies could only guarantee a ‘Best Effort’ service when streaming content for broadcasters or content owners, which is dramatically different from what they were used to with satellite providers where they used to purchase services based on quality of experience. The delivery network providers whose technology and services are built specifically for video and focused on enabling the future of streaming over the internet will be the providers that are able to guarantee quality of experience to broadcasters and content owners based on QoS as they go direct-to-consumer,” Corrado asserts.
Broadpeak VP Marketing, Nivedita Nouvel, concurs with Corrado: “More and more, content providers are driving the delivery of valuable content, both live and VOD. This has an impact on the technologies required by network operators to serve their subscribers with the best QoE possible.”
Joined up linear to VOD workflows
“The drive to adopt IP technology and – beyond that – to virtualise continues to drive interest and investment in linear playout technology,”
says Pebble Beach Systems Marketing Manager, Alison Pavitt. “From a Pebble perspective, several years after the first deployment of our ‘Orca’ virtualised playout solution there are now multiple deployments in diverse applications, including a full cross-continent business continuity service, and a large scale multi-channel deployment which broadcasts in multiple languages requiring precise synchronisation and comprehensive audio playout rules.
“The buzz around virtualised playout keeps growing. However, questions remain about the economic, logistical and technical benefits to the
end user, and judging by the high volume of on-premise playout solutions that we continue to install and commission – whether IP or baseband – it’s clear that this path is not one that every broadcaster or media company is ready to follow,” Pavitt continues.
“The adoption of new technologies and standards is not an end in itself. In our experience the key drivers of change continue to be the need for greater efficiency, and for ‘joined-up’ linear to VOD workflows. The fact is that linear playout continues to be where broadcasters make their money, and with pressure for them to invest and expand to other platforms to help them compete with on-demand and OTT services, any efficiencies that can be gained in their playout workflows and infrastructure become very attractive,” Pavitt adds.
OTT or linear – where’s the investment going?
We asked our respondents where they are seeing the market moving in terms of what their customers are asking for. While OTT is getting a lot of attention, it’s clear that linear isn’t going anywhere soon, and an integrated, hybrid approach is often preferred.
A hybrid world
Videon CEO, Todd Erdley, sees OTT first as the ultimate destination. “Videon is seeing a hybrid approach to the market where there will continue to be linear provided in parallel to OTT. Our focus is supporting low latency live events, particularly sports. In this area, the focus is on OTT being delivered with the same delay as linear. That is step one where the linear experience and OTT experience are effectively the same. With advances in areas like eSports and other sports gamification including better, more and more we are seeing providers consider much more focus on the OTT delivery. Videon anticipates a highly differentiated workflow where OTT provides 2nd screen users that are evolving to have 2nd screen as 1st screen be a much more interactive, engaged production. This necessitates a transition to OTT-first,” Erdley says.
“We believe that there will be a mixed environment for some decades to come,” says Rohde & Schwarz’s Tim Felstead. “While there has been a relentless trend towards OTT (including VOD and streaming live services) over recent years, linear terrestrial delivery remains a powerful and often still primary market. The key for our customers is to ensure that solutions they build to service their customers serve all of the publish mechanisms at once in the most efficient manner possible. We believe workflow automation, integration, virtualization and great file or signal processing capabilities hold the keys to this expectation. In summary, no, it is not OTT first, it is everything first.”
Red Bee Media’s Steve Russell again provides the view from the output end. “Our customers like the fact we span across these domains. It’s about an integrated business plan that leverages all distribution models. It’s a mistake to see these as separate domains. It’s a continuum of distribution and monetization possibilities. And one channel supports the other,” he asserts.
Telestream sees the focus as very much on OTT, perhaps reflecting its broadcaster customer base. “Looking at one of our core markets – owner operated linear multi-channel broadcasters – they are almost exclusively focused on OTT build out,” says Shawn Carnahan. “Currently, virtually everything we are doing with them is in response to the challenge of how to create an OTT network that is on a par with linear television in terms of quality of service and experience. The transition from OTT being a novelty to having the same consumer expectations as the linear TV experience in your living room is massive.”
“Broadpeak’s customers are mostly Pay-TV operators, and we find that they are investing more in OTT technologies like ABR streaming rather than legacy cable or IPTV streaming. The same is true for satellite operators. They need to address multiscreen and are hence moving to ABR formats,” Nivedita Nouvel explains.
Linear lives on
“OTT is certainly growing in prominence, with customers increasingly looking to take on the FAANG companies via their own Direct-to-Consumer offerings,” adds Deluxe Director of Strategic Planning, Ian Robbins. “For traditional linear players, their budgets are still heavily focused on their core offerings, but naturally they are having to move more focus towards their digital platforms as a means to diversify their revenue streams and compete in the digital space. That said, linear is still an important medium for the big players and something we envisage continuing to carry a significant level of focus for the foreseeable future.
“It is now crucial that vendors enable their customers with a technology platform to seamlessly orchestrate fulfilment for both initiatives via the same tool set. This inflection point led Deluxe to create the One platform. Now more than ever, Deluxe is positioned to meet high-volume processing requirements for both linear and OTT fulfilment as well as having the ability to support customers seeking a D2C solution,” Robbins adds.
For Mainstreaming too, OTT is the prime focus.
“All of our customers are transitioning toward OTT or adopting an OTT-first approach, as we are a streaming service provider focused on delivering video over the internet. As the OTT industry continues to grow exponentially year over year and with the now direct-to-consumer approach many content owners or traditional broadcasters are taking, this is exactly where we see the market heading,” says Antonio Corrado.
For Harmonic, customer priorities depend on the size and type of organization. “On the production side, we are seeing continued growth in the adoption of OTT,” says Andy Warman. “And this appears to be a priority for larger organizations, though linear channels still receive significant ongoing investment. In small- to mid-size organizations there is relatively little movement on the adoption of OTT, with linear being the focus. On the distribution side, we see more investment in OTT than classical broadcast.”
For IBM Aspera’s Richard Heitmann, while OTT is the overriding trend, cloud-based distribution is not yet feasible. “There is a massive transition to OTT networks – from the largest of content owners to the smallest of niche creators. However, with the reallocation of C-Band spectrum, traditional linear networks –particularly ‘digital’ and secondary sub channels –are testing transmission over internet. OTT first relies on on-premises, followed by data center hosted and CDNs due to the favorable financial models. While linear channel migration may lead to a surge in budgetary spend, cloud-based distribution is not yet financially viable due to the fees which come from moving content out of public cloud providers. Broadcasters and OTT providers will still face trade-offs between long term capital investments and the high operating costs associated with cloud-based distribution. Being more economical, especially when launching pilot content/channels, will make cloud the preferred platform. Cloud usage is even more logical as more facilities try to move their operations off site for economic, scale and update incentives,” Heitmann explains.
Will more broadcasters internalize OTT capabilities as Disney has done – or will they continue to outsource them?
The bigger operators in this market want to control the viewer experience and build their brands,” says Telestream’s Shawn Carnahan. “Today, the technology is not experimental: there has been a shrinking in the options for how they distribute content across all the consumer platforms that they need to cover. It’s not as much of a mystery for them as it once was. So, I believe that many of the bigger players will build their OTT channels out internally, whereas earlier, the strategy was to wholesale outsource the challenge. This internalization process will include the entire workflow including App development for these platforms. This will enable them to better control consumer behavior, consumer experience, monetization and analytics of the process.
“At Telestream, the development of our iQ portfolio is driven by the problems that are unique to the delivery of media over IP,” Carnahan continues. “This continues to be true for OTT delivery – it’s just a different set of problems. Our customers seek control over this, and also to control costs. When OTT was an experimental novelty, organizations weren’t staking revenue on these new channels. But as they transition to OTT becoming the core business, then bottom-line profitability is directly related to costs. For Telestream, this speaks to the level of integration we have within our customers’ workflows. As a technology vendor, we can do so much more depending on the amount of the whole process that we’re participating in. OTT is not as componentized as television was in the past. We must have a broader portfolio of system solutions. We’re not there yet, but strategically it is where we are heading.”
For Rohde & Schwarz, it’s all about the old maxim of sticking to core competencies. “We envisage some customers making some elements in-house,” says Tim Felstead. “The determining factor will be in answer to the question ‘what is core to their businesses?’. Put another way, when is it imperative to do in-house developments and when is it not? Undertaking a technology development program while at the same time offering a media service to consumers can increase financial and operational risks if not executed with great care. In this case we see some in-sourcing by our customers, and to enter development operations with them, but these are always case by case and often conclude with the realization that some core competences can be bought in from technology suppliers with far less risk than BIY (build it yourself).”
Harmonic sees different requirements depending on the scale of the organization, and like Rohde & Schwarz, sees concentration on core business as a key consideration too. “We have seen a trend with AT&T buying Quickplay, and most recently Disney buying BAMTech. They are choosing to develop in-house technology for their OTT delivery platform and, as such, create a vertical integration strategy,” Andy Warman explains. “This is a trend for large tier-1 distributors that has some impact on Harmonic’s business, but we also anticipate they will be buying solutions from vendors like Harmonic.
“Other organizations are doing the exact opposite: they are launching their OTT services with in-house technology and buying certain components of their platform from vendors like Harmonic when they need an upgrade. They want to focus all their efforts on creating the best content and viewer apps, which are key to making their OTT services a success and providing a significant differentiation in terms of QoE. They believe that all of the internal R&D efforts spent to keep their platform up to date with the latest technologies and maintained was ultimately too complex and expensive,” Warman adds.
IBM Aspera’s Richard Heitmann also sees the key being in delivering the services. “‘Build it or buy it’ has been an expansion mantra for years. There will continue to be a few dominant players in OTT origination and management services, and they will be used as outsourced providers for smaller services. Other big networks/content owners are poised to launch their own services and will likely also offer those services to other parties. Regardless of who is originating, managing or distributing the content, we expect the use of IBM Aspera for file and stream transport of OTT content to continue to grow.”
For Mainstreaming, bringing the expertise in house is the way to go – and it has advantages for Mainstreaming too, as Antonio Corrado explains: “We think that this is smart as it will allow them to control the technology flow on their platforms with an experienced team instead of starting from scratch. It helps us as we will be able to now partner directly with broadcasters instead of having to go through third parties which may speed up the process for adoption of technology.”
For Red Bee Media’s Steve Russell, it depends on scale. “The technology is really a hygiene factor, it just has to work, at scale and across all platforms. Perhaps a handful of truly global players can own and operate the full stack. Our view is that it makes more sense to partner with a Service provider that is laser-focused on getting the platform right, so that our customers can focus on their brands, their narratives, their fans, viewers and their business model.”
Synamedia’s Nick Fielibert agrees: “Some big media companies can afford to do their development in-house. They will probably keep doing this until they feel they can buy solutions on the open market that fit their needs. We see this happening as vendors like ourselves meet media companies’ specific requirements.”
Riding both horses
Deluxe sees advantages in both approaches – and feels it is set up to benefit whichever route the customer chooses. “As the volume of content continues to grow, we see customers opting for a range of models, from those that continue to wish to operate via a 3rd party managed service to those, such as Disney, who take some steps towards internalizing,” says Ian Robbins.
“Via its One platform, Deluxe is in the position to work with customer wishes and provide a range of options from the traditional managed service model through to a PaaS (Platform as a Service) model where the client is in more control. By designing and building One in a modular fashion, we are able to offer platform services in the same way — providing customers with the choice of what capabilities they would like to internalize and what they would like to outsource to the market for self-service.
“For clients wishing to go further and develop bespoke capability in-house, Deluxe has a wealth of experience and know-how that we are open to use in partnership with customers to ensure they get the outcome they desire and can continue to publish content to all necessary output points. Whether the trend continues towards further internalization of technology development of publish capabilities or if it swings in the opposite direction, Deluxe is strategically positioned with API integration at the core of the platform to support the industry and lean into either model,” Robbins adds.
“The trend established by Disney is an indicator of what will happen with greater pace,” says Videon’s Todd Erdley. “If delivery were based on OTT and linear being the same experience, rapid change would not be needed. We do not feel that linear=OTT is the trend and this is due to how people are consuming video on 2nd screen devices. The data indicates more and more people are using 2nd screen as primary. This creates an opportunity for a highly interactive delivery where differentiated content must be enabled. Broadcasters will look for emerging platforms and service providers to create that fast differentiation.”
Following the moves of companies such as Discovery into the cloud for playout, is it now the natural destination for all playout – and is this putting pricing pressure on playout vendors?
Not any time soon according to Pebble Beach Systems! “[Cloud playout] is absolutely still the exception, but these pioneering projects rightly attract much interest within the broadcast community, along with much debate as to whether the hoped-for cost savings actually materialize once systems are deployed,” says Marketing Manager, Alison Pavitt. “We are regularly asked to put proposals together for such deployments, and to participate in exploratory proof of concepts from end users who expect to be playing out from the cloud at some point in the future.
“Customers do see the cloud as a way of gaining efficiency, but let’s not forgot that the majority of channels are always ‘on’, whereas cloud deployments are particularly suited to ‘bursty’ activity. Is it really useful to have an infinitely elastic ecosystem for playout, when playout is usually static? Pricing, however, remains robust. With linear playout still at the heart of broadcasters’ revenue generation, there is emphatically still a strong market for proven, specialist enterprise solutions from expert vendors with experience and ability in this field,” Pavitt adds.
“The cloud does not solve everything, especially when we talk about public cloud, where valuable content is treated just like any other data,” says Broadpeak’s Nivedita Nouvel. “The public cloud is good for some processing, analytics, and head-end functions, but when it comes to delivery, you need to control the network and its equipment (even virtualized and containerized in the operator’s private cloud) to achieve the best QoE possible.”
For Deluxe, “At the moment, we see major broadcasters moving their main channels to the cloud as an exception,” says Ian Robbins. “That said, it’s a trend we expect to see continue as broadcasters gain more trust in the cloud and seek continued workflow and cost efficiencies. But the move to the cloud is more than just a technology change, it’s a transformational exercise for broadcasters and playout service providers which can result in attractive reductions in content distribution costs. As companies take on this transformation, it’s critical they keep the end goal in mind and realize that the software-defined and cloud-based nature of this model ensures that, operationally, the service can be run from any location – it is geographically agnostic, enabling a new variety of servicing models to broadcasters.”
IBM Aspera also sees playout moving towards the cloud, while acknowledging some of the barriers Pebble Beach points out. Says Richard Heitmann: “While the cost for cloud playout with the same resilience and redundancy as traditional infrastructure is still high – and the mentality gap between cloud and traditional broadcast engineering still needs to be overcome (good enough vs. ‘broadcast quality’, iterative vs. stable and wait, etc.) – cloud-based linear playout is expected to grow.
“IBM Aspera is helping broadcasters with the transition. Our Orchestrator and Aspera on Cloud (AoC) automation tools can assist with automating complex content ingest and prep workflows both in and out of the cloud to reduce cost, increase reliability, and therefore, build confidence,” Heitmann asserts.
“Harmonic is also a player in the linear playout cloud space,” says Andy Warman, who also sees the move to the cloud bringing benefits to its customers. “We refer to this as Channel Origination as it represents the start of the content delivery chain, originated in the cloud. We offer a full, end-to-end video delivery solution running on the cloud – supporting everything from live and clip playout to delivery via broadcast and OTT to consumers. This puts us in a unique position, as we can offer as much of the linear and OTT delivery chain as the end user needs. The move to cloud-based playout is actually helping rather than hindering our ability to assist customers in leveraging cloud-based technologies.”
The $64 million question
“The $64 million question is what is a broadcaster achieving by migrating linear playout to the cloud?Essentially, they are third partying their IT team and renting their data center. Is it more cost-effective than building it yourself – eventually, yes,” says Telestream’s Shawn Carnahan. “Moving to the cloud for linear playout is not an inherently difficult thing to do: the key question is one of cost. If broadcasters are migrating to the cloud, they want to exploit any potential economies of scale.
“With the cloud, broadcasters have more elasticity to cater for increases in channel demand. Whenever they win new business, they can build up the new channels in hours. And just as easily, they can tear it down again if they lose the channel playout contract. At Telestream, we introduced OptiQ this year to meet exactly this demand. Whenever you need to create channels automatically, perhaps event-based channels, that meet all the service level expectations associated with 24x7x365 live linear playout. In developing OptiQ, we have solved the challenges associated with this need and applied it to a much broader audience,” Carnahan adds.
“We have made a very significant investment in moving to a cloud-model for playout and can now deliver extremely complex broadcast channels from our cloud-based operations,” says Red Bee Media’s Steve Russell, who also points to their “agility and flexibility, which boost opportunities for growth for our customers. What cost benefits cloud does bring come through faster deployment time rather than lower licencing costs. If they focus on automated deployment and efficient operations, vendors can still offer a great deal of value in the Linear Playout space. Our new platform has been born out of investment in software, engineering, operational innovation, partnerships and private cloud infrastructure. Public cloud is fantastic for many use cases, but it is not ready for primetime, live, uncompressed, low latency linear experiences at this point on numerous levels.”
Is latency still an issue in live streaming –what hurdles still have to be overcome?
“If you ask 10 people to define ‘latency’, you might get 12 different answers!” jokes IBM Aspera’s Richard Heitmann. “Where possible and necessary, latency should be kept to a minimum to keep the in-venue fan experience in time with the event, support remote control/operations (REMI), and prevent ‘New York Neighbor’ syndrome. Further, artificial latency – while in some cases offering a higher class of service – gives the originator less control over end-to-end timing and release.
“Encoding/decoding technology could still be improved to provide lower latency at lower cost for contribution. Recent innovations like CMAF (Common Media Application Format) and low latency HLS may help with reduced latency over the ‘last mile’ to the consumer. IBM Aspera’s streaming technology can substantially reduce contribution transport latency while maintaining high fidelity, resiliency and reliability. When combined with low-latency encoding and decoding, end-to-end contribution latency can be significantly reduced, leaving any significant latency on the distribution side,” Heitmann adds.
Barriers still to be overcome
“Latency is the #1 issue for live streaming,” says Videon’s Todd Erdley. “The days of linear being delivered 30+ seconds in advance of OTT will come to an end very soon. By 2022, the latency discussion will go away for OTT vs live. Latency will then take another twist as OTT provides the opportunity for true interaction. And with 2nd screen becoming first screen, OTT will experience another rush to move from linear latency levels to truly interactive latency and this will be done at mass scale. Overcoming the first hurdle can be done through broad adoption of CMAF with HTTP streaming. That can solve the linear/OTT problem. Moving to interactive delivery will require a shake out of technology including WebSocket, WebRTC, Apple LLHLS, advances in HTTP streaming along with new signaling standards. This shake out is slowly happening but will not be available for wide adoption until we initially solve the OTT/Linear latency barrier.”
“Latency continues to improve,” asserts Harmonic’s Andy Warman. “We are seeing latency in the 5- to 6-second range for OTT-based live streaming and playout from the cloud. This puts latency for OTT in the same range as broadcast. The result is that, for example, live sports viewers for streaming via OTT and broadcast see the game action at the same time. This is possible thanks to CMAF (fragmented MP4 or fMP4), which enables much lower latency delivery to consumer devices. The challenge is iOS went a different route than CMAF LLC, and we still do not have a unified workflow where we can package, encrypt and stream one file format with two manifests (i.e., DAS/HLS).”
“The big question is what are your expectations?” says Telestream’s Shawn Carnahan, reflecting Richard Heitmann’s answer above. “The elegance of http-based OTT is that it enables media distribution through content delivery networks that don’t treat media any differently than a web page. When we first did it, people said, ‘Wow, that’s cool! Yes, there’s latency but the positives more than outweigh the negatives.’ Now, as this is understood, there has been a search to optimize the video through conventional approaches. If we can reduce latency to around 10 seconds without having to change the network, that is largely problem solved for the vast majority of applications… the standardization efforts of DASH and HLS have largely solved the problems for mainstream applications.”
“Latency is still an issue for linear OTT, but the technology is ready to overcome this and provide latencies similar to traditional broadcast,” says Nick Fielibert. “Synamedia is a leader in this space and provides an end-to-end approach that avoids a weak link in the chain. For example, in some cases the OTT player is not supported by the devices. This is now also changing with Apple providing its own low latency HLS specification, which we support.”
Red Bee Media’s Steve Russell thinks the latency problem is being mastered but standards work still needs to be done. “We have received a lot of attention and recognition for our market-leading work here – delivering as low as 3.5 second latency for live OTT streams. The challenge we overcame was to deliver low-latency results using open standards so that we can still inter-operate with the wider tech ecosystem. These technologies are still maturing, as standards adoption takes time.”
Standards are also an issue identified by Deluxe. “Beyond production we are still seeing latency being a typical sticking point due to the variety of formats and standards that need to be supported,” says Nav Khangura. “CMAF is a new standard announced by Apple and Microsoft which aims to address the low latency requirements by introducing chunked encoding and chunked transfer encoding. This approach not only reduces costs and complexity (by doing away with multiple formats for multiple devices), but it also has the potential to reduce latency to the sub-three second mark. The major hurdle that needs to be overcome now is actually rolling out CMAF on a broad range across the industry. The success of CMAF will solely rely on the uptake by the industry. For example: Apple recently announced their own ‘Low-Latency HLS’ which once again contradicts the push to try and standardize the industry to CMAF.”
Mainstreaming has also reduced latency problems –through a combination of direct connections and AI.
“Network workflows are essential to minimizing the way video is streamed which is why we realized we couldn’t rely on legacy providers and instead built up our own network that connects to clients at the point of ingest and directly interconnects with ISPs to deliver video seamlessly to audiences,” says Antonio Corrado. “With our AI that also prevents congestion and this approach, we have decreased latency by up to 3x the standard rate, essentially increasing the speed of streams TTFF (Time To First Frame) and E2E (End-to-end) delivery.”
“Latency has become a way for network operators to differentiate themselves,” says Broadpeak’s Nivedita Nouvel. “Many technology providers pitch low-latency solutions based only on head-end (i.e., encoding and packaging) and player optimizations, but they forget the network implied in the delivery. It is a mistake that leads to low-latency solutions that work in the lab but not in real conditions. At Broadpeak, we have solved this by combining CMAF Low Latency with a multicast ABR managed network, creating the conditions necessary to reduce buffer sizes in players without impacting service continuity and hence QoE.”
Try and try again
“Yes, latency is still an issue,” says Tim Felstead from Rohde & Schwarz, giving us a recent example – and providing me with the perfect link to the next question on 5G. “One of our Munich based staff members was watching the Rugby World Cup semi-final over a streamed service. They received text messages from New Zealand (the country) saying the English team had scored very early in the match BEFORE seeing it happen on the screens in Munich. That said, I think there is a difference between the state of the art in deployments and the state of the art in current technology. There are solutions to reduce latency across network components and R&S is busy building one in the shape of 5G Broadcast. Latency in 5G Broadcast is tiny in that part of the network 5G broadcasting technology so if the other parts fall into line (encoding, network buffering etc.) then this is certainly a solvable problem.”
5G – an incremental advance or a revolution for content distribution?
“Advances in networks are crucial to industry advancement,” says Steve Russell of Red Bee Media. “It’s about wider pipes and greater agility in points of content acquisition and distribution. Yes, it’s all incremental, of course, but 5G offers a really important step-change in capacity and capability. We are leaders in live and are really excited by what 5G offers in that domains for sports, events and greater interaction for fans everywhere.”
Videon agrees on the content acquisition side of the 5G equation, but sees some barriers to full-on consumer adoption. “In the next few years 5G will revolutionize event production in a controlled environment,” says Todd Erdley.” 5G will merely be a small incremental gain for consumers until such time that consumers can use high speed data delivery in the manner they expect. Such expectations will only be met when propagation issues are solved which is a very, very, very significant problem for the highest data rate 5G.”
Revolutionizing content distribution
IBM Aspera has some skin in the 5G game, and sees a bright future. “5G will revolutionize content distribution, particularly for more personalized content,” says Richard Heitmann. “5G as a cellular platform offers faster speeds and reduced latency resulting in higher resolution and higher bitrates. The user experience can also be greatly enhanced, especially when combined with edge computing. New AR/VR experiences require real-time processing in the supply chain.
“For 5G as a wireless LAN, there will be substantially improved network throughput and offerings. The most common use case for this will be in (sports) venue real-time fan interaction. If content is served locally (from the venue), the offerings could be endless with instant and fast delivery. IBM Aspera will continue to provide transport layer improvements with 5G, in addition to core enhancements to FASP transfer and streaming technologies which will continue to have a prominent place in the delivery/distribution ecosystem.”
Will mobile become 1st screen with 5G?
While seeing some advantages, Telestream’s Shawn Carnahan is skeptical of the wider benefits of 5G. “Will 5G make better experiences more portable – yes, highly likely. There’s definitely something to be said for portability. It doesn’t affect the content – it just further moves the viewing experience to smartphones. But a key question to me is just how good can that experience be on a four-inch screen? At Telestream, we’re doing a lot of work with HDR, which is a technology that really positively impacts the home viewing experience. But, how much of that quality can be seen on a smartphone in bright daylight?”
But just maybe the future will be mobile-first viewing, in which case Antonio Corrado at Mainstreaming’s optimism is justified. “5G will be great for content distribution as it will increase the bandwidth every device is able to utilize which may possibly push the industry into a mobile-first focus for video streaming. The best part about 5G is that delivery networks will essentially be able to stream video more easily compared to the existing 4G networks today as it will increase speeds and the ability to stream heavier files,” Corrado says.
Deluxe’s Ian Robbins also sees 5G driving mobile viewing. “Improved speeds [with 5G] should continue to drive the demand from consumers to watch more and more video on the go, with this increasingly stretching to the streaming of feature content and live events versus the streaming of short form and the viewing of downloads. I expect that 5G will also be an enabler for the type of content being viewed on the go with an increasing desire to watch things in UHD with the greater transfer speeds offered by 5G. In this sense, I think it’s likely that 5G will have big impact on content distribution as we’ll see increased volumes of content, in increasingly advanced formats, being distributed to
‘on the go’ platforms,” Robbins says.
Harmonic’s Andy Warman also sees a bright 5G future. “5G is a new technology that will lead to new services and therefore new revenue streams. Some of the services that will be offered with 5G not currently available for 4G include: multi-game, multi-view, virtual reality and augmented reality to mobile devices. We expect mass adoption in 2022-2024, given that only a few countries will have a 5G network set up in 2020.”
Eliminating the network bottleneck
Avraham Poupko, Manager of the Architects Team at Synamedia, also predicts a massive impact for 5G: “We believe that 5G will revolutionize content distribution.
“One of the main barriers to offering truly interactive content at very high definition is the network. 5G will eliminate that bottleneck. The increase in delivery capacity with 5G will allow video service providers to distribute bandwidth-intensive content including 4K and high quality interactive content such as 8K based virtual reality.
“We then expect breakthroughs in related areas such as high availability storage and video processing. Synamedia is paying close attention to this space and plans to play a leading role in this revolution,” Poupko concludes.
Final word on 5G goes to Rohde & Schwarz, who are heavily involved in 5G broadcast testing and rollout.
“R&S believes in 5G Broadcasting (and as a corporation the wider 5G standard as a whole),” says Tim Felstead. “We believe that 5G Broadcasting is a huge opportunity for several industries including and beyond live entertainment distribution. In the media space we see it as a goal that our industry has been working towards for some time; linear media delivery to mobile / telecoms-based consumer devices. For the first time we have a telecoms-based standard that has been written with broadcasting input from the very beginning. This makes delivery to mobile devices highly network efficient (read any number of articles about the explosion of video content consumption that will saturate networks), with very low latency and that can operate SIM-free (to tablets without a contract for example).”
Piracy remains a major issue with losses in the $billions being reported. How are you helping your customers to fight back?
With $billions of potential revenues being lost to video piracy, this is a key battlefront if broadcasters and media companies are going to be able to get the returns on their content investments to fund the creation of new content – and of course, remain profitable. Alan Ogilvie, Lead Product Manager at Friend MTS, recognizes this, and has some weapons to help the industry fight back. “Friend MTS recognizes that piracy directly impacts your investment in content, whether that’s the money spent on content creation or the revenue associated with content distribution,” Ogilvie says.
Protecting investment and revenue
“We have tools to aid the fight to retain control and protect your investment and revenue models. On OTT, some people still believe that Digital Rights Management is enough. In the Conditional Access systems in Satellite, some believe that’s enough too. It certainly isn’t. While these methods are important, it’s vital you let us apply an audience-imperceptible watermark either in the distribution chain and/or on the client-side in a robust and trusted manner. Then you need our global monitoring and investigations services to go looking for your content being pirated and to determine the subscriber that leaked it or the distribution method used. These proprietary technologies work together to help you to detect, deter and disable so that you can protect your investment or revenue,” Ogilvie concludes.
“IBM Aspera has always offered the ability for both encryption in transit and encryption at rest,” says Richard Heitmann, who also agrees that integration with specialist services is the way forward for complete protection. “At rest, encryption supports both client side and server-side secrets. This is just part of a solid content security framework. When used properly, encryption at rest is a reliable deterrent.
“Further, we recently released an automation feature for Aspera on Cloud that allows integration with third-party services as part of the file transfer process. Specifically, along with our partner, Irdeto, we have integrated forensic watermarking capabilities into our distribution workflow, thus enabling content to be tracked to each recipient. This automation feature to incorporate third-party best-of-breed services such as watermarking, fingerprinting, blockchain and the like, significantly improves security for high-value content,” Heitmann concludes.
Staying ahead of the pirates
Final word on piracy goes to Synamedia, whose Video Security Product Manager, Rinat Burdo, points out that “Synamedia has a track record in securing Pay-TV services and revenues for over 30 years. We support our customers with layers of security to stay ahead of the pirates – anticipating their attacks and reaction to our security counter measures. This includes preventative measures as well as reactive measures to fight the inherent ecosystem vulnerabilities that pirates exploit.”
Burdo continues, “Our approach is to offer an end-to-end solution rather than point products. This approach is more effective against attacks that exploit the weakest link, and provides our customer with a continually enhanced umbrella against evolving attacks.”
Burdo agrees with Friend MTS’s Alan Ogilvie that to complete the circle, you need to be actively on the look out for piracy too. “Proactive intelligence is the key to an effective anti-piracy solution. This intelligence has to be global because streaming piracy ignores any country borders. We employ technology tools and data analytics to track pirate activities at scale. We combine it with human intelligence to investigate pirate operations, and anticipate pirate next steps, as well as forensic analysis of tools used by pirates. Armed with this knowledge our solutions are ready to defend our customers’ systems against new and evolving attacks,” Burdo concludes.