
White Paper
The Digital Supply Chain – Managing Media In A New Paradigm
““The need to implement ‘just in time’ supply chain management, as seen in other industries, becomes increasingly important as media organisations are driven to improve content management efficiencies. To achieve this, the gathering of data also becomes increasingly important. It is the orchestration platform that needs to gather and utilise this data for intelligent, automated decision making across the digital supply chain.” ~ Tony Taylor, Executive Chairman, TMD Ltd
Media networks today are under enormous pressure. The industry is in a state of disruption as consumers drive demand for more content on more platforms. As broadcasters move to OTT, the volume of business is substantial, but the margins are not, making profitability a challenge. This drives the need for cost efficiencies, flexibility, and speed throughout the supply chain. The need for an agile, software-defined process orchestration platform that delivers business intelligence and automated decision making for cost savings and efficiencies is essential.
Introduction
Broadcast television production has always been based on a series of processes, from acquisition – shooting on location or in the studio – to post-production to distribution and final transmission. In the days of 16mm film and early videotape these were more self-contained steps rather than a fully integrated chain. The advent of digital brought the concept of the workflow, a smooth, interconnected structure that has become more end-to-end with the wide-scale adoption of tapeless, file-based operations.
The need for such an incorporated, joined-up approach was always apparent on the professional production side of broadcasting but now there is an ever greater requirement for an all encompassing infrastructure that connects with the way people are receiving and watching programmes and other media content.
There has been a proliferation of devices such as smartphones, tablets and smart TVs, all connected to a wide array of platforms, from conventional TV channels to OTT services such as Netflix and Amazon to a huge number of YouTube channels, supplying a broad variety of content. This is being matched by the growing demand for films and other programming, with a parallel rise in the number of formats being developed to reach these new platforms. The situation can be summed up by the term ‘Content Everywhere’. This is not just another technology buzz phrase. It does convey a situation where there is more material available on many more outlets being received by even more devices.
Workflow Orchestration
Workflow orchestration, although not new, is one of the big phrases in the industry today. Along with IP connectivity, it is the only way we are going to survive. In simple terms, workflow orchestration means we define all the things we might want to do with content as process tasks, and we push the content from task to task, whether fully automated and/or with manual human interaction, until we reach the required deliverable. Workflow orchestration is our friend. It gives us both a technology and a business platform that can quickly adapt to changing business requirements for broadcasters, media organisations, content owners, producers, and the wider creative community. Where once, for a broadcaster, it could take months to put a new channel on air, today we can often measure that in hours. We are no longer hard-wiring workflows, but drawing them on screen and allowing the orchestration engine to manage them.
But the orchestration engine is only a mechanism to carry out the processing. What really enables operational efficiencies to be unlocked and is critical to success, is metadata. If you acquire, manage and use metadata efficiently the capacity for delivering more content to more devices on more platforms is maximised. Inter and intra organizational departments all have individual applications. Whether it be rights information, technical parameters and content descriptions, each business area cares about each of them. But taken together and adding AI derived metadata, such as speech to text, facial recognition, people detection, text on screen and brand/logo detection, as the content is processed enables a significant level of automated decision making, and it is this which releases the power of the modern content supply chain.
To make all this work together calls for a new concept behind how the existing workflows are organised. The result is what we term the Digital Supply Chain (DSC). This is an enlargement of end-to-end workflows and goes beyond the studio or playout centre to encompass the means of acquisition at the beginning of the production process through to the final delivery at the other end.
Existing technologies and processes, such as transcoding, quality control (QC) and post-production, form part of the DSC. But these will have to work with new techniques that are needed to bring everything together to form software-defined, agile workflows. The key here is ‘orchestration’, which neatly sums up the concept of making many disparate components work alongside each other to create a fully integrated whole.
The aim of this White Paper is to discuss the evolution of both the DSC and the technologies that are making it possible – the most obvious example being the Cloud – and examine how it can be used to create a service oriented architecture that seamlessly interlocks the production domain with the consumer world of multiple platforms and devices, cost effectively.
The Development of the Digital Supply Chain and how it caters for “more of everything”
A Digital Supply Chain can be simply defined as the means by which digital media is delivered from its source, in this case a broadcaster, OTT streaming service or other content provider, to the consumer, specifically the audience watching on TVs, computers, mobile phones or tablets.
The diagram below provides a simplified view of the DSC relating to a content provider or broadcaster. At the centre is the digital ecosystem of the media organisation, with the content acquisition processes, content management and preparation processes, and content packaging and distribution processes. Each of these business areas have interactions with third party suppliers as well as geographically dispersed internal divisions and departments.
On the left we have inputs to the acquisition process from third parties. This might be film studios, production houses, content owners, and any number of other suppliers. There are also outputs to these third parties during negotiations, or for example, if content doesn’t pass the specified format or quality standards.
At the bottom we have inputs from and outputs to third party suppliers such as subtitling, closed captioning, transcription, audio dubbing, post production, along with other geographically dispersed internal operations. And on the right are the outputs to third parties and internal divisions for the distribution of media content to linear and on-demand platforms as well as other consumption environments such as international sales and licensing.
So we can see even from this simplified view that there are a significant number of content, business, and technology processes, both internal and external that need to be coordinated and managed if the volume of content to meet the ever increasing consumption demands is to be achieved. This is where the orchestration solutions come to the fore.
The Evolution of Media Asset Management
The industry has had many iterations of solutions to manage media content. In the early days, playout automation was the only solution in broadcast operations that managed content in file form. It would ingest the content from tape to the playout cache or directly to the servers. And it was only important within a fairly short window prior to transmission. Playout automation was essentially the first orchestration layer for linear broadcast environments. Then came the early iterations of Media Asset Management systems (MAM).
All these processes are established parts of today’s broadcast production workflows but to create a full DSC they need to be completely integrated with each other. Managing a DSC is becoming increasingly complex because of the growing number of platforms, devices, formats, bits, and technologies involved. There is also more third party involvement, with broadcasters and content providers working with different manufacturers and services providers (including playout). On top of this, the amount of distributed storage needed is increasing which needs to be processed and managed.
Bringing all these functions together as a DSC comes under the heading of media orchestration. This covers not only how automated playout is controlled and the MAM involved but also how material is configured and monitored. In effect, orchestration facilitates and smooths the whole supply process as well as connecting and integrating the different elements involved.
The roots of orchestration lie in early playout automation systems for linear broadcast centres. This expanded to include the first iterations of MAMs, which could be used to search for information about content but were rarely able to orchestrate any major tasks without referring to a file system that enabled a process to be performed manually.
More sophisticated MAMs were developed that used scripting and watch folders to move content from one storage location to another or send it to be transcoded. The crucial part of any MAM is metadata. Information about a file should not be just for the creation stage, it has to be updated and enriched at every stage of the production process. Every parameter and change should be noted in the metadata to ensure it remains in the correct format, conforms to standards including those for video coding on different platforms and audio loudness, and is directed to the correct outlet.
Agile software development is now being used to define workflows, using automated business decisions based on adaptive planning contained in metadata values. It is these technological implementations that make the DSC possible and enable it to be fully orchestrated.
The ideal orchestration platform has to be significantly more sophisticated and intelligent than its scripted predecessors if broadcasters and media groups are to satisfy the demand for content consumption while meeting the needs of monetisation. To do this an orchestration system has to not only control and monitor a platform but also receive feedback and metadata from all third party devices and systems in the supply chain, including transcoders. In effect it has to both gather the data and use it to apply intelligent decision making across the DSC.
Content Everywhere – The Cloud and Beyond
Content is no longer just stored at a broadcaster’s or media supplier’s premises. Neither is it delivered to an audience over a single medium. Programmes and other material can be held at third party facilities, such as playout centres, from where it can be distributed to multiple platforms and devices.
Another possible dimension to the modern DSC that is being adopted increasingly by media organisations of all sizes is the Cloud. Essentially Cloud computing is an IT infrastructure based on computer networks, servers, storage systems and software applications that allows content owners and suppliers to store large amounts of data in locations other than their own premises but with full access and control.
There has been a lot of hype and discussion surround the Cloud in recent years but the concept behind it is not new. What we are working with today is the third iteration of the Cloud. In the 1960s, 70s and 80s mainframe and mini-computer bureau offered time-sharing and subscriptions to fulfil the vision of computing as a utility. By the mid to late 1990s there was the ASP (Application Services Provider) model. This was not wholly successful due to the lack of virtualising technologies combined with restricted availability of fast, affordable communications links.
The Cloud we have now is well established and will continue to be successful for a number of reasons:
- The affordable fast communication links are readily available.
- The hardware platforms are powerful enough to support the virtualisation software to create the economies of scale.
- Organisations are seeking to benefit from an OpEx (operational expense ledger, which covers ongoing, regular costs) model rather than having to invest in expensive hardware that requires a refresh every three to five years.
- The OpEx model for software subscriptions is gaining momentum.
Despite these obvious benefits, the Cloud is not suited to every organisation or application. When considering Cloud systems and technologies for a DSC, a potential user has to make judgements concerning the development and implementation of an overall system and what orchestration platforms will be needed to manage it.
The first major decision in selecting a Cloud is the type. Clouds can be public, private, or hybrid. If going for public there is the subsequent question of which service provider to choose. Suppliers include Amazon, Google, Microsoft, Oracle and a whole host of others. A private system can be housed in a company’s own data centre or at a commercial, hosted complex or over a combination of the two. A hybrid Cloud can be any permutation of public and private.
Making this decision requires an understanding of what services are needed and which will be housed in the Cloud environment. For example, do you want to use Amazon S3 (Simple Storage Service) in the Cloud to store material that can be accessed by third parties for collaborative working?
A major content producer and broadcaster faced such questions when producing programmes for several online platforms, including YouTube. The aim was to implement an orchestration layer with Cloud integration to make this possible. Producers and editors search for and then select material using the MAM. Editing is carried out on a non-linear desktop system, from where the finished package is loaded back into the MAM.
This starts the orchestration process, which integrates an Amazon S3 storage system with the encoding and distribution platforms. This happens through APIs (Application Programming Interfaces), which signal if the process has been completed successfully or why it is interrupted or has failed.
The Benefits of Cloud Implementation and Orchestration for Digital Supply Chains
Content is a valuable commodity and digital technology has made it possible for unauthorised and unscrupulous people to get their hands on material. The controversy over leaked episodes of Game of Thrones amply illustrates this point. But digital technology can also make systems extremely secure, which is why security is a primary advantage of Cloud-based storage and distribution.
Access to Cloud instances (virtual servers), services, traffic and storage has to be thoroughly secured using access policies, secure protocols and firewall rules, including IP range and geo-fencing. There are also extensive options for encryption, digital watermarking and other security measures involving DRM.
Budgeting for Cloud deployments is a big challenge because it is harder to estimate costs when comparing with on-premises installations, which involve complex pricing of storage, transfer and computing.
This is complicated by the rise of non-linear and on-demand services, including OTT, VoD and IPTV. The need to expand facilities to accommodate more outlets makes the traditional CapEx approach of buying equipment and software more difficult to estimate accurately.
But the implementation of an agile cloud-native platform, with full orchestration, will provide comprehensive workflow solutions and capabilities. It will also allow extensive monitoring and control of operations through a centralised ‘dashboard’ display that covers all systems, whether on-premises or in the Cloud.
Conclusion
Broadcasting and the means of distributing material has changed and will continue to do so. Streaming and other non-linear platforms are now firmly established and growing. In several cases they are run by corporations with a lot of financial, technological, and production clout. And their reach is only extending.
During 2019 one of the big players in OTT, Netflix, reached the 151 million streaming subscribers worldwide. This means potentially every hour of the day Netflix could be streaming 151 million unique hours of content. That is a lot of programming to deliver manually.
Therefore, a Digital Supply Chain backed up by intelligence in how all the processes involved in acquisition, production, and distribution are orchestrated is the only logical approach to take.