With the ever-increasing consumption of content there comes an equal demand to acquire, manage and deliver it. Clearly the pace of new content creation and production will continue to deliver a significant volume. However, there is also a large amount of valuable content that exists within archives, in both physical and digital forms. To gain any benefit from this content, financial, social or otherwise, requires improvements in discoverability and accessibility, both inside and outside of the organisations that hold it.
What’s in the archives?
To go into the details of every type of content format or object held within archives would take a considerable time and is really a subject for another day. However, for the purposes of this discussion we will explore the types typically held by organisations such as cultural and heritage film and sound archives, public and commercial television broadcasters, national archives and private collections.
The earliest film formats held in archive around the world, generally, date back to the late 19th Century and early 20th Century. This includes nitrate, a highly combustible format and acetate, commonly known as safety film. Latterly polyester film has been widely used so is also quite prevalent in many archives. Much of the content is still only held on these physical forms which means it is in danger of being lost forever if it is not preserved and digitised soon.
For sound, the wax cylinder is one of the oldest forms of content carrier generally held within archives. More recent formats include audio on film, magnetic tape, vinyl disc, CD’s and DVD’s. Again, much of this content could be lost if it is not preserved and digitised.
Other objects held in these archives include videotape, still images, posters, documents and artefacts. Some of which could also be lost if not preserved and/or digitised.
Many public television broadcasters will have an archive that dates back to the formation of the organisation, the BBC being the oldest in the world, from 1922. Many of these organisations hold a considerable number of film, videotape and audiotape formats in their physical form, putting it in jeopardy if it is not digitised. Additionally, there are a huge range of commercial broadcasters, content creators, content owners and media organisations that also date back to the 1920’s, although only really came to be prevalent from the 1980’s onwards that have content at risk.
Of course, all of this does not include the explosion in born digital content that is becoming so ubiquitous, both within and outside of the traditional sectors and organisations described above.
As we can see the timespan of content creation is long and ongoing, so the volume of it is also immense and growing. Archival methods and processes have undergone numerous changes over time. Typically, in the past, the prime focus of archiving was institutionalisation and preservation of content and associated, potentially useful, materials. However, these functions have merged and expanded to encapsulate the cross-media opportunities the content could have, with the ability to repurpose and reuse rather than just creating an untouched ‘memory’ bank.
Storage and access of content is a challenge in its physical forms but is equally challenging when we get it into the digital domain, although for different reasons. However, storage and access are not the only concerns. Efficient and effective acquisition, preservation and digitisation along with curation and cataloguing, make a huge difference to the value that can be extracted from the archive; both from a historical and financial perspective. Getting the metrics across the whole spectrum of the organisation is also key to extracting value from it and creating efficient use of its resources.
The implementation of both a content strategy and a flexible software platform will significantly improve the acquisition, curation, preservation, storage and accessibility of content and objects, in all their forms of both physical and digital renditions. Over time this software has been commonly known as one or more of, MAM (Media Asset Management), DAM (Digital Asset Management), CMS (Collections or Content Management System) or in fact many of the other tags that are attributed to the business systems around content, media, object and collections management.
If we take away the label attributed to the solution and concentrate on the organisational needs and use cases, then the most effective platform should enable it to integrate the four pillars of the business:
- Content – In all forms, including the information (metadata) about it.
- Workflow – Any process that is carried out on the content and/or metadata.
- Resources – Every process will use a resource of some form, whether human, technical or both.
- Analytics – To provide the organisation with all the business intelligence it needs.
When a software platform seamlessly integrates all four of the above pillars, it provides the foundation for comprehensive management of anything from a departmental requirement to a full-on enterprise wide solution. You can call this a MAM, DAM, CMS, or any other acronym that is out there today. But the reality is, that it is a solution to efficiently manage the business of the organisation, irrespective of the label you put on it.
Many discussions are had by organisations as to whether it is better to internally build a solution from the ground up that is customised only for the needs of the organisation, or to buy a solution from a specialised supplier with broader experience and capabilities. Typically, these discussions have been based upon on-premise solutions with both approaches having their pro’s and con’s. The choice, however, should always be based upon the organisations use cases and core competencies and not purely on a technology decision.
One common perception is that building internally will arguably keep tighter control on the design, development, deployment and ongoing support of the platform and therefore meet all of the unique organisational needs. This, in reality, is often not the case. It is also a common perception that all supplier delivered solutions will only meet some of the organisational needs and therefore require a significant amount of custom development, especially around workflow orchestration, before the platform is usable. Control is also lost to the supplier. Again, this does not have to be the case.
Other options to a binary build vs buy decision should be considered. A combination approach where the solution is to implement a platform that puts the user organisation in control and enables it to enhance, configure and develop additional business functions as and when required, is also an option. This overcomes the “blank canvas” challenge of designing a solution and speeds up the deployment of it to the business.
One example of where significant benefits can be obtained from the implementation of a supplier platform, is that of workflow orchestration. When a business has the ability to build a set of workflow templates from a library of tasks using a graphical UI then the speed of both initial workflow deployment and future modifications delivers an agile operation. Some tasks will be specific to archive organisations whilst others may be more generic media processes used for digital media processing and may be both manual user interactive and automated. In these scenarios there is no need for specialist programming or scripting, so no reliance on the technical specialists, the supplier or 3rd parties. The efficiencies gained by being in total control of both the archive and the business systems and processes should not be underestimated.
Into the Cloud
The next question is “where to deploy”? This used to be simple, as the only realistic approach was an on-premise platform. But today archive organisations have a choice. On-premise, cloud or a hybrid approach. The initial decision though does not mean that it cannot be modified as time goes by. Technology platforms today provide comprehensive functionality to integrate on-premise and cloud environments that can morph over time to create inter and intra organisational agile and elastic ecosystems.
On premise archiving has been particularly appropriate for organisations with the resources, infrastructure and enough physical capacity. It can arguably be less expensive over time. If expansion is unlikely, the only costs incurred will generally be the maintenance of the software and hardware infrastructure. But other less tangible costs should also be considered, such as infrastructure support, power, cooling and hardware refresh, which may tip the balance of value to a cloud-based platform. A further consideration for some archive organisations is that of security, especially those archives with classified or highly sensitive information.
Cloud based platforms are quickly evolving to provide agile and elastic solutions to many of the challenges archive organisations are facing. Challenges include the explosion of digital storage capacity required and the ability to efficiently curate, catalogue and discover.
Operating in a highly competitive media environment sees clients looking for cost effective storage options that maximise return on investment. Cloud based storage is a cost-conscious option, particularly as infrastructure incurred costs are no longer a concern of the user. Subscription plans are often a solution, thus making this archiving method financially accessible for both small and larger organisations. It is also highly scalable and supports the growth of an organisation as increased storage may be required over time. Similarly, cloud-based platforms enable solutions to be deployed and managed in a multi-tenanted geographically distributed environment; again, providing accessibility to all sizes of organisation.
One of the resource intensive processes archives have struggled with is that of curation and cataloguing. However, that is now changing. The use of cloud-based AI technology is starting to provide an answer. Being able to undertake facial recognition, speech to text and insights provides a significant amount of additional metadata, both technical and intellectual, that enables enhanced discovery. But it is how it is used in the organisations use cases that will extract the maximum value from the content, not just AI on its own.
There is a huge volume of content and objects stored in archives around the world. Extracting value from them requires a strategy for acquisition, management as well as accessibility. This strategy needs to encompass both the business systems and ongoing storage to ensure that value is maximised, not only from a financial perspective but also from a heritage and social standpoint too.