Eric Bassier, senior director product marketing at Quantum, outlines the long-term challenges of looking after your irreplaceable content.
One of the biggest issues facing the media industry is protecting and maximising its invaluable content. The original copy of the first Star Wars film isn’t simply a piece of data, it’s much more than that; it’s history, art, and a part of humanity’s heritage.
The same can be said of live broadcast sporting events such as the Super Bowl, the World Cup, and Wimbledon.
This calls for an intelligent and highly secure long-term storage infrastructure. But, the realities of what it will take to manage, store, protect, and enrich this data over the next years, decades and longer are an emerging challenge.
The goal of any content archive is to store and preserve digital content for decades, as well as ensure it can be retrieved when needed at a moment’s notice, regardless of physical location.
The first basic requirement is ensuring that your archiving platform has the capability to scale and handle petabytes and even exabytes of content in a cost-effective way.
Cameras with 1080p, HDR and 4K capabilities are creating unstructured files and object data at unprecedented rates. It’s not uncommon for unstructured data sets to grow into many hundreds of petabytes and even exabytes in a matter of just a few years.
With VR, AR, and 3D formats still on the rise, any long-term archive plan needs to be able to handle that sort of scale.
The technology on which the data is stored is also key – flash, hard drives, tape, optical, cloud or synthetic DNA.
Each technology offers its own advantages and disadvantages, but the key for most media creatives and executives is accessibility and cost. While flash offers lower power consumption, it’s expensive. Tape technology, which comes at a lower cost, offers a 30+ year shelf life, is more reliable than disk, and presents a much more ‘green’ eco-friendly option with stronger protection against any potential ransomware attacks.
An efficient archive system needs to give production staff quick and efficient access to content that was stored decades ago, regardless of their location. It also needs to allow for quick upload and ingest into the server when content is first created.
Post-production houses, broadcasters, and industry creatives need to keep an eye on industry developments in this space over the coming years. The need to physically migrate data every five to 10 years to keep up with formats and customer demands will be a sobering realisation for many, as well as the underlying challenges of archiving data while driving down costs. In fact, who knows what other applications, systems and programs will be around that will exacerbate data growth. The only thing that remains certain is that data is and always will be the most valuable commodity that businesses possess.
News Source: Broadcast Now