RebelMouselinkedinPinterestGoogle +BlogspottwitterFacebook

Hollywood 2.0: Networked Data Centers

New hubs of content creation are changing digital film and television workflows to the delight of industry coffers.

By Shane Guthrie, Senior Manager, Global Solutions Architecture West, Equinix

EQUIX_App_infographic_White_C

In the race to stay competitive, producers and broadcasters constantly are seeking ways to improve production with a flexible, high-performance, and more economical infrastructure and workflows. Achieving this is the “Holy Grail,” but getting there can be a Herculean task for media and entertainment organizations, which must operate within a strict framework of fixed capital that is recouped on a continual basis, limiting the ability to invest in new technologies.

The question is: How does an organization raise production quality and lower costs while maximizing speed-to-market? Increasingly, it’s being accomplished by leveraging the magic that happens behind the scenes when professionals on location utilize low-latency networks and Web-based workflows that make content accessible to counterparts in other parts of the world. Low-latency* file transfer may be a boring name for this process, but it is a technology movement that could very well transform the capital structure and future workflows of the film and television industry.

What’s in a Name?

Before we explore low-latency file transfer, for those unfamiliar with the traditional form of moving assets, here’s a primer on physical asset distribution – also known as the way things have been done since the advent of production. Not too long ago, it was the norm for broadcasters to ship hard drives of the media assets they created. This method was conceived before the advent of specialized network technologies to address legacy business agreements drafted to transport large media files. An intern would take hard drives loaded with raw footage from remote filming locations or productions to the closest courier, where they would then be sent for review or processing to those with a stake in the project’s progress and direction.

This physical process created a logistics and expense nightmare, slowing down workflows by days, if not weeks. This one-way flow of information to solitary endpoints also made it very difficult – if not impossible – to respond in real-time to changes or requests by producers or financiers of the project. Once a project was finished, films and television spots would again be sent by physical courier to content delivery networks, which then would distribute the completed work in the same laborious, expensive manner.

Figuring out a way for gigantic copyrighted files to be transferred worldwide, more securely and faster than a courier service while allowing for constructive dialogue between sender and receiver, was crucial.

Many productions initially addressed this need by paying their own capital specific to their purpose. For example, if a production was done in Tunisia, England and Guatemala, studios and broadcasters would sign long-term contracts with service providers in those nations to build out additional infrastructure to handle the bandwidth necessary for a particular production. This process often took months, but it would ensure that large amounts of raw footage could be sent back to headquarters and, from there, to the various teams involved in production.

To complete the transfer, production companies also would have to beef up the networks of the intended endpoints to cope with receiving a high volume of large assets, as well as sending edited or annotated versions back into the field. When production ended, studios and broadcasters often would be stuck with a long-term contract for lots of fiber and throughput that they might not ever use again. While building physical infrastructure for a singular purpose is nothing new in the movie business, having to continually pay for these high-speed, data-intensive connections after wrap was a new financial burden that few could handle.

Thankfully, this initial fix brought to bear another trend – that of carrier networks fortifying their own infrastructure to handle the explosive amounts of data being created and transferred daily worldwide. These network providers were co-locating in third-party, multi-tenant data centers to exchange traffic and connect directly to their customers, ensuring quality of service and security were not compromised by over-reliance on the public Internet. The huge influx of capital spent by wide-area network providers and Ethernet carriers to upgrade the capacity and reach of their networks is a boon to the media and entertainment business, as it is to a variety of other industries such as finance and cloud computing.

A Boon in the Ethernet Boom

The boom in metropolitan Ethernet has created literally millions of drop-off points for assets and film feeds. Basically, any building with a direct fiber line that’s interconnected to a network terminated in a data center housing multiple carriers can be viewed as the new courier drop off. For an entertainment industry now beginning to handle UltraHD and 4K content, delivering these massive files through buildings “pre-lit” with globally interconnected fiber has ushered in a tremendous improvement in latency for data transfers – not to mention the ability to work with massive files on a real-time basis.

As a result of moving electronic assets over extensive fiber networks, studios and broadcasters not only benefit from lower latency of file transfer, but also heightened security. Passing through data centers home to rich ecosystems of network providers gives media companies a level of infrastructure they hadn’t enjoyed previously. Network providers can ensure data goes directly to their private storage arrays, or those of a trusted third-party, and can be customized to security and encryption preferences. This allows companies to maintain control of their assets and to work on 4K and even 8K assets loaded with metadata.

In fact, assets can go right from a broadcaster’s storage system to the intended source – provided that production houses, content delivery networks (CDN), distributors and others parties of interest are co-located. Prior to moving files to a CDN or distributor, assets can be enriched with live event feeds that also pass through data centers. This creates a chain reaction of efficiencies every step of the way, with reduced hops between live event feeds, production houses, content delivery networks, and distributors.

Beyond speed and security, moving files electronically also affords filmmakers and broadcasters the benefit of moving data more flexibly and cost effectively. By leveraging the existing and growing carrier infrastructure, companies no longer need to source their own fiber. Instead, they can request the bandwidth they require for production from the networks that already have the capacity to meet those needs, and have this capacity delivered in a matter of hours versus months. This allows production companies to move assets at a cost basis that is a significant improvement over what they pay today.

On the carrier end, providers increasingly are able to offer different pay structures and contracts to meet media client demands. Broadcasters now can lease throughput on a short-term basis, rather than pay for fiber capacity to sit dormant for the majority of a long-term contract. This benefits the carriers as well; they can leverage flexible revenue streams and recoup their capital investments faster.

Film Meets Cloud

Cloud storage

Image courtesy of Stuart Miles / FreeDigitalPhotos.net.

In fact, low-latency file transfer over networks housed in data centers is just the tip of the iceberg for broadcasters looking to lower capital expenditure by taking advantage of existing infrastructure. With files already flowing into a storage facility located in a data center, media companies can latch onto the growing trend of cloud computing. While leveraging the cloud for digital production workflows is a recent concept, other industries, such as financial services, have been reaping the benefits of the cloud for years.

In media industry terms, using cloud-based applications allows for additional cost savings to broadcasters and producers as they are able to “pay-as-you-go” for both infrastructure and production applications. For example, a studio may only need to use a specific visual effects software program for the duration of post-production; this model allows companies to “rent” the necessary applications, infrastructure, and direct interconnection to cloud providers such as Microsoft Azure and Amazon Web Services to support their work, versus purchasing applications and infrastructure outright.

More broadly, end-to-end workflows, from live feed acquisition to content delivery, can be accomplished leveraging the cloud, and can be applied across multiple infrastructure boundaries. As more and more production capabilities are offered as a service, producers have an ever-increasing range of options with which to fulfill production needs. Data centers also serve as a hub for where this all can happen together. Companies can source service providers and switch partners as needed with very little cost to their business.

The combination of low-latency file transfers, the ability to connect with multiple partners and customers in multi-tenant data centers, and the breadth of cloud-based applications being developed to increase production efficiency is heralding a major transformation for all companies involved in digital production workflows. Increasingly, what used to live in a physical box can live in the cloud – and maybe, the interns who would otherwise ship hard drives can more quickly realize their moviemaking dreams.

 

* Latency is the amount of time a message takes to traverse a system. In a computer network, it is an expression of how much time it takes for a packet of data to get from one designated point to another.


July 28, 2014