RebelMouselinkedinPinterestGoogle +BlogspottwitterFacebook

Go Big or Go Home: Four Must-Haves for Modern Media Workflows

In the world of digital media workflows, it’s always been about the struggle to manage big.

By Janet LaFleur, Product Marketing, Quantum

Media

Image courtesy of Stuart Miles / FreeDigitalPhotos.net.

In digital’s early years, enterprise storage infrastructure couldn’t keep pace with the demands of SD. The first non-linear editing systems were turnkey solutions with proprietary file and file system formats. In the last decade or so, IT-based solutions have caught up, and open solutions for shared storage are readily available at price points affordable for even boutique post houses.

Now that we’ve made the shift from digital SD to HD, there’s the push toward the quality viewing experience that 4K brings. At the same time, new avenues for distribution are driving workflows to provide quick and direct access to all content for monetization and reuse, not just work-in-process content. For most facilities, this means considering four key issues before making any decisions.

 

1. The Rise of 4K

At NAB 2014, virtually every storage vendor claimed to offer a solution for 4K workflows. But what are they truly offering? 4K technically is just a frame size – 4,096 pixels × 2,160 lines for cinema or 3,840×2,160 for consumer – but to achieve the quality that people expect from 4K, you need more than just a high-resolution image. You also need a frame rate of 24 frames per second or greater, and a compression codec that doesn’t degrade the 4K image quality. After all, to your creative team 4K is 4K raw, not a low-resolution codec like most vendors offered.

The dirty little secret is that it’s difficult to stream 4K content at this rate without dropping frames on a shared storage system. So to support their 4K raw claims, some workflow storage vendors are reverting back to direct attached storage or proxy editing. Going that route requires modifying your workflow for 4K to download locally for edit instead of accessing directly from shared storage. That can mean giving up the collaborative environment that has shortened your production cycles so you can meet the tight deadlines that drive this competitive industry.

The good news is there are industry-standard, high-performance solutions available that don’t require you to disrupt your workflow with tedious downloads and uploads to and from direct attached storage. So when you investigate storage for 4K workflows, your key questions should be: What frame rate does it support? Which 4K codec? Is it 4K raw? Does it work with shared storage or only local storage? How many concurrent streams can it support?

 

2. The Need for Scale

Higher-resolution workflows are not just about increasing storage performance; they’re about increasing storage capacity by three to four times more than HD, depending on codec. And that’s not the only trend that is driving the storage explosion. Much of the storage demand starts right at capture. Live events are now being shot with more cameras, and in feature production, cameras are less likely to be shut off between takes.

In facilities that turn projects frequently, the large-scale file creation that happens when files are ingested, or file deletion when a project is completed, often conflict with times when users need high-performance access. If the storage can’t handle it, facilities often are forced to set up “delete windows” of limited file system access. With tight production deadlines, this can make the difference in being on schedule or off.

The heavy storage consumption continues throughout the workflow with transcoders spitting out more distribution formats for more connected devices, and content owners creating more second-screen content for both live and on-demand markets. If you’re storing content in single-file-per-frame formats such as DPX, or if you generate large numbers of files as in VFX, your storage will need to scale accordingly without degrading performance.

So in addition to asking what a storage system’s capacity is, you should ask: How easy is it to expand capacity? What’s the maximum number of files (not just terabytes) this storage system can support? Does access performance degrade as the storage fills, as the number of files increases or as files are deleted?

 

3. The Need for Long-Term Access

At a time when there are more ways to monetize and reuse content than ever before, keeping digital assets accessible to production teams has become more challenging because of capacity demands. The all-too-common strategy is to store as much content on high-performance storage as budgets permit, then move older content to offline tape archives as storage fills. In many facilities, unused raw footage simply is deleted after the project is completed with no regard for its future potential.

With an architecture that allows content to be stored on less expensive – but still secure – digital media, content owners can capitalize on multiple revenue opportunities. The trick is ensuring it allows direct, seamless access by production teams. Popular storage options include object storage-based solutions, which offer disk-speed access to petascale content, and LTO/LTFS(1) digital tape solutions that are an even more economical choice, particularly for content owners who want to store copies offsite for disaster recovery.

You’ll need to find a solution that integrates into your workflow and is economical even after you factor in system management. Key questions to ask include: Which types of archive media does the storage system support? Is there automation to migrate content to the archive based on policy? Can archived content be quickly, easily, and directly accessed by users? What are the limits to the archive’s scale?

 

4. The Need for the Cloud

Content production has never been a simple process, but the number of moving parts and scale involved has grown to global proportions. Even a low-budget film might shoot in the rainforest of Costa Rica, be edited in Vancouver, have visual effects added in Korea, undergo color-correction in Toronto, and finish up in Hollywood. At the same time, there’s more pressure to transcode and deliver content worldwide on more platforms. Doing all this without the added complexity of making and transmitting duplicate copies between remote teams is critical.

That’s why many content producers are looking to the cloud to share content across distributed teams. The sticking point is that most public cloud offerings were designed as development platforms for software vendors to build applications and services and have little, if any, integration necessary for the complex multistage workflows that are common in media production. Public clouds often require users to adopt new, unproven, and unfamiliar tools and move assets between stages of the public cloud without careful regard for security or QA checks.

The ideal solution moves the same workflow your team uses today to the cloud so your team can work remotely, sharing content stored on economical, robust storage with the scalability, flexibility and security your team needs. Key questions to ask: Was the cloud built specifically for the demands of media workflows? How much will my users have to change their daily processes to access cloud-based content? Are the workflow tools for ingest, editing and transcoding available in this solution? Will they work with lower bandwidth connections?

In a business where bigger is always better, and change is the only constant, you need higher performance, greater scale, easier long-term access, and the distributed access of the cloud. You can’t afford to do anything else but go big, or go home.

 

(1)The Linear Tape-Open/ Linear Tape File System (LTO/LTFS) refers to the format of data recorded on magnetic tape media and the implementation of specific software that uses this data format to provide a file system interface to data stored on magnetic tape.


July 28, 2014