Post Positions:  Big media’s big data problem
Issue: October 1, 2012

Post Positions: Big media’s big data problem


By Jason Danielson
M&E Products/
Solutions Marketing
NetApp
www.netapp.com
Sunnyvale, CA


For the managing engineers, VPs of engineering and CIOs of today’s rapidly growing media organizations, there is good news and there is bad news. The bad news is the relentless big data challenges that IT departments have been wrestling to manage and mitigate over the past several years are now a serious daily reality for media companies as well. The good news is that big data solutions have been optimized substantially over the past several years and now offer tremendous benefits for those companies that can now be considered “Big Media.”

Unfortunately, the fact that media engineering and IT have often developed independently from each other within many organizations, resulting in silos, causes many media engineering decision makers to overlook or ignore powerful tools to increase efficiency, expand bandwidth and manage content. But for many organizations, these solutions couldn’t come a moment too soon.

At a time when operating budgets are narrower than ever before, media organizations must contend with content needs and data requirements that hobble workflow and strain the very limits of storage. If there were any lingering doubt about the parallels between Big Data and Big Media, they can be easily dispelled with a quick look at the three defining pillars of Big Data — analytics, bandwidth and content — and how they are profoundly impacting media companies today.
Take this example from the IBC 2012 Innovation Awards’ shortlist in the Content Management category. “Sports ingest, management, multi-screen distribution” — End User: Turner Studios; Technical Partners: Dalet, EVS, Stainless Code, Active Storage, NetApp, Quantum, Azzurro Systems Integration, CineSys, Telecom Network Solutions and Enterprise Apps.

TURNER SPORTS
Turner Sports covers 40 different sports and five national leagues (NBA, NCAA Basketball, Major League Baseball, the PGA and NASCAR). Each year it covers 6,000 events resulting in recording 27,000 hours of live content — an average of 74 hours a day, and close to two million highlights a year. The database has to reference 3,000 teams and 30,000 players. The challenge for asset management is clearly huge.

Its new centralized feeds recording, logging, highlight editing and playout system was launched on Christmas Day 2011, at the end of just 15 months of specification and development. Turner brought together best of breed technology partners from around the world, including Dalet Digital Media, EVS, Stainless Code, Active Storage, Quantum and NetApp. Despite the huge throughput of the system — it is currently configured for 26 ingest and 16 playout channels — any content is available for logging, editing and viewing within 10 seconds.

If traditional media solutions were all that were available, this simply would not be practical. The storage costs for the media repository alone would swamp the overall budget. Even then, the operation would have a growing enterprise-wide media repository on its hands that lacked enterprise-caliber resiliency and flexibility.
 
Until recently, media organizations expanded media infrastructure as needs developed. The result was individual organizations forced to manage multiple edit suites, working with just enough storage to support each room. There is no fault here. In 1999, when shared editing began, bandwidth was only fast enough and capacity was only big enough to support a single edit room.
 
Hands-on, in-the-trenches media management engineers are all too familiar with where this process led. Multi-gigabyte file transfers frequently tying up two rooms at once, and editors spending more time waiting for files to transfer than actually editing. With 30 percent of a typical work day spent moving files, production schedules and deadlines were regularly stretched past their limits.

Just as the information that can be parsed from Big Data is often highly time sensitive — extraordinarily valuable if it can be calculated immediately, possibly worthless if it takes hours or days — so too does Big Media depend on technology that can deliver content while it is timely and relevant. Without it, enormous efforts by entire teams working in tandem can quite easily become old news. That’s not just a waste of resources, it can also be an extremely serious competitive disadvantage.

HOW DO THEY DO IT?
In order to effectively produce the massive amount of sporting events they cover, Turner uses both proxy and broadcast resolution workflows. The proxy workflow is supported with a two-node clustered storage system accessed over a 10GbE network, with a high-performance operating system equipped to handle big media challenges. The game feeds are ingested into two SAN storage systems for redundancy accessed over 8 gigabit FibreChannel. The system allows loggers and editors access to the incoming feeds while they are ingesting.

Turner’s loggers annotate and log plays in realtime during games and events, just :10 to :15 behind live broadcast. Editors then edit the logged plays into a highlights package and publish the highlights package for use later in the broadcast or in subsequent broadcasts. Without a single large volume that can be shared by the ingest servers, the logging and editing workstations, the publish servers, and the MAM clients, the entire production workflow would fail to meet such basic objectives as getting replays and highlights to air during the event.

COMPANIES OF ALL SIZES
While Turner is a clear example of a company reaping the benefits of the advances driven by Big Data, they are not alone. Video, music and photography Websites driving millions of media transactions a day and managing billions of media objects within their repositories rely on Big Data solutions. As relentless technological innovation has morphed traditional media into Big Media, more and more organizations are realizing that their business viability hinges on embracing the kinds of Big Data solutions that provide extreme bandwidth and performance for realtime workflows, boundless object storage for high volume global content libraries, and realtime analysis of complex data sets to empower better business decisions. For those companies that increasingly struggle with Big Media challenges, it is now time to employ Big Data solutions.