You are here

Share

Does it take you 1/2 million years to test your workflow?

Does it take you 1/2 million years to test your workflow?

Blog
Home
Does it take you 1/2 million years to test your workflow?
Posted: Tuesday, January 21, 2014

Does it take you 1/2 million years to test your workflow?It is now obligatory to start every broadcast technology blog post, article or presentation with a statement reminding us that we are now living in a multi-format, multi-platform world, where consumers want to view the content they choose, when they want it, where they want it, on the device they want. However, unlike other marketing platitudes, this one is actually true:

Many of us in this industry spend our days trying to develop infrastructures that will allow us to deliver content to different platforms, ageing prematurely in the process because to be honest, it's a really hard thing to do. So why is it so hard? 

Why is it so hard? 

Let me explain: For each device, you have to define the resolution: a new iPad has more pixels than HDTV, for example (2048 wide), and is 4:3 aspect ratio. Android phones have different screen sizes and resolutions. Don’t even get me started on interlaced or progressive.

That video has to be encoded using the appropriate codec – and of course different devices use different codecs.

Along with the pictures there will be sound. Which could be in mono, stereo or surround sound, which in turn could be 5.1, 7.1 or something more exotic. The sound could be encoded in a number of different ways. Digital audio sampling could be at 44.1kHz or 48kHz and a whole range of bit depths.

Then the audio and video need to be brought together with the appropriate metadata in a wrapper. The wrapper needs to be put into a delivery stream. If it is for mobile use, we now routinely adopt one of the three different adaptive bitrate formats, which means essentially we have to encode the content at three different data rates for the target device to switch between.

If you want to achieve the admirable aim of making your content available on all common platforms, then you have to take into consideration every combination of resolution, video codec, audio codec, track layout, timecode options, metadata and ancillary data formats and bitrate options. This is a very large number.

And it does not stop there. That is only the output side. What about the input? How many input formats do you have to support? Are you getting SD and HD originals? What about 2k and, in the not too distant future, 4K originated material? If you are producing in-house, you may have ARRI raw and REDCODE (R3D) files floating around.

The content will arrive in different forms, on different platforms, with different codecs and in different wrappers. We are on to the third revision of the basic MXF specification, for example.

Any given end-to-end workflows could involve many, many thousands of input to output processes, each with their own special variants of audio, video, control and metadata formats, wrappers and bitrates. Each time a new input or output type is defined the number increases many-fold. 

Quality Control 

All of which is just mind-boggling. Until you consider quality control. If you were to test, in real time, every variant of, say, a three minute pop video, it would take a couple of hundred years. This is clearly not going to happen.

It’s all right, I hear you say. All we need do is define a test matrix so that we know we can transform content from any source to any destination. If the test matrix works, then we know that real content will work, too.

Well, up to a point. I have done the calculations on this and, to complete a test matrix that really does cover every conceivable input format, through every server option, to every delivery format for every service provider, on every variant of essence and metadata, it is likely to take you half a million years. Maybe a bit more.

So are you going to start at workflow path one and test every case, working until some time after the sun explodes? Of course not.

But what is the solution? Do you just ignore all the possible content flows and focus on the relatively few that make you money? Do you accept standardized processing which may make you look just like your competitors; or do you implement something special for key workflows even though the cost of doing it – and testing it – may be significant?

We have never had to face these questions before. Apart from one pass through a standards converter for content to cross the Atlantic, everything worked pretty much the same way. Now we have to consider tough questions about guaranteeing the quality of experience, and make difficult commercial judgments on the right way to go.

If you want to find out more about how to solve your interoperability dilemma, why don't you register for our next webinar on:

Wednesday 29th January at:

1pm GMT, 2pm CET, 8am EST, 5am PST

OR

5pm GMT, 6pm CET, 12pm EST, 9am PST

I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?

Broadcast IT Training - Bruce's Shorts

Posted by Bruce Devlin
In: 

Popular Posts

Copyright 2016. Dalet Academy by Dalet Digital Media Systems - Agence web : 6LAB