
Transcoding in Post Production
03/08/17 • 16 min
1. Onset and Editorial Transcoding
When does Post actually begin?
Since we’ve moved from celluloid to digital, the answer to this query has quickly moved to production. In fact, over the past 20 years, a new position has emerged – the DIT, or Digital Imaging Technician, as a direct response to the need to coordinate between digital acquisition and subsequent post-production. In fact, the DIT is such an instrumental part of the process, that the DIT is often the liaison that connects production and post together.
Adding a watermark, timecode window burn, and LUT inside Blackmagic Resolve
Now, this can vary depending on the size of the production, but the DIT will not only wrangle the metadata and media from the shoot and organize it for post, but they may have added responsibility. This can include syncing 2nd system audio to the camera masters. This may also include adding watermarks to versions for security during the dailies process or putting a LUT on the camera footage. Lastly, the DIT may also create edit ready versions – either high or low res – depending on your workflow. A very common tool is Blackmagic Resolve, but also tools like Editready, Cortex Dailies, or even your NLE.
Now, having the DIT do all of this isn’t a hard and fast rule, as often assistant editors will need to create these after the raw location media gets delivered to post. What will your production do? Often, this comes down to budget. Lower budget? This usually means that the assistants in post are doing a majority of this rather than the folk’s onset.
As for the creation of edit-ready media, this speaks to the workflow your project will utilize. Are you creating low res offline versions for editorial, and then reconforming to the camera originals during your online?
A traditional offline/online workflow
Or, are you creating a high res version that will be your mezzanine file that would work with throughout the creative process?
A mezzanine format workflow
OK, now on to actually creating edit-worthy media.
This can be challenging for several reasons.
You need to create media that is recognized by and optimized for the editorial platforms you’re cutting on. For Avid Media Composer, this is OPAtom MXF wrapped media. This media is commonly DNxHD, DNxHR and ProRes. What these have in common is that they are non-long GOP formats, which makes them easier for Avid to decode in real time.
The go-to has been the 20-year-old offline formats of 15:1, 14:1, or 10:1.
These formats are very small in size, are easy for computers to chomp through, but look like garbage. But if it means not spending money for larger storage or new computers, it’s tolerated. Recently, productions have been moving to the 800k and 2Mb Avid h.264 variants so they can keep larger frame sizes.
You can create this media through the import function, using dynamic media folders, or consolidate and transcode within Media Composer itself.
Adobe Premiere is a bit more forgiving in terms of formats. Premiere, like Avid, will work best with non
1. Onset and Editorial Transcoding
When does Post actually begin?
Since we’ve moved from celluloid to digital, the answer to this query has quickly moved to production. In fact, over the past 20 years, a new position has emerged – the DIT, or Digital Imaging Technician, as a direct response to the need to coordinate between digital acquisition and subsequent post-production. In fact, the DIT is such an instrumental part of the process, that the DIT is often the liaison that connects production and post together.
Adding a watermark, timecode window burn, and LUT inside Blackmagic Resolve
Now, this can vary depending on the size of the production, but the DIT will not only wrangle the metadata and media from the shoot and organize it for post, but they may have added responsibility. This can include syncing 2nd system audio to the camera masters. This may also include adding watermarks to versions for security during the dailies process or putting a LUT on the camera footage. Lastly, the DIT may also create edit ready versions – either high or low res – depending on your workflow. A very common tool is Blackmagic Resolve, but also tools like Editready, Cortex Dailies, or even your NLE.
Now, having the DIT do all of this isn’t a hard and fast rule, as often assistant editors will need to create these after the raw location media gets delivered to post. What will your production do? Often, this comes down to budget. Lower budget? This usually means that the assistants in post are doing a majority of this rather than the folk’s onset.
As for the creation of edit-ready media, this speaks to the workflow your project will utilize. Are you creating low res offline versions for editorial, and then reconforming to the camera originals during your online?
A traditional offline/online workflow
Or, are you creating a high res version that will be your mezzanine file that would work with throughout the creative process?
A mezzanine format workflow
OK, now on to actually creating edit-worthy media.
This can be challenging for several reasons.
You need to create media that is recognized by and optimized for the editorial platforms you’re cutting on. For Avid Media Composer, this is OPAtom MXF wrapped media. This media is commonly DNxHD, DNxHR and ProRes. What these have in common is that they are non-long GOP formats, which makes them easier for Avid to decode in real time.
The go-to has been the 20-year-old offline formats of 15:1, 14:1, or 10:1.
These formats are very small in size, are easy for computers to chomp through, but look like garbage. But if it means not spending money for larger storage or new computers, it’s tolerated. Recently, productions have been moving to the 800k and 2Mb Avid h.264 variants so they can keep larger frame sizes.
You can create this media through the import function, using dynamic media folders, or consolidate and transcode within Media Composer itself.
Adobe Premiere is a bit more forgiving in terms of formats. Premiere, like Avid, will work best with non
Previous Episode

Live Streaming
Editor’s Note: This episode was not live-streamed.
1. What components do I need for live streaming?
Getting started is pretty easy. Thanks to platforms like Facebook and YouTube, you can do it with a few clicks from your phone. But that won’t help you much when you want to up your production value. Here is what you need to get started.
Live Streaming from your CDN to end users
To successfully plan your setup, we need to work backward and follow the old credo of “begin with the end in mind”.
So, the last part of live streaming is “how do I get my video to viewers”? This is normally done through a CDN – a content delivery network. A content delivery network takes your video stream, and not only pushes it out to your legion of fans but also does things like create lower resolution variants, also called transcoding and transrating. A good CDN also supports different platforms, such as streaming to mobile devices and various computer platforms. A CDN also takes the load off of your streaming machine. Imagine your single CPU being tasked with sending out specific video streams in specific formats for every user?
OK, now that we have our CDN, we need to send a high-quality signal to it. I’ll address the specifics of that later in the episode, but suffice it to say, for a successful broadcast, you’ll need a decent Internet connection with plenty of headroom.
Live Streaming to a CDN
All CDNs will provide you with a protocol – that is, the way in which they want to receive the live feed for your broadcast. Now, this is different than the protocols that your end device will use – as a good CDN will take care of that mumbo jumbo translation for you....but only if you get the video to them in the right format.
These upload streaming formats can include HLS, RTSP, RTMP, and Silverlight formats. The end game is that your software needs to be able to stream to the CDNs mandated format for it to be recognized.
Speaking of software, this is one of the most critical decisions to make.
Will your streaming software ONLY stream, or will it also do production tasks as well, like switching between multiple sources, playing prerecorded video, adding graphics, and more? Of the utmost importance is “does your software support the camera types you’re sending it?”
Cameras to I/O device to Streaming Device
....which leads up to the beginning of your broadcast chain... “what kind of cameras are you using?” Are you using USB cameras? Or, are you using cameras with HDMI or HDSDI outputs? If the latter, then you need an I/O device on your computer which can not only take all of the inputs you want but also in the frame size and frame rate you want.
You’ll quickly see that the successful technical implementation of a live stream is based on each part playing with the others seamlessly.
A complete live streaming workflow
2. CDNs?!
In case I wasn’t clear, your CDN choice is one of the most critical decisions. Your CDN will dictate your end users experience.
Here are some things to look for in a CDN.
Does your CDN allow an end user a DVR like experience, where they can rewind to any point in the stream and watch? Does this include VOD options to watch later after the event is over?
Many CDNs have a webpage front end, where you can send users to watch the stream. However, most users prefer to take the video stream and embed it in their own website so they can control the user experience.
Also, “is this a private stream?’ If so, ensure your CDN has a password feature. Speaking of filtering viewers, does your CDN tie into your RSVP system – or, are you using your CDN’s RSVP system? This is another way to create a more personalized experience for the viewer – as well as track who watches – and get their contact info you can follow up with them after the event if needed....as well as track the streams metrics – so you can improve the content and experience for the next live stream.
CDN’s can’t do all of this for free. This is why most CDNs restrict video stream quality on free accounts. This means your upload quality may be throttled, or the end users viewing quality or even total viewer count may be lim...
Next Episode

5 THINGS Update – NAB 2017, Appearances, Meet-Ups, Rampant Dance, and more!
Hey 5 THINGS fans:
While the next episode of 5 THINGS is in production, I wanted to give you a quick update on other 5 THINGS news. So let’s get to it:
NAB, the National Association of Broadcasters convention kicks off next week. It’s the annual tech nerd pilgrimage and business card exchange, and I’ll be there Friday, April 21st to Thursday, April 27th. I have a few appearances I’d like to clue you in on.
Teradek – Fishbowl Panel Discussion. Wednesday, April 26th, 11:30am. C6025
1. Post Production Panel Discussion – Streamed!
First is the Teradek Post Production Panel, in conjunction with Pro Video Coalition.
We affectionately call it the fishbowl, for somewhat obvious reasons, as we’re encased in a clear plexiglass stage in the middle of Central Hall. I’ll be on the panel with Woody Woodhall, Graham Sheldon, and Scott Simmons. If you’re at NAB, check it out Booth C6025 on Wednesday, April 26th at 11:30 am Pacific.
Not going to NAB? Go to provideocoalition.com or teradek.com for streaming info. I think it will be posted online after the show as well. If I get the link early enough, I’ll update this post with the info. Update-LIVE Link: https://www.provideocoalition.com/event/teradek-nab-2017-live-show/
2. 5 THINGS – a live episode!
5 THINGS – a live episode, followed by a panel discussion.
Also on Wednesday the 26th, I’ll be doing a 5 THINGS episode LIVE thanks to the kind folks at Lumaforge.
Come by at 2:30 pm Pacific at the Courtyard by Marriott. Lumaforge has 3 days of panels and speakers so I HIGHLY recommend registering ASAP at lumaforge.com/nab.
As if a live episode of 5 THINGS wasn’t enough, I’ll be sticking around to be on another panel discussion immediately following, this time with the brilliant Oliver Peters and Bill Davis. Scott Simmons again will round out the panel.
3. Parties and Meet-Ups
Blue Collar Post Collective (BCPC) and #Postchat Meet-up – Sunday, April 24th – O’Sheas at the LINQ Promenade.
There are a few after parties and meet-ups that I do recommend if you dig 5 THINGS.
First one is the joint Blue Collar Post Collective and #Postchat meet-up on 6 pm on Sunday. We’re again at O’Sheas, at the LINQ promenade. Just a bunch of creative post people getting out from a dark edit bay.
I’ll also be hitting the 16th annual Supermeet on Tuesday the 25th hoping to win something. Hundreds of thousands of dollars of awesome prizes are to be raffled off, plus a whole host of speakers and demos. I think a few tickets are left, so register at Supermeet.com and I’ll buy you a drink.
4. Rampant Danc...
If you like this episode you’ll love
Episode Comments
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/5-things-simplifying-film-tv-and-media-technology-audio-only-263286/transcoding-in-post-production-31196480"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to transcoding in post production on goodpods" style="width: 225px" /> </a>
Copy