4K Post Production Hardware

Last updated by Yermy Weiss on Apr 7

With all the buzz surrounding 4K (dubbed UltraHD in the home theater world) many filmmakers, videographers and digital content creators are anxious to understand just what a switch to 4K will mean in terms of new investments. In this article we will explore what 4K means to the editor and colorist; the kind of hardware that might be required and what to consider when building a fresh system or upgrading and existing setup. As we will show, a lot will depend on the specific application; therefore this article will attempt to provide a general overview of post production hardware as it relates to 4K rather than a comprehensive blueprint for building a system.

What is 4K?

4K is a resolution―actually two resolutions. The DCI (Digital Cinema Initiatives) cinema specification for 4K is a 17:9 format normally with a resolution of 4096 x 2160; the equivalent of four 2K (2048 x 1080) quadrants. The other resolution, a consumer specification often designated QFHD (quad full HD), has a 16:9 aspect ratio and a resolution of 3840 x 2160. As you probably guessed, it's equal to four 1920 x 1080 HD quadrants. The differences between these resolutions is slight, but a DCI-format display will introduce pillars, or black side bars, to display QFHD video without cropping, whereas a 16:9 monitor, whether QFHD or HD, will letterbox the display, or apply black bars to the top and bottom when displaying DCI content. In either case, the amount of black will be slight, but it's worth noting to avoid surprises.

As an aside, the term UltraHD properly refers to consumer QFHD (3840x2160), but in practice many companies confusingly use the term to describe DCI format as well.

What hardware is required to edit 4K?

The answer to this question is: It depends. As noted, 4K is just a resolution. A compressed 4K format such as Apple ProRes can often be edited on hardware suitable for HD; but 16-bit linear RAW video from a high-end cinema camera such as the Sony PMW-F55 will likely require a server-grade workstation connected to a fibre channel SAN with as many as eight additional GPUs to achieve smooth real-time playback. In other words, what really matters, more than resolution, is encoding.

So what spec do I look at to determine what hardware I'll need?

Again, things aren't so simple. For example the same ProRes 4K clip that played flawlessly in Final Cut Pro may choke up Premiere on the exact same computer. In addition, it depends on what software one is using.

Starting with the computer itself, we will break up hardware into the following categories and address each in turn:

Hardware: a quick breakdown:

  • Computer, including system hard drive, RAM, power supply and most of the standard components that come preinstalled from the factory.
  • GPU, or graphics card. (It is explained below why this doesn't fall under computer.)
  • Mass storage. Not the drive your editing software and operating system runs on, but the drive on which the video will actually be stored.
  • I/O, or Input/Output cards. Sometimes just called “capture cards” even though most 4K workflows are already file-based and will actually be used only for output.

The Computer: Sounds like I'll need a pretty serious machine―is 12 cores even enough?

Paradoxically, it might seem, fixating on getting a computer with the fastest processor may not be the best place to start. The reason is that video software is now able to dump much of the heavy lifting onto the GPU (or even several GPUs), leaving the CPU free to do its job of delegating tasks, applications, APIs, hardware process, I/O device requests, and so on. The CPU just makes sure all the basic tasks run in harmony while the GPU takes care of the video.

The specifics of GPU acceleration are discussed below, but for now it is important to know that for all but the most basic video (and certainly for any form of 4K), the computer should have a dedicated graphics card.

The most important computer feature is expandability, which usually means a workstation such as a Mac Pro or HP Z-series tower, as opposed to a basic desktop, all-in-one or laptop.

What do you mean by expandability?

Expandability is the capacity to install additional hardware to increase the performance or functionality of the workstation. The most important type of hardware expansion is PCI express, or PCIe. PCIe 2.0 (and later) slots provide the fastest, most direct access to the motherboard of any interface and are mandatory for certain things. Specifically, PCIe enables the installation of additional GPUs, I/O cards and ultra-fast solid-state drives. In addition to PCIe, many of these devices will require at least four-lane (often designated “x4”) or wider slots to run at optimal performance. GPUs need at least x8, often x16. So make sure the workstation has at least one free x8 slot besides the one the stock graphics card is installed in.

It is also important to consider the physical size and power requirements of additional PCI cards. Small mini towers may not have room for high-performance GPUs, and GPUs are power-hungry, so make sure the original power supply is sufficient or can be replaced (in most cases they can). If your system doesn’t have enough slots, more PCI devices can be added externally using products like the Cubix Xpander.              

What about Thunderbolt? Isn't it as fast, if not faster than PCIe?

Thunderbolt is a port that provides external access to the PCI bus of the host computer for up to six peripherals, as well as passing a display signal. Thunderbolt “1” is indeed fast enough for I/O cards and eSATA drivers. However its bandwidth is the equivalent of an x4 slot and is limited to 10 Gb/s in one direction (20 Gb/s bidirectionally). Thunderbolt 2, now gaining traction, is faster in that it can move more than 10 Gb/s in one direction. However, for things like graphic cards, it's still effectively x4.

So the processor doesn't matter at all?

It all depends on the application. Some software many not be able to do much with GPUs―older applications especially. And even those that do harness GPU power may still need help. Generally, recent hyper-threaded quad core processors are enough―a lot of CPU performance these days comes from tricks that let the processor operate more efficiently rather than raw clock speed or even the number of cores. And many programs may not be able to use those additional cores. Final Cut Pro 7 can only use 1.5 cores, max. In this regard any workstation will be capable. Having said that, investing in a six-, eight- or even twelve-core system certainly won't hurt, but it just may not be as beneficial as is often assumed. Note that laptops and some all-in-ones use throttled, or otherwise inherently slower processors than their desktop counterparts, even if the specs look the same on paper.

I probably want a good graphics card, right? You keep talking about it.

You will. In fact you will probably need several for any serious 4K work, especially RAW. However, as we'll discuss below, what comes stock in the computer may not matter so much. It just matters that it is a dedicated GPU and not “integrated” – that is, essentially, an extension of the CPU. The stock graphics card just needs to drive the number and resolution of whatever computer monitors one will use. Most workstations―even bare-bones ones—now can drive three displays at around HD resolution or slightly higher―and virtually all can drive at least two (for viewing in 4K see monitoring discussion below). As we'll see shortly this is all the OEM card needs to worry about. But you will probably want one or more additional aftermarket cards for other purposes.

RAM, the more the better, right?

Adding RAM to boost speed may be disappointing. NLEs such as Adobe Premiere and Apple Final Cut Pro aren't going to use more than four gigs anyway. Where more RAM comes in is applications that require a lot of rendering. For RAM-intensive applications like After Effects, the duration of one's preview may be limited by available memory. And here a lot of RAM is important. But just cramming as much in as will fit may not be the best plan. Each program has its own quirks and depending on how it was designed it may worker better with a specific total amount of RAM installed in a specific way.  For example, as noting in the Configuration Guide, certain Resolve setups are tested to be more stable with 24GB of RAM even though the host system can physically hold more. Therefore, it is best for these applications to check the manual when selecting RAM. And when upgrading it may be best to completely replace any existing RAM with whole new DIMMs of equal size.

I'm sure 4K takes up a lot of hard drive space? … Or does that depend, too?

Indeed storage requirements also vary wildly. The old, simpler days of 7200 rpm and FireWire are long behind us.

Throughput is key. Compressed 4K formats like ProRes or Avid DNxHD are about four times the data rate of their HD companions. 4K ProRes422 at 30 fps runs roughly 630 MB/s, which is more than what a USB 2.0 or FireWire hard drive can really handle, but not crazy. There's a good chance whatever decent eSATA or internal drives one is currently using to edit HD will still cut it. Obviously, it depends on the number of tracks one is editing. To be safe, four-bay or larger RAID arrays like the G-Technology G-Speed series, ideally configured in RAID0, are recommended as a baseline for ProRes422 or equivalent compressed format, as recorded in the Blackmagic 4K Production Camera or Convergent Design Odyssey7Q.

Although prosumer drives should cut it at the entry point, for RAW, or even large projects in compressed 4K, something much more serious is needed. For a turnkey solution that doesn't require a network engineer, SAS (Serial Attached SCSI) is the most popular high-speed interface. Unlike the prosumer RAIDs like G-Tech's G-Speed Q line or the Promise Pegasus, which often have built in RAID controllers and may use a variety of common interfaces, such as eSATA, USB 3.0 or Thunderbolt, SAS will require a dedicated host card installed internally and typically in at least an x8 PCIe 2.0 slot. Therefore, SAS drives can only be used with workstations to achieve the required performance. The Dulce Systems PRO RXmpd is a good example.

But even SAS direct-attached storage (DAS) may not be enough. To ensure smooth real time playback of many 4K RAW formats, such as Canon 10-bit log from the EOS-C500 as recorded in the Convergent Design Odyssey 7Q, even SAS would feel a strain. Here one will need a fibre channel/SAN (storage area network), server and hopefully a Linux engineer to make it work.

SSDs are really fast, no?

Solid-state memory promises to be faster; unfortunately many consumer-grade drives fail to achieve advertised performance when operating on video. The main reason is these drives rely on lossless data compression to boost write speed. This is great for normal data full of repetitive patterns. But video is too random, and in many cases (including many RAW formats) compressed already, for losses compress to be able to do very much.

For speed freaks enterprise grade products like the Fusion ioFX SSDs are more than capable for just about anything, including 4K. Unlike other drive types discussed, these guys install directly into an x4 PCI slot and cut out the middle-man SCSI or SATA bus. Their main drawback is capacity. The larger of the two ioFX drives is 1650GB, which equates to not much more than an hour of footage for many RAW formats. This may not be a problem for color correction since only the final, edited footage is needed, but for multiple projects or during the initial cut their size will likely be insufficient. 

PCIe SSDs can be RAIDed in an external housing but usually conventional hard drives are considered more cost-effective. As costs drop and capacities increase, this state of affairs will almost certainly change.

You mentioned GPU acceleration? A few times, actually!

It there's one thing that's make-or-break when it comes to 4K, it is libel to be GPU acceleration. In fact, GPU acceleration is probably the single most important variable in determining performance.

A common and understandable misconception is that a strong graphics card is needed to draw video on a monitor. But the reality is modern GPUs brush off drawing HD video and even 4K with a shrug. Video is 2D and all but the most the most expensive monitors only display 8-bit color (many claim 10-bit but really cheat, and dither 8-bit); even 3D video is really just 2D played at double the frame rate as far as the GPU knows. What traditionally makes video playback so taxing―causing glitchy, stuttery playback and freezing—is decoding (often more of a problem with compressed formats like H.264, that use complex algorithms) and rendering. These are tasks that, until recently, were the sole burden of the CPU.

For many years, powerful GPUs were really only important to gamers drawing complex 3D worlds in real time, until software developers like Apple and chip makers like ATI and Nvidia realized they were sitting on a gold mine of untapped computational power. In response, two important standards emerged – OpenCL and CUDA.

OpenCL was original developed by Apple but is an open standard anyone can adopt and is currently supported by both ATI and Nvidia. The other standard, and probably the one more familiar to those of us in the video world is CUDA. CUDA is a proprietary specification exclusively available on specific Nvidia GPUs, including all of their professional Quadro models.

It is an open debate which standard is better. Suffice it to say tests show CUDA beating OpenCL at floating-point and OpenCL ATI products beating Nvidia at integer math. As always the specific task will determine which is better. Either way, they both do the same thing, make editing systems scream because they can process video way better than any CPU ever could.

You're contradicting yourself. Now you're saying I do need a good graphics card?

Actually, one needs more than one GPU. The original stock card need only drive whatever monitors you have hooked up to it. One or more additional GPUs should be there to provide accelerated parallel processing crucial for any serious workstation. And it's these additional cards that should be as powerful as one can get them. (Note that additional GPUs, when configured to provide acceleration cannot drive additional monitors.)

But my tower only has one or two extra slots.

Even beefy towers probably don't have room or enough power to support more than two GPUs; at best three GPUs (including the stock card internally). Yet some Resolve configurations, for example, call for as many as eight. To add these additional cards a PCIe expansion chassis like the Cubix Xpander may be attached. The Chassis has a single (at least x8) host card that will be installed in the computer. An external tower or rackmount provides two or more slots as well as its own power supply. As far as the computer knows they're all installed internally. Some care needs to be taken when first building such as system. Typically cards needing drivers should be installed inside the computer initially. Once the system is happy with the new hardware the card can be removed and put in its new home. How many cards one can add and still get a performance bump is mainly limited by software―between two and four is typical right now.

In the near future things are likely to change. At the moment most computers ship with only one GPU, often wimpy factory installed (fine for display). However, following Apple's lead with the new Mac Pro, we will probably start seeing more and more companies offering multi-GPUs as a standard configuration option.

Okay, now I've got the system down. How do I view my 4K and have it look great?

For normal editing, you need nothing other than a decent computer monitor; ideally two. For color correction, something higher-end is certainly recommended. But in this case color accuracy more than resolution is what counts. (Color correction is a whole topic until itself; subject for another article.)

But if you really want to view your beautifully shot 4K footage in 4K―and I'm sure you will—there are some options. 

The difficulty, because 4K as a delivery format is still in its infancy (it was really born as a set of digital cinema acquisition formats), is the fact that there is not a lot of standardization. That shiny new Sony TV and CUDA-on-steroids Nvidia K5000 might not talk to each other. Not to mention, most graphics cards currently either don't support 4K at all, or may not meet the correct specifications to sync with the intended monitor―a reality which can make things very confusing.

Currently, the safest bet is to use a video I/O card (misleadingly called a capture card), which is a dedicated PCIe card or Thunderbolt device whose job it is to send video out to a video monitor rather than a computer monitor (they offer inputs too, but that's a topic for another time). Specifically, you’ll want one with SDI connectors that support the so-called “quad link”. Quad link splits the 4K image into four HD or 2K quadrants and sends each part out over its own 3G or in some cases 1.5G SDI port. Professional monitors like the Sony Trimaster PVM-X300 can normally accept quad link directly and piece the scene back together.

This is great one of you feel like spending big. But a broadcast-grade display may not always be needed, or desired. A large TV or projector may be preferred, especially for screenings where a broadcast monitor (usually less 30” or less) would be on the small side. Oriented more for the home theatre and A/V market , TVs and projectors will most probably have HDMI 1.4a as their only 4K input. Therefore, since most computers don't currently support 4K HDMI through built-in hardware, either an HDMI I/O card like Blackmagic's Decklink 4K Extreme or a quadlink SDI to HDMI 1.4a converter like AJA's Hi5-4K (used alongside a quad link SDI I/O) will be needed to connect one of these fancy displays to a computer.

All sounds wonderful. But before you said I'll probably need to be a network engineer to get real time RAW performance? I'm not a network engineer.

Proxy to the rescue...

During the transition to 4K most post-production will realistically be done using a proxy workflow. A proxy is a lower quality copy of the original footage either recorded separately on the shoot or rendered out in the edit room in cases where only the high-quality original is available. The former method is preferred because it saves time, but may require an additional recorder for some camera systems. The proxies themselves are in a widely used, NLE-friendly, space-efficient codec like ProRes LT that is used to make the picture edit.

Once the picture edit is done, the original RAW footage is conformed match to the proxy. Any color correction is applied and a 4K, HD or any other desired delivery codec can then be rendered for export. For this work flow the editing system only needs to be powerful enough to handle the proxy.

Color grading is probably best outsourced. But a slower computer may be used provided it meets the color grading application's minimum specs (a newer iMac with OpenCL GPU is good enough for DaVinci Resolve). One won't get real time previewing—or anywhere close to it—but since most color correction and grading is applied globally (i.e., to the entire frame lasting the duration of the shot), the result can often be reliably judged from a still image. It’s just going to take a while to render, so get a coffee.

Add new comment