4K production – what do you really need?

Review: In DP 02 : 2014, our author Michael Radeck went into the practical test with True 4K.

This article originally appeared in Digital Production 02 : 2014.

In my article “4K & HFR” in DP O7/13, I covered the technical basics on the subject of “True 4K”. Now it’s time to put it into practice.

To get you started, here is a summary of the most important facts:

    • A 4K monitor/projector has 8.85 megapixels at 4,096 × 2,160 according to the DCI standard. As we know, a pixel consists of three colour pixels: red, green and blue. This means that 8.85 megapixels times three equals 26.5 megapixels in RGB.
    • However, a camera sensor labelled as 4K only has a total of 8.85 megapixels for RGB for 4,096 × 2,160, and only 2,048 for green on the horizontal. However, it would actually have to have 26.5 megapixels to achieve the same resolution as the monitor. A camera sensor currently labelled as 4K therefore only achieves a luminance resolution of 2K! Tip: Just google Bayer pattern sensor.
    • To deliver true, full 4K, i.e. 26.5 megapixels for RGB, you would need an 8K×4K sensor as a minimum, which neither the Epic Dragon sensor nor the Sony F65 currently achieves.
    • In order to be able to see real 4K (which currently only exists in the photo and animation sector), you must not be further than 1.5 times the image height away from the image.
    • In order to be able to perceive real 4K while the image content is moving, you need an image motion resolution/frame rate above the perception threshold of around 90 to 100 frames per second (HFR = high frame rate)

With this in mind, it is fair to ask: does it make sense to even try to produce 4K at the moment?

Projection

The good news is that there are actually applications where 4K makes sense even with the current technical framework conditions: for large projections at events (example: car shows). Anywhere where viewers (e.g. visitors to trade fair stands) can get close to the screens. Or in the field of design and development, where only artificially digitally generated images are presented, sometimes also many still images or paused virtual live 3D animations.

With computer-generated images, there is usually no limit to the frame rates, as long as the computing power is sufficient. The automotive industry has been working with 4K projections under these “conditions” for around ten years.

There is another aspect to large projections: Even if you can no longer perceive 4K from a distance of 1.5 times the image height to the screen, the pixel structure can still be perceived from a distance of 10 times the image height with a 2K or HD projection, especially with high-contrast graphics or fonts. A 4K projection, even of HD content, would therefore look much better in the last cinema row, because the scalers built into 4K projectors convert this pixel structure into smooth lines. Or if you produce the graphics in 4K and the film content in HD, it looks better on the 4K projector than on an HD projector.

Alternative applications

Another application variant for 4K was presented by Canon in its latest roadshow: Photo shoots with the Canon 1D in 4K video mode. Here at least 24 images per second can be recorded in 4K (but only in Motion JPEG, quite heavily compressed with limited dynamic range compared to raw photos).

The photographer then no longer presents printed photos, but short snippets of movement on 4K monitors with a long pause in between – sometimes just a blink of the eye or a twitch of the corner of the mouth. A completely new art form? Well, not quite. Basically a video installation, albeit in unprecedented image quality, which certainly fulfils the increased image quality demands of viewers.

For consumers, there are also some areas of application for 4K that they can already utilise. Viewing photos: Even an iPhone takes photos with 8 megapixels, as many pixels as a Sony F5 or F55 4K sensor has. A photo from a Canon 5d has over 20 megapixels, a Nikon D800 has 36 megapixels – such photos look simply fantastic on 4K displays. You can see the full quantum leap of 4×HD.

If you want to achieve this quality in 4K moving images: timelapses, i.e. series of individual images from such cameras, are the only true 4K content to date, apart from computer animations. Current new games consoles or games computers can also reproduce 4K up to 60 FPS.

However, there are also other examples. Sony F65 users tell us: “We bought an F65 because we saw that a Red or an Alexa reach their limits on the huge screens that Audi operates at these trade fairs. The images just look soft.” (The quote comes from the application report of a 4K production with the Sony F65, in which the frame rate problem is also vividly described: Kropac-Media at http://bit.ly/1dqNb9f.)

Apart from timelapses, maximum resolution in images can only be achieved digitally and artificially, or you have to come up with multiple resolutions and scale very carefully. However, there is still a lack of cameras that can produce true 4K and, above all, those that significantly exceed true 4K resolution (26 megapixels plus!) and do so at a simultaneous frame rate of at least 60 FPS. The most common argument in favour of 4K productions that I have heard recently was: “Then we can still zoom into the image because we have so much resolution.” Or: “We only shoot the slow motion in 2K, that’s still enough for HD utilisation.” However, if you record 2K cropped with a Bayer pattern sensor, you are left with significantly less resolution than HD. 2KBayerpattern resolution is just about good enough for 16:9 SD.

HFR – more than 30 FPS required

Another quote from the aforementioned application report reads: “In our tests, we realised that 4K resolution is only half the battle. The essential thing is recording in 50p. This is what creates the real wow effect. Only then does the material look really sharp.”

HD at 59.94 frames per second has been the standard for BMW trade fair films for over ten years. Audi now also requires every large-scale projection in 4K to be delivered in at least 50p, both filmed content and 2D/3D animations. For cinema projection, the frame rate in the DCI standards has been increased to up to 60 frames per second, particularly for stereo 3D. Almost all high-end finishing systems (such as those from Quantel and DVS) can also process at least 2K to 60 FPS in stereo 3D or 4K 2D at 60 FPS.

However, consumer TV sets and computer displays pose a problem when it comes to 60 FPS. I recently visited a production company that is currently producing films in 4K for a TV set manufacturer for product presentations of 4K TV sets. This visit led to a joint field research odyssey, which I would like to report on below because it clearly illustrates the “teething troubles” of 4K.

Great expectations and unvarnished reality

With its 65 inches, the TV set supplied by the manufacturer for testing already offered an impressive picture size, creating anticipation of brilliant images. In addition, the playback device from the same manufacturer – a kind of tablet with a docked keyboard – was supposed to make 4K easy and simple for anyone to operate with the swipe of a finger, so to speak. So much for the marketing. And that was the reality: after several attempts to connect and disconnect the 4KTV device to various HDMI ports on the computer (Windows 8.1), it recognised itself with 4K, i.e. 4,096 × 2,160. When the first images lit up, the colleague who had previously dragged the heavy device into the office by the sweat of his brow couldn’t contain his excitement: It looks great! Sitting on the sofa about three metres away, he didn’t notice the image deficiencies or the lack of sharpness. About one metre away from the screen, however, the unvarnished 4K reality was revealed in the form of massive image resolution losses.

The automatic screen recognition delivered a completely incorrect recommendation and also an incorrect default setting that deviated from the recommendation. Without a pixel-native screen setting, the supposed 4K image is dramatically degraded in quality because it has to be scaled by the graphics card of the playback system

Massive loss of image resolution

Now we began our search for the causes of the disappointing picture quality when viewed up close. To anticipate, at the end of this search, which can certainly be described as an odyssey, we were enriched by the following findings:

      • Monitor sends incorrect resolution information to graphics card.
      • Graphics card scales up Quad HD to DCI standard (because the Quad HD film is displayed in full screen).
      • Then the monitor scales down from DCI standard back to its native Quad HD resolution and zooms into the picture using overscan.

Here is our field research odyssey in detail:

    • Check: Resolution / image content

As we didn’t have a 4K test chart, we produced one: First, we downloaded my HD test chart from digitalproduction.com (http://bit.ly/1b2tUXr) and created a Quad HD, i.e. 3,840 × 2,160, in After Effects and then also in 4,096 x 2,160 by filling it up. The aim: to determine whether the monitor has DCI standard or Quad HD.

    • Check: Resolution/picture reproduction devices/playback system

Loading the test chart revealed that a resolution below HD could already be seen in the result when this still image was played back. So we looked in the graphics card settings to see what was offered as an alternative. The next entry was 3,840 × 2,160 – but without the additional information “native”, which was also missing for 4,096 × 2,160. This is often the case with computer displays so that it is easier to set the appropriate native resolution or it is initially used automatically. However, the image only became a little sharper, it was still not pixel native.

    • Check: Resolution/picture display device/TV set

We searched for the scaling or overscan in the display menu, as there is no standard designation here and it can be called “full”, “1:1”, “native” and similar. When we found the setting, it still looked blurred, but pixel-perfect. My test chart has various elements that can be used to quickly differentiate such a problem.

    • Check: Image processing functions/ TV set
  • TV set manufacturers apparently assume that the majority of picture content played back to consumers at home is of such poor picture quality that the TV sets process the picture quality with a considerable number of functions. However, these picture enhancement functions prove to be counterproductive for high-quality picture material. It is therefore advisable to switch off or correctly set all these functions, which affect saturation, colour space, colour temperature, frame rate algorithms and, in particular, sharpness. Normally, sharpening is always active and inactive at a value of 0, but on our device it turned out that image sharpening was inactive at 50 (the default value). I was only familiar with this type of setting from cameras that can also go into the negative, i.e. can even blur the image. After I had deactivated the image sharpening function, the test chart finally looked perfect and pixel-perfect.
    • Check: Frame rate/image display devices
  • In the graphics card settings, the frame rates were hidden in the Advanced Settings. After we had changed the resolution from 409 2,160 to 3,840 × 2,160, we were now able to set 25 FPS, 29.97 and 30 FPS instead of just 23.97 or 24 FPS. As the 4K film had been produced at 25 FPS, we now set 25 FPS accordingly. This showed that the higher frame rates of up to 30 FPS could also be activated at 3,840 × 2,160 – a further indication that this was the native picture resolution of the TV set’s panel. To be able to display 4,096 × 2,160, the internal scaler has to utilise additional computing capacity, which is at the expense of the frame rates. In the end, we exported the 4K file at a significantly lower data rate, i.e. from the original 300 megabits to 25 megabits. Fortunately, the production company had produced the file in Quad HD, i.e. in 3,840 × 2,160, which now also corresponded to the resolution of the TV set. After all these tedious changes to the settings, the 4K film looked halfway decent if you were standing two metres or more away from the screen.
  • Final check: Motion resolution

Now that the TV set was also displaying the native frame rate of the film and we were able to play the H.264 film with the now significantly lower data rate of 25 Mbit almost smoothly, the differences in quality from almost motionless image content to the faster ones were already drastically visible if you stood close enough to the 4K TV set. There are at least two reasons for this fluctuating picture resolution quality:

a. Sensor resolution for slow motion: In order to achieve higher frame rates, the resolutions of the camera sensor usually have to be cropped. This means that the available pixel quantity is not fully utilised – often in order to save storage capacity. For example, even the Sony F65 only achieves 4K × 1K at 120 FPS, i.e. only a quarter of the resolution. However, the sensor has 8K × 2K.

b. Motion blur due to excessively long exposure times: Motion blur. The higher the actual still image resolution and the closer the viewer looks at the image on higher-resolution image devices, the more noticeable the differences in image resolution. Arri has already shown in various presentations impressive tests how dramatic the image resolution losses are with 4K due to motion blur. It therefore often makes more sense to produce consistently in HD or 2K with at least 60 FPS than 4K with 24 FPS. Especially when 4K cannot be real 4K.

HDMI 2.0 must come soon

At least the TV set in our field research had an HDMI interface of standard 1.4, which is the minimum required for 4K. The computer/playback device also had HDMI 1.4, although HDMI 1.4 is only specified up to 30 FPS. But the frame rate was automatically selected by the graphics card/TV set: 24 FPS. An average consumer would already be overwhelmed by the task of setting the device correctly.

Only HDMI 2.0 will support 60 FPS. Until the end devices can handle this, 60 FPS films should not be released. However, they can be produced, as various consumer TV manufacturers are already pointing out that their devices will be easily upgradeable to HDMI 2.0. “Easy” because the HDMI interfaces are often built into docked boxes to enable such upgrades. You can already produce at 48 or 50 or 60 – and then simply output half the images.

For example, “The Hobbit” was shot at 48 and not 60 in order to be able to deliver a normal 24 FPS for 2D cinema exploitation, which doesn’t shutter so violently. With 60 FPS, the film would have been shot much faster and the exposure would have been shorter, which would have led to more shutter problems. So if you want to be downwards compatible, you always have to shoot in such a way that you only have to halve the FPS. So 60 FPS for a 30 FPS output or later 60 FPS.

With HDMI 2.0, 30 FPS can also be displayed with up to 12-bit colour depth, making a higher colour resolution than in post-production possible for the first time. However, only the previous 8-bit colour depth will be possible again at 60 FPS. Either way, the amount of data and therefore the technical requirements will increase dramatically.

Bear in mind that wireless HDMI transmission only works up to Full HD. Wireless 4K will not be possible for a long time yet. The requirements for cable lengths and their quality will also increase considerably with 4K. Especially if instead of 30 FPS, 60 FPS, i.e. twice or even almost three times the 24 FPS data rate, has to be delivered via the cable.

Tip: To solve such a problem in production, you can still fall back on classic SDI technology. There are various affordable products in the adapter sector from Blackmagic, AJA etc. The first 4K live productions provide corresponding reports of experience: we will still have to work with HD technology for a long time to come, especially when monitoring 4K productions, as the effort and costs would otherwise explode: Videos and additional material from a live music production: http://www.4k-concerts.com

The Hisense 4K TV set, which claims to be capable of 4K playback, only accepts HD sources. However, a macro shot proves that the LCD panel actually has 4K pixel resolution. The HD test chart consists of an alternation of black and white lines, both horizontally and vertically, each with two rows of red, green and blue pixels. As the HD source is simply mapped to twice the resolution without 4K interpolation, the HD image looks much worse in 4K than on a real 4K display that is properly upscaled, such as the one from Sony

TV sets: 800Hz refresh rate?

Motion optimisation Motionflow XR 800 Hz – this or something similar is the name of the most advertised feature of current TV sets. Without this technology, a person would not be able to see the 50 or 60p images produced as sharply as they are.

With images produced at only 24 or 25p, it depends very much on whether they are sharp at all. As the frame rate at 24 FPS is far too low, it is usually necessary to work with a shallow depth of field and a longer exposure time in order to avoid shutter problems as far as possible.

However, a long exposure time produces motion blur. At 24p, for example, slight head movements in a dialogue scene are enough to generate so much motion blur that all facial details disappear. Especially with real or almost 4K details, the image would constantly switch between sharp and blurred, which normal viewers would find extremely distracting. This problem has also been demonstrated in extensive scientific tests and by cinema-goers watching 4K presentations.

However, due to their technology, today’s LCD monitors also create a motion blur problem during image processing in the human brain. This is called “edge blurring”. This blurring problem can only be eliminated with a much higher frame rate than the human perception limit of around 120 FPS. For this reason, motion flow technology must be active, especially with 4K, otherwise the problem of blurring would be even more pronounced than it already is in the material.

4K production

To get close to true 4K, it is recommended to shoot at least on Sony F65 in 8K or Red Epic with Dragon sensor and also in 6K. However, most people will opt for the Sony F55 or F5 for budget reasons alone. The XAVC codec will have to be used to manage the data volumes to some extent: The active pixel count of most 4K displays for home applications is limited to 3,840 × 2,160. As the XAVC format covers horizontal 4,096 and 3,840 scan formats, the XAVC production tools can be used for both cinema and television.

Sony’s new PMW-F55 camera records in 4K XAVC INTRA at data rates between 240 Mbps (at 24P) and 600 Mbps (at 60P) internally. With a 128 GB SxSPro memory card, you can record up to 50 minutes in 4K/24P or around 20 minutes in 4K/60P. So you use 6.4 gigabytes/minute at 60 FPS. With the latest firmware, you can record the following in XAVC:

In contrast, the Sony F65 in 4K- 60P Raw consumes 128 gigabytes for 7 minutes, i.e. 18.4 gigabytes per minute. A real production for a two-minute trade fair film quickly comes to 5 terabytes of raw material with a Sony F65 4K Raw. This needs to be duplicated at least once as a backup and stored away on LTO tape for insurance purposes. The latest Macbooks with Thunderbold raids are therefore a must on set – and even with these, you still have to reckon with double the real-time copy times.

4K post-production

As 4K raw cannot be played back in real time and practically nobody has 4K reference monitors, HD offline editing will take place first. Here you have a free choice between all current editing systems. Avid Mediacomposer v7, Adobe Premiere CC and FCP X can process XAVC. Adobe Premiere CC can even play back 4K XAVC in 4K via HDMI on a consumer display in 30 FPS in real time on a current HP laptop. For 4K raw, there is also a classic dailies workflow: DaVinci Light can debayer Sony 4K/8K raw and downsample to HD in ProRes or DNxHD free of charge.

For finishing in 4K, you go back to the raw in DaVinci and the 4K output can be done at the appropriate frame rate. The workflow is therefore almost standard, apart from the material battle with the storage space. Render times in 4K should not be underestimated either, as they increase exponentially. Realtime 4K uncompressed (dpx etc) playback is also a challenge: a RAID hard disc system should offer around 1,500 megabytes per second of performance in order to be able to achieve realtime 4K 60p uncompressed with one fade. This means at least a new purchase of the RAID. A DaVinci system, which alone provides the necessary graphics card performance and memory connection or at least requires two Red Rockets for 6K debayer, costs upwards of 40,000 euros. Without the large panel, of course.

Quad HD TV set in self-experiment

My test device was a Hisense Ultra HD TV for just €999: it seemed like a bargain when you consider that the competition costs at least €3,000. That should have been a warning. However, in addition to the super-affordable price, there was another consideration: I wanted to try out the 4K display as a computer monitor. having 50 inches in 4K right in front of me would have been a decent monitor for DaVinci, After Effects and even for editing with Avid. But things turned out differently. The reality quickly became apparent in the graphics card settings: no 4K to choose from, even a driver update didn’t help. A search of the manufacturer’s technical specifications also left a lot to be desired: no clear information on the HDMI version or the possible PC resolutions. An indication that the truth is being concealed: as a macro photo of my test chart shows, the panel is indeed 4K, but the processing or the interface can only handle HD. The device is a 4K fake. Even HD looks horrible on it, after some tuning of the settings the result is just about acceptable. I will be sending it back.

[caption id="attachment_90209" align="alignnone" width="361"] Native HD is here only the source acceptance or the image processing of the Hisense. 4K (Ultra HD) is missing

Conclusion

Even if the sensor/camera and TV set have been optimally selected/adjusted, the decisive factor for achieving true 4K quality is the production. In the case of our 4K TV device field research, the content was produced with Red Epic in Quad HD and with Sony F700, whose sensors are read out cropped in Quad HD. So only with just under 8.3 megapixels in RGB. A Quad HD TV set has 24.9 megapixels, i.e. around three times the resolution in RGB and around twice the resolution in luminance on the horizontal plane. Accordingly, graphics/fonts/logos are crisp and sharp, but the “4K film content” behind them is still quite soft, although no longer pixelated. The majority of readers probably know from their VFX experience that they sometimes have to degrade the artificially generated, crisply sharp graphic content by means of grain, noise, chroma aberrations and blurring so that it harmonises with the “real shots”. Well, this additional workload will also remain with 4K. I dread the thought of having to accommodate SD archive material in 4K.