Blog / Resources / Cinematography HDR reality and monitoring – a DOP’s Perspective Cinematography These are some excerpts of a CML thread about HDR and monitoring HDR, which I have reproduced with permission from Geoff Boyle and Art Adams. it was really interesting and is about HDR from the DOP’s perspective. HDR reality [Geoff Boyle] There is huge confusion between HDR at the exposure stage and HDR at the display stage, so a few basics. What most of you know and think of as HDR is exposure side HDR where multiple exposures are used to create a combined image that squashes an HDR scene into a SDR display. This is the stills photography view of HDR and also one that some of us have experimented with in moving pictures. In my case with multiple Amiras in a mirror rig set to zero to give a combined 20+ stops DR. What we are talking about in the case of moving images is HDR as a display format. The UHD Premium standard is approximately 15 stops of DR peaking at 1000 nits, however, in most viewing environments this will not be the range you see, it’ll be more like 12 stops because of the effect of ambient light on the black level. A rec 709 monitor will be capable of 6 to 8 stops of DR [Dynamic Range], the upper end is pushing it and I work on 7 stops as achievable. So, in reality if we shoot with cameras with a DR capability of more than 12 stops we don¹t have to worry about SDR or HDR as long as we expose in the middle of that range, iw mid grey should be exposed to give us +/- 6 stops. If we expose like this we can use the entire image in either display format. As cameras become available that give us more DR we will have more room to play with. Bit depth, as when we finish for 8 bit displays it is best to start in 10 bit so that we have room to manoeuvre if we are finishing for a 10 bit display we need to originate and record at 12 bit or greater to give us room to work. If you are shooting for a SDR [Standard Dynamic Range] finish now but wish to future proof what you shoot just expose to use the complete range of the camera and you¹ll be cool. Monitoring for HDR is difficult, partly because monitors that display HDR are bloody expensive and secondly you are highly unlikely to be able to view them in conditions that allow you to judge what is in the picture. The solution to this is, wee, use a meter, become a cinematographer again, read shadows, read highlights, you then know if you¹re in range. ACES is a huge help in this process, I was demonstrating at HPA in February the same rushes on various monitors in different modes. I had original material in raw form from Alexa, C500, F65 & Red Epic. Using both Resolve and Daylight I was feeding the original Dolby HDR monitor in both SDR and HDR and a Sony X300 in HDR. I also had a colorfront system with a Sony X300 showing HDR. I was loading the rushes and not grading but just applying the relevant IDT and ODT and switching back and forth so that people could see what happened. Essentially nothing happened! the HDR material looked better but it looked great without anything other than the relevant ODT being used. The bigger DR made a huge difference but bear in mind that all this material had been shot for Rec 709 display, actually some was shot for DCDM but still DR limited compared to HDR. I exposed it all as I normally would, using meters. For the moment we have to use rec 709 monitoring and use our experience to know what will work. Hmm, the cinematographer is the one who understands what the images will look like! Oh and lots of people will try and sell you all kinds of magic sauces and solutions, some work, some don¹t but ultimately you don¹t need them. What about display-side HDR [Geoff Boyle] Manufacturers have to sell boxes. HDR is a much easier sell than 3D was, you can clearly see it in a store, especially if everything is turned up to 11. The HDR sets being sold at the moment peak at 650 to 1,000 nits, my current SDR set peaks at 350 to 400, I just wish there was a way to use that extra brightness for more dynamic range rather than lighting up my living room. What about Monitoring HDR? [Art Adams] Monitoring for HDR is difficult, partly because monitors that display HDR are bloody expensive and secondly you are highly unlikely to be able to view them in conditions that allow you to judge what is in the picture. They also don’t show the entire dynamic range of the image. If a 1000 nit display shows about 5-5.5 stops of linear dynamic range above middle gray, and you’re shooting with a camera that has 6-7+ stops of dynamic range, the extra .5-1.5 stops gets clipped off–unless, of course, one builds a LUT that prevents this. So far the monitors I’ve worked with don’t address this. As for using experience alone to judge HDR from SDR… it seems to me that experience can only be gained by watching HDR displays. HDR is so lifelike that we suddenly have to become aware of things that we routinely take for granted when it comes to human vision. In SDR we know that mid-tones are the sweet spot, and that all else is gravy. In HDR it’s all sweet spot, but highlights can be both more vibrant and more distracting than ever before. Highlights attract my attention in daily life but if they aren’t attached to something interesting I learn to ignore them. That’s not possible once one puts a frame around an image: everything within that frame becomes significant, and its placement or relative brightness can add or subtract from that significance. Even film experience doesn’t necessarily help, as HDR’s dynamic range is certainly beyond that of print film, and the images are vastly brighter. Plus there’s the issue of OLEDs (greater extent) and LEDs (lesser extent) clamping down on large bright areas in order to save themselves. It’s important to recognize when we’re shooting something that will cause the display to go into damage control mode, which affects the overall feel of the image. When I first saw HDR imagery I was stunned. The more I learn about the current state of HDR, I’m horrified. Not only can I not be sure that my visual intent will make it through the grade, but it’s likely that whatever does make it through the grade will be presented radically differently on every television on which it will be viewed. Still… HDR television can’t come fast enough for me. I love new technology, but it helps if that new technology has been thought out well in advance. So far, HDR hasn’t in many practical ways. That doesn’t mean I won’t embrace it, but it will be more important than ever to be aware of exposure and lens flare and camera movement and lots more. We’ll have to retrain ourselves. The more I learn, the more complex it seems. Reproduced from CML thread with thanks to Geoff Boyle and Art Adams Art Adams | Director of Photography Geoff Boyle FBKS, Cinematographer www.gboyle.co.uk Related articles The Creative and Technical Differences between Full Frame and S-35 Cinematography Introduction Film making has always had ‘pinch points’ of innovation, some of them more successful than others and in recent history more often than not driven by the retail thrust … Read more What real world Cinematographers think about using Infiniprobe lenses Cinematography, Lenses The Digital Cinema Society presents an exploration of Micro/Macro cinematography with three DPs including Bill Bennett, ASC, James Mathers, and Cameron Cannon. The trio photograph a variety of shots in order to evaluate these exceptional lenses and also share their insights in behind-the-scenes coverage. Macro and Micro Cinematography Cinematography, Lenses We all love to see images of extreme magnification to show a world which we can’t experience with our own eyes but just like all cinematography, this is hard to do well. The choice right lens for the specific shot is key and this article aims to explain the differences between Close Up, Macro and Micro photography/cinematography and in particular, to introduce the cinematographer to a relatively recent new type of lens, which are called Nelsonian Lenses made by Infinity Photo-Optical and which use microscope techniques to create unique images which are not possible using traditional optics.