blog

git clone https://git.ce9e.org/blog.git

commit
c750e26c7e5021107e88b9c72940524bf2dbc2d3
parent
83c571398b8494fb482bc6825c360a0350bd9173
Author
Tobias Bengfort <tobias.bengfort@posteo.de>
Date
2025-07-18 16:21
add article on HDR

Diffstat

A _content/posts/2025-07-18-hdr/index.md 146 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

1 files changed, 146 insertions, 0 deletions


diff --git a/_content/posts/2025-07-18-hdr/index.md b/_content/posts/2025-07-18-hdr/index.md

@@ -0,0 +1,146 @@
   -1     1 ---
   -1     2 title: What is HDR, really?
   -1     3 date: 2025-07-18
   -1     4 tags: [color]
   -1     5 description: "HDR is about having more details in shadows and highlights. A higher dynamic range is one piece of the puzzle, but not all of it."
   -1     6 ---
   -1     7 
   -1     8 Technically, dynamic range refers to the ratio between the brightest and
   -1     9 darkest absolute luminance that can be displayed. SDR (standard dynamic range)
   -1    10 consumer equipment can produce something like 0.32 cd/m² to 320 cd/m². HDR
   -1    11 (high dynamic range) consumer equipment can produce roughly 0.064 cd/m² to
   -1    12 1,000 cd/m² (see [Poynton
   -1    13 (2022)](https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/30/1/6)).
   -1    14 
   -1    15 I was a bit confused when people talked about HDR, until I realized that they
   -1    16 were referring to the wider goal of having more details in shadows and
   -1    17 highlights. A higher dynamic range is one piece of the puzzle, but not all of
   -1    18 it.
   -1    19 
   -1    20 In this post I will explore of some of the other changes that contribute to
   -1    21 that goal.
   -1    22 
   -1    23 ## More Bits
   -1    24 
   -1    25 A simple way to increase detail is just to increase the amount of bits we use
   -1    26 for each channel. In sRGB (the most common SDR color space for digital media)
   -1    27 we typically use 8 bits per channel, while HDR often uses 10 bits, giving us
   -1    28 four times as many color stops.
   -1    29 
   -1    30 ## Beyond White and Black
   -1    31 
   -1    32 In SDR, black is 0% and white is 100%. HDR on the other hand makes a
   -1    33 distinction between surfaces and highlights. Surface white might for example be
   -1    34 defined at 90%, so that there is some headroom for highlights that are even
   -1    35 brighter. In fact, surface white on an HDR screen will in practice not be much
   -1    36 brighter than on an SDR screen.
   -1    37 
   -1    38 ## Wider Gamut
   -1    39 
   -1    40 While not technically related to high dynamic range, HDR standards such as
   -1    41 [HDR10](https://hdr10plus.org/wp-content/uploads/2024/01/HDR10_Ecosystem_Whitepaper.pdf)
   -1    42 also define which exact colors the red, green, and blue lights (called
   -1    43 *primaries*) that make up each pixel should have. Most commonly these are the
   -1    44 ones defined in [ITU-R BT.2020](https://www.itu.int/rec/r-rec-bt.2020/), which
   -1    45 can produce a much wider range of colors than sRGB, P3, or even Adobe RGB.
   -1    46 
   -1    47 ## Curves
   -1    48 
   -1    49 Some color spaces have the property of being *linear*, by which we mean that
   -1    50 adding the values of two colors in such a color space has the same effect as
   -1    51 mixing two corresponding physical light sources.
   -1    52 
   -1    53 However, using such a linear color space would result in inefficient encoding,
   -1    54 because human perception is not linear in that sense. A lot of bits would be
   -1    55 used to encode differences that we cannot even perceive, while only few
   -1    56 bits would be left for areas that make a big difference for us. So while a lot
   -1    57 of processing happens in linear color spaces, storage and transmission often
   -1    58 uses color spaces that employ a non-linear *transfer function* before encoding
   -1    59 the values as integers.
   -1    60 
   -1    61 Transfer functions actually take up most of the space in the relevant standards
   -1    62 (e.g. [SMPTE ST 2084](https://pub.smpte.org/latest/st2084/st2084-2014.pdf) or
   -1    63 [ITU-R BT.2100](https://www.itu.int/rec/R-REC-BT.2100)). The most common
   -1    64 one for HDR is called *perceptual quantizer (PQ)*.
   -1    65 
   -1    66 I honestly don't care all that much about transfer functions though. They are
   -1    67 applied in encoding and reverted in decoding, so they don't actually change
   -1    68 anything about the colors. The worst-case scenario is that encoding is not as
   -1    69 efficient as it could be. With the move from 8 bits to 10 bits that shouldn't
   -1    70 be a major problem.
   -1    71 
   -1    72 ## Tone Mapping
   -1    73 
   -1    74 When you want to display HDR content on an SDR screen, you have to do a lossy
   -1    75 conversion that is often called *tone mapping*. [ITU-R
   -1    76 BT.2408](https://www.itu.int/pub/R-REP-BT.2408) has some examples how this
   -1    77 could be done.
   -1    78 
   -1    79 This is actually not a new phenomenon though. There are very similar issues
   -1    80 e.g. in printing (because ink on paper has a very different gamut from
   -1    81 light-emitting screens) or in games (because the lighting systems in modern
   -1    82 game engines produce an extremely high dynamic range).
   -1    83 
   -1    84 One option is to clip everything that cannot be displayed to the target space.
   -1    85 Since clipping individual channels might change the hue, sometimes people opt
   -1    86 to reduce the saturation instead. Another option is to shrink down the entire
   -1    87 space until it fits.
   -1    88 
   -1    89 Which option you choose depends on your specific use case. [ICC
   -1    90 profiles](https://www.color.org/icc_specs2.xalter) (a file format for
   -1    91 conversion between color spaces) therefore can contain multiple mappings with
   -1    92 different rendering intents.
   -1    93 
   -1    94 A special kind of tone mapping is *local tone mapping*, where the mapping is
   -1    95 different depending on context. For example, a dark pixel in a bright area of
   -1    96 the image might be mapped to black, but a pixel of the same color in a dark
   -1    97 area of the image might be mapped to a lighter grey to maintain local contrast.
   -1    98 
   -1    99 ## Personalization
   -1   100 
   -1   101 Games commonly have a slider for gamma, which allows players to adjust the tone
   -1   102 mapping to their specific viewing conditions and personal preferences. Another
   -1   103 common case for color customization is a night mode, where screens are darker
   -1   104 and less blue at night.
   -1   105 
   -1   106 I have found [one article](https://lightillusion.com/what_is_hdr.html) that
   -1   107 interprets PQ as prohibiting these kinds of user customization. In this
   -1   108 interpretation, every color value maps to an absolute luminance. This would not
   -1   109 only take power away from users, it would also disregard the huge influence
   -1   110 that viewing conditions have on human sight. If anything, we need more options
   -1   111 for personalization, not less.
   -1   112 
   -1   113 ## Scene-to-Scene differences
   -1   114 
   -1   115 [ITU-R BT.2390](https://www.itu.int/pub/R-REP-BT.2390/) mentions that a higher
   -1   116 dynamic range could not just be used to increase the range within a scene, but
   -1   117 also scene-to-scene differences. In my opinion, that would be the worst
   -1   118 possible outcome of all of this. I fear that we will end up with a situation
   -1   119 like with sound, where you have to increase the brightness of your screen all
   -1   120 the way too see anything during a dark scene, only to get your eyes burned when
   -1   121 the next bright scene arrives.
   -1   122 
   -1   123 ## Is any of this relevant?
   -1   124 
   -1   125 A lot of work is currently being put into supporting HDR across the stack. On
   -1   126 the hardware side we need cameras that can capture more details and screens
   -1   127 that can display them. On the software side we need support for different color
   -1   128 spaces across the stack. Many image and video formats have already added that,
   -1   129 [CSS color Level 4](https://www.w3.org/TR/css-color-4/) brings more color
   -1   130 spaces to the web, and Wayland also recently gained a [color management
   -1   131 protocol](https://wayland.app/protocols/color-management-v1).
   -1   132 
   -1   133 While I think that it is nice to have the option of a higher dynamic range and
   -1   134 a wider gamut (especially during production), sRGB is fine for most
   -1   135 everyday activity. Adding color management across the stack adds a whole lot of
   -1   136 complexity.
   -1   137 
   -1   138 What I would like to see is better images and more personalization. What I fear
   -1   139 we will get is less personalization and more scene-to-scene differences.
   -1   140 
   -1   141 ## Further Reading
   -1   142 
   -1   143 -   The wayland ecosystem collected some information around HDR, among them a [list of all relevant specification](https://gitlab.freedesktop.org/pq/color-and-hdr/-/blob/main/doc/specs.md).
   -1   144 -   [SMPTE RP 177](https://pub.smpte.org/pub/rp177/rp0177-1993_stable2010.pdf) explains how calculate a conversion matrix from primaries and white point
   -1   145 -   The kernel documentation has a surprisingly good [explanation of color spaces](https://www.kernel.org/doc/html/v4.9/media/uapi/v4l/colorspaces.html).
   -1   146 -   Matt Taylor has a good [article on tone mapping in games](https://64.github.io/tonemapping/) with lots of screen shots for comparison.