Since you clicked on this article, you must be committed to learning about the important, but admittedly boring, subject of color gamuts. Your interest may have stemmed from the latest smartphone releases, which boast features like P3 wide color and HDR support, or you may have realized that the monitor you have been using for years just doesn’t look quite right when it comes to photo editing. In any case, this should be a quick explanation of some common terms, what you should be looking for, and how it may be affecting your workflow.
What is a Gamut?
A gamut refers to a specific range of colors that can be displayed. Some of the most common gamuts in the creative industry are sRGB, Adobe RGB, and DCI-P3. Each of these references a different color range that is suited to various applications. Some are wider, some are smaller, but each is specific in what types of colors they support, so it is important to know general differences and your expected output to choose a proper option.
Here are the most common ones you will encounter.
sRGB: The “standard Red Green Blue” color space is, perhaps, the most common gamut you will find in modern electronic devices. It also matches up with Rec.709, since sRGB was derived from it, which is a space used for television and broadcast applications. This is an efficient standard that covers a good range for average viewing needs and is so common now that it is the default for the Web and most images taken with consumer cameras. The only limitations of this gamut are that it is, technically, the smallest of the most widely available options.
Adobe RGB: Developed by Adobe in 1998, this space was optimized for printing applications by covering most of the possible combinations of CMYK printing systems. Compared to sRGB, Adobe RGB offers expanded coverage in the cyan-green hues.
DCI-P3: A video-oriented wide gamut color space, P3 is becoming ever more popular, even being included on smartphones and all-in-one computers. It offers a similarly wide range as Adobe RGB (about 25% greater than sRGB), though it expands more into the reds and yellows and less into the cyan and green areas. It also plays a part in defining a display as HDR capable.
Why Does this Matter?
The simplest reason that color gamuts matter is that they tell users how many colors can be displayed and seen. So, a wide color gamut will display more colors than a standard one, for example, leading to more vibrant tones and more realistic imagery. It also helps you see more closely what the final output will look like, whether that is broadcast, print, or a digital cinema projector. For example, a colorist working on a program for TV may want a Rec.709 display, so that what they are looking at perfectly matches what most TVs can show, and they can perform fine tuning to get the exact look they want. A photographer, on the other hand, would be better suited to an Adobe RGB monitor because it will produce colors closer to what the photographer will see in their final print.
What about HDR?
High Dynamic Range, or HDR, technology is still quite new to the computer-monitor field and will take a little while to become a mainstream thing. The basics that you need to know involve the term “wide color” and brightness. Let’s start with wide color, which simply means that the display can display more colors than your average screen. Generally, this requires a true 10-bit panel and the ability to recreate 90% of the P3 gamut, though some manufacturers attempt to get away with 8-bit+FRC, so just pay attention to the actual specs when looking for a monitor or TV. Next is how brightness plays into HDR and, for many standards, you will need a screen either capable of reaching 1000 nits (cd/m²) at peak brightness and dropping down to 0.05 nits for the black level, or of hitting 540 nits brightness and down to 0.0005 nits in the black level. Why the two standards? It simply has to do with differing technologies, because LED screens can generally get brighter, but not as dark, while OLEDs can get a lot darker but not as bright. The important thing here is that the screens can display a wide range of brightness levels for a high dynamic range image, since brightness can be quite relative in practice.
Now, a lot more HDR formats and standards are being developed, so keep an eye out when you start seeing different labels being thrown around, and do the research if you don’t recognize something—but try to find something that can hit these key points if you want the best results.
Don’t Forget to Calibrate!
You may have found the perfect display with 100% coverage of all your needed gamuts, but it is worthless if you don’t perform regular calibrations. Factory calibrations can be quite good and, lately, manufacturers seem to be stepping up their game here. However, all displays will gradually change over time, in increments that won’t be noticeable. This drift can impact your images and, eventually, you may be wondering why your monitor no longer matches your prints. Also, if you are using a TV as a secondary display for clients or testing your HDR grading skills, these types of screens are generally calibrated by the manufacturer to look good, not necessarily to look accurate. So, it is highly important that you calibrate each display once you set it up to evaluate the quality.
Another aspect of calibration is adjusting to your current workflow. For example, the HP Z31x 31.1" DreamColor Studio Display is a 10-bit monitor with support for 100% of sRGB, Rec.709, and Adobe RGB, as well as 99% of DCI-P3 and 80% of Rec.2020. However, if you are looking to print, it wouldn’t make sense to be calibrated for P3. If you were looking to create something for broadcast standards in Rec.709, using Adobe RGB wouldn’t be helpful. Make sure that your display is set to the color settings that are appropriate to your output needs.
Do you have any more questions about gamuts or what type of display you should consider in your workflow? Let us know in the Comments section, below.