• Monitor with extended color gamut. Monitors Samsung SyncMaster XL24 and XL30

    © 2014 site

    A color space is an abstract mathematical model that describes a certain color palette, i.e. fixed range of colors, using color coordinates. For example, palettes built according to the additive RGB scheme are described using a three-dimensional model, which means that any color included in the palette can be uniquely determined by an individual set of three coordinates.

    The most complete color space, CIE xyz, covers the entire spectrum of human-visible colors. In 1931, the International Commission on Illumination (Commission internationale de l "éclairage or CIE) approved CIE xyz as the reference color space, and therefore it is still used today to evaluate and compare all other models.

    It is important to remember that no device used to reproduce color images, be it a printer or a computer monitor, is able to display all the variety of colors that is available to a person with normal vision. Worse, color gamuts often don't match across devices, causing the same colors to look different depending on the specific monitor or printer model. To solve this problem, the so-called. working color spaces, which are standard palettes that more or less correspond to the color gamut of a certain class of devices. The use of standard color spaces when working with a color image allows you to ensure that you do not go beyond the color range of the final output device, and if a way out is inevitable, you can find out about the discrepancy between color spaces in advance and take appropriate measures.

    Working color spaces

    The most commonly used working color spaces in digital photography are sRGB and Adobe RGB. Much less popular is ProPhoto RGB.

    sRGB

    sRGB is a universal color space created jointly by Hewlett-Packard and Microsoft in 1996 to unify color reproduction. sRGB is far from the widest space - it covers only 35% of the colors described by CIE, but it is supported by all modern monitors without exception. sRGB is the worldwide standard for displaying images on the web, and all web browsers use this color space by default. When you save an image in sRGB, you can be sure that the colors you see on your monitor will be displayed on other monitors without significant distortion, regardless of the program used to view them. Despite the apparent narrowness, the sRGB palette is sufficient for the vast majority of practical needs of the amateur photographer, including photography, photo processing and printing.

    Adobe RGB

    In 1998, Adobe Systems developed the Adobe RGB color space, which is more accurate than sRGB to the palette available when printing on high-quality color printers. Adobe RGB covers approximately 50% of the CIE color gamut, but the differences between Adobe RGB and sRGB are hard to tell by eye.

    Visual comparison of the sRGB color range (color area)
    and Adobe RGB (light gray area).

    It should be understood that the mindless use of Adobe RGB instead of sRGB, due to the abstract superiority in color gamut, will not only not improve the quality of your photos, but will most likely lead to its deterioration. Yes, in theory, Adobe RGB has a larger color gamut than sRGB (mostly in blue-green tones), but what's the point if in 99% of cases this difference is not noticeable, either on a computer monitor or when printing, even with the right equipment and software?

    Adobe RGB is a highly specific color space used purely for professional photo printing. Images in Adobe RGB require special viewing and editing software, and a printer or mini photo lab that supports the appropriate profile. When viewed in programs that do not support Adobe RGB, such as web browsers, any colors that do not fit into the standard sRGB color space will be clipped and the image will fade. Likewise, when you print from most commercial photo labs, Adobe RGB will be messily converted to sRGB, and you'll end up with less saturated colors than if you originally saved the image in sRGB.

    ProPhoto RGB

    Due to the fact that the entire range of colors perceived by a digital camera matrix is ​​so wide that it cannot be directly described even using Adobe RGB, Kodak proposed a new ProPhoto RGB color space in 2003, covering 90% of CIE colors and poorly -poorly corresponding to the capabilities of the photomatrix. However, the practical value of ProPhoto RGB for the photographer is negligible, since no monitor or printer has enough color gamut to take advantage of the ultra-wide color space.

    DCI-P3

    DCI-P3 is another color space proposed in 2007 by the Society of Motion Picture and Television Engineers (SMPTE) as a standard for digital projectors. DCI-P3 simulates the color palette of film. In terms of its coverage, DCI-P3 surpasses sRGB, and roughly corresponds to Adobe RGB, with the only difference being that Adobe RGB extends more into the blue-green part of the spectrum, and DCI-P3 into the red. In any case, DCI-P3 is of interest mainly to cinematographers, and is not directly related to photography. Of the mainstream computer monitors, only the Apple iMac Retina displays seem to be able to display DCI-P3 correctly.

    Choosing a color space should be based on specific practical considerations, and not at all on the basis of the theoretical superiority of one space over another. Unfortunately, more often than not, the coverage of the color space used by the photographer only correlates with their level of snobbery. To prevent this from happening to you, consider those stages of the digital photo process that may be associated with the choice of a particular color space.

    Actually shooting

    Many cameras allow the photographer to choose between sRGB and Adobe RGB. The default color space is sRGB and I strongly advise you not to touch this menu item whether you are shooting in RAW or JPEG.

    If you shoot in JPEG, then most likely you do it to save time and effort, and you don’t tend to fiddle with each shot for a long time, which means you definitely don’t need Adobe RGB.

    If you shoot in RAW, then the choice of color space does not matter at all, since a RAW file, in principle, does not have such a category as a color space - it simply contains all the data received from a digital matrix, which will only be compressed during subsequent conversion up to the specified range of colors. Even if you're going to be converting your photos to Adobe RGB or ProPhoto RGB, you should leave your camera settings at sRGB to avoid unnecessary hassle when you suddenly need in-camera JPEG.

    Editing

    A standard color space is assigned to an image only when a RAW file is converted to TIFF or JPEG. Up to this point, all processing in the RAW converter takes place in some conditional unnormalized color space, corresponding to the color gamut of the camera's matrix. That is why RAW files allow such freedom in handling color when processing them. When editing is complete, colors outside the target palette are automatically adjusted to their closest values ​​within the color space you choose.

    With rare exceptions, I prefer to convert RAW files to sRGB because I want results that are extremely versatile and playable on any hardware. I'm quite happy with the colors I get in sRGB and find the Adobe RGB space to be overkill. But if you feel that using sRGB negatively affects the quality of your photos, you are free to use whichever color space you see fit.

    Some photographers prefer to convert files to Adobe RGB in order to have more freedom when post-processing the image in Photoshop. This is true if you really intend to carry out deep color correction. Personally, I prefer to do all the work with color in the RAW converter, because it is easier, more convenient and provides better quality.

    What about ProPhoto RGB? Forget about it! This is a mathematical abstraction and the feasibility of its practical application is even lower than that of Adobe RGB.

    By the way, if you still have to edit photos in Photoshop in spaces other than sRGB, do not forget to use 16 bits per channel. Posterization in large gamut color spaces becomes noticeable at equal bit depths sooner than in sRGB because the same number of bits is used to encode a larger range of hues.

    Seal

    Using Adobe RGB when printing photos can be justified, but only if you are well versed in color management, know what color profiles are and personally control the entire photo process, and also use the services of a serious photo lab that accepts files in Adobe RGB and has the appropriate equipment for their printing. Also, feel free to run some tests by converting the same pictures to both sRGB and Adobe RGB and printing them on the same equipment. If you can't see the difference, is it worth it to complicate your life? The sRGB palette is enough for most scenes.

    Internet

    All images intended for publication on the Internet must be converted to sRGB without fail. If you use any other color space, colors in the browser may not display correctly.

    If I didn't express my position clearly enough, then let me repeat once again: in case of the slightest doubt about which color space you should use in a given situation, choose sRGB, and you will save yourself from unnecessary trouble.

    Thank you for your attention!

    Vasily A.

    post scriptum

    If the article turned out to be useful and informative for you, you can kindly support the project by contributing to its development. If you did not like the article, but you have thoughts on how to make it better, your criticism will be accepted with no less gratitude.

    Do not forget that this article is subject to copyright. Reprinting and quoting are permissible provided there is a valid link to the original source, and the text used must not be distorted or modified in any way.

    The question of the correct display of color on the monitor belongs to the category of eternal. Everyone who has ever encountered the need to print what he sees on the screen (and exactly the way he sees it) knows that this is not an easy procedure. Printers in such a situation are even more difficult, because the quality of the system "monitor - printing device" depends on the client's satisfaction with the result and, accordingly, the success of work and business. In addition, the idea of ​​a remote (soft, screen - as you like) color proof is in the air, which will become a reality not today or tomorrow. With the rise in color-demanding printing methods, such as extended triad printing (more than four inks), professional monitors have become increasingly demanding. Now we need a new approach to solving the problem of matching between colors obtained by additive and subtractive synthesis.

    It is very difficult to choose a monitor from the wide range offered today. A professional monitor from a manufacturer specializing in such devices is an expensive pleasure. For most users, the difference between a consumer model with a caressing Pro prefix and a monitor designed to work with color is not obvious, especially since it is also not always clear from the characteristics. Therefore, it makes sense to figure out what features professional monitors have and what conditions they must satisfy in order to meet modern requirements.

    Increasing the color gamut

    Most TFT monitors can reproduce up to 75% of the NTSC color space. But while this gamut is theoretically large enough to include print synthesis colors, its size and position in the color space is such that these monitors are not suitable for displaying print colors on screen. The reason lies again in the fundamentally different color models of monitors (RGB) and printing devices (CMYK). In order to include all printable colors, the color gamut of RGB devices (monitors in this case) needs to be greatly expanded.

    The best way to increase the color gamut of a TFT monitor is to optimize the spectral response of the backlight. By combining the achievements of colorimetric and chemical technologies, it became possible to create a phosphor with a modified spectral response and better reproduction performance in the red and green color gamuts.

    The results of these changes are clearly visible in the illustration: the green and red regions of the spectrum have shifted, resulting in an increase in the size of the color gamut. Much brighter greens and reds became available.

    Color gamut optimization

    Unfortunately, gamut expansion alone does not capture all the colors reproduced by subtractive synthesis devices (or, more simply, CMYK devices). The main goal was and is to achieve the most complete color matching on the monitor and on the printout. The simple example shown in the figure shows that if the color gamut of one monitor (black line) is larger than that of another (red line), this does not mean that it will reproduce the colors of printing devices better (white line).

    In addition, you need to clearly understand the difference between the size of the color gamut, that is, the position of the extreme points on the graph, and the quality of the color gamut - the actual correspondence of the colors on the monitor to the printing device.

    This means that a monitor with a smaller but optimized color gamut may be a better choice for color grading or remote proofing than a solution with a nominally large gamut but conditionally acceptable color reproduction.

    Let's talk about spaces

    There are two main RGB working spaces in color management systems today that are very close to each other, Adobe-RGB and ECI-RGB.

    The Adobe-RGB system is a good solution for most tasks, which, unfortunately, is not well suited for transferring the colors of printing devices and organizing screen color proofing. The reason for this is that it uses a white point of 6500 K and a gamma of 2.2. Recall that the white point of 5000 K is considered the standard for color management in printing, and gamma 2.2 does not correspond to the dot gain curve of classic offset printing. In addition, the Adobe-RGB color gamut practically cuts off the rich blue colors reproduced in offset printing.

    The ECI-RGB system is a much more acceptable option. It was created with all standardized printing methods in mind, it excludes colors that cannot be reproduced in the RGB system, and finally, ECI-RGB uses a white point with a color temperature of 5000 K and a gamma of 1.8. That is, it better corresponds to the generally accepted conditions of printing and control of the print. This space is an excellent basis for a hardware independent system: it includes most RGB devices and conforms to print standards. To be clear, ECI-RGB cannot reproduce the very rich blues that sRGB (and Adobe-RGB) can produce, but these colors also cannot be reproduced on any printing device.

    If we take as an example the work with photographic images, where Adobe-RGB dominates, then we can note several interesting points. On the one hand, Adobe-RGB is the standard workspace of professional digital cameras and a pre-installed system in the main tool of photo artists - Adobe Photoshop. On the other hand, the ICC standard uses a D50 white point, and the vast majority of viewing stations and flash units also use 5000K as their white point. The photograph itself is only the beginning of the process, most of the photographs are eventually printed, and again the printing process is best matched by a white point of 5000 K and a gamma of 1.8. Therefore, using the appropriate color space - ECI-RGB - will help you get the highest quality result and get rid of typical problems, especially since most RAW converter programs support the ECI-RGB space as standard. Remarkably, no photo printer (including dedicated models with 12 colors) is able to reproduce all the colors of Adobe-RGB, despite the fact that this system, as we have seen earlier, cuts the blue tones available to these devices. It turns out that in this situation, ECI-RGB again offers the best coverage of the color space of the printing system.

    Difference between "calibration" and calibration

    The accuracy of the calibration and profiling of the monitor directly affects the accuracy of the display of colors included in its color gamut, and the imitation of colors that go beyond its gamut. There are many devices on the market designed to calibrate monitors, and although some of them are very powerful and accurate solutions, the quality of the results depends on the ability to control the monitor itself. The most common case is when not the monitor itself is calibrated, but with the help of a measuring device - a colorimeter or a spectrophotometer - changes are made to the color matching table of the video card. In this case, the created profile is forced to make too many changes, which negatively affects the color reproduction. For example, if the original white point of a monitor is 7000 K and the gamma is 2.2, then bringing such a monitor to compliance with printing requirements (reducing the white point by 2000 K, and the gamma by 0.4) will cause a loss of up to 40 gradations per channel. This will be noticeable when working with a monitor, and such a device cannot be recommended for use in professional color work. If the monitor has the ability to change the brightness of the color channels, then usually the range of changes is limited to one hundred steps, and this is not enough for an accurate setting. Something will be compensated by the profile, but the inability to adjust the gamma of the monitor will result in the loss of up to 19 gradations per channel when recalculating. If the gamma setting is available, then only for 50% gray. For a better result, a color-oriented monitor should have pre-set gamma values ​​that comply with the standard. But the best thing is the possibility of hardware calibration of the color matching table (Look-Up Table, LUT) of the monitor itself, while maintaining the original LUT values ​​​​of the graphics adapter. Professional monitors with the possibility of hardware calibration offer an internal LUT with an accuracy of up to 14 bits, that is, they have not 256 gradations, like a conventional monitor, but 16,384, which practically eliminates color inaccuracy.

    What will you prove?

    The monitor is calibrated, the system is configured, all profiles are connected, and the client is still unhappy or not sure that everything is really right. The way out, in addition to the competent organization of viewing conditions (correct ambient light, no bright or dark spots in the field of view, etc., etc., which the reader probably knows very well), may be to certify the monitor according to a generally accepted standard, e.g. UGRA. Some professional solutions allow you to do this. This operation is based on the measurement of gray balance in the entire dynamic range and a set of colors, in this case from the UGRA/FOGRA Media Wedge set. You can save the result with maximum color deviation and average deviation as a PDF and verify its accuracy. This may be an additional argument in favor of choosing the services of a printing house or a prepress department that offers such a service.

    Unfortunately, the volume of the article does not allow discussing many more interesting issues related to color rendering in general and monitors as tools for working with color in particular. The current state of the printing industry and market trends place new demands on all aspects of production. A professional monitor today is not just a device, but rather an approach to solving a problem. Behind the development of such a monitor is many years of experience and serious research, which distinguish it from mass products. Of course, the price of the device is sometimes a determining factor, but everything here is far from being as gloomy as many people think. The onset of new developers is already leading to the fact that high-level solutions inevitably become cheaper, and more and more models appear in more affordable configurations without sacrificing functionality. This positive trend is another argument in favor of purchasing a professional monitor adapted for printing tasks, which will allow you to see the color on the screen as it should be.

    Let me remind you that last time I considered such marketing tricks as frankly overestimated contrast ratio and unrealistic refresh rate, as well as hypertrophied color gamut. And now we will move on to another most popular topic: 4K resolution.

    The first commercial TV supporting Ultra HD resolution appeared in Russian retail in 2012. It was Sony BRAVIA KD-84X9005 - 84-inch model worth 1,000,000 rubles. Since then, TV manufacturers have made a decent leap forward. For three years, a large number of such devices have appeared on sale. And for a very reasonable price too. For three years, the marketing machine has been spinning its virtual gears. So much so that such "chips" as 3D support and the presence of SmartTV faded into the background.

    The editors of the site itself are paying more and more attention to solutions based on Ultra HD resolution. So, on our site, reviews of 4K TVs are constantly published. Powerful gaming graphics cards are also being tested at 2160p resolution. Obviously, the Ultra HD era will come into its own sooner or later. But this does not mean at all that today, having heard enough sweet marketing barkers, you must immediately run to the store for a new TV.

    Marketing fluff. What is behind the "new technologies" in TVs. Part 2

    Was it a boy?

    What is Ultra HD? The simplest explanation is the very high resolution of 3840x2160 pixels. Ultra HD has two equal synonyms: 4K and 2160p. However, marketing is already in the very definition of the concept. I'll try to clearly explain.

    Popular permission formats

    On October 22, 2012, the Consumer Electronics Association (CEA) industry organization approved the Ultra HD name and minimum specifications. This happened by anonymous voting of the council of the working group. According to the official document, modern Ultra HD projectors, monitors and TVs must have at least 8 million active pixels: at least 3840 horizontally and at least 2160 vertically. The aspect ratio must be at least 16:9. Plus, the device must have at least one digital input capable of receiving a video signal with a resolution of 3840x2160 pixels. That is HDMI 1.4, HDMI 2.0 or DisplayPort. These TVs, projectors and monitors receive the Ultra HD Ready label.

    Logo symbolizing support for Ultra HD

    However, Ultra HD is a technology, and not just the screen resolution feature mentioned above. The Japanese broadcaster NHK (Nippon Hōsō Kyōkai), which is rightfully considered a pioneer in UHD television, has been developing it for a decent amount of time. The Japanese began their experiments with 4K back in 2003, but only in August 2012 (that is, before the CEA approved the name and minimum characteristics of Ultra HD), the International Telecommunication Union (ITU), which celebrated its 150th anniversary this year, based on NHK data , published a single technical standard for Ultra HD television, which was called ITU-R Recommendation BT.2020 (Rec. 2020). It is he who throughout all this time is considered the main reference point not only for equipment manufacturers, but also for television broadcasters. For greater clarity, I have given the main characteristics of Rec. 2020 in the table below. As you can see, they greatly exceed the parameters of the current Rec. 709, adopted back in 1990 and designed specifically for HDTV. There is a huge difference between the two standards, primarily in signal quality.

    Color gamut comparison for popular TV formats

    But what about modern 4K panels? Most of them work with Rec. 709. There are also TVs on sale, the color gamut of which corresponds to 98% DCI-P3 and 90% DCI-P3. But not Rec. 2020. In the last part of the “nonsense”, I already told how manufacturers brag about the increased color gamut of their solutions, implemented through hardware and software algorithms. However, in practice, it turns out that either it is of no use, or the built-in logic of the device adjusts the image provided by the source to the “fictitious” palette and noticeably distorts the colors. Simultaneously with equipment that supports Rec. 2020, related content should also appear. Here, not only corporations such as NHK, but also leading film companies should try.

    Ultra HD is not just a resolution of 3840x2160 pixels. This is a whole technology and serious requirements for signal quality

    So it turns out that modern 4K TVs, on the one hand, with the consent of CEA, have the Ultra HD Ready label, but at the same time do not fully comply with the more serious ITU standard. In my opinion, this is the most common marketing. It turns out that ordinary HDTV-TVs simply added a matrix with a higher resolution. Devices with real Ultra HD (read - from Rec. 2020) will appear only in the foreseeable future, although it is worth recognizing that there are already advances in this direction.

    Panasonic TC-65CX850U - 98% DCI-P3 color gamut TV

    And so it will come down

    Let's continue the conversation about the fact that Ultra HD is not only about resolution. The first commercial 4K TVs already had some problems, which, however, did not prevent marketers from launching their obsessive campaign. The fact is that in the UHD solutions of those years, the HDMI 1.4 interface was used, which was able to transmit a high-resolution signal only at a 30 Hz sweep. It is now that many modern models are equipped with an HDMI 2.0 port, and the problem is partially solved. However, on sale you can still find models only with HDMI 1.4 (including the 2014 lines). If you still decide to buy such a device, then by all means take a model with HDMI 2.0 - this is a guarantee that the hardware of the "box" will not become obsolete for the next few years.

    Ultra HD TV must be equipped with HDMI 2.0

    A prime example of this is budget 4K TVs. I’ll make a reservation right away: the word “budget” in the current realities means models worth 50-60 thousand rubles. For example, Philips 49PUS7809. This "box" has only HDMI 1.4 ports and does not support H.265/HEVC codec. The built-in player is not able to work with 4K-quality content. Finally, by default, the 49PUS7809 starts up at Full HD resolution. You can activate the declared 2160p in the settings, but even after that, in some cases, 4K resolution does not work at the proper level. However, for some reason, the manufacturer himself is silent about this, focusing the attention of a potential buyer on, I quote, “ unrivaled 4K Ultra HD image quality.» Marketing? Marketing! The funny thing is that for such a price tag you can get a very good and functional Full HD TV. As a consequence, don't chase pseudo-4K.

    An example of an inexpensive TV model Philips 49PUS7809. See how high her score is on Yandex.Market. True, this 4K TV does not support either HDMI 2.0 or H.265/HEVC codec

    Old song about the main

    Even after three years, there is very little publicly available 4K quality content, even if there is little progress. More and more consumer equipment supports, for example, shooting video in Ultra HD. Popular foreign services (NETFLIX, Amazon instant video, ASTRA, PlayMemories Online and Privilege Movies 4K) mark their presence in this market. When such online cinemas will appear in Russia is a good question. Marketers do not care about such inconsistencies. The presentations show magnificent, specially prepared videos. In fact, works of art in Ultra HD format, as they say, the cat cried. The main thing is to repeat the mantra that " 4K captures four times the detail of conventional HD.»

    “Look how many great movies are already available in 4K,” Sony tells us. I watched 68 films in four years. For comparison: according to Kinopoisk, in October 2015, 43 films were released in Russian film distribution.

    External storage media should play an important role in promoting 4K content. However, the Ultra HD Blu-ray format was adopted only this year, on August 24th. Plus, the first commercial BD players will appear only in 2016. Therefore, our compatriots will have to hope for upscaling of lower-resolution video to 4K format in the near future.

    No matter what anyone says, there is still very little Ultra HD content

    In a few words, upscaling is the process of "stretching" a lower resolution video to 2160p by the internal logic of the TV. Marketing comes into play here as well. Manufacturers are not shy about claiming that their products scale the image superbly. Here is what they write on the official website of Philips: An Ultra HD TV has 4 times the resolution of a regular Full HD TV. With 8 million pixels and unique Ultra Resolution technology image quality will not depend on the original content. » The reality is that it is impossible to achieve this in principle. There will always be a difference in quality between native 4K and upscaled 4K. It remains only to find out how well this or that TV has processing processes. For example, the Panasonic VIERA TX-65AXR900 does an excellent job of this. But Samsung SUHD UE65JS9000TXRU has some problems.

    TV Panasonic VIERA TX-65AXR900. One of the few 4K models that does an excellent job of upscaling video to Ultra HD resolution

    Four times stronger

    Let's say that the problem with the lack of content will be solved as soon as possible. Throughout this post, I've been quoting TV manufacturers claiming that 4K is four times sharper than Full HD. This is one of the most common marketing claims. And everything seems to be logical: Ultra HD resolution is four times larger than Full HD resolution. Yes, but many people confuse high resolution with better image quality. The confusion applies not only to TVs with large diagonals, but also to tiny smartphones. The definition of image clarity simply does not take into account the distance from which the viewer looks at the screen.

    Optimal TV viewing distance based on screen size and resolution

    There are several methods for determining the optimal TV viewing distance depending on the screen size and resolution. And even special calculators. I see no reason to argue about the correctness or incorrectness of certain schemes, but in front of a Full HD "box" with a diagonal of 55 '', you need to sit at a distance of about 2-2.5 meters. For Ultra HD, the distance is already reduced to a value of 1-1.5 meters. As a result, it is enough for the viewer to have the network further away so that the image detail is noticeably reduced. So, at a distance of 2.5-3 meters, Ultra HD will not differ from Full HD.

    4K image clarity depends on viewing distance

    At the very beginning of the article, I drew your attention to the very first commercial 4K TV from Sony. During its testing, when watching a prepared Ultra HD video, we were recommended to sit at a distance of 1.6-2 meters. Initially, it seemed like a utopia, but in fact, watching a video on the BRAVIA KD-84X9005 canvas turned out to be as convenient as reading a newspaper. In fact, the distance between the screen and the person turned out to be less than the diagonal size of the device itself (2.13 m). This leads to a simple conclusion: there is no point in buying a 4K TV with a diagonal of less than 55-60 inches. Sitting at a distance of 2-3 meters, you simply will not feel the effect of having an ultra-high resolution.

    I have only one question: why?

    Entertainment in Ultra HD

    Recently, questions regarding the purchase of a UHD TV for games have become more frequent. Marketers are hard at work in this field as well. Everything seems to be logical: 4K resolution allows you to sit very close in front of the TV. All you have to do is get the right equipment. But only the latest generation consoles - Sony Play Station 4 and Microsoft Xbox One - will not work. They can't even pull out 1080p resolution. There are rumors that 4K versions of these consoles may soon be presented, but this does not apply to the games themselves, but to the playback of multimedia content. In particular, with the help of the NETFLIX service.

    Ultra HD TV and gaming computer - a very expensive tandem

    It turns out that the only option to play on a UHD TV is to buy a powerful computer. In addition, video card manufacturers are actively promoting the ideas of "Orthodox" 4K gaming. Unfortunately, today only a few graphics adapters can cope with modern computer games at settings close to the maximum in Ultra HD resolution, and even then with a big stretch. Regular visitors to the site, who are interested in computer hardware, were convinced of this more than once. Playing in 4K will require a very powerful computer, which can easily cost over $2,000.

    Marketing 2-in-1

    Ultra HD and curved screens are the most popular "innovations" of the last two years. They are very closely intertwined with each other. The main message for this type of device sounds very simple: the curved surface and 4K allow you to more immerse yourself in what is happening on the screen. For example, this is what Samsung says about it: Samsung's revolutionary curved SUHD TV lets you immerse yourself in fantastic virtual reality and feel yourself at the center of what's happening on screen.»

    On the WebKit blog.

    The last few years have seen a significant improvement in display technology. At first it was an upgrade to higher resolution screens, starting with mobile devices and then moving to desktops and laptops. Web developers needed to understand what high DPI meant to them and how to design pages that used such a high DPI. The next revolutionary display improvement is happening right now: improved color reproduction. In this article, I'd like to explain what that means and how you developers can identify these displays and provide a better experience for your users.

    Take a typical computer monitor - the type you've been using for over a decade - an sRGB display. Recent Apple designs, including the Retina iMac (Late 2015) and iPad Pro (Early 2016), can show more colors than an sRGB display. Such displays are called wide color gamut displays (an explanation of the terms "sRGB" and "gamut" will be given later).

    Why is it useful? A system with a wide color gamut often provides more accurate reproduction of the original color. For example, my colleague named Hober there are flashy sneakers.

    Hober's bright orange sneakers

    Unfortunately, what you see above does not convey how impressive these shoes really are! The problem is that the color of the shoe material cannot be represented on an sRGB display. The camera that this photo was taken with (Sony a6300) has a sensor that perceives more accurate color information, and the corresponding data is in the original file, but the display cannot show it. Here's a variant of the photo, in which every pixel that has a color that goes beyond the boundaries of a typical display is replaced with light blue:


    The same bright orange Hober sneakers, but here all the out-of-gamut pixels are replaced with blue

    As you can see, the color of the material of the sneakers and a large part of the grass extends beyond the sRGB display. In fact, only less than half of the pixels accurately represent colors. As a web developer, you need to be aware of this. Imagine that you are selling these sneakers through an online store. Your customers won't know exactly what color they ordered and may be surprised when their purchase arrives.

    This problem is reduced when using a display with a wide color gamut. If you have one of the devices mentioned above, or similar, then here is a photo option that will show you more colors:


    The same bright orange Hober sneakers, but with a color profile added.

    On the wide color display, you can see the sneakers in a brighter orange color, green grass is also more diverse in color. If you, unfortunately, do not have such a display, then you are most likely seeing something very close in color to the first photo. In this case, the best I can suggest is to colorize the image, highlighting the areas you lose in color.

    Anyway, this is good news! Wide color gamut displays are brighter and provide a more accurate representation of reality. Obviously, there is a desire to make sure that you can provide your users with imaging in which this technology is useful.

    Below is the next example, this time with a generated image. Users on an sRGB display see a uniform red square at the bottom. However, this is somewhat of a trick. In fact, there are two shades of red in the image, one of which can only be seen on displays with a wide color gamut. On such a display, you will see a faint WebKit logo inside a red square.


    Red square with pale WebKit logo

    Sometimes the difference between a normal image and a wide color image is very subtle. Sometimes it is expressed much more sharply.

    WebKit looks forward to implementing these features when we're sure they're worth it.

    Wide color gamut in HTML

    While CSS works with most representations of HTML documents, there is one important area where this color space doesn't work: the canvas element. Both 2D and WebGL canvases assume they work in the sRGB color space. This means that even on displays with a wide color gamut, it is not possible to create a full color canvas.

    As a solution, it is proposed to add an optional flag to the getContext function that specifies the color space that the canvas should be set to by color. For example:
    // NOTE: Proposed syntax. Not yet implemented. canvas.getContext("2d", ( colorSpace: "p3" ));
    This brings up some points to consider, such as how to create canvases that have increased color depth. For example, in WebGL, you can use half-float textures that give 16 bits of precision per color channel. However, even if such deeper textures are used in WebGL, you will be limited to 8-bit precision when embedding this WebGL image in the document.

    You need to give the developer a method to set the color buffer depth for the canvas element.

    This is achieved in a more sophisticated way by combining the getImageData/putImageData functions (or the WebGL equivalent of readPixels). With today's 8 bits per channel buffer, there is no loss of precision when entering and leaving the canvas. The conversion can also be efficient, both performance and memory efficient, because the canvas and program data are of the same type. If the color depth is different, then this may no longer be possible. For example, the WebGL half-float buffer does not have an equivalent type in JavaScript, which means either some data conversion is forced when reading or writing, and additional memory is used when storing them, or the need to work with the original array buffer and perform cumbersome mathematical operations on bit masks.

    Such discussions are ongoing at the WhatWG site and will continue soon at the W3C. And again we invite you to join.

    conclusions

    Wide color gamut displays have entered the market and are the future of computing devices. As the number of users of these stunning displays grows, developers will become more interested in mastering the stunning palette of colors on offer and in providing users with an increasingly compelling web experience.

    WebKit software gives developers a lot of power to improve color performance through color matching and gamut detection, available today in Safari Technology Preview, as well as in macOS Sierra and iOS 10 betas. We're also interested in starting to implement more advanced color features, such as wide gamuts in CSS, introducing profiles to canvas elements, and using increased color depth.

    srgb add tags

    Almost everything that the user does on the iPhone is reflected on its display. This is where we look at photos, read messages, browse websites. Apple's new generation of smartphones, unveiled on September 7th, features the brightest and most colorful Retina display ever seen on an iPhone. Now iPhone has an even wider cinematic-standard color gamut and richer colors.

    On the iPhone 7 and iPhone 7 Plus displays, photos and videos look even more realistic and more immersive thanks to the expanded color gamut. The Wide Color technology provides the highest color fidelity, unattainable for "ordinary" display panels.

    The displays on iPhone 7 have a wider color gamut, making colors on screen appear brighter and more realistic. More shades, wider dynamic range, more accurate every color. Smartphone displays operate in the same color space as used in the digital cinema industry.


    On "regular" displays, the picture is filled with one color, on Wide Color the WebKit logo is visible

    “The Retina HD display with wide color gamut delivers cinematic color reproduction. More shades of the spectrum are used for each image, so everything looks truly realistic on the screen. Whether you're viewing a collection of wedding dresses or Live Photos of tropical landscapes, the colors will be so natural that you won't be able to distinguish them from reality,” Apple says.

    It is known that the more accurate and realistic the colors, the more vivid and natural the picture on the screen. Standard smartphone screens with the sRGB color space display significantly fewer shades than reality. The display panels in iPhone 7 offer a wider DCI-P3 color gamut with a 25% wider color space. With more colors, images look brighter, more realistic, and allow you to see even more detail in each photo.

    For the first time, Apple used the DCI-P3 color space in the latest generation of all-in-one iMacs. It is this color space that is used in modern cinemas. It covers a large part of the spectrum of natural origin, thanks to which it was possible to achieve serious improvements in the field of color realism.

    According to Apple, the iPhone uses the best color rendering system of any smartphone on the market.