Hacker News

Brajeshwar
sRGB Gamut Clipping (2021) bottosson.github.io

herf13 days ago

It's a good generalization of several techniques. The main thing I want to know is this: how does it look with actual HDR exposures, not the synthetic ones made by "increasing exposure and colorfulness significantly"? We shouldn't choose a method based on how natural the result is, when there is a "stretching" step like this that is not at all natural.

jacobolus13 days ago

I think any kind of hard clipping approach which takes each pixel independently is going to necessarily create some significant artifacts, though as the examples here make clear, some directions of clipping work better than others. The most important part to preserve for the image to look reasonable is lightness contrast, as when it gets crunched away the image loses visible detail, and the better methods demonstrated in this post manage to save at least some lightness contrast. But all of the methods here end up blowing out / compressing away chroma contrast in some regions where it existed in the original image.

What I'd like to see someone try is do is record the relative lightness and chroma, and (adaptation/context dependent) color relationships from the original image, and then try to preserve those to the extent practical even when bringing out-of-gamut colors into gamut. This will necessarily require modifying in-gamut colors to some extent as well.

This is what good Photoshop users do when they manually adjust the color of an image from one display medium to another, but it involves some amount of careful conscious choice and skill.

pixelpoet13 days ago

OMG, it's Mike Herf! I even linked your "give me a gigabyte" article randomly yesterday to someone complaining of an application using 1gb on his 64gb machine, and remember most of your website off by heart still, in particular the soft shadows / roundrects, all the way to random funny things like SreeD :) Thanks so much for the great articles <3

mattdesl13 days ago

I’ve implemented[1] some of these algorithms into @texel/color, a modern JS color library, and it also supports gamut mapping to certain wide gamut color spaces (Display P3, Rec2020, Adobe 1998) rather than just sRGB.

https://github.com/texel-org/color

Many popular color libraries (Colorjs.io, culori) attempt to match CSS gamut mapping spec, which is an order of magnitude slower than the approach in Ottosson’s blog post, and also less accurate (CSS gamut mapping may not fall neatly on the gamut boundary).

[1] “Ported” might be a better term as I used a combination of Ottosson’s own JS OKHSL picker, Colorjs.io code, and Coloraide (Python), and adjusted it for performance, more gamuts, and smaller bundle sizes.

jacobolus12 days ago

The method described at https://drafts.csswg.org/css-color/#css-gamut-mapping is equivalent to "Keep lightness constant, only compress chroma" from this blog post. They presumably picked that one because it is straightforward and should be good enough for the purpose at hand, which is figuring out what to do when someone puts an out-of-gamut color in their CSS.

The CSS spec doesn't say you have to implement this gamut clipping method via binary search. It's just one possible algorithm that is straightforward to describe so that people can get the intended result with a naïve implementation. Other clipping methods and better algorithms for accomplishing them require significantly more explanation with a bunch of math and color-model-dependent data.

The binary search approach is plenty fast for the CSS use case involving single isolated colors, but would not be appropriate for e.g. adapting photographic images from one output medium to another.

smittywerben13 days ago

I used to ignore all of this color stuff until I made a custom health potion asset (Note: not an artist). It looked surprisingly good. I sent it to my friend on Discord and immediately noticed a change: it converted my beautiful Blood Red into Bog Water Brown. I had an existential crisis. Is every Blood Red on the web converted to to Bog Water Brown? Does sRGB's red not go more "red" than that? Now every red on the internet looks like brown to me.

cmovq13 days ago

In games it’s common to have a tone mapping step [1] to map the HDR image to sRGB while maintaining pleasant colors.

The exposure parameter is usually dynamically chosen by using the average brightness of a previous frame.

[1]: http://filmicworlds.com/blog/filmic-tonemapping-operators/

SideQuark13 days ago

Those ideas fail for anyone with a modern screen, which goes far beyond sRGB and it's ancient 80 nits brightness. I doubt there's a phone, laptop, PC monitor, or TV made with such low limits now.

TeMPOraL13 days ago

Ah so that's why so many movies, shows and even videogames got so dark you can barely see a thing, unless you're viewing them on a relatively recent TV?

SideQuark13 days ago

That and overall poor color management practices. Most likely this will all get smoothed out as specs, knowledge, and ecosystems mature

jiggawatts13 days ago

Sooo... there's a whole story here of interacting forces, technology advancements, etc...

I've recently dipped a toe into this space because I got myself a camera that can shoot 8K HDR "raw" footage with 12-bit color. As I learned to edit and process this into something I can upload to YouTube I got a glimpse into the madness that's going on in the industry.

Basically, digital cameras got really good really quickly, and at the same time OLED monitors and super-bright LCDs with local dimming become available for mere mortals.

The industry meanwhile was stuck in the mindset that there is only one standard, and it is basically "whatever the RGB phosphors of a CRT TV made in the 1980s did". The software, the hardware, the entire pipeline revolved around some very old assumptions that were all violated by advancing technology. The changes were so rapid that mistakes are still common.

Essentially, video editing tools had to "grow up" and deal with color management properly, but there was an awful lot of push-back from both the editors/colorists, and the vendors themselves.

Examples:

- Professional grading monitors have buttons on the side to emulate various color spaces. These buttons generally don't "report back" the active color space to the OS or the grading software. It's possible to complete an entire project and not notice that what you see on your setup is not at all what anyone else will see. ("Oops.")

- Some OLED grading monitors are so good now that in a dark room you won't notice that you've accidentally packed the brightness range into the bottom 10% of the signal range. (This is basically what happened with that episode of Game of Thrones.)

- Both recording devices and color grading software like DaVinci Resolve treats color "artistically" and are basically incapable of the equivalent of "strong typing". Video from the camera is often not tagged with the color space used, which is just crazy to me, but this is the "way things are done"! Similarly, these tools generally don't have a strict mapping into the working space, they allow overrides and it's all very flexible and squishy.

- Colorists are so used to the specifics of old formats that they think in terms of RGB values in the encoded output space. Same as Photoshop users who think in terms of 255,127,0 being a specific color instead of a specific encoding. ("In what space!?") This extends to tooling like Resolve that shows the output-space values instead of the actual color space in all controls.

- Video cards and monitors "do their own thing". Without closing the loop with a hardware calibrator, there is absolutely no way to know what you're actually seeing is what others will see.

- The software has a mind of its own too. It's absurdly difficult to avoid "black crush" especially. It just happens. Forums are full of people basically fiddling with every combination of checkboxes and drop-down options until it "goes away"... on their specific monitor and driver setup. Then it looks too bright on YouTube... but not NetFlix. Or whatever.

- Basic assumptions one might have about HDR editing are still mired in the SDR world. E.g.: I would expect a fade-out to be equivalent to turning down exposure of the camera. It isn't! Instead it's the equivalent of blending the output tone mapped image with black, so bright parts of the scene turn grey instead of less bright.

For reference, outside of the video editing space, tools like Adobe Lightroom (for stills photo editing) are much more strict and consistent in the way they work. RAW stills photos are always tagged with the input gamut and color space, are automatically mapped to a very wide gamut that won't "clip" during editing, and all editing controls operate in this ultra-HDR space. It's only the final output step that tone maps into the target HDR or SDR space.

As a random example of this, switching from SDR to HDR mode in Lightroom will just make the highlights get brighter and more colorful. In DaVinci Resolve, unpredictable brightness shifts will occur throughout the image.

Etherlord8712 days ago

Let me add two things to this madness that come to my mind out of the bat:

- observer metamerism (somewhat obvious)

- I don't have a name for that, but once (at some point tried it again and couldn't reproduce), I dragged a window between my 2 monitors. As it was mostly on the 2nd monitor, the image colors suddenly shifted in hue. Not just on the 2nd monitor, on the 1st one too. And the colors were obviously wrong (way too purple) on both monitors.

strogonoff13 days ago

If you shoot video raw, do yourself a favour and use proper development tools to deal with footage. As an example, RawTherapee uses probably the most “strongly typed” approach to colour (and RawPedia is a treasure trove of advice for nearly every step starting with pre-production, such as creating relevant calibration profiles and flat/dark frames).

jiggawatts12 days ago

Still image RAW editing has mostly been correct for the popular tools for about a decade now, maybe longer.

The history behind this is that a 12-bit or even 14-bit still image (photo) is big but "not that big" (100 MB at most), so it was possible to process them in a wide-gamut HDR space for a long time now even on very slow hardware.

For video, even 10-bit support is pushing the limits of computer power! Sure, playback has been sorted for a while now, but editing of multiple overlapping pieces of footage during a transition can be a problem. Unlike with a still photo, the editing software has to be real-time, otherwise it stutters and is unusable.

Consider that Lightroom does everything in a floating point 32-bit linear space, because... why not? It takes a lot of computer power, sure, but cheap enough not to matter in practice.

Video editing tools try to keep video in the original 8-bit or 10-bit encoding as much as possible for performance.

There's also another reason: older cameras would output only 8-bit files with maybe 7-bits of dynamic range in it, so if you manipulated them too much things would fall apart. Just the floating point rounding error converting to a linear space and back would cause visible banding! So all editing tools tried to operate in the camera's native encoding space so that they minimise precision issues. E.g.: Many video editing tools have controls that just add a constant value to the output space! E.g.: 255,127,10 is mapped to 255-5,127-5,10-5 = 250,122,5. This is naive and "wrong" but it preserves the dynamic range as much as possible.

strogonoff11 days ago

Developing a raw (not an acronym) into some wide gamut space and applying whatever transitions at that point is still fine.

jiggawatts11 days ago

This isn't just "fine", it's essentially the only sane thing to do. That intermediate space ought to be "linear light" so that operations like resize or blur work properly.

This is the only mode in which Lightroom operates. There's basically no other way of using it.

It's not only not the default in video editing tools, it's decidedly non-standard and you have to go out of your way to enable it. Then everything breaks because the tools still have too many baked-in assumptions that you're operating in the output space or some other non-linear space.

In a tutorial video one professional Hollywood colorist said something like: "This color space has a gamma curve of blah-blah-blah, which means that the offset control will apply an exposure compensation."

That blew my mind: edit controls that change behaviour based on the input and output formats to the point that "addition" becomes "multiplication"!!

SideQuark3 days ago

Addition becoming multiplication is simply the difference between linear and gamma encoded space. Gamma curves are mostly log based since it maximizes bit usage in formats, and given 2 linear colors A,B, and an addition operator, applying it in gamma space is logA + logB = log(A*B).

strogonoff11 days ago

Sure. So back to your previous point, when you develop the source CinemaDNG footage you do not deal with any transitions, are not cutting it, and do not need it to be real-time. It is simply a bunch of stills, which you process into a straightforward sequence of pre-graded PNGs in your desired colour space that you are free to edit together however you like at blazing fast speeds. Unless you go crazy on some very specific transitions (and who uses transitions anyway in film these days) that don’t work in final colour space, you do not need raw footage at edit time, do you?

I guess I know the answer, if you work in this industry then punching in/out or stabilizing in post is a common requirement and you mentioned some tools can mess up even at resizing stage.

jiggawatts9 days ago

I’m not a professional, I just “dabble”.

Something I’ve noticed in fields where I am a professional is that it’s the “WTFs per hour” from intelligent beginners that best measures how broken some ecosystem is.

Video editing is very high on the WTF/hr metric.

strogonoff9 days ago

Then try workflow I described. Too cumbersome and exotic for a pro, somewhat clumsy or nonexistent GUIs, but almost entirely open-source and very few WTFs.

qingcharles13 days ago

Madness is the right word. The situation is mad, and you'll drive yourself mad trying to get things to "look right."

I hate this situation because all the hardware available now is great, but none of the software works properly together.

And of course, it looks great on your hardware calibrated HDR OLED display with every piece of software using the correct color profiles, but then it looks like a turd on grandma's Windows 7 PC.

Don't even read this or it's straight to the asylum:

https://www.mux.com/blog/your-browser-and-my-browser-see-dif...

jiggawatts13 days ago

Oh absolutely, the entire field is just gibbering eldritch madness and arguments to the contrary are basically this: https://knowyourmeme.com/memes/this-is-fine

Sharing content in HDR content as-intended is basically impossible outside of a major streaming vendor like NetFlix or Apple TV. That's academic anyway, because Dolby Vision is unavailable to mere mortals. YouTube mostly works most of the time, but still has maddening issues like processing SDR first, and then HDR some unspecified time later. Eventually. Maybe. Just keep refreshing!

It blows my mind that pretty much the only easy consumer-grade HDR content sharing that Just Works is Apple iMessage.

There's basically nothing else.

Try uploading a HDR still image or a HDR video to any social site and see what happens!

Or email it...

Or put it in a web page.

Or anything that involves another, unspecified device.

SSLy13 days ago

instagram handles HDR content fairly smoothly too.

strogonoff13 days ago

Clipping is a danger not just with sRGB gamut but in any case where you process a wider dynamic or colour range into a narrower one—for example, that is essentially what photography is (because a sensor capable of capturing the full range of human sight does not exist, and neither does display media capable of reproducing the range of values captured by a modern digital camera sensor).

If you are a photographer, this process starts with camera settings at shooting time and ends with delivery to preferred display space. I believe the whole process can or even should be a creative one rather than purely algorithmic, and fitting dynamic range is an opportunity for you to accentuate/attenuate the right aspects of the image to convey a desirable effect.

(This is not unlike microphone placement in audio recording, or equalisation/compression in subsequent mixing. First you deal with extreme range of actual physical sound and limit it to capabilities of recording media, then you further fit it to the range of audio reproducing capabilities of consumer equipment and to modern listening conditions, and throughout the entire process you accentuate and attenuate.)

suzumer13 days ago

I haven't gone through the whole article, but it seems to be conflating chroma and saturation. If lightness of a color is scaled by a factor c, then chroma needs to be scaled by that same factor, or saturation won't be preserved, and the color will appear more vibrant then it should.

refulgentis13 days ago

Well, no, it's not straight up scaling.

(Not directed at you) Color science is a real field, CAM16 addresses all of the ideas and complaints that anyone could have, and yet, because it's 400 lines of code, we are robbed of principled, grounded, color. Instead people reach for the grab bag of simple algorithmic tricks

itishappy13 days ago

> CAM16 addresses all of the ideas and complaints that anyone could have...

Here's some complaints that better color scientists than me have had about CAM16:

> Bad numerical behavior, it is not scale invariant and blending does not behave well because of its compression of chroma. Hue uniformity is decent, but other models predict it more accurately.

https://bottosson.github.io/posts/oklab/

Here's more:

> Although CAM16-UCS offers good overall perceptual uniformity it does not preserve hue linearity, particularly in the blue hue region, and is computationally expensive compared with almost all other available models. In addition, none of the above mentioned color spaces were explicitly developed for high dynamic range applications.

https://opg.optica.org/oe/fulltext.cfm?uri=oe-25-13-15131

Color is hard.

refulgentis13 days ago

You've discovered my White Whale.

It spells out a CAM16 approximation via 2 matmuls, and you are using as an example of how CAM16 could be improved.

The article, and Oklab, is not by a color scientist. He is/was a video game developer taking some time between jobs to do something on a lark.

He makes several category errors in that article, such as swapping in "CAM16-UCS" for "CAM16", and most importantly, he blends polar opposite hues in cartesian coordinates (blue and yellow), and uses the fact this ends up in the center (gray) as the core evidence for not liking CAM16 so much.

> better color scientists than me

Are you a color scientist?!

Sesse__13 days ago

> The article, and Oklab, is not by a color scientist. He is/was a video game developer taking some time between jobs to do something on a lark.

As a non-color scientist sometimes dealing with color, it would probably be nice if the color scientists came out sometimes and wrote articles that as readable as what Ottosson produces. You can say CIECAM16 is the solution as much you want, but just looking at the CIECAM02 page on Wikipedia makes my brain hurt (how do I use any of this for anything? The correlate for chroma is t^0.9 sqrt(1/100) J (1.64 - 0.29^n)^0.73, where J comes from some Chtulhu formula?). It's hard enough to try to explain gamma to people writing image scaling code, there's no way ordinary developers can understand all of this until it becomes more easily available somehow. :-) Oklab, OTOH, I can actually relate to and understand, so guess which one I'd pick.

suzumer13 days ago

Mark Fairchild, one of the authors of CIECAM02, recently published a paper that heavily simplified that equation: https://markfairchild.org/PDFs/PAP45.pdf

If the link doesn't work, the paper is called: Brightness, lightness, colorfulness, and chroma in CIECAM02 and CAM16.

Also, if you want a readable introduction to color science, you can check out his book Color Appearance Models.

jacobolus12 days ago

Thanks for the link! To anyone looking for a summary: Fairchild's paper explains the origin and nature of various arbitrary and empirically/theoretically unjustified and computationally expensive complications of CAM16 (from Hunt's models from the 80s–90s via CIECAM02 and CIECAM97s) which were apparently originated as duct-taped workarounds that are no longer relevant but were kept based on inertia. And it proposes alternatives which match empirical data better.

Mark Fairchild is great in general, and anyone wanting to learn about color science should go skim through his book and papers: he does the serious empirical work needed to justify his conclusions, and writes clearly. It was nice to drop by his office and shake his hand a few years ago.

In an email a couple years ago he explained that he had nothing to do with CAM16 specifically because the CIE wouldn't let him on their committees anymore (even as a volunteer advisor) without signing some kind of IP release giving them exclusive rights to any ideas discussed.

octacat12 days ago

J is the lightness channel and similar to other lightness formulas in other colorspaces for SDR lightness. I.e. usually idea is to take a lightness formula, and just arrange hues/chromas for each value of J.

Yea, Jab instead of Lab in ciecam haha. Btw, ciecam is pretty bad predicting highlights, it was designed for SDR to begin with. Lightness formula in ICtCp is more interesting (and here it is "I").

But yea, difficulty of ciecam02 comes from the fact that it tries to work for different conditions, i.e. if usual colorspaces just need to worry about how everything works with one color temperature (usually 5500 or 6500K), ciecam02 tries to predict how colorspace would look like for different tempratures and for different viewing conditions (viewing conditions do not contribute much difference though).

Oh, and of course, ciecam02 defines 3 colorspaces, because it is impossible to arrange ab channels in euclidean space :) TLDR, there is metric de2000 to compare 2 colors, but this metric defines non-euclidean space. While any colorspace tries to bend that metric to fit into euclidean space. So, we have a lot of spaces that try it with different degree of success.

Cam02 is over-engeneered, but it is pretty easy to use, if you just care about cam-ucs colorspace (one of these three) and standard viewing conditions.

If you kinda just wanna see difference between colorspaces, good papers comparing colorspaces have actually nice visual graphs. If you want to compare them for color editing, I've implemented a colorgrading plugin for photoshop: colorplane (ah, kinda ad ;)).

From most interesting spaces, I would say colorspaces, optimized using machine learning are pretty interesting (papers from 2023/2024). But yeah, this means they work using tensorflow, so you need to use batching, when converting from/to RGB. But yeah, what they did, they took CieLab (yes, that old one), used L from it and stretched AB channels to better fit de2000 metric prediction. Basically, like many other colorspaces are designed, just machine learning is cool to minimise errors in half-automatic way. Heh, someday I should write a looong comparison of colorspaces in an easy language with examples :)

itishappy13 days ago

> Are you a color scientist?!

I would say yes, but if you're going to argue Björn Ottosson isn't, then no.

refulgentis13 days ago

They called me a color scientist at work and I didn't like it much :( until I started doing it. But I don't think I could again.

I was just asking because I'm used to engineers mistaking the oklab blog for color science, but not color scientists. It's legit nothing you want to rely on, at all, clearest example I can give is the blue to yellow gradient having grey in the middle. It's a mind numbing feeling to have someone tell you that someone saying that, then making a color space "fixing" this, is something to read up on.

jacobolus13 days ago

> CAM16 addresses all of the ideas and complaints that anyone could have

A statement this emphatic and absolute can't possibly be true.

Here's a concrete complaint that I have with CAM16: the unique hues and hue spacing it defines for its concept of hue quadrature and hue composition are nontrivially different than the ones in CIECAM02 or CIECAM97s, but those changes are not justified or explained anywhere, because the change was just an accidental oversight. (The previous models' unique hues were chosen carefully based on psychometric data.)

> because it's 400 lines of code, we are robbed

It's not really surprising that people reach for math which is computationally cheap when they need to do something to every pixel which appears in a large video file or is sent to a computer's display.

8n4vidtmkvmk13 days ago

Then give us both. fast_decent_colormap() and slow_better_colormap() and hide away all your fancy maths.

Give me some images and the kinds of transforms each color space is good at, and let me pick one, already implemented in a library in a couple different languages.

What's the best color space if I want pretty gradients between 2 arbitrary colors?

What's the best color space if want 16 maximally perceptually unique colors?

What color space do I use for a heat map image?

What color space do I use if I want to pick k-means colors from an image or down sample an image to 6 colors?

refulgentis12 days ago

This is a category error question and that's what makes these hard to answer. There's very good & clear answers, but you see the reaction to them in this thread. I wish I could share the tools I made these problems visible at BigCo, and thus easy to resolve, but alas.

TL;DR: A color space just tells you where a color is in relation to others. From there, focus on your domain (eg. mine was "~all software", 2D phone apps) and apply. Ex. the gentleman above talking about specularity and CAM16 is wildly off-topic for my use case, but, might be crucially important for 3D (idk). In general, it's bizarre to be using something besides CAM16, and if that's hard to wrap your mind around, fall back to Lab*, (HCL) and make sure you're accounting for gamut mapping if you're changing one of the components.

8n4vidtmkvmk11 days ago

Is it a category error? I can see that if I blend linearly in one color space vs another space I'll get a different result. And if a try to cluster colors using one color space vs another I'll get different results. Surely the color space is relevant and my questions aren't completely non-sensical?

CAM16 can't be the best answer to all of them, can it? It's possible but I'd think some color spaces are better suited for some tasks than others.

Which CAM16 are we even talking about? A quick Google reveals CAM16 UCS, SCD and LCD.

CIELAB I've heard good things about but then OKLAB became all the rage and now I don't know what's better.

SideQuark13 days ago

While CAM16 helps, it doesn't address all the ideas and complaints. The field that brought you CAM 16 has many more advanced models to address shortcomings, and there's papers published nearly daily addressing flaws in those models.

It's by no means a solved problem or field.

creata13 days ago

Like most people, I think, I'm just using Oklab for interpolation between colors on my screen, and in color pickers that feel a little easier to use than the usual HSV one. As you mentioned, it's easy to throw in anywhere.

Is there a reason why it would be more appropriate to use CAM16 for those use cases?

I think an even simpler approximation than Oklab might be appropriate for these cases - it'd be nice if the sRGB gamut was convex in Oklab, or at least didn't have that slice cut out of it in the blue region.

[deleted]13 days agocollapsed

Rapzid13 days ago

Color management is such a shit show on PCs. Most phones these days support a large percentage of DCI-P3 and are configured for it.

But even if you have a monitor that supports DCI-P3 you have to slog through modes, profiles, and blog posts to get it setup.

Should you always have HDR on? Why does SDR content always look "wrong" when HDR is on? Oh, it's because the peak brightness and color saturation blah blah blah.

Gamma 2.0, 2.2, or 2.4?

Now you learn the hard way the "desktop" is not color managed, it's the individual applications.. If they want to be. Maybe they'll use the windows configured profile, maybe not.

pier2513 days ago

> Now you learn the hard way the "desktop" is not color managed

On Windows and probably Linux.

MacOS is fully color managed which has its pros and cons since so much of desktop users are on Windows which simply assumes sRGB.

At least P3 and sRGB have the same gamma. People working with video are so confused about uploading rec709 content to youtube with 2.4 gamma.

kuschku13 days ago

At least KDE nowadays supports color management, which is how the steamdeck implements HDR.

qingcharles13 days ago

I read this recently and it just added to the madness:

https://www.mux.com/blog/your-browser-and-my-browser-see-dif...

[deleted]13 days agocollapsed

Jasper_13 days ago

[flagged]

SideQuark13 days ago

Maybe because others interested in color management, including how it works in modern tech, could be attracted to this topic. Then it makes sense to put related things here.

If you don't like this comment, which seems pretty useful to get discussion, perhaps even solving the users issues, then don't engage it.

Others will find it useful and topical.

jacobolus13 days ago

To be fair, it's pretty hard to perform or judge reasonable gamut clipping/adaptation if you don't have a well characterized display.

Rapzid13 days ago

[flagged]

Anitalawson11 days ago

[dead]

hn-front (c) 2024 voximity
source