Rendered at 01:32:53 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
LegionMammal978 12 hours ago [-]
sRGB has bugged me from the start, since it's not even clear to me which actual matrix to use to convert between linear sRGB colors and XYZ colors. I count at least 3 different matrices in IEC 61966-2-1, each of which I have seen different people ascribe to as the true version:
1. The matrix implied by the reference primaries in Table 1: [X; Y; Z] = [506752/1228815, 87098/409605, 7918/409605; 87881/245763, 175762/245763, 87881/737289; 12673/70218, 12673/175545, 1001167/1053270]*[R; G; B].
3. The inverse of the matrix in section 5.3: [X; Y; Z] = [248898325000/603542646087, 71938950000/201180882029, 36311670000/201180882029; 128304856250/603542646087, 143878592500/201180882029, 14525360000/201180882029; 11646692500/603542646087, 23977515000/201180882029, 191221850000/201180882029]*[R; G; B].
The distinction starts to matter for 16-bit color. The CSS people seem to take the position that the matrix implied by primaries is the true version, but meanwhile, the same document's Annex F (in Amd. 1) seems to suggest that the 5.2 matrix is the true version, and that the 5.3 matrix should be rederived to the increased precision. There's no easy way to decide, as far as I can tell.
Meanwhile, I agree with the author that the ICC's black-point finagling in their published profiles has not helped with the confusion over what exactly sRGB colors are supposed to map to.
kurthr 9 hours ago [-]
If you're using sRGB with 16bit color you already have problems.
It is an 8bit per color hack that worked perfectly well with CRTs and early LCDs.
There were multiple different hacky versions with different vendors that were visually indistinguishable on displays of the day.
Even most modern displays are not really capable of more than 10bit color (RGB miniLED and QD-OLED barely are). Even REC2020 doesn't need 16bit.
sRGB doesn't even have a consistent gamma, and it's not anywhere close to uniformly covering the color volume. Why use it? DCI-P3 works fine.
SideQuark 4 hours ago [-]
Your eye also doesn’t have a consistent gamma, nor does the camera, now does any viewing technology. If you’re complaining about the slight linear section of many gamma curves, they are very important for avoiding various artifacts.
bflesch 10 hours ago [-]
It's perfectly fine for fingerprinting though. Innocuous artifacts in file formats such as custom matrices, digits on the seventh decimal position of a floating point number or millisecond-precision timestamps allow identification and cross-referencing of internet users.
Just last week I noticed that when a reddit user uploads a screenshot taken on MacOS as PNG image to a reddit post, the PNG will still contain uniquely identifying information about the monitor that is attached to the MacOS system and when it was last calibrated. You can deduce type of Macbook they are using from the screen resolution and see when they switched machines once you notice a different monitor calibration timestamp. Just from a single PNG image that was uploaded by the user themselves. If those two pieces of information are not stored in the PNG you know that they must be Windows or Linux user.
It's these small breadcrumbs all over the place which make forensics so interesting.
I would have loved to have found this page back when I was adapting some PDF-generating program to conform to PDF/A (which requires a colour profile in some cases). I found several sRGB profiles and could see that they were different, but knowing almost nothing about them I just chose the one that seemed to be from the most authoritative source (I forgot which). This page must have existed then, actually.
grvbck 11 hours ago [-]
It is a rabbit hole. I just checked the latest release of GiMP (3.2.4). The "GiMP built-in sRGB" profile is supposed to be a functional match to the ArgyllCMS sRGB color space – the true sRGB profile according to the addendum in the above profile comparison.
But if I embed it in a photo and then open the photo in GraphicConverter, it shows up as "sRGB IEC61966-2.1", which to my understanding is identical to Apple’s sRGB Color Space Profile.icm.
But that's an sRGB v2 profile. Should I download and use a v4 profile instead? Or download the ArgyllCMS sRGB.icm [1] and convert all photos to it? Or just select the Apple default sRGB profile everywhere?
I'm not a pro and don't have a calibrated display, but it annoys me when photos I upload online look vastly different in my browser than they look in my editing software on the same display.
From 2012, updated 2015 it says. Would have been interesting with a recent update to compare.
smallstepforman 9 hours ago [-]
No mention of 601, BT709, BT2020, BT2100 etc. He did mention the P and D profiles. Unorm vs linear.
There is always a historic reason for a colour profile, sadly most software avoids terminology like the plague.
voidUpdate 11 hours ago [-]
Wow I'm glad I'm not a graphic designer. My head hurts just trying to understand this. I just pick the colours that look good to me
kllrnohj 11 hours ago [-]
graphic designers don't really see any of this, either. It's going to be photo junkies or people working on image processing systems (either building them or using them) that have to deal with this.
But for the most part this shouldn't really matter much. A huge amount of things these days are properly color managed, so as long as the thing that wrote the profile actually, you know, wrote what it actually wanted then it'll display just fine regardless of how many different "sRGB" profiles there are floating around. We're largely past the days of just hoping that the image and the display happen to agree on roughly the same colors.
gpvos 9 hours ago [-]
The problem, as I described in another comment, is that the average programmer doesn't know enough about colour spaces, and sometimes must choose a colour profile while not knowing nor understanding what they actually want. They can figure out that an "sRGB" profile is probably what they want, but then there should not be such a plethora of different versions of that, as choosing between them is impossible for anyone not in the know.
esafak 10 hours ago [-]
This is more like the stuff Linux users had to endure in the bad old days of setting up drivers. Concerns of twenty years ago. I remember the days people compared their colorimeters and profiled their own monitors. I'm too old for this.
kllrnohj 10 hours ago [-]
> I remember the days people compared their colorimeters and profiled their own monitors.
That would be calibration and it's still necessary if you want color accuracy. That's about ensuring that what your monitor thinks it's displaying and what it's actually physically emitting are the same. The main thing that's changed here is that factory calibration has become a lot more common and is often more than good enough for anything short of serious professional work. Even for things that aren't professional displays. Like most flagship or even midrange smartphones are factory calibrated with dE values that would make reference monitors from 20 years ago blush. Right up until the OEM shoves a shitty color curve on it intentionally to make it "pop" or be more "vibrant" (Samsung calls this "Vivid", Pixel calls it "Adaptive", etc.. - but they at least usually have a "natural" option that gets you back to the properly calibrated display)
spider-mario 6 hours ago [-]
> That would be calibration and it's still necessary if you want color accuracy.
Your correction is backwards. Profiling + colour-managed apps gets you accurate colours regardless of source colourspace. Calibration doesn’t, and is not strictly needed either.
No, my correction is correct. Colorimeters are physical measurement devices to adjust the display so that it reports an accurate color profile in the first place for which to color manage to. It's required for the monitor itself to be able to display accurate colors at all.
spider-mario 2 hours ago [-]
> Colorimeters are physical measurement devices to adjust the display so that it reports an accurate color profile in the first place for which to color manage to.
No. You can profile an uncalibrated (unadjusted) display and the profile will be correct. There is zero inherent requirement to adjust the display, or for the display itself to report a profile.
> It's required for the monitor itself to be able to display accurate colors at all.
Also no. Once you have profiled the uncalibrated display, you can accurately display colours within its gamut by converting to the profile.
jiggawatts 2 hours ago [-]
> We're largely past the days of just hoping that the image and the display happen to agree on roughly the same colors.
<Heath Ledger Joker>Ah haa ha ha haaaa!</Heath Ledger Joker>
We're nowhere near past that point, we haven't even begun to approach that point. That point is something I would like to reach before I die, but since that's maybe just a couple of decades away, it's not looking likely.
In general, Windows and Linux does not color manage, or so badly that it is counter-productive.
Most sub-$500 monitors do not report their native gamut! By default, operating systems assume monitors are sRGB (they're typically not), and send un-calibrated 8-bit RGB as-is.
On Windows and MacOS, enabling HDR mode typically sets the correct gamut, etc... and mostly makes things "just work", but that's at the OS level only.
Almost all applications map wide-gamut images to sRGB even on HDR monitors or simply re-interpret the RGB values as-if they're sRGB without even bothering to color space convert.
Firefox has color management off by default. Microsoft Edge defaults to "crush to sRGB". Apps with embedded web view controls are "who knows?"
In general, widge-gamut, 10 bits per channel, and HDR support are all a total shit show. I'm perpetually surprised if any of it works!
As a random example, my Nikon Z8 DSLR can natively record HDR 10 bit wide gamut HEIF files in-body. Windows can't display those at all. MacOS and iPhones can... sometimes... but then the viewer apps will often "get confused" and the brightness will jump around randomly and non-deterministically as you switch between thumbnail and full screen views. You can't forward such an image to anyone via iMessage, they'll get gibberish on their end, and SMS/MMS is hopeless.
Meanwhile, YouTube HDR generally "just works" on most devices, so I've started sending people my still image photography by converting them to a HDR 4K slideshow in DaVinci Resolve and giving them a YouTube link.
It's sad and pathetic that Meta set $80 billion on fire for the Metaverse and the rest of the industry found a decent chunk of a trillion dollars under the couch cushions to throw at AI slop, but nobody can "afford" to have one or two engineers fix their imaging pipeline.
Upload a HDR or wide-gamut image to Faceobook successfully and then tell me it "just works".
Or send one in an email.
Or do anything with it other than view it on your own device.
1. The matrix implied by the reference primaries in Table 1: [X; Y; Z] = [506752/1228815, 87098/409605, 7918/409605; 87881/245763, 175762/245763, 87881/737289; 12673/70218, 12673/175545, 1001167/1053270]*[R; G; B].
2. The matrix in section 5.2: [X; Y; Z] = [1031/2500, 447/1250, 361/2000; 1063/5000, 447/625, 361/5000; 193/10000, 149/1250, 1901/2000]*[R; G; B].
3. The inverse of the matrix in section 5.3: [X; Y; Z] = [248898325000/603542646087, 71938950000/201180882029, 36311670000/201180882029; 128304856250/603542646087, 143878592500/201180882029, 14525360000/201180882029; 11646692500/603542646087, 23977515000/201180882029, 191221850000/201180882029]*[R; G; B].
The distinction starts to matter for 16-bit color. The CSS people seem to take the position that the matrix implied by primaries is the true version, but meanwhile, the same document's Annex F (in Amd. 1) seems to suggest that the 5.2 matrix is the true version, and that the 5.3 matrix should be rederived to the increased precision. There's no easy way to decide, as far as I can tell.
Meanwhile, I agree with the author that the ICC's black-point finagling in their published profiles has not helped with the confusion over what exactly sRGB colors are supposed to map to.
Even most modern displays are not really capable of more than 10bit color (RGB miniLED and QD-OLED barely are). Even REC2020 doesn't need 16bit.
sRGB doesn't even have a consistent gamma, and it's not anywhere close to uniformly covering the color volume. Why use it? DCI-P3 works fine.
Just last week I noticed that when a reddit user uploads a screenshot taken on MacOS as PNG image to a reddit post, the PNG will still contain uniquely identifying information about the monitor that is attached to the MacOS system and when it was last calibrated. You can deduce type of Macbook they are using from the screen resolution and see when they switched machines once you notice a different monitor calibration timestamp. Just from a single PNG image that was uploaded by the user themselves. If those two pieces of information are not stored in the PNG you know that they must be Windows or Linux user.
It's these small breadcrumbs all over the place which make forensics so interesting.
But if I embed it in a photo and then open the photo in GraphicConverter, it shows up as "sRGB IEC61966-2.1", which to my understanding is identical to Apple’s sRGB Color Space Profile.icm.
But that's an sRGB v2 profile. Should I download and use a v4 profile instead? Or download the ArgyllCMS sRGB.icm [1] and convert all photos to it? Or just select the Apple default sRGB profile everywhere?
I'm not a pro and don't have a calibrated display, but it annoys me when photos I upload online look vastly different in my browser than they look in my editing software on the same display.
[1] https://argyllcms.com/icclibsrc.html
There is always a historic reason for a colour profile, sadly most software avoids terminology like the plague.
But for the most part this shouldn't really matter much. A huge amount of things these days are properly color managed, so as long as the thing that wrote the profile actually, you know, wrote what it actually wanted then it'll display just fine regardless of how many different "sRGB" profiles there are floating around. We're largely past the days of just hoping that the image and the display happen to agree on roughly the same colors.
That would be calibration and it's still necessary if you want color accuracy. That's about ensuring that what your monitor thinks it's displaying and what it's actually physically emitting are the same. The main thing that's changed here is that factory calibration has become a lot more common and is often more than good enough for anything short of serious professional work. Even for things that aren't professional displays. Like most flagship or even midrange smartphones are factory calibrated with dE values that would make reference monitors from 20 years ago blush. Right up until the OEM shoves a shitty color curve on it intentionally to make it "pop" or be more "vibrant" (Samsung calls this "Vivid", Pixel calls it "Adaptive", etc.. - but they at least usually have a "natural" option that gets you back to the properly calibrated display)
Your correction is backwards. Profiling + colour-managed apps gets you accurate colours regardless of source colourspace. Calibration doesn’t, and is not strictly needed either.
https://discuss.pixls.us/t/rip-displaycal/21775/130
No. You can profile an uncalibrated (unadjusted) display and the profile will be correct. There is zero inherent requirement to adjust the display, or for the display itself to report a profile.
> It's required for the monitor itself to be able to display accurate colors at all.
Also no. Once you have profiled the uncalibrated display, you can accurately display colours within its gamut by converting to the profile.
<Heath Ledger Joker>Ah haa ha ha haaaa!</Heath Ledger Joker>
We're nowhere near past that point, we haven't even begun to approach that point. That point is something I would like to reach before I die, but since that's maybe just a couple of decades away, it's not looking likely.
In general, Windows and Linux does not color manage, or so badly that it is counter-productive.
Most sub-$500 monitors do not report their native gamut! By default, operating systems assume monitors are sRGB (they're typically not), and send un-calibrated 8-bit RGB as-is.
On Windows and MacOS, enabling HDR mode typically sets the correct gamut, etc... and mostly makes things "just work", but that's at the OS level only.
Almost all applications map wide-gamut images to sRGB even on HDR monitors or simply re-interpret the RGB values as-if they're sRGB without even bothering to color space convert.
Firefox has color management off by default. Microsoft Edge defaults to "crush to sRGB". Apps with embedded web view controls are "who knows?"
In general, widge-gamut, 10 bits per channel, and HDR support are all a total shit show. I'm perpetually surprised if any of it works!
As a random example, my Nikon Z8 DSLR can natively record HDR 10 bit wide gamut HEIF files in-body. Windows can't display those at all. MacOS and iPhones can... sometimes... but then the viewer apps will often "get confused" and the brightness will jump around randomly and non-deterministically as you switch between thumbnail and full screen views. You can't forward such an image to anyone via iMessage, they'll get gibberish on their end, and SMS/MMS is hopeless.
Meanwhile, YouTube HDR generally "just works" on most devices, so I've started sending people my still image photography by converting them to a HDR 4K slideshow in DaVinci Resolve and giving them a YouTube link.
It's sad and pathetic that Meta set $80 billion on fire for the Metaverse and the rest of the industry found a decent chunk of a trillion dollars under the couch cushions to throw at AI slop, but nobody can "afford" to have one or two engineers fix their imaging pipeline.
Upload a HDR or wide-gamut image to Faceobook successfully and then tell me it "just works".
Or send one in an email.
Or do anything with it other than view it on your own device.