For a legacy structure like the Intel GDT that has grown over many decades, or for ease of hardware implementation like the RISC-V it makes sense. But for a purely software parsed structure like the HDMI EDID, why would you do this?

@th Hm..What's your complaint? I don't understand the problem. :D

@th needless bitpinching in design leads to hours wasted in debugging the implementation, but the designers must feel really clever!

@th well, at least it's documented. (below is from the RF transceiver I'm working with)

@th Everything about HDMI makes more sense once you understand it as first a *restraint*, and only second as a means of moving images from point A to point B.

HDMI's mission is, "Under no circumstances display something unpermitted; all other considerations secondary; crew expendable."

The EDID thing probably falls out of that on the basis of: the kind of people who would willingly work on the design of such a system are terrible engineers, technically and ethically compromised.

@jwz @th It’s much simpler: HDMI is built on the ruins of VGA.

@frumble @jwz @th

I mean, those bit mingling things look like a 10-bit extension of a pre-existing 8-bit thing, although that doesn't explain why the LSBs come first.

@jwz @th wow id assumed hdcp was tacked onto hdmi not its original goal

@glassresistor @th HDCP actually predates HDMI since it was a party of DVI as well. But really HDCP is the purpose of HDMI, they are inseparable.

@jwz @th the better solution is called SDI (all versions of it)

@jwz @th This looks similar to how they extended the colour space on the Atari STE. The Atari ST had 3 bit colour (512 different colours), stored in a single 16-bit value (0000 0RRR 0GGG 0BBB). They then extended it to 4-bit colour (4096 different colours). So in order to stay binary compatible, they encoded the least significant bits for R, G and B in the high bit of each colour value.

In the worst case, this would lead to some colours being off by one shade, but I don't think this was ever visible.

@jwz it gets worse... @mjg59 points out that EDID can't properly represent some common screen sizes and antiquated NTSC overscan choices still factor into HDMI connections: mjg59.dreamwidth.org/8705.html

@th Well you've still got to make sure that 10 bit monitor works fine with every existing HDMI source on the planet, so you can't change the existing layout, just add? It seems odd unless they're tight on space that they didn't just add wider versions.

@th Is it giving the RGB values a shave and a haircut?

Sign in to participate in the conversation
(void *) social site

(void*)