chilloutdocdoc
Full time elf
For the sake of this discussion, please consider DC voltages only. No AC sine waves necessary
So many people that have used RGB this past year or in the past years have noted one thing. Color mixing isn't what you think it would be. You put something on at 5% and it looks more like 30%, then you go from 80% to 90% and see no difference whatsoever. A lot of people have attributed this to the fact that humans have logarithmic vision. For the sake of what I've been thinking, I'm going to attribute this to be the number on reason that we don't like linear dimming (irrespective of controllers, cabling, hardware, outdoor lighting conditions, street lights, moon position, you get the point).
The way to fix this, that I see is Gamma Correction. In short, gamma correction creates an output curve that looks like the second half of the letter U. It accounts for the fact that our eyes can see very small differences in the lower light levels but requires larger and larger changes to see a difference in the higher levels.
Most people are currently approaching correction by the method of Software Dimming Curves. The problem with this is, a huge loss of resolution. Computer dimming doesn't operate with decimals, and they only use 8 bits to store each value of dimming. 8 bits = 2^8 = 255 values. It seems like a large amount, but when you consider that a Gamma Curve of 2.0-2.2 (typical values used for correction) eliminate half or more of these values, we are left with only 113-130 values.
The obvious solution to this: we need more bits. There is one problem with the solution. More bits would require a re-write of the software and the protocols that we currently use. We would have to send more bits down the line, which would mean less channels per data-speed. (less channels per universe, or a slower refresh rate).
The solution would be to increase the resolution in the hardware. If the hardware took in an 8 bit value, and mapped it to a 9, 10, 12, or any other higher resolution bit. (Even 9 bit dimming would look great, but if we need to move up to 2 bytes to store a value, why not use more of it). The mapping could be done in hardware, and users could even create their own curves if they wanted to, however if all were using DC and the LED's were fairly balanced in output, it shouldn't need to be done.
Just my view of one idea as to how we could fix color mixing curves that could be applicable to any RGB non pixel element.
So many people that have used RGB this past year or in the past years have noted one thing. Color mixing isn't what you think it would be. You put something on at 5% and it looks more like 30%, then you go from 80% to 90% and see no difference whatsoever. A lot of people have attributed this to the fact that humans have logarithmic vision. For the sake of what I've been thinking, I'm going to attribute this to be the number on reason that we don't like linear dimming (irrespective of controllers, cabling, hardware, outdoor lighting conditions, street lights, moon position, you get the point).
The way to fix this, that I see is Gamma Correction. In short, gamma correction creates an output curve that looks like the second half of the letter U. It accounts for the fact that our eyes can see very small differences in the lower light levels but requires larger and larger changes to see a difference in the higher levels.
Most people are currently approaching correction by the method of Software Dimming Curves. The problem with this is, a huge loss of resolution. Computer dimming doesn't operate with decimals, and they only use 8 bits to store each value of dimming. 8 bits = 2^8 = 255 values. It seems like a large amount, but when you consider that a Gamma Curve of 2.0-2.2 (typical values used for correction) eliminate half or more of these values, we are left with only 113-130 values.
The obvious solution to this: we need more bits. There is one problem with the solution. More bits would require a re-write of the software and the protocols that we currently use. We would have to send more bits down the line, which would mean less channels per data-speed. (less channels per universe, or a slower refresh rate).
The solution would be to increase the resolution in the hardware. If the hardware took in an 8 bit value, and mapped it to a 9, 10, 12, or any other higher resolution bit. (Even 9 bit dimming would look great, but if we need to move up to 2 bytes to store a value, why not use more of it). The mapping could be done in hardware, and users could even create their own curves if they wanted to, however if all were using DC and the LED's were fairly balanced in output, it shouldn't need to be done.
Just my view of one idea as to how we could fix color mixing curves that could be applicable to any RGB non pixel element.