Please consider the following snippet:
var el = document.getElementById('x');
el.style.backgroundColor = "rgba(124, 181, 236, 0.25)";
alert(el.style.backgroundColor);
<div id="x">Testing</div>
On Windows 8.1 this gives back the exact input for backgroundColor
in IE 11 and Firefox 37, but in Chrome 43 it changes the alpha-value, and the alert says:
rgba(124, 181, 236, 0.247059)
Notice that the alpha-value unexpectedly returns 0.247059
instead of 0.25
.
I've gone through the background-color
spec as well as the rgba spec and more specifically the bit about alpha values, but failed to determine whether this is a bug or if the UA (in this case Chrome) is allowed to do this.
Does any of the relevant specs explain whether Chrome's behavior is "allowed"? As a bonus, can anyone explain why Chrome would subtly change the alpha-value?
Footnote: to check if it is the "setter" (el.style.backgroundColor = ...
) is to blame I've also tried declaring the style on the element inside the DOM itself. This will have the same (unexpected) result. See this snippet:
document.addEventListener("DOMContentLoaded", function(event) {
var el = document.getElementById('x');
alert(el.style.backgroundColor);
});
<div id="x" style="background-color: rgba(124, 181, 236, 0.25);">Testing</div>
It's a known issue of Chrome, for quite a long time...
This is due to how Chrome calculates the computed style alpha value.
For example your input alpha value is 0.25
.
0.25 * 255 = 63.75
.63
.63/255 = 0.24705882352941178
, approximately 0.247059
.It occurs not only when you set color with JS, but also with CSS. For example:
<style>
div { color: rgba(100, 100, 100, 0.3) }
<style>
<div></div>
<script>
document.querySelector('div').style.color // rgba(100, 100, 100, 0.298039)
</script>
A thought as RGBA is a represented in 32 bit. That would mean that in actual fact then there is no such thing as an exact 0.25 of the 8 bit alpha. So therefore 0.247059 is the actual correct extrapolated value. So is Chrome Wrong? or is it in fact Correct? and the other browsers are giving you an invalid number that is not the true real representation of what is rendered on the page ?
You can then argue that the W3C standard is not entirely correct and that it should only allow for values that a fully devise-able with an 8 bit Alpha.. But then it is just a recommendation and not law...
Below is a stripped down version of Chromium customised webkit color.cpp code that looks to be doing the color conversions. but then i'm no chromium expert http://www.chromium.org/developers/how-tos/getting-around-the-chrome-source-code
sources: https://code.google.com/p/chromium/codesearch#chromium/src/third_party/WebKit/Source/platform/graphics/Color.cpp
#include <iostream>
using namespace std;
typedef unsigned RGBA32;
int colorFloatToRGBAByte(float f)
{
return std::max(0, std::min(static_cast<int>(lroundf(255.0f * f)), 255));
}
RGBA32 makeRGBA32FromFloats(float r, float g, float b, float a)
{
cout << "Alpha: " << a;
return colorFloatToRGBAByte(a) << 24 | colorFloatToRGBAByte(r) << 16 | colorFloatToRGBAByte(g) << 8 | colorFloatToRGBAByte(b);
}
int main()
{
RGBA32 t;
t = makeRGBA32FromFloats (255.0f, 255.0f, 255.0f, 0.25f);
cout << static_cast<unsigned>(t) << std::endl;
return 0;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With