I have the following code:
var oneHeight = Math.ceil(0.012*window.innerHeight).toString()+"px";
var usboxshadow="0px "+oneHeight+" 0px rgba(0,140,255,1), 0px "+oneHeight+" 25px rgba(0,0,0,.7)";
console.log(usboxshadow);
$(".unselected").css("-webkit-box-shadow",usboxshadow);
When I output usboxshadow to the console, I get what I should:
0px 20px 0px rgba(0,140,255,1), 0px 20px 25px rgba(0,0,0,.7)
(the -webkit-box-shadow property)
However, when I retrieve the property with Jquery.css(),
console.log($(".unselected").css("-webkit-box-shadow"));
I get a very different result:
rgb(0, 140, 255) 0px 20px 0px 0px, rgba(0, 0, 0, 0.701961) 0px 20px 25px 0px
First, where did the extra 0px come from in each of the arguments?
Second, why is the rgba alpha(opacity) 0.701961, when it should be 0.7?
Please tell me what I have done wrong.
Edit:
After running the code, the box-shadow of the elements with the class unselected doesn't display.
[Updated for second question, see at the bottom.]
1) In CSS, you can specify the box-shadow
by using two, three or for pixel values.
First two are x,y offsets, two others are spread value & radius. See MDN for more infos.
When you set a CSS rule (with CSS or jQuery), the browser automatically put full values (in this case, four px
values). That's why you get the extra one when targetting with jQuery: the last one is given by the browser default behaviour (see the computed panel in Chrome inspector: it shows you values you didn't have to set because they are default ones).
2) .7 or .701961 isn't very different. I think your code seems good, but sometimes, conversion issues can exists in float numbers (= decimals). The way a computer stores a decimal number may vary following various complex criterias, I think. See this subject on StackOverflow.
I think you should'nt worry about that. There's maybe a way to force proper conversion in JS or jQuery. Can't tell more.
Hope I could help. :)
Here are experimentations on JS Fiddle (updated version than my Comment on this answer). The problem comes from var oneHeight = Math.ceil(0.012*window.innerHeight).toString()+"px";
I didn't paid much attention to it (sorry), but what window.innerheight
returns is the height of the Window. Therefore, the y offset of the shadow will vary depending on the browser window size. That's why we obtain a very different shadow.
Once again, I have no other explanations for the decimal issue. ^^
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With