I am developing a scalable mobile/desktop friendly website using CSS rem units based on the average viewport size of the site.
I have been setting rem sizes as low as 0.001rem for certain padding and margins, which is working fine in my Firefox browser... but will it work in all modern browsers?
I am only questioning the ability to use 0.001rem units because the highest granularity I have seen prior to thousandths is hundredths of opacity... like opacity:0.25
for example.
How low can rem units go? Is 0.00000001rem an acceptable value in modern browsers?
To recap, the rem unit means "The root element's font-size" (rem stands for "root em"). The <li> elements inside the <ul> with a class of rems take their sizing from the root element ( <html> ). This means that each successive level of nesting does not keep getting larger.
REM is defined relative to the font size of the root element. The root element is matched by the :root pseudo-class or the html selector. 1rem therefore takes on the value which is given to the font-size of the root element. This means that 1 REM keeps the same value throughout your whole CSS code.
A rem is equal to 0.01 sievert in the International System of Units (SI).
Ultimately, your granularity is 1 physical pixel (well technically the sub-pixel in modern browsers, but I will ignore that for purposes of this discussion). You can have different calculated pixel values based on em
or rem
even down to several digits of precision. You then run into the real-world problem of, when rendering, that decimal precision would be lost when the browser ultimately rounds off to fit the pixels available at whatever the device pixel density is relative to the reference pixel density (96ppi).
In essence, this reference pixel is 1/96th of an inch. So 1px
in CSS terms basically means 1/96" at 96ppi. On screens with higher pixel densities (say like 326 ppi of many Apple "retina" screens), scaling takes place to convert the CSS reference pixel to physical pixels. For the retina display mentioned, this scaling factor would be ~3.4. So if you specified a CSS rule to set something to say 10px, the retina display browser should display on 34 physical pixels (assuming no other HTML changes (i.e. meta-elements) that would change display behavior). Because of this scaling behavior, the physical size of the element would still be 10/96" which is exactly the same physical size as if the element were rendered on a 96ppi screen.
Now let's add em
and rem
to the mix. So let's use an example of 10px root element font size with a declaration on some other element of .001rem
. That would mean you are trying to render this element at 0.01 (10px * .001rem) reference pixels, which would translate to 0.034 physical pixels in the retina display. You can clearly see that the rem
value of 0.001 is at least one order of magnitude away from making a significant difference in physical display, as .01rem
in this case would translate to 0.34 physical pixels - no difference when rounded for display than for the "more precise" .001rem
specification.
So I think you are defining rem-based CSS rules with far more specificity than can actually be accommodated in real-world terms when physical pixels are being painted, unless you either have a very high root element size defined and/or you have a physical screen with pixel densities an order of magnitude greater than what you have in a retina display. I am guessing this latter case is not true.
Just because the CSS can be calculated to 3 decimals worth of precision or whatever, that doesn't mean that physical rendering can occur at that same level of precision.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With