I'm plotting many thousand points in a canvas, where each point is an arc with a small alpha value, basically a density plot. At first I used 0.1 for the alpha, but get better results using 0.05. I was surprised that 0.05 worked (Chrome), as I've never seen a decimal precision greater than 1 in any code.
Is there some official precision for the alpha value or is it just up to the browser's implementation?
The precision should be 1/256 (~0.004 or about "0.5% opacity increments")1
The range is [0, 1] and value outside this make no sense - as the range corresponds to completely transparent (0% opaque) and 100% opaque, respectively. Values will be clipped to fit this range.
The value 0.05 is perfectly fine (and distinct from 0.1) on both accounts. The actual amount of visual change (or visual appeal) will depend upon the display hardware2 and human perception - in any case, I would expect 0.05 to be "95% transparent" whereas 0.1 would be "90% transparent".
1 This is assuming an 8 alpha channel and an underlying 32bpp canvas (although I cannot find any supporting documentation!). Also note that currently, only 8 bits per channel can be supplied via the #AARRGGBB notation, so while a higher internal color depth may be supported (i.e. when loaded from a 48-bit image, with 12 bits/channel) it is not currently exposed.
2 If the display hardware is lower (e.g. 16bpp or 24bpp) then some loss of the internal precision will occur when displayed. Lower quality displays are also prone to bias or poor calibration. Additionally, the lighting in the environment can play a huge factor on how the image is perceived.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With