Please note: I know the differences, so this question is not to know the difference between them but for something else.
I am putting below my understand and knowledge (for those who are not aware), compiled from reliable sources of information.
Gigabyte and Gigabit are for decimal notation and represent byte and bit respectively in power of 10.
Gibibyte and Gibibit are for binary notation and represent byte and bit respectively in power of 2.
For sake of symbol, byte will always be represented as B and bit will always be represented as b.
As per best of my knowledge, above information is true.
Below is quote which can be seen at many places including Wiki 32-bit, Stackoverflow 32 bit etc. And as per below, 32 bit memory address can access 4 GiB (which means 4 gibibyte and not 4 gibibit) of byte addressable memory.
Hence, a processor with 32-bit memory addresses can directly access 4 GiB of byte-addressable memory.
Question:
I am really really confused (in fact going nuts) that how 2^32 bit memory address can access 4 gibibyte of addressable memory? Shouldn't it be 4 gibibit or .5 Gibibyte?
1 byte has 8 bits, so if we are talking about memory in terms of bytes then isn't 2^32 bits would mean .5 gibibyte or 4 Gibibit?
I mean how 2^32 bit can be represented as 4 GiB or 4 gibibyte? It cannot be some convention because bytes and bits cannot be played like this. So, there HAS be a proper reason.
If there are 2^32 memory address and each storing 1 byte then it means I have 2^32 * 2^3 = 2^35 bits of memory. Then its no more 2^32 but 2^35. No?
P.S.: I am not a CS graduate so please forgive me.
2^32 = 4294967296 - so on a 32 bit architecture you can reach 4 billion addresses in memory. And one byte (8 bit) is stored at each address.
So your "maximum memory" is indeed 2^32 bytes or 4 x 2^30 bytes or 4 GiB.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With