Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Confused about "c lg n" in Introduction to Algorithms book - what is c?

I am reading the book "Introduction to Algorithms" and I am confused by this part:

We also assume a limit on the size of each word of data . For example when working with >inputs of size n , we typically assume that integers are represented by c lg n bits for some >constant c>=1. We require c>=1 so that each word can hold the value of n , enabling us to >index the individual input elements, and we restrict c to be a constant so that the word >size doesn't grow arbitrarily .

What is the purpose of this constant c?

like image 269
user3731608 Avatar asked Jun 11 '14 20:06

user3731608


1 Answers

The reason for the constant c is for the case where you're working with small inputs on a huge word-size machine. For example, if you are working on a 64-bit machine but only processing inputs of size, say, 212, the value of lg n will be 12 but the machine is a 64-bit machine. It wouldn't be correct to assume that the word size is exactly 12 bits here. The added factor of c allows us to assume that the machine word size is at least lg n, but possibly a lot larger, without messing up the analysis.

Hope this helps!

like image 78
templatetypedef Avatar answered Oct 22 '22 18:10

templatetypedef