Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Difference between Logarithmic and Uniform cost criteria

I'v got some problem to understand the difference between Logarithmic(Lcc) and Uniform(Ucc) cost criteria and also how to use it in calculations.

Could someone please explain the difference between the two and perhaps show how to calculate the complexity for a problem like A+B*C

(Yes this is part of an assignment =) )

Thx for any help!

/Marthin

like image 967
Marthin Avatar asked May 21 '10 16:05

Marthin


2 Answers

Uniform Cost Criteria assigns a constant cost to every machine operation regardless of the number of bits involved WHILE Logarithm Cost Criteria assigns a cost to every machine operation proportional to the number of bits involved

like image 122
Fareed Avatar answered Sep 30 '22 16:09

Fareed


Problem size influence complexity Since complexity depends on the size of the problem we define complexity to be a function of problem size Definition: Let T(n) denote the complexity for an algorithm that is applied to a problem of size n. The size (n) of a problem instance (I) is the number of (binary) bits used to represent the instance. So problem size is the length of the binary description of the instance. This is called Logarithmic cost criteria

Unit Cost Criteria If you assume that: - every computer instruction takes one time unit, - every register is one storage unit - and that a number always fits in a register then you can use the number of inputs as problem size since the length of input (in bits) will be a constant times the number of inputs.

like image 40
Anon Avatar answered Sep 30 '22 15:09

Anon