I'v got some problem to understand the difference between Logarithmic(Lcc) and Uniform(Ucc) cost criteria and also how to use it in calculations.
Could someone please explain the difference between the two and perhaps show how to calculate the complexity for a problem like A+B*C
(Yes this is part of an assignment =) )
Thx for any help!
/Marthin
Uniform Cost Criteria assigns a constant cost to every machine operation regardless of the number of bits involved WHILE Logarithm Cost Criteria assigns a cost to every machine operation proportional to the number of bits involved
Problem size influence complexity Since complexity depends on the size of the problem we define complexity to be a function of problem size Definition: Let T(n) denote the complexity for an algorithm that is applied to a problem of size n. The size (n) of a problem instance (I) is the number of (binary) bits used to represent the instance. So problem size is the length of the binary description of the instance. This is called Logarithmic cost criteria
Unit Cost Criteria If you assume that: - every computer instruction takes one time unit, - every register is one storage unit - and that a number always fits in a register then you can use the number of inputs as problem size since the length of input (in bits) will be a constant times the number of inputs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With