What is the time and space complexity of an algorithm, which calculates the dot product between two vectors with the length n?
bn> we can find the dot product by multiplying the corresponding values in each vector and adding them together, or (a1 * b1) + (a2 * b2) + (a3 * b3) .... + (an * bn). We can calculate the dot product for any number of vectors, however all vectors must contain an equal number of terms.
The standard way of multiplying an m-by-n matrix by an n-by-p matrix has complexity O(mnp). If all of those are "n" to you, it's O(n^3), not O(n^2).
First find the magnitude of the two vectors a and b, i.e., |→a| | a → | and |→b| | b → | . Secondly, find the cosine of the angle θ between the two vectors. Finally take a product of the magnitude of the two vectors and the and cosine of the angle between the two vectors, to obtain the dot product of the two vectors.
If the 2 vectors are a = [a1, a2, ... , an]
and b = [b1, b2, ... , bn]
, then
The dot-product is given by a.b = a1 * b1 + a2 * b2 + ... + an * bn
To compute this, we must perform n
multiplications and (n-1)
additions. (I assume that this is the dot-product algorithm you are referring to).
Assuming that multiplication and addition are constant-time operations,
the time-complexity is therefore O(n) + O(n) = O(n)
.
The only auxiliary space we require during the computation is to hold the 'partial dot-product so far' and the last product computed, i.e. ai * bi
.
Assuming we can hold both values in constant-space,
the space complexity is therefore O(1) + O(1) = O(1)
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With