How to identify the linearly independent rows from a matrix? For instance,
The 4th rows is independent.
Since the matrix is , we can simply take the determinant. If the determinant is not equal to zero, it's linearly independent. Otherwise it's linearly dependent. Since the determinant is zero, the matrix is linearly dependent.
Given a set of vectors, you can determine if they are linearly independent by writing the vectors as the columns of the matrix A, and solving Ax = 0. If there are any non-zero solutions, then the vectors are linearly dependent. If the only solution is x = 0, then they are linearly independent.
You can basically find the vectors spanning the columnspace of the matrix by using SymPy library's columnspace() method of Matrix object. Automatically, they are the linearly independent columns of the matrix.
With sympy you can find the linear independant rows using: sympy.Matrix.rref
:
>>> import sympy >>> import numpy as np >>> mat = np.array([[0,1,0,0],[0,0,1,0],[0,1,1,0],[1,0,0,1]]) # your matrix >>> _, inds = sympy.Matrix(mat).T.rref() # to check the rows you need to transpose! >>> inds [0, 1, 3]
Which basically tells you the rows 0, 1 and 3 are linear independant while row 2 isn't (it's a linear combination of row 0 and 1).
Then you could remove these rows with slicing:
>>> mat[inds] array([[0, 1, 0, 0], [0, 0, 1, 0], [1, 0, 0, 1]])
This also works well for rectangular (not only for quadratic) matrices.
First, your 3rd row is linearly dependent with 1t and 2nd row. However, your 1st and 4th column are linearly dependent.
Two methods you could use:
Eigenvalue
If one eigenvalue of the matrix is zero, its corresponding eigenvector is linearly dependent. The documentation eig states the returned eigenvalues are repeated according to their multiplicity and not necessarily ordered. However, assuming the eigenvalues correspond to your row vectors, one method would be:
import numpy as np matrix = np.array( [ [0, 1 ,0 ,0], [0, 0, 1, 0], [0, 1, 1, 0], [1, 0, 0, 1] ]) lambdas, V = np.linalg.eig(matrix.T) # The linearly dependent row vectors print matrix[lambdas == 0,:]
Cauchy-Schwarz inequality
To test linear dependence of vectors and figure out which ones, you could use the Cauchy-Schwarz inequality. Basically, if the inner product of the vectors is equal to the product of the norm of the vectors, the vectors are linearly dependent. Here is an example for the columns:
import numpy as np matrix = np.array( [ [0, 1 ,0 ,0], [0, 0, 1, 0], [0, 1, 1, 0], [1, 0, 0, 1] ]) print np.linalg.det(matrix) for i in range(matrix.shape[0]): for j in range(matrix.shape[0]): if i != j: inner_product = np.inner( matrix[:,i], matrix[:,j] ) norm_i = np.linalg.norm(matrix[:,i]) norm_j = np.linalg.norm(matrix[:,j]) print 'I: ', matrix[:,i] print 'J: ', matrix[:,j] print 'Prod: ', inner_product print 'Norm i: ', norm_i print 'Norm j: ', norm_j if np.abs(inner_product - norm_j * norm_i) < 1E-5: print 'Dependent' else: print 'Independent'
To test the rows is a similar approach.
Then you could extend this to test all combinations of vectors, but I imagine this solution scale badly with size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With