I want to convert a vector of indices into a matrix with ones in the columns of the indices.
x = [2;1;3;1];
m = someFunc(x,3)
% m =
%
% 0 1 0
% 1 0 0
% 0 0 1
% 1 0 0
[ ind , N ] = vec2ind( vec ) takes a matrix of vectors, each containing a single 1 and returns the indices of the ones, ind , and the number of rows in vec , N . ind2vec and vec2ind allow indices to be represented either by themselves or as vectors containing a 1 in the row of the index they represent.
Description. example. [ row , col ] = ind2sub( sz , ind ) returns the arrays row and col containing the equivalent row and column subscripts corresponding to the linear indices ind for a matrix of size sz .
vector() Functions. So far, we have converted our matrix by columns. However, it is also possible to convert a matrix to a one-dimensional vector by rows.
I tested the sub2ind function, but on the coursera Machine Learning forum I was pointed to this beauty.
m = eye(num_cols)(x,:);
It uses the identity matrix to select the appropriate column based on the value in x.
One way is to use SUB2IND function:
colN = 3;
assert(max(x)<=colN,'Not enough columns') %# check that you have enough columns
%# other checks that x is valid indices
m = zeros(numel(x),colN);
m(sub2ind(size(m),1:numel(x),x')) = 1;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With