I have a 1D tensor that I wish to partition into overlapping blocks. I'm thinking of something like:
tensor = tf.constant([1, 2, 3, 4, 5, 6, 7])
overlapping_blocker(tensor,block_size=3,stride=2)
=> [[1 2 3], [3, 4, 5], [5, 6, 7]]
So far I've only found ways to partition a tensor into non-overlapping blocks. Anybody knows of a way to solve this?
This needs to work for arbitrary input dimension (i.e. my input is like a tf.placeholder([None])
You can use tf. slice on higher dimensional tensors as well. You can also use tf. strided_slice to extract slices of tensors by 'striding' over the tensor dimensions.
Basically to subset a tensor for some indexes [a,b,c] It needs to get in the format [[0,a],[1,b],[2,c]] and then use gather_nd() to get the subset.
Use Lambda to split a tensor of shape (64,16,16) into (64,1,1,256) and then subset any indexes you need.
transpose(x, perm=[1, 0]) . As above, simply calling tf. transpose will default to perm=[2,1,0] . To take the transpose of the matrices in dimension-0 (such as when you are transposing matrices where 0 is the batch dimension), you would set perm=[0,2,1] .
You can achieve the same using tf.extract_image_patches
.
tensor = tf.placeholder(tf.int32, [None])
def overlapping_blocker(tensor,block_size=3,stride=2):
return tf.squeeze(tf.extract_image_patches(tensor[None,...,None, None], ksizes=[1, block_size, 1, 1], strides=[1, stride, 1, 1], rates=[1, 1, 1, 1], padding='VALID'))
result = overlapping_blocker(tensor,block_size=3,stride=2)
sess = tf.InteractiveSession()
print(result.eval({tensor:np.array([1, 2, 3, 4, 5, 6, 7], np.int32)}))
#[[1 2 3]
#[3 4 5]
#[5 6 7]]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With