Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

expand tensor in tensorflow

In TensorFlow, I intend to manipulate tensor with Taylor series of sin(x) with certain approximation terms. To do so, I have tried to manipulate the grayscale image (shape of (32,32)) with Taylor series of sin(x) and it works fine. Now I have trouble manipulating the same things that worked for a grayscale image with the shape of (32,32) to RGB image with the shape of (32,32,3), and it doesn't give me the correct array. Intuitively, I am trying to manipulate tensor with Taylor's expansion of sin(x). Can anyone show me the possible way of doing this in tensorflow? Any idea?

my attempt:

here is taylor expansion of sin(x) at x=0: 1- x + x**2/2 - x**3/6 with three expansion term.

from tensorflow.keras.datasets import mnist
(X_train, y_train), (X_test, y_test) = mnist.load_data()

x= X_train[1,:,:,1]
k= 3
func = 'sin(x)'

new_x = np.zeros((x.shape[0], x.shape[1]*k))
new_x = new_x.astype('float32') 
nn = 0
for i in range(x.shape[1]):
    col_d = x[:,i].ravel()
    new_x[:,nn] = col_d
    if n_terms > 0:
        for j in range(1,k):
            if func == 'cos(x)':
                new_x[:,nn+j] = new_x[:,nn+j-1]

I think I could do this more efficiently with TensorFlow but that's not quite intuitive for me how to do it. Can anyone suggest a possible workaround to make this work? Any thought?

update:

In 2dim array col_d = x[:,i].ravel() is pixel vector which flattened 2 dim array. Similarly, we could reshape 3dim array to 2 dim by this way: x.transpose(0,1,2).reshape(x.shape[1],-1) in for loop, so it could be x[:,i].transpose(0,1,2).reshape(x.shape[1],-1), but this is still not correct. I think tensorflow might have better way of doing this. How can we manipulate the tensor with taylor series of sin(x) more efficiently? Any thoughts?

goal:

Intuitively, in Taylor series of sin(x), x is tensor, and if we want only 2, 3 approximation terms of Taylor series of sin(x) for each tensor, I want to concatenate them in new tensor. How should we do it efficiently in TensorFlow? Any thoughts?

like image 965
Jerry07 Avatar asked Nov 29 '25 03:11

Jerry07


2 Answers

new_x = np.zeros((x.shape[0], x.shape[1]*n_terms))

This line has no meaning, why allocating space for 96 elements for 3 taylor expansion terms.

(new_x[:, 3:] == 0.0).all() = True # check

For pixelwise taylor expansion with n-terms


def sin_exp_step(x, i):

  c1 = 2 * i + 1
  c2 = (-1) ** i / np.math.factorial(c1)

  t = c2 * (x ** c1) 
  
  return t

# validate

x = 45.0
x = (np.pi / 180.0) * x 

y = np.sin(x)

approx_y = 0

for i in range(n_terms):

  approx_y += sin_exp_step(x, i)

abs(approx_y - y) < 1e-8

x= X_train[1,:,:,:]
n_terms = 3
func = 'sin(x)'

new_x = np.zeros((*x.shape, n_terms))

for i in range(0, n_terms):

  if func == 'sin(x)': # sin(x)

    new_x[..., i] += sin_exp_step(x, i)

Commonly numerical approximation methods are being avoided, as they are computationally expensive (i.e. factorial) and less stable, so gradient based optimization usually is the best, for a higher order derivatives algorithms such BFGS and LBFGS used to approximate hessian matrix (2nd order derivative). Optimizers such Adam & SGD are sufficient and comes with much less computational consumption. Using neural network, we might be able to find a much better expansions.


Tensorflow solution for n-terms expansion

import numpy as np

import tensorflow as tf
from tensorflow.keras.datasets import cifar10
from tensorflow.keras.layers import Input, LocallyConnected2D
from tensorflow.keras.models import Model
from tensorflow.keras import backend as K

(x_train, y_train), (x_test, y_test) = cifar10.load_data()

x_train = tf.constant(x_train, dtype=tf.float32)
x_test = tf.constant(x_test, dtype=tf.float32)

def expansion_approx_of(func):

  def reconstruction_loss(y_true, y_pred):

    loss = (y_pred - func(y_true)) ** 2
    loss = 0.5 * K.mean(loss)

    return loss

  return reconstruction_loss

class Expansion2D(LocallyConnected2D): # n-terms expansion layer

  def __init__(self, i_shape, n_terms, kernel_size=(1, 1), *args, **kwargs):
    
    if len(i_shape) != 3:
      
      raise ValueError('...')

    self.i_shape = i_shape
    self.n_terms = n_terms

    filters = self.n_terms * self.i_shape[-1]
    
    super(Expansion2D, self).__init__(filters=filters, kernel_size=kernel_size,
                                      use_bias=False, *args, **kwargs)
    
  def call(self, inputs):

    shape =  (-1, self.i_shape[0], self.i_shape[1], self.i_shape[-1], self.n_terms)

    out = super().call(inputs)

    expansion = tf.reshape(out, shape)
    
    out = tf.math.reduce_sum(expansion, axis=-1)
    
    return out, expansion

inputs = Input(shape=(32, 32, 3))

# expansion: might be a taylor expansion or something better.
out, expansion = Expansion2D(i_shape=(32, 32, 3), n_terms=3)(inputs)

model = Model(inputs, [out, expansion])

opt = tf.keras.optimizers.Adam(learning_rate=0.0001, beta_1=0.9, beta_2=0.999)
loss = expansion_approx_of(K.sin)

model.compile(optimizer=opt, loss=[loss])

model.summary()

model.fit(x_train, x_train, batch_size=1563, epochs=100)

x_pred, x_exp = model.predict_on_batch(x_test[:32])

print((x_exp[0].sum(axis=-1) == x_pred[0]).all())

err = abs(x_pred - np.sin(x_test[0])).mean()

print(err)
like image 109
4.Pi.n Avatar answered Nov 30 '25 18:11

4.Pi.n


Put three expansion terms into a tensor at axis=1

x = tf.ones([8, 32, 32, 3], tf.float32) * 0.5  # example batchsize=8, imageshape=[32, 32, 3]
x = tf.stack([x, - (1/6) * tf.math.pow(x, 3), (1/120) * tf.math.pow(x, 5)], axis=1) # expansion of three terms of sin(x), [8, 3, 32, 32, 3]

If you would go with tf.keras Functional API or Sequential API, you might make a Keras custom layer

tf.math.pow
tf.stack

Edit: In the first answer, I recommended tf.keras.layers.Lambda, but it might not work with tf.math.pow or tf.stack (I haven't tried). You would go with Keras custom layer.

like image 20
Watanabe.N Avatar answered Nov 30 '25 18:11

Watanabe.N



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!