Is it possible to multiply two ndarray A, and B and add the result to C, without creating a large intermediate array for A times B?
Numpy has the out keyword parameter for the case of C = A times B:
numpy.multiply(A, B, out=C)
How about the case of C += A times B?
Numpy only supports operations one at a time. With that said, there are several workarounds.
The most simple solution is to use in-place operations via += and *=
import numpy as np
n = 100
b = 5.0
x = np.random.rand(n)
y = np.random.rand(n)
z = b * x
z += y
You can access the underlying BLAS programs and apply them manually. Sadly, there is no multiply add instruction, but there is the "AXPY" instruction, which performs
y <- a * x + y
This can be called via:
import scipy
axpy = scipy.linalg.blas.get_blas_funcs('axpy', arrays=(x, y))
axpy(x, y, n, b)
Another option is to use some package like numexpr which allows you to compile expressions:
import numexpr
z = numexpr.evaluate('b * x + y')
Recently several machine-learning packages have started supporting compiled expressions, one such package is theano. You could do something like:
import theano
x = theano.tensor.vector()         # declare variable
y = theano.tensor.vector()         # declare variable
out = b * x + y                    # build symbolic expression
f = theano.function([x, y], out)   # compile function
z = f(x, y)
                        I compared different variants and found that you're not going wrong with SciPy's BLAS interface
scipy.linalg.blas.daxpy(x, y, len(x), a)

Code to reproduce the plot:
import numexpr
import numpy as np
import perfplot
import scipy.linalg
import theano
a = 1.36
# theano preps
x = theano.tensor.vector()
y = theano.tensor.vector()
out = a * x + y
f = theano.function([x, y], out)
def setup(n):
    x = np.random.rand(n)
    y = np.random.rand(n)
    return x, y
def manual_axpy(data):
    x, y = data
    return a * x + y
def manual_axpy_inplace(data):
    x, y = data
    out = a * x
    out += y
    return out
def scipy_axpy(data):
    x, y = data
    n = len(x)
    axpy = scipy.linalg.blas.get_blas_funcs("axpy", arrays=(x, y))
    axpy(x, y, n, a)
    return y
def scipy_daxpy(data):
    x, y = data
    return scipy.linalg.blas.daxpy(x, y, len(x), a)
def numpexpr_evaluate(data):
    x, y = data
    return numexpr.evaluate("a * x + y")
def theano_function(data):
    x, y = data
    return f(x, y)
b = perfplot.bench(
    setup=setup,
    kernels=[
        manual_axpy,
        manual_axpy_inplace,
        scipy_axpy,
        scipy_daxpy,
        numpexpr_evaluate,
        theano_function,
    ],
    n_range=[2 ** k for k in range(24)],
    equality_check=None,
    xlabel="len(x), len(y)",
)
# b.save("out.png")
b.show()
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With