I'm doing astronomical image processing using python, and numpy.std(a) is consuming way too much memory. Some searching turns up the ncreduce package by Luis Pedro, but I'm having difficulty building my download of the package form here. ActiveState seems to suggest that this package won't build on Windows. I'm using Windows 7 and Python 2.7.
Is it possible to use ncreduce on Windows? If not, is there an alternative fast algorithm for computing standard deviation or variance that isn't as memory-hungry as numpy.std(a)?
The package requires a few small changes to build with msvc. It is quite old and there are no tests so use at your own risk.
--- ncreduce/reduce.cpp Thu Aug 14 13:02:50 2008
+++ ncreduce/reduce.cpp Thu Sep 26 11:56:04 2013
@@ -6,6 +6,7 @@
#include <iterator>
#include <vector>
#include <cmath>
+#include <limits>
extern "C" {
#include <Python.h>
#include <numpy/ndarrayobject.h>
@@ -98,7 +99,7 @@
}
*result /= N;
if (extra.is_std) {
- *result = std::sqrt(*result);
+ *result = std::sqrt((double)(*result));
}
}
@@ -142,7 +143,7 @@
for (unsigned i = 0; i != result.diameter(); ++i) {
first_result[i] = divide(first_result[i],ArrSize/result.diameter());
if (extra.is_std) {
- first_result[i] = sqrt(first_result[i]);
+ first_result[i] = sqrt((double)first_result[i]);
}
}
--- setup.py Thu Aug 14 13:54:48 2008
+++ setup.py Thu Sep 26 12:03:16 2013
@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
from numpy.distutils.core import setup, Extension
-ncreduce = Extension('ncreduce', sources = ['ncreduce/reduce.cpp', 'ncreduce/numpy_utils.hpp'], extra_compile_args=['-Wno-sign-compare'])
+ncreduce = Extension('ncreduce', sources = ['ncreduce/reduce.cpp', 'ncreduce/numpy_utils.hpp'], extra_compile_args=['/EHsc'])
classifiers = [
'Development Status :: 4 - Beta',
I put the binaries at http://www.lfd.uci.edu/~gohlke/pythonlibs/ . Search for ncreduce.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With