Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to force larger steps on scipy.optimize functions?

I have a function compare_images(k, a, b) that compares two 2d-arrays a and b

Inside the funcion, I apply a gaussian_filter with sigma=k to a My idea is to estimate how much I must to smooth image a in order for it to be similar to image b

The problem is my function compare_images will only return different values if k variation is over 0.5, and if I do fmin(compare_images, init_guess, (a, b) it usually get stuck to the init_guess value.

I believe the problem is fmin (and minimize) tends to start with very small steps, which in my case will reproduce the exact same return value for compare_images, and so the method thinks it already found a minimum. It will only try a couple times.

Is there a way to force fmin or any other minimizing function from scipy to take larger steps? Or is there any method better suited for my need?

EDIT: I found a temporary solution. First, as recommended, I used xtol=0.5 and higher as an argument to fmin. Even then, I still had some problems, and a few times fmin would return init_guess. I then created a simple loop so that if fmin == init_guess, I would generate another, random init_guess and try it again.

It's pretty slow, of course, but now I got it to run. It will take 20h or so to run it for all my data, but I won't need to do it again.

Anyway, to better explain the problem for those still interested in finding a better solution:

  • I have 2 images, A and B, containing some scientific data.
  • A looks like a few dots with variable value (it's a matrix of in which each valued point represents where a event occurred and it's intensity)
  • B looks like a smoothed heatmap (it is the observed density of occurrences)
  • B looks just like if you applied a gaussian filter to A with a bit of semi-random noise.
  • We are approximating B by applying a gaussian filter with constant sigma to A. This sigma was chosen visually, but only works for a certain class of images.
  • I'm trying to obtain an optimal sigma for each image, so later I could find some relations of sigma and the class of event showed in each image.

Anyway, thanks for the help!

like image 905
user2329994 Avatar asked Dec 09 '13 19:12

user2329994


4 Answers

I realize this is an old question but I haven't been able to find many discussion of similar topics. I am facing a similar issue with scipy.optimize.least_squares. I found that xtol did not do me much good. It did not seem to change the step size at all. What made a big difference was diff_step. This sets the step size taken when numerically estimating the Jacobian according to the formula step_size = x_i*diff_step, where x_i is each independent variable. You are using fmin so you aren't calculating Jacobians, but if you used another scipy function like minimize for the same problem, this might help you.

like image 171
Max Avatar answered Oct 20 '22 00:10

Max


Basin hopping may do a bit better, as it has a high chance of continuing anyway when it gets stuck at the plateau's.

I found on this example function that it does reasonably well with a low temperature:

>>> opt.basinhopping(lambda (x,y): int(0.1*x**2 + 0.1*y**2), (5,-5), T=.1)
    nfev: 409
     fun: 0
       x: array([ 1.73267813, -2.54527514])
 message: ['requested number of basinhopping iterations completed successfully']
    njev: 102
     nit: 100
like image 43
Alex Avatar answered Oct 20 '22 01:10

Alex


Quick check: you probably really meant fmin(compare_images, init_guess, (a,b))?

If gaussian_filter behaves as you say, your function is piecewise constant, meaning that optimizers relying on derivatives (i.e. most of them) are out. You can try a global optimizer like anneal, or brute-force search over a sensible range of k's.

However, as you described the problem, in general there will only be a clear, global minimum of compare_images if b is a smoothed version of a. Your approach makes sense if you want to determine the amount of smoothing of a that makes both images most similar.

If the question is "how similar are the images", then I think pixelwise comparison (maybe with a bit of smoothing) is the way to go. Depending on what images we are talking about, it might be necessary to align the images first (e.g. for comparing photographs). Please clarify :-)

edit: Another idea that might help: rewrite compare_images so that it calculates two versions of smoothed-a -- one with sigma=floor(k) and one with ceil(k) (i.e. round k to the next-lower/higher int). Then calculate a_smooth = a_floor*(1-kfrac)+a_ceil*kfrac, with kfrac being the fractional part of k. This way the compare function becomes continuous w.r.t k.

Good Luck!

like image 31
Torben Klein Avatar answered Oct 20 '22 00:10

Torben Klein


I had the same problem and got it to work with the 'TNC' method.

  res = minimize(f, [1] * 2, method = 'TNC', bounds=[(0,15)] * 2, jac = '2-point', options={'disp': True, 'finite_diff_rel_step': 0.1, 'xtol': 0.1, 'accuracy': 0.1, 'eps': 0.1})

The combination between 'finite_diff_rel_step' and setting 'jac' to one of {‘2-point’, ‘3-point’, ‘cs’} did the trick for the jacobian calculation step, and the 'accuracy' did the trick for the step size. The 'xtol' and 'eps' I don't think are needed, I just added them just in case.

In the example, I have 2 variables that are initialized to 1 and with boundaries [0,15] because I'm approximating the parameters of beta distribution, but it should apply to your case also.

like image 27
AxelWass Avatar answered Oct 20 '22 00:10

AxelWass