I try to dig more into optimization of functions depending on multiple variables with scipy
I have a function returning prediction from a data mining tool after calling this tool with a batch file.
def query(x):
import numpy as np
file_direc_in="path_to_input_file.csv"
file_direc_out="path_to_output_file.csv"
with open(file_direc_in, 'w') as f:
np.savetxt(f, x, delimiter=';', fmt='%.3f',newline='\r\n')
f.close()
os.system("Dataset_query.bat")
#batch file takes the array i wrote to from input_file and estimates a result
#afterwards the output will be taken from the output file:
f = open(file_direc_out,'r')
out = np.array([[float(f.readlines()[0])]])
f.close()
return out
from scipy.optimize import minimize
from calc import query
import numpy as np
x0=np.array([[1.5,50,30]])
bnds = ((1, 2), (0.1, 100), (20, 100))
res=minimize(query,x0,method='SLSQP',bounds=bnds, options={'maxiter': 10 , 'disp': True}, callback=True)
when I run the script I see the loop in my console, but it seems that there aren´t really values tested for my variables and I get the initial guess returned:
Optimization terminated successfully. (Exit mode 0)
Current function value: [[ 1636.724]]
Iterations: 1
Function evaluations: 5
Gradient evaluations: 1
although I know that for this problem the minimum lies at x_minimum=[1,0.1,100]
with a value of out at about out=400
( i have to decrease the first and second value of the variable and increase the third value to get a lower out
)
What am i doing wrong here?
The Solution in my case was changing the step size, because of the unsmoothness of my prediction function query
res=minimize(query,args=(hist,ana),x0=x0,method='SLSQP',/
bounds=bnds, options={'disp': True ,'eps' : 1e0})
In my case searching for local minima does not make sense and I am now searching in integer steps for the minima.
according to @ali_m searching for a global minimum basinhopping
could be used instead.
I will give it a try in the next days
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With