I am trying to solve an optimization problem using the package nloptr
in R. I am not sure what is wrong with the following code, as I keep getting this error:
Error: nlopt_add_equality_mconstraint returned NLOPT_INVALID_ARGS.
Here is the problem (note that (A+)^T is the transpose of the Moore-Penrose inverse of matrix A)
and the code:
library( MASS ) ## for the Moore-Penrose inverse ginv()
library( zoo )
library( nloptr )
A = matrix(runif(27, -0.5, 0.5), nc = 3)
sigma = diag(runif(3,0.001,0.002))
M = ncol(A)
b = 1 / M
n = nrow(A)
init_y = rep(1/M, M)
c = -ncol(A)*log(ncol(A))
#### Objective function
eval_f <- function(y)
{
return( sqrt( as.numeric( t(y) %*% sigma %*% y ) ) )
}
#### Gradient of objective function
eval_grad_f <- function(y)
{
return( ( 2* sigma %*% y) / as.numeric(sqrt( t(y) %*% sigma %*% y )) )
}
#### Equality Constraint
eval_g_eq <- function(y)
{
return( ( t(ginv(A)) %*% y ) - 1 )## ginv(a) is the Moore-Penrose inverse of A
}
#### Inequality constraint
eval_g_ineq <- function(y)
{
return( c - sum( log(y) * b ) )
}
#### Jacobian of equality constraint
eval_jac_g_eq <- function(y)
{
return( ginv(A) )
}
#### Jacobian of inequality constraint
eval_jac_g_ineq <- function(y)
{
return( (-1/y) * b )
}
opts <- list( "algorithm" = "NLOPT_LD_SLSQP",
"xtol_rel" = 1.0e-14)
res0 <- nloptr( x0=init_y,
eval_f=eval_f,
eval_grad_f=eval_grad_f,
lb = rep(0,ncol(A)),
ub = rep(Inf,ncol(A)),
eval_g_eq = eval_g_eq,
eval_jac_g_eq = eval_jac_g_eq,
eval_g_ineq = eval_g_ineq,
eval_jac_g_ineq = eval_jac_g_ineq,
opts = opts
)
I've seen this happen when people declare a lambda value with only 1 argument instead of two. The problem is probably to do with how you're setting the constraints. You can review this document on the topic.
In this case, the problem seems to be to do with the function that you're using for eval_g_eq
. If I set this parameter to either NULL or the example function given on Page 4 of the documentation, then there are no errors.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With