0

我正在尝试使用 运行最小化问题scipy.optimize,包括NonlinearConstraint. 我真的不想自己编写衍生代码,所以我习惯autograd这样做。但即使我对 tominimize和 to的参数遵循完全相同的程序NonlinearConstraint,第一个似乎有效,第二个无效。

这是我的 MWE:

useconstraint = False

import autograd
import autograd.numpy as np
from scipy import optimize

def function(x): return x[0]**2 + x[1]**2
functionjacobian = autograd.jacobian(function)
functionhvp = autograd.hessian_vector_product(function)

def constraint(x): return np.array([x[0]**2 - x[1]**2])
constraintjacobian = autograd.jacobian(constraint)
constrainthvp = autograd.hessian_vector_product(constraint)

constraint = optimize.NonlinearConstraint(constraint, 1, np.inf, constraintjacobian, constrainthvp)

startpoint = [1, 2]

bounds = optimize.Bounds([-np.inf, -np.inf], [np.inf, np.inf])

print optimize.minimize(
  function,
  startpoint,
  method='trust-constr',
  jac=functionjacobian,
  hessp=functionhvp,
  constraints=[constraint] if useconstraint else [],
  bounds=bounds,
)

当我useconstraint关闭(在顶部)时,它工作正常并按(0, 0)预期最小化。当我打开它时,我收到以下错误:

Traceback (most recent call last):
  File "test.py", line 29, in <module>
    bounds=bounds,
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_minimize.py", line 613, in minimize
    callback=callback, **options)
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_trustregion_constr/minimize_trustregion_constr.py", line 336, in _minimize_trustregion_constr
    for c in constraints]
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_constraints.py", line 213, in __init__
    finite_diff_bounds, sparse_jacobian)
  File "/home/heshy/.local/lib/python2.7/site-packages/scipy/optimize/_differentiable_functions.py", line 343, in __init__
    self.H = hess(self.x, self.v)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/wrap_util.py", line 20, in nary_f
    return unary_operator(unary_f, x, *nary_op_args, **nary_op_kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/differential_operators.py", line 24, in grad
    vjp, ans = _make_vjp(fun, x)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/core.py", line 10, in make_vjp
    end_value, end_node =  trace(start_node, fun, x)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 10, in trace
    end_box = fun(start_box)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/wrap_util.py", line 15, in unary_f
    return fun(*subargs, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/differential_operators.py", line 88, in vector_dot_grad
    return np.tensordot(fun_grad(*args, **kwargs), vector, np.ndim(vector))
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 44, in f_wrapped
    ans = f_wrapped(*argvals, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/autograd/tracer.py", line 48, in f_wrapped
    return f_raw(*args, **kwargs)
  File "/home/heshy/.local/lib/python2.7/site-packages/numpy/core/numeric.py", line 1371, in tensordot
    raise ValueError("shape-mismatch for sum")
ValueError: shape-mismatch for sum

我究竟做错了什么?我认为问题出在hessian_vector_product因为我hess在错误消息中看到,但我不确定。

4

1 回答 1

0

好的,我找到了答案。这非常令人困惑。

to的hessp参数minimize需要一个函数,该函数返回“目标函数的 Hessian 乘以任意向量 p”(source)。相比之下,hess参数NonlinearConstraint期望“A callable [that] 必须返回 dot(fun, v) 的 Hessian 矩阵”(source)。

如果您像我一样解释第一句话,“(目标函数乘以任意向量 p)的 Hessian 矩阵”,它的含义与“点(fun,v)的 Hessian 矩阵”几乎相同。因此,我假设您可以autograd对两者使用相同的功能。

但是,正确的解释是“(目标函数的Hessian)乘以任意向量p”,这是完全不同的。中的hessian_vector_product函数autograd为第一个提供了正确的结果,但您需要为第二个提供不同的函数。

于 2019-03-12T20:34:10.357 回答