1

我正在尝试使用 minimize_scalar 来计算一维多项式的最小值和最大值。

多项式为 x^{6}-2x^{5}-26x^{4}+28x^{3}+145x^{2}-26x-80

代码如下所示

import numpy as np
from scipy.optimize import *
ppar = np.array([1, -2, -26, 28, 145, -26, -80])
p = np.poly1d(ppar)
print p

maximum = minimize_scalar(-p, bounds=(-3.978, 3.068), method = 'bounded')
print "(-3.978, 3.068)", -maximum.fun
print maximum

minimum = minimize_scalar(p, bounds=(-3.978, 3.068), method = 'bounded')
print "(-3.978, 3.068)", minimum.fun
print minimum

结果是

   6     5      4      3       2
1 x - 2 x - 26 x + 28 x + 145 x - 26 x - 80
(-3.978, 3.068) 86.0883584933
  status: 0
    nfev: 12
 success: True
     fun: -86.0883584932823
       x: -1.5444720061831096
 message: 'Solution found.'
(-3.978, 3.068) -81.1476243092
  status: 0
    nfev: 11
 success: True
     fun: -81.147624309245643
       x: 0.08767224353999728
 message: 'Solution found.'

然而,一维多项式的实际解应该是最大值:在 x=2.176 处为 264.155,在 x = -3.391 处最小值为 -436.947

有谁知道我的代码有什么问题或者我错过了什么?

感谢您的任何帮助意见。

4

2 回答 2

1

Polynomials are oscillating, and have several extrema. What you get are simply different ones. Note that a local minimizer finds a minimum and if there are several, it reports one of them.

For polynomials, you are best off using specialized minimizers. E.g., differentiate and find the roots of the derivative using the companion matrix approach:

In [53]: coef = [-26, 2*145, 3*28, -4*26, -5*2]    # coefficients for the derivative

In [54]: coef = np.asarray(coef, dtype=float)

In [55]: coef /= 6  # make it monic

In [56]: coef
Out[56]: array([ -4.33333333,  48.33333333,  14.        , -17.33333333,  -1.66666667])

In [57]: a = np.diag(np.ones(4), -1)     # build the companion matrix

In [58]: a[:, -1] = -coef

Eigenvalues of the companion matrix are roots of the derivative (and vice versa), hence the extrema of the original polynomial:

In [61]: np.linalg.eigvals(a)
Out[61]: array([-3.39056572, -1.54447197,  0.08767236,  2.17555358,  4.33847842])

In [62]: pp = np.poly1d([1, -2, -26, 28, 145, -26, -80])   # sanity check

In [63]: pp(np.linalg.eigvals(a))
Out[63]: 
array([-436.94699498,   86.08835849,  -81.14762431,  264.15457395,
       -794.0522912 ])

An obligatory word of caution is that polynomials of large degree are best avoided because they are unstable.

于 2019-11-25T21:13:53.010 回答
0

简单的答案是使用scipy.optimize.minimize_scalar你可以找到局部最小值。详细信息在@ev-br answer中。但是,您可以使用全局优化方法在给定范围内找到函数的全局最小值。即使在这种情况下,您也应该做好准备,由于在引擎盖下使用了局部最小化器,这些方法也可能会失败。在下一个示例中,您可以看到shgo优化器无法找到全局最小值:

import numpy as np
from scipy import optimize

p = np.poly1d([1, -2, -26, 28, 145, -26, -80])

bounds = [(-3.978, 3.068)]

results = {}

results['shgo'] = optimize.shgo(p, bounds)    
results['DA'] = optimize.dual_annealing(p, bounds)    
results['DE'] = optimize.differential_evolution(p, bounds)

print(results['shgo']['x'], results['DA']['x'], results['DE']['x'])
[0.08767235] [-3.39056566] [-3.39056573]
于 2019-11-25T22:46:05.533 回答