2

我正在自学 ML,当我尝试在 python 中编写逻辑回归时出现错误。这是来自斯坦福在线课程。我尝试了很多次,包括将 grad 更改为 grad.ravel()/grad.fatten(),但都没有奏效。

输入:

import numpy as np

data=np.loadtxt(r'E:\ML\machine-learning-ex2\ex2\ex2data1.txt',delimiter=',')

X=data[:,:2]
y=data[:,2].reshape(-1,1)

def sigmoid(z):
    return 1/(np.exp(-1*z)+1)

def costFunction(theta,X,y):
    m=len(y)
    h=sigmoid(np.dot(X,theta))
    J=-1/m*np.sum((np.dot(y.T,np.log(h))+np.dot((1-y).T,np.log(1-h))))
    grad=1/m*np.dot(X.T,(h-y))
    return J,grad

m,n=np.shape(X)
X=np.hstack((np.ones([m,1]),X))
initial_theta=np.zeros([n+1,1])

import scipy.optimize as opt
result = opt.fmin_tnc(func=costFunction, x0=initial_theta, args=(X, y))

输出:

    ValueError:
    ---> 25 result = opt.fmin_tnc(func=costFunction, x0=initial_theta, args=(X, y))

    ValueError: tnc: invalid gradient vector from minimized function.
4

2 回答 2

0

好吧,我忘了复制这些代码: m,n=np.shape(X) initial_theta=np.zeros(n+1) 我得到了答案。x0 参数需要是一维数组,但我给了它一个二维数组。所以只需将initial_theta更改为 1D 并在costFunction 中将其重塑为 2D

于 2019-08-12T04:35:44.813 回答
0

我得到了同样的错误。通过在 fmin_tnc 函数中添加参数 approx_grad = True 来解决它(参考:https ://towardsdatascience.com/building-a-logistic-regression-in-python-301d27367c24 )

def sigmoid(z):
    z = z.astype(float)
    return (1 / (1 + np.exp(-z)))

def net_input(theta, x):
    return np.dot(x, theta)

def probability(theta, x):
    return sigmoid(net_input(theta, x))

def cost_function(theta, x, y):
    m = x.shape[0]
    total_cost = -(1 / m) * np.sum(y * np.log(probability(theta, x)) + (1 - y) * np.log(1 - probability(theta, x)))
    return total_cost

def gradient(theta, x, y):
    m = x.shape[0]
    return (1 / m) * np.dot(x.T, sigmoid(net_input(theta,   x)) - y)

def fit(x, y, theta):
    opt_weights = fmin_tnc(func=cost_function, x0=theta, fprime=gradient, approx_grad=True, args=(x, y.flatten()))
    return opt_weights[0]

X = np.c_[np.ones((X.shape[0], 1)), X]
y = loan_data_new.iloc[:, -1].to_numpy()
y = y[:, np.newaxis]
m,n = np.shape(X)
one_vec = np.ones((m,1))
X = np.hstack((one_vec,X))
theta = np.zeros((n+1,1))
theta = theta[:,np.newaxis] 
parameters = fit(X, y, theta)
于 2022-02-03T21:08:02.580 回答