0

I am currently working on finding polynomial equations to describe data sets (simple tables with one x and one y value) in Python. I need a polynomial equation to describe the data set so that given a number x I can derive it's y value.

If I enter these data sets into Apple's Numbers app and plot them on a graph, I can get a very helpful polynomial equation that predicts my y values very accurately for my purposes. However, when using Numpy to derive a polynomial equation with the same data set, I am given a very different polynomial equation that makes incredibly inaccurate predictions. I want my Python program to create polynomial equations that are much closer to the ones produced by the numbers app.

Some specifics:

My x values (x_list): 3.75652173913043, 3.79130434782609, 3.82608695652174

My y values (y_list): 0.0872881944444445, 0.0872522935779816, 0.0858840909090909

My polynomial from Numbers: y = -0.5506x^2 + 4.1549x - 7.7508

My polynomial from Numpy: y = -7.586x^2 + 57.53x - 108.7

How I'm using Numpy: polynomial = numpy.poly1d(numpy.polyfit(x_list, y_list, deg=2))

My numbers in Python are rounded to 8 decimal points, but I think the discrepancy is too large for it to be a rounding issue.

In short, I am wondering how Numbers would have derived this polynomial vs. how Numpy would have, and how I can replicate the Numbers method, ideally without using Numpy. (I am going to have to translate my program from Python to Swift eventually.)

4

1 回答 1

0

我用这段代码得到的结果是准确的。也许如果您可以发布您的代码,我们可以提供更多帮助。

% x_list = [3.75652173913043, 3.79130434782609, 3.82608695652174]
% y_list = [0.0872881944444445, 0.0872522935779816, 0.0858840909090909]
% fit = numpy.polyfit(x_list, y_list, deg=2)
% y = numpy.poly1d(fit)
% print y
         2
-0.5506 x + 4.155 x - 7.751
于 2016-02-17T23:50:07.480 回答