问题:
Calculate the mean and standard deviation of a tightly clustered set of 1000 initial conditions as a function of iteration number. The bunch of initial conditions should be Gaussian distributed about x = 0.3 with a standard deviation of 10-3
我写的代码:
from numpy import *
def IterateMap(x,r,n):
for i in xrange(n):
x = r * x * (1.0 - x)
return x
output = "data"
nIterations = 1000
r = 4.0
x0 = 0.3
delta = 0.00005
L = []
for i in xrange(nIterations):
x = x0
x = IterateMap(x,r,1)
L[i] = x
x0 = x0 + delta
A = array(L)
print 'mean: ', mean(A)
所以我的代码应该做的是获取 x (x0) 的初始值并调用 IterateMap 函数并返回 x 的新值并将其放入 list(L) 然后 x0 更改为新值,这过程持续 1000 次。我收到错误“列表分配索引超出范围”。另外,您认为我正确地解决了这个问题吗?