0

我正在尝试使用 tf.GradientTape() 计算 Python 中函数的梯度。

这个相当慢的实现按预期工作:

def simulate_BS(it, S):
    sigma = 0.1
    T = 1
    weeks = 52
    dt = T/weeks
    sum = 0

    for i in range(it):
        Sold = S
        for t in range(weeks):
            Z = np.random.normal()
            Snew = Sold*np.exp((-0.5*sigma**2)*dt+sigma*np.sqrt(dt)*Z)
            Sold = Snew
        if Snew - 100 > 0:
            sum = sum + (Snew - 100)
    
    return sum/float(it)

x = tf.Variable(100.0)

with tf.GradientTape() as g:
    g.watch(x)
    C = simulate_BS(1000,x)
    delta = g.gradient(C,x)

它返回 C ~= 4.0 和 delta ~= 0.5。

以下替代实现等效地计算 C,但由于某种原因梯度等于 None:

def simulate_BS(it, S0):
    sigma = 0.1

    T = 1
    weeks = 52
    dt = T/weeks

    S = tf.reshape(tf.Variable(np.repeat(S0,it), dtype = "float32"), shape = (it,1))

    for i in range(weeks):
        Z = tf.random.normal(shape=(it,1))
        temp = tf.math.multiply(tf.reshape(S[:,i], shape = (it,1)), np.exp((-0.5*sigma**2)*dt+sigma*np.sqrt(dt)*Z))
        S = tf.concat([tf.reshape(S, shape = (it,i+1)), tf.reshape(temp, shape = (it,1))], axis = 1)
        
    return tf.math.reduce_sum(tf.nn.relu(S[:,i+1]-100))/it

x = tf.Variable(100.0)

with tf.GradientTape() as g:
    g.watch(x)
    C = simulate_BS(1000,x)
    delta = g.gradient(C,x)

我的问题是:为什么梯度返回 None,以及如何在第二个实现中修复它?

4

0 回答 0