我一直听说 batch_size 不会影响学习的准确性或质量。
但在我的简单模型中,立即使用 20,000(总数据集 250 万)的 batch_size 会损失 NaN 和 0.004 的准确度。将 bath_size 减小到 2,000 将使我损失 4.10 和 0.07 的准确度,并且准确度按预期随着每个时期继续适当增加。
那么:我的 batch_size 影响准确性这一事实是否意味着我的模型设置错误?
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation
from keras.optimizers import SGD
model = Sequential()
model.add(Dense(26, activation='relu', input_dim=X.shape[1]))
model.add(Dropout(0.1))
model.add(Dense(Y.shape[1], activation='softmax'))
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy',
optimizer=sgd,
metrics=['accuracy',])
model.fit(X, Y,epochs=15,batch_size=2000)
score = model.evaluate(X, Y, batch_size=2000)
score
Epoch 1/15
44s - loss: 4.7164 - acc: 0.0781
Epoch 2/15
45s - loss: 4.3305 - acc: 0.0977
Epoch 3/15
45s - loss: 4.1886 - acc: 0.1065
Epoch 4/15
45s - loss: 4.1235 - acc: 0.1104
Epoch 5/15
45s - loss: 4.0881 - acc: 0.1122
Epoch 6/15
45s - loss: 4.0657 - acc: 0.1136
Epoch 7/15
45s - loss: 4.0506 - acc: 0.1148
Epoch 8/15
45s - loss: 4.0393 - acc: 0.1154
Epoch 9/15
47s - loss: 4.0305 - acc: 0.1159
Epoch 10/15
model.fit(X, Y,epochs=15,batch_size=200000)
score = model.evaluate(X, Y, batch_size=200000)
score
Epoch 1/15
42s - loss: NaN - acc: 0.0004