1

I am looking for a keras equivalent of scikit-learn's partial_fit : https://scikit-learn.org/0.15/modules/scaling_strategies.html#incremental-learning for incremental/online learning.

I finally found the train_on_batch method but I can't find an example that shows how to properly implement it in a for loop for a dataset that looks like this :

x = np.array([[0.5, 0.7, 0.8]])  # input data
y = np.array([[0.4, 0.6, 0.33, 0.77, 0.88, 0.71]])  # output data

Note : this is a multi-output regression

my code so far:

import keras
import numpy as np

x = np.array([0.5, 0.7, 0.8])
y = np.array([0.4, 0.6, 0.33, 0.77, 0.88, 0.71])
in_dim = x.shape
out_dim = y.shape

model = Sequential()
model.add(Dense(100, input_shape=(1,3), activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(6))
model.compile(loss="mse", optimizer="adam")

model.train_on_batch(x,y)

I get this Error: ValueError: Input 0 of layer sequential_28 is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape [3, 1]

4

1 回答 1

3

You should feed your data batch-wise. You are giving a single instance but model expecting batch data. So, you need to expand the input dimension for batch size.

import keras
import numpy as np
from keras.models import *
from keras.layers import *
from keras.optimizers import *
x = np.array([0.5, 0.7, 0.8])
y = np.array([0.4, 0.6, 0.33, 0.77, 0.88, 0.71])
x = np.expand_dims(x, axis=0)
y = np.expand_dims(y, axis=0)
# x= np.squeeze(x)
in_dim = x.shape
out_dim = y.shape

model = Sequential()
model.add(Dense(100, input_shape=((1,3)), activation="relu"))
model.add(Dense(32, activation="relu"))
model.add(Dense(6))
model.compile(loss="mse", optimizer="adam")

model.train_on_batch(x,y)
于 2020-09-16T16:23:44.500 回答