I'm trying to create a neural network for my University project and have run into a problem. The network is to be used for image recognition for recognizing 320 x 200 pixel images. This means, that the number of input neurons in my network is 64.000 (320 * 200).
The problem is, that when calculating the weighed sum in each neuron in the hidden layer, I get very large numbers. When calculating the weighted sum because of this, I get results like 16000. This is my code for calculating the weighted sum, hope it clarifies what I do.
for(int i = 0; i < sizes[layer - 1]; i++){
double sum = 0;
for(int j = 0; j < a.size(); j++){
sum += a[j] * weights[layer - 2][i][j];
sum += biases[layer - 2][i];
}
out[i] = Sigmoid(sum);
}
I won't get into details with the code, but the concept of it is multiplying each weight with the corresponding input value. Naturally, when I get output activations of for example 16.000 or -16.000, then running the Sigmoid function will always return 0 or 1.
My question is: Is there some way to get around this, to "normalize" the weighed sum so that the Sigmoid function will return other than 0 or 1, or is it simply a question of having a very large number of neurons in the hidden layer, or scaling down the image?