I am trying for a prediction network. I have normalized my inputs and outputs in the range of -1 to 1.

Currently I am using sigmoid activation function at hidden layer and output layer. But for sigmoid output is between 0 to 1. So should I normalize my output in the range 0 to 1. or there is any other activation function which has output between -1 to 1?

I also want to know about Error function as I came to know that it is special function of sigmoid.

Which is better error or sigmoid?

More Prashant Chavan's questions See All
Similar questions and discussions