I have attached my graph. Pease explain how I can evaluate my network best. My understanding is to use the test set and validation set. Also, it should give me a correlation coefficient near one if my networks.
Well the selection of best NN performance is crucial. There are number of reasons that, why not your test and validation data is not as correlated as the training data. Logically the test data points are subset of your whole data points where you also took the training data. Now the selection in your hand, try to use different sample selection techniques (i.e. Random, Stratified or else) to partition the data, that would definitely gives you different results. Better I would suggest to use 10 fold cross validation. Where you will find the best model and your test correlation would be closer as much as your trained model hold. Critically thinking the test/validation correlation should be closer to learned model, but your test data is much scattered and not much correlated to your fitted curve, and thats why it shows that what you don't want to see.
your regression curve of test and validation are not acceptable although you are able to train it perfectly but thats not the only requirement. You did not attach performance plot of your experiment, that says a lot. These curves are plotted using MATLAB toolbox, so in order to get best performance, you can retrain the network by changing its different parameters like: no. of hidden layers, no. of neurons in hidden layer, re-initializing weight matrices, changing training and transfer functions etc.
i have made a network from neural network toolbox in MATLAB but correlation cofficient is 0.9999 for only Train set but it ino for validation set and test set what is problem can you suggest me ??
Your figure shows a case of overfitting. You must try to overcome this overfitting problem by reducing your R value for training set and increasing it for the other two subsets. To achieve this do not leave everything to MATLAB to do it for you. By default the nn toolbox divides your data to three parts. Prevent this default division by enetering: net.dividefcn = '';
So first from the original pool of data get your training subset by some technique (Random or by applying some statistical technique). Subsequently use only the training data to develop your model with different MSE values and simulate your models with your validation subset. The best model can be chosen as one which gives almost the same accuracy for both training and validation data.