Mukesh Kumar Yes, it's usually the case since the model fails to generalize to different distributions outside the training set. But, I think CV is usually a good way to detect overfitting.
The test or cross validation error being greater that the training error is the definition of overfitting.
If you want to reduce overfitting you can either:
1. Add more training samples
2. Simlify your model (not recomended in the case of deep learning techniques) by reducing the number of trainable parameters
3. Preprocess your training data to simplify it either using feature extraction techniques or feature selection techniques
4. Use Ensemble Learning techniques where you train several models on bagged subsets of your training set and then have a vote on the answers (outputs) of all the models