Solving a problem with Deep Learning in a particular area depends on the availability of data. Large data sets are needed to make sure that the machine delivers desired results. So, is the big challenge consist to find a compromise between large data (quantity /quality) , overfitting and hyperparameters?