Apache Hadoop Foundation is an open-source independent framework that is used to efficiently store and process large datasets ranging in size from gigabytes to petabytes of data. Instead of using one large computer to store and process the data, Hadoop allows clustering multiple computers to analyze massive datasets in parallel more quickly@Nijad Ahmad.
Big Data is simply a vast volume of data, both structured and unstructured, whereas Hadoop is an open-source, java-based framework that is capable of handling and analysing the enormous volume of Big Data.