Apache Hadoop is an open-source batch processing framework, which allows you to write applications that can deal with large data sets by spreading computations over a cluster of nodes. Computations are written using the MapReduce programming model.
In his paper, Jimmy Lin called Hadoop the large-scale data analysis "hammer" of choice since it is a good fit for many classes of algorithms. Hadoop has been widely adopted in the industry especially because it is fault-tolerant and it has the ability to scale horizontally for huge datasets. Also, the Hadoop library keeps the application development process quite straightforward.
A list of institutions that are using Hadoop for educational or production uses can be found on the following link. Each institution shortly describes their usage.
Hadoop is ,an distributed open source framework developed by apache software foundation, used to store and process huge dataset that are beyond the processing capacity of ordinary standalone system.