Hadoop is a software library framework that allows for the
distributed processing of large data sets across clusters of computers using
simple programming models. It is designed to scale up from single servers to
thousands of machines.This Software helps us rather than rely on hardware
to deliver high-availability, the library itself is designed to detect and
handle failures
Hadoop provides a framework to process data of any size using a computing cluster made from normal, commodity hardware. There are two major components to Hadoop:
1. The File System, which is a distributed file system that splits up large files onto multiple computers,
2. The MapReduce framework, which is an application framework used to process large data stored on the file system.
B
Hadoop provides a framework to process data of any size using a computing cluster made from normal, commodity hardware. There are two major components to Hadoop:
1. The File System, which is a distributed file system that splits up large files onto multiple computers,
2. The MapReduce framework, which is an application framework used to process large data stored on the file system.
B
No comments:
Post a Comment