Answer:
Hadoop.
Step-by-step explanation:
Apache Hadoop:- Supported by Apache foundation it is an open-source framework created for processing very large data sets.
Apache Hadoop consists of components including:
MapReduce for parallel processing.
(HDFS) Hadoop Distributed File System.
Yarn for cluster resource managementand job scheduling.
Common libraries needed by the other Hadoop subsystems.