Final answer:
HDFS is a distributed storage system designed for handling large files. It is fault-tolerant and uses data replication to ensure data availability.
Step-by-step explanation:
The correct answer is B) HDFS is a distributed storage system designed for handling large files. Hadoop Distributed File System (HDFS) is a distributed file system that provides high throughput access to application data. It is designed to store and process large datasets across multiple nodes in a Hadoop cluster.
For example, if a large file is stored in HDFS, it will be split into multiple blocks and distributed across the nodes in the cluster. This allows for parallel processing of the file, enabling efficient data storage and retrieval for big data applications.