80.2k views
3 votes
Apache Spark can run on which two of the following cluster managers?

A. Apache Mesos
B. oneSIS
C. Hadoop YARN
D. Linux Cluster Manager
E. Nomad

User Navneeth
by
7.2k points

1 Answer

2 votes

Final answer:

Apache Spark can run on A. Apache Mesos and C. Hadoop YARN cluster managers, which provide efficient resource allocation and processing capabilities in a distributed computing environment.

Step-by-step explanation:

Apache Spark can run on various cluster managers, providing flexibility and scalability for distributed data processing tasks. The two cluster managers that Apache Spark can run on as per your options are A. Apache Mesos and C. Hadoop YARN. Apache Mesos is an open-source project to manage computer clusters that provide efficient resource isolation and sharing across distributed applications or frameworks. Hadoop YARN (Yet Another Resource Negotiator) is a resource management layer in the Hadoop ecosystem, which allows multiple data processing engines to handle data stored on a single platform.

User Bhupendra Patel
by
8.3k points