157k views
5 votes
Your organization has a Spark application that sometimes consumes a large amount of compute resources. You need to run this on OCI.Which OCI service can be used to meet this requirement?

1 Answer

5 votes

Final answer:

OCI Data Flow is a service that can be used to run Spark applications requiring significant compute resources, with OCI Compute and OCI Container Engine for Kubernetes (OKE) as other possible solutions.

Step-by-step explanation:

To run a Spark application that consumes a large amount of compute resources on Oracle Cloud Infrastructure (OCI), you can use the OCI Data Flow service. OCI Data Flow is a fully managed Apache Spark service that lets you run big data applications without having to set up or manage any infrastructure. It scales automatically to handle large datasets and complex analytics. Alternatively, if you require more control over the environment, you might choose OCI Compute instances which can be configured with the necessary resources and environment to run your Spark application. Lastly, for containerized applications, OCI Container Engine for Kubernetes (OKE) offers a managed Kubernetes service which can also be suitable for deploying Spark jobs.

User Haseman
by
9.0k points