Final answer:
A worker node in Spark is responsible for executing tasks within a Spark cluster and managing cluster resources.
Step-by-step explanation:
A worker node in Spark refers to a node responsible for executing tasks within a Spark cluster. It is one of the main components of the Spark architecture, along with the driver node and the cluster manager.
A worker node performs the actual data processing and computation tasks. It receives instructions from the driver node and works on the assigned tasks, accessing and manipulating the data stored in the cluster.
Therefore, option b) 'A node responsible for managing cluster resources' is the correct understanding of a worker node in Spark.