Final answer:
Spark Streaming is the component of the Spark unified stack that provides processing of data in real-time.
Step-by-step explanation:
The component of the Spark unified stack that provides processing of data arriving at the system in real-time is Spark Streaming. Spark Streaming is a scalable and fault-tolerant stream processing system built on top of Spark Core. It enables high-throughput and fault-tolerant processing of live data streams.
With Spark Streaming, real-time data can be processed using powerful Spark APIs, allowing for various operations such as transformations, aggregations, and machine learning algorithms to be applied to the data.
To use Spark Streaming, you need to set up a streaming context, define the input source, apply transformations and actions, and start the streaming context to begin receiving and processing data in real-time.