Final answer:
Lazy evaluation in Apache Spark means that computations are delayed until an action is called, optimizing data processing.
Step-by-step explanation:
When discussing Apache Spark and its evaluation strategy, statements saying that Spark has lazy evaluation mean that computations on RDDs (Resilient Distributed Datasets) or other data structures are delayed until an action is required. The correct answer to what lazy evaluation means in Spark is A) Computations are delayed until an action is called. This allows Spark to optimize the processing by grouping operations and minimizing data shuffling across the cluster. It's a key feature of Spark that contributes to its efficiency in big data processing.