Final answer:
Automatic clean-ups in Spark can be triggered by setting a periodic cleanup interval or restarting the Spark application.
Step-by-step explanation:
One way to trigger automatic clean-ups in Spark to handle accumulated metadata is by setting a periodic cleanup interval. This can be done by configuring the spark.cleaner.periodicGC.interval property in Spark's configuration. By setting a specific interval, Spark will automatically clean up accumulated metadata files at regular intervals.
Another option to trigger automatic clean-ups is by restarting the Spark application. When a Spark application is restarted, it will clear all the accumulated metadata files.
Manually deleting the metadata files is not recommended as it can lead to potential issues. Increasing the Spark cluster's memory does not directly trigger automatic clean-ups, but it can help mitigate any issues caused by accumulating metadata files.