Final answer:
Option (C), In the field of computer science, 'Big Data' refers to data sets that are so voluminous and complex that traditional data processing software is inadequate to deal with them. The Sloan Survey data, which amounted to 15 terabytes, is an example of Big Data, requiring sophisticated software and algorithms for processing. Big Data is characterized by its large volume, variety of data types, and the quick rate at which it's generated.
Step-by-step explanation:
When a computer scientist refers to "Big Data," they typically mean that there is a large volume of data that goes beyond the capacity of traditional data processing applications. Option (c) from the question captures the essence of the term, as it implies that the data sets are of such a size and complexity that conventional data management tools are inadequate for capturing, storing, managing, and analyzing the data within a reasonable timeframe.
The examples from the Sloan Survey illustrate what could be considered Big Data in practical terms. The survey generated 15 terabytes of data, which was managed through advanced computer algorithms and sometimes with the help of crowd-sourced human intelligence in projects such as the "Galaxy Zoo." As our ability to collect data has grown, so has the need to develop new methodologies for dealing with these vast datasets.
It is important to note that Big Data can include not just large volumes of data but also the variety of data types and the speed at which it is generated, often referred to as the three V's: volume, variety, and velocity. The importance of being able to process and understand these data sets extends across fields, impacting decision-making in agriculture, business, science, and countless other sectors of the economy and society.