Final answer:
In a histogram, you commonly compare a value to the low or high endpoint of each of a series of numerical ranges, but not to the midpoint.
Step-by-step explanation:
In a histogram, you commonly compare a value to the low or high endpoint of each of a series of numerical ranges, but not to the midpoint. A histogram is a graphical representation of the distribution of a dataset, where the data is divided into intervals and the frequency of values falling into each interval is shown.
For example, in a histogram that represents test scores, you would compare a test score to the low or high endpoint of each interval to determine which interval it falls into.
By comparing a value to the low or high endpoint of each interval in a histogram, you can determine the frequency or count of values that fall into each interval.