Final answer:
When the interval size of a histogram is decreased, the total number of bars increases, and each bar will cover a narrower range and thus represent fewer data items. The height of the bars does not necessarily decrease, as it depends on the frequency of the data within those smaller intervals.
Step-by-step explanation:
When the size of a histogram's intervals is decreased while keeping the scale the same, the height of the bars need not necessarily decrease; rather, the change that will definitely occur is an increase in the total number of bars. As the interval size decreases, more intervals (or bars) are needed to cover the same range of data. The width of each bar will decrease because each interval represents a smaller range of data values. Contrary to the inquiry, each bar will represent fewer data items than before, as the data gets distributed over more bars.
To construct a histogram for a given set of data, you need to decide on the number of bars or classes to represent the data. An example given was the heights of 100 male semiprofessional soccer players. With continuous data such as height, a histogram consists of contiguous boxes that represent different intervals. For instance, if the original histogram had intervals of width 1, reducing the interval size to 0.5 would double the number of bars, assuming you have enough data points to fill those bars.