Final answer:
The question pertains to how firms measure costs, specifically focusing on average cost and marginal cost per unit of output. Average cost is the total cost divided by the quantity produced, while marginal cost is the additional cost of producing one more unit.
Step-by-step explanation:
Firms often analyze costs by looking at the cost per unit of output in addition to their total costs. There are two primary ways to measure per unit costs: average cost and marginal cost. The average cost is simply the total cost divided by the quantity of output produced (AC = TC/Q). For instance, if producing two widgets costs $44 in total, then the average cost per widget is $22.
Marginal cost, however, is concerned with the cost of producing an additional unit of output and is calculated as the change in total cost divided by the change in output (MC = ΔTC/ΔQ).
Using the same example, if the cost to produce the first widget is $32.50 and the cost to produce two widgets is $44, the marginal cost of the second widget is therefore $11.50.
The reason total cost and average cost are not depicted on the same graph is that they are measured in different units. Total cost, fixed cost, and variable cost are calculated in absolute dollars covering the entire quantity of output, while marginal cost, average cost, and average variable cost are presented as costs per unit. Putting them on the same graph would be confusing as it would mix different units of measurement ($ versus $ per unit of output).