Final answer:
To show that if A and B are finite sets, then |A ∩ B| ≤ |A ∪B|. We can use the concept of cardinality to prove this inequality. The relationship |A ∩ B| ≤ |A ∪ B| is always true, and it becomes an equality only when A and B are disjoint.
Step-by-step explanation:
To show that if A and B are finite sets, then |A ∩ B| ≤ |A ∪B|, we can use the concept of cardinality. Cardinality is a measure of the size of a set, which is given by the number of elements in the set. Let's consider two finite sets, A and B.
- First, we can write the union of A and B as A ∪ B = A + B - A ∩ B. This is because the elements that are common to both A and B are counted twice when we add the sizes of A and B, so we need to subtract the intersection of A and B to avoid double counting.
- Next, we can rearrange the equation to get |A ∩ B| = |A| + |B| - |A ∪ B|. This equation shows that the cardinality of the intersection of A and B is equal to the sum of the cardinalities of A and B minus the cardinality of their union.
- Now, if we substitute the rearranged equation into the original inequality |A ∩ B| ≤ |A ∪ B|, we get |A| + |B| - |A ∪ B| ≤ |A ∪ B|.
- Simplifying further, we have |A| + |B| ≤ 2|A ∪ B|.
- The maximum value of |A ∪ B| is |A| + |B|, which occurs when A and B do not have any common elements (i.e. A ∩ B = ∅).
- Therefore, the relationship |A ∩ B| ≤ |A ∪ B| is always true, and it becomes an equality only when A and B are disjoint (i.e. A ∩ B = ∅).