Final answer:
To prove the equation |A^C ∩ B^C| = |U| − |A| − |B| + |A ∩ B| for subsets A and B of the universal set U, we use set notation and the principle of inclusion-exclusion.
Step-by-step explanation:
Let A and B be subsets of the finite universal set U. To prove that |A^C ∩ B^C| = |U| − |A| − |B| + |A ∩ B|, we need to show that the number of elements in the intersection of the complements of A and B is equal to the difference of the number of elements in U, A, and B, plus the number of elements in the intersection of A and B.
To begin, let's simplify the equation using set notation:
|A^C ∩ B^C| = |(U - A) ∩ (U - B)|
The intersection of two sets can be represented as the union of their complements:
|A^C ∩ B^C| = |(U - A) ∪ (U - B)|
Using the principle of inclusion-exclusion, we can rewrite the equation as:
|A^C ∩ B^C| = |(U - A) ∪ (U - B)|
= |(U - A)| + |(U - B)| - |(U - A) ∪ (U - B)|
= |U| - |A| + |U| - |B| - |(U - A) ∩ (U - B)|
= |U| - |A| - |B| + |(U - A) ∩ (U - B)|
Hence, the equation |A^C ∩ B^C| = |U| - |A| - |B| + |A ∩ B| is proved.