27.4k views
1 vote
Let m and n be nonzero integers. Prove that gcd(m, gcd(m, n)) , n gcd(m, n) = 1?

User Vikaspraj
by
7.4k points

1 Answer

4 votes

Final answer:

The question seems to be about gcd properties where we prove that m and the gcd of m and n are coprime to n. This is concluded by the logic that any integer dividing both n and gcd(m, n) must divide the gcd itself, and therefore must be 1.

Step-by-step explanation:

The question appears to be about the properties of the greatest common divisor (gcd). However, the question seems to have a typo and might be asking if the gcd of m and gcd of m and n (which is the same as gcd of m and n) is coprime to n. Assuming this interpretation is correct, the proof is based on the fundamental properties of gcd.

To prove that gcd(m, gcd(m, n)), n gcd(m, n) = 1, we start by noting that gcd(m, n) divides both m and n, hence gcd(m, gcd(m, n)) is equal to gcd(m, n). Since n and gcd(m, n) are divisors of each other, they only share the factors they have in common. Therefore, if any integer divides both n and gcd(m, n), it must divide the gcd by definition, implying that the gcd is the largest such integer.

To state it more formally, if d is a common divisor of n and gcd(m, n), then d divides the gcd(m, n), i.e., d divides itself, which implies that d must equal 1, since any common divisor cannot be greater than gcd(m, n). Hence, gcd(m, gcd(m, n)), n gcd(m, n) must indeed equal 1, showing that they are coprime.

User Ksugiarto
by
7.9k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.