175k views
2 votes
Solve the given equation for ùë•: log10ùë• + log10(ù땹5) = 2

1 Answer

2 votes

To provide a solution, first and foremost, we need to use a property of logarithms that says the sum of two logarithms is equal to the logarithm of the multiplication of these two logarithms. Therefore, we can write:

log10(x) + log10(x^5) = 2

as

log10(x*x^5) = 2

Simplified further, we get

log10(x^6) = 2

To get rid of the logarithm, we remove it by making use of the rule a = logb(c) is equivalent to b^a = c, so:

x^6 = 10^2

x^6 = 100

The next step is to take the sixth root of both sides:

x = 100^(1/6)

Solving this, we find the possible solutions for x in the real and complex number space. Therefore, the results are:

x = 10^(1/3) --> This is the real root.

And four complex roots:

x = 10^(1/3)*(-1 - sqrt(3)*i)/2

x = 10^(1/3)*(-1 + sqrt(3)*i)/2

x = 10^(1/3)*(1 - sqrt(3)*i)/2

x = 10^(1/3)*(1 + sqrt(3)*i)/2

Therefore, the equation has one real root and four complex roots.

User Douglas Gaskell
by
8.6k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.