201k views
5 votes
Given f(x) = x3 – 2x2 – x + 2,

the roots of f(x)

User Nairolf
by
5.3k points

2 Answers

6 votes

f(x) = x³ – 2x² – x + 2

0 = x3 – 2x2 – x + 2

(x-2)(x-1)(x+1)=0

x1=-1

x2=1

x3=2

User Smack Alpha
by
5.4k points
4 votes

For this case we must follow the steps below:

We factor the polynomial, starting by factoring the maximum common denominator of each group:


x ^ 2 (x-2) - (x-2)

We factor the maximum common denominator
(x-2):


(x-2) (x ^ 2-1)

Now, by definition of perfect squares we have:


a ^ 2-b ^ 2 = (a + b) (a-b)

Where:


a = x\\b = 1

Now, we can rewrite the polynomial as:


(x-2) (x + 1) (x-1)

To find the roots we equate to 0:


(x-2) (x + 1) (x-1) = 0

So, the roots are:


x_ {1} = 2\\x_ {2} = - 1\\x_ {3} = 1

Answer:


x_ {1} = 2\\x_ {2} = - 1\\x_ {3} = 1

User Gadget
by
4.8k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.