41.1k views
2 votes
If a polynomial function f(x) has roots 1+square root of 2 and-3, what must be a factor of f(x)?

If a polynomial function f(x) has roots 1+square root of 2 and-3, what must be a factor-example-1
User ClarkeyBoy
by
5.4k points

2 Answers

3 votes

Answer:

The correct option is B.

Explanation:

User Grisha Levit
by
6.1k points
2 votes
To find the factor a a polynomial from its roots, we are going to seat each one of the roots equal to
x, and then we are going to factor backwards.

We know for our problem that one of the roots of our polynomial is -3, so lets set -3 equal to
x and factor backwards:

x=-3

x+3=0

(x+3) is a factor of our polynomial.

We also know that another root of our polynomial is
1+ √(2), so lets set
1+ √(2) equal to
x and factor backwards:

x=1+ √(2)

x-1= √(2)

x-1- √(2)=0

(x-(1+ √(2))=0
(
(x-(1+ √(2) )) is a factor of our polynomial.

We can conclude that there is no correct answer in your given choices.
User Ishwr
by
6.4k points