31.6k views
2 votes
The equation for the cost in dollars of producing computer chip is y = .000015x^2 - .03x +35. Where X is the number of chips produced. Find the number of chips that minimizes the cost. What is the cost for that number of chips?

User Kila
by
5.4k points

2 Answers

6 votes

Final answer:

To find the number of chips that minimizes the cost, substitute the equation into the formula for the vertex of a quadratic equation. The number of chips that minimizes the cost is 1000, and the cost for producing 1000 chips is $20.

Step-by-step explanation:

To find the number of chips that minimizes the cost, we need to find the vertex of the quadratic equation for the cost. The equation for the cost of producing computer chips is y = .000015x^2 - .03x + 35, where x is the number of chips produced. The vertex of a quadratic equation is given by the formula x = -b/2a where a, b, and c are the coefficients of the equation.

For this equation, a = .000015 and b = -.03. Substituting these values into the formula, we get:

x = -(-.03)/(2 * .000015) = .03/(2 * .000015) = .03/ .00003 = 1000

So, the number of chips that minimizes the cost is 1000. To find the cost for that number of chips, we can substitute x = 1000 into the equation and solve for y. Plugging in the value of x, we get:

y = .000015(1000)^2 - .03(1000) + 35 = 15 - 30 + 35 = 20

Therefore, the cost for producing 1000 chips is $20.

User Greeso
by
5.3k points
1 vote

Answer:

The equation for the cost in dollars of producing computer chips

The equation for the cost in dollars of producing computer chips is C = 0.000015x^2 - 0.03x + 35, where x is the number of chips produce. Find the number of chips that minimizes the cost.

Step-by-step explanation:

User Teran
by
5.1k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.