205k views
0 votes
A curve is given by y=(x-a)√(x-b) for x≥b, where a and b are constants, cuts the x axis at A where x=b+1. Show that the gradient of the curve at A is 1.

User Musium
by
4.8k points

1 Answer

2 votes

Answer:

A curve is given by y=(x-a)√(x-b) for x≥b. The gradient of the curve at A is 1.

Solution:

We need to show that the gradient of the curve at A is 1

Here given that ,


y=(x-a) √((x-b)) --- equation 1

Also, according to question at point A (b+1,0)

So curve at point A will, put the value of x and y


0=(b+1-a) √((b+1-b))

0=b+1-c --- equation 2

According to multiple rule of Differentiation,


y^(\prime)=u^(\prime) y+y^(\prime) u

so, we get


{u}^(\prime)=1


v^(\prime)=(1)/(2) √((x-b))


y^(\prime)=1 * √((x-b))+(x-a) * (1)/(2) √((x-b))

By putting value of point A and putting value of eq 2 we get


y^(\prime)=√((b+1-b))+(b+1-a) * (1)/(2) √((b+1-b))


y^(\prime)=(d y)/(d x)=1

Hence proved that the gradient of the curve at A is 1.

User Gustavo Kawamoto
by
5.0k points