The height of a rocket, launched from the ground, is given by the function y = 30t - 5t^2, where y is in meters and t is in seconds. An observer is sitting at ground level a distance d from the launch point. If r is the distance between the observer and where the rocket is, show that dr/dt = ((30t - 5t^2)(30 - 10t)) / sqrt((30t - 5t^2)^2 + d^2). Show all your work.
pls help i need this by 11 pm ty :)