66,292 views
12 votes
12 votes
Ayden earned a score of 50 on Exam A that had a mean of 75 and astandard deviation of 10. He is about to take Exam B that has a meanof450 and a standard deviation of 20. How well must Ayden score onExam B in order to do equivalently well as he did on Exam A? Assumethat scores on each exam are normally distributed.

User Muhammad Adeel
by
2.7k points

1 Answer

21 votes
21 votes

For the first exam we know that:


\begin{gathered} \mu_{1\text{ }}=75 \\ \sigma_1\text{ = 10} \\ \text{score}_1\text{ = 50} \end{gathered}

So, Ayden was 2.5 five standard deviation away from the mean:


50\text{ = }\mu_1\text{ - 2.5}\sigma_1\text{ = 75 - 2.5}\cdot10\text{ = 75 - 25}

For the second exam, we know that:


\begin{gathered} \mu_2\text{ = 450} \\ \sigma_2\text{ = 20} \end{gathered}

To do equivalently, Ayden score must be 2.5 standar deviations away from the mean, so:


\text{score}_2\text{ = }\mu_2-2.5\sigma_2\text{ = 450-2.5}\cdot20\text{ = 450-50 = 400}

User Godheaper
by
2.6k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.