206k views
0 votes
The heights of the apple trees in an orchard are normally distributed with a mean of 12.5 feet and a standard

deviation of 1.2 feet. What percentage of the apple trees are between 10.1 and 13.7 feet tall?

User DimKoim
by
7.9k points

1 Answer

4 votes

Answer:

18.141%

Explanation:

We start by calculating the z-scores

Mathematically;

z-score = (x-mean)/SD

for 10.1

z = (10.1-12.5)/1.2 = -2

for 13.7

z = (13.7-12.5)/1.2 = 1.2/1.2 = 1

So we need the probability within this range

P(-2 < x < 1)

We can check the probability using the standard normal distribution table

That will be;

P(-2 < x < 1) = 0.18141

Converting this to percentage, we have 18.141%

User Brent Worden
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories