220k views
3 votes
Security A has a higher standard deviation of returns than security B. We would expect that: I. Security A would have a risk premium equal to security B. II. The likely range of returns for security A in any given year would be higher than the likely range of returns for security B. III. The Sharpe ratio of A will be higher than the Sharpe ratio of B. Multiple Choice A. I only B. I and II only C. II and III onlyD. I, II and III

User Rshankar
by
5.5k points

1 Answer

3 votes

Answer:

B. I and II only

Step-by-step explanation:

Assumption: Apparently there is an error in the question and option 1 has been read as, "Security A would have a higher risk premium than Security B", only then answer I and II only can be selected.

Standard deviation of a security is a measure of security risk. If a security has a higher standard deviation, it means the risk is high.

A security that has more risk would have a greater risk premium to compensate for the extra risk assumed.

Sharpe ratio denotes excess return per unit of total risk i.e standard deviation.

In the given case, security A has a higher standard deviation than security B. This means that Security A would have a higher risk premium than Security B and also since the variation of returns is high (since standard deviation is high), the range of returns for Security A in any given year would be higher than the likely range of returns for Security B.

User Prabal Srivastava
by
5.7k points