The Correct answer is option a.Alpha increases. An increase in Alpha increases the risk of making a Type I - Alpha Error in hypothesis testing. It's crucial to select an appropriate Alpha value to minimize the chances of drawing incorrect conclusions.
When using hypothesis testing, you are at risk of making a Type I - Alpha Error when the value of Alpha increases. Alpha, also known as the significance level, is the probability of rejecting the null hypothesis when it is actually true.
By increasing Alpha, you are raising the bar for evidence required to reject the null hypothesis. This increases the likelihood of falsely rejecting the null hypothesis and concluding that there is a significant effect or relationship when there isn't one in reality.
To understand this better, let's consider an example. Suppose you are testing a new drug's effectiveness. The null hypothesis states that the drug has no effect, while the alternative hypothesis suggests that it does. If you set a high Alpha value (e.g., 0.10), you are more likely to reject the null hypothesis and conclude that the drug is effective, even if it isn't.
It's important to choose an appropriate Alpha value to balance the risk of making Type I errors and Type II errors. A Type II error, also known as a Beta error, occurs when you fail to reject the null hypothesis when it is false. Increasing or decreasing Beta does not directly impact the risk of a Type I error.
In summary, an increase in Alpha increases the risk of making a Type I - Alpha Error in hypothesis testing. It's crucial to select an appropriate Alpha value to minimize the chances of drawing incorrect conclusions.
Learn more about Alpha value here,