80.8k views
0 votes
Recall the Akaike information Criterion (AIC) as well as the related Bayesian Information Criterion (BIC) from our study of Variable Selection and Model Building. Without heavy mathematical notation, explain how both the AIC and BIC emphasize goodness-of-fit of a regression model while also controlling model complexity. Why would we wish to do this (that is, to promote goodness-of-it while at the same time controlling model complexity)

User Daniellee
by
3.4k points

1 Answer

2 votes

Explanation:

To check out how efficient or accurate a model is, we use the akaike information criterion or the Bayesian. If the AIC or BIC are lower, then this model would be better. They are also used to control for model complexity

Akaike information criterion = 2k-2ln where k is the number of parameter. A higher k gives a higher AIC.

In the real world complex models are discouraged and avoided since

1. They cause data to be over fitted and can capture noise and information from this data.

2. They are complex and therefore difficult to interpret

3. They consume a lot of time and computing them has several inefficiencies.

Using these two as measure of performance, we can select optimal choice of independent variable.

With forward/backward regression, we are able to put new variables in the model or remove from it. The best is the one with lowest AIC.

User Ang
by
4.0k points