Answer:
Check explaination
Step-by-step explanation:
You are to expect the majority classifier to score about 50% on leave-once-out cross-validation but scores zero (0%) every time.
This is so due to the fact that each data set is divided into 'x' subsets of equal size in a leave-one-out cross-validation.
From given data, a data set consisting of 100 positive and 100 negative examples. Here, using the majority classifier with the leave-one-out cross-validation.
The majority classifier is specified a set of training data and the majority in the training set, regardless of input that is always outputs the class.
If we continue making use of the majority classifier with the leave-one-out cross-validation is unbalanced for small permutations.
When an instance is deleted from the data set which is the majority in the training set, the majority inducer predicts one of the other two classes and always errors in classifying the test instances.
The leave-one-out estimated accuracy for a majority classifier. It will always predict the wrong answer. So, scores occurred 0% every time.