Explanation:
so, he predicted he had 12 questions wrong.
that would have been a score of 38.
but he got 42.
that is a difference of 42-38 = 4.
now, accuracy could have different meanings like actual score vs. predicted score, or which questions were right or wrong (for real or predicted), or only looking at the questions that were right (real vs. predicted), ...
I assume it is the first one.
so,
100% = 38 (prediction)
1% = 100%/100 = 38/100 = 0.38
how many % are these 4 questions that his prediction was off from reality ?
4 / 0.38 = 10.52631579...
so, his prediction was off by about 10.53%