Final answer:
Legitimacy is the word from the list that best fits the definition: Whether those in power are viewed as having the right to govern.
Step-by-step explanation:
Legitimacy is the word from the list that best fits the definition: Whether those in power are viewed as having the right to govern. Legitimacy refers to the belief and acceptance that those in power have the rightful authority to govern. In a democracy, the legitimacy of the government is derived from the fact that leaders are elected by the people. On the other hand, authoritarian governments, such as dictatorships, lack legitimacy as they seize and maintain power without the consent of the people.