Final answer:
Salespeople in the U.S. primarily work in sales to make more money.
Step-by-step explanation:
Salespeople in the U.S. primarily work in sales to make more money. Sales positions often offer commission-based incentives, which allow salespeople to earn more money based on their performance and the number of sales they generate. This motivates salespeople to work hard and achieve higher sales volumes, ultimately leading to higher earnings.