Final answer:
An endothermic process takes in heat from the surroundings, leading to an increase in the entropy of the surroundings, due to the general principle where entropy increases with heat transfer from hotter to colder objects.
Step-by-step explanation:
When a system process is endothermic, it takes heat from the surroundings, increasing the entropy of the surroundings. During an endothermic process, the system absorbs heat, leading to a decrease in the temperature of the surroundings. In contrast, an exothermic process releases heat into the surroundings, increasing their temperature. The quantity of heat for a process is represented by the letter q. For endothermic processes, q is positive, whereas for exothermic processes, q is negative.
It is also insightful to know that entropy generally increases when heat transfers from a hotter to a colder object. Consequently, the decrease in entropy of the hotter object (the surroundings in the case of an endothermic process) is outweighed by the increase in entropy of the colder object (the system that absorbs heat), resulting in an overall increase in entropy for the process.