107k views
1 vote
When did the earth sciences become more recognized professions

1 Answer

1 vote

Answer:

Earth sciences have always had certain importance, since thanks to them we generate new information about energy flows in nature and we learn to take advantage of this information to prevent meteorological, seismic, volcanic, and environmental risks.

Explanation:

Earth sciences or geosciences are disciplines that study the changes that exist in it, such as the structure, movement, and evolution existing.

All these disciplines help to plan the rational exploitation of resources, understand the causes of natural disasters, and how they directly influence nature.

Some of these are geology, geophysics, oceanography, climatology. etc.

User Greta
by
4.4k points