Because every one of the sources of error in his experiment was bigger than what he was trying to measure.
Here's what he did, as I recall:
-- Galileo and his buddy took flashlights and hiked up two hilltops at night.
-- Galileo pointed his flashlight at his buddy on the other hill. Then he turned it on, and started his clock.
-- His buddy on the other hill pointed his flashlight at Galileo, and waited. When he saw the light from Galileo's flashlight, he turned HIS on.
-- When Galileo saw the light from his buddy's flashlight, he stopped his clock.
-- Galileo sent a message to his buddy: "Bel lavoro, amico. Andiamo alla taverna, ti offro una bibita fresca, e possiamo trovare ragazze."
Now, Galileo figured that the time shown on his clock was the time it took light to make the round trip from him to his buddy and back.
Yes, that was part of it. But there was probably also some time ...
-- between starting his clock and his flashlight,
-- between his buddy SEEing the light and STARTing his light,
-- between seeing his buddy's light and stopping the clock.
How much time could each of those add to the clock ? Well, one problem was that they couldn't be measured, and they were probably different every time the experiment. was repeated.
What do you think ? Could each of those reaction times be 0.1 second maybe ? So all together, there was an extra 0.3 seconds on the clock that had nothing to do with the light ?
Well, 0.3 seconds is about what light takes to travel almost 56,000 miles ! That's more than twice around the Earth !
If Galileo and his buddy were even 5 miles apart on their hilltops, their lights could make the round trip in something like 0.00005 second ! Those "reaction times" of theirs, adding a few tenths of a second on the clock, totally gummed up the experiment for sure. The thing that they were really trying to measure was ... as engineers say ... "lost in the noise".