Final answer:
When the two computer programs run in parallel, their run times are extended to 12 minutes and 20 minutes, respectively. They would finish together when the longer-running program finishes, which would be in 20 minutes.
Step-by-step explanation:
To calculate the time it takes for both computer programs to finish running in parallel, we must first determine their new running times with the reduced speeds. The first program's speed is reduced by 50%, so it now takes twice as long to run, which is 12 minutes. The second program's speed is reduced by 40%, so it takes 120/60 = 2 times longer to run, which makes it 12 minutes * 1.67 = 20 minutes.
Starting both programs together, we have to find the longest time it would take for one to finish as they are running in parallel. Therefore, it will take 20 minutes for both programs to finish running because the second program, which now takes the longest, will finish in 20 minutes. Hence, the answer is 20 minutes.