191k views
3 votes
You are trying to enter a prize drawing at the radio station but the lines are busy. You continue to call every 1-5 minutes hoping to get on the air. Which reinforcement schedule is this?

Question 9 options:
a)
variable interval
b)
fixed interval
c)
variable ratio
d)
fixed ratio

User Acecool
by
7.9k points

1 Answer

2 votes

Final answer:

The scenario given where calls are made every 1-5 minutes without knowing when they will be successful in reaching the radio station is an example of a variable interval reinforcement schedule.

Step-by-step explanation:

The behavior described in the scenario where a student continues to call a radio station every 1-5 minutes hoping to get on the air illustrates a variable interval reinforcement schedule. This is because the reinforcement (getting through to the radio station) is based on varying amounts of time, which in this case are unpredictable and not set. The caller does not know exactly when their call will be picked up, so they continue to exhibit the behavior (making calls) in hopes of receiving the reinforcement.

It's important to distinguish this from the others: a fixed interval reinforcement schedule occurs when behavior is rewarded after a set amount of time passes; a fixed ratio reinforcement schedule requires a set number of responses before the behavior is rewarded; and a variable ratio reinforcement schedule rewards a behavior after an unpredictable number of responses, which is what makes slot machines so engaging.

User Mrfreester
by
8.3k points