Catania and Reynolds 1968 - Interval Scehdules

Download as pdf or txt
Download as pdf or txt
You are on page 1of 57

JOURNAL OF THE EXPERIMENTAL ANALYSIS OF BEHAVIOR 1968, 112 327-383 NUMBER 3 (MAY) PART 2

A QUANTITATIVE ANALYSIS OF THE RESPONDING


MAINTAINED BY INTERVAL SCHEDULES
OF REINFORCEMENT'
A. CHARLES CATANIA AND G. S. REYNOLDS
NEW YORK UNIVERSITY AND UNIVERSITY OF CALIFORNIA, SAN DIEGO

Interval schedules of reinforcement maintained pigeons' key-pecking in six experiments. Each


schedule was specified in terms of mean interval, which determined the maximum rate of
reinforcement possible, and distribution of intervals, which ranged from many-valued (varia-
ble-interval) to single-valued (fixed-interval). In Exp. 1, the relative durations of a sequence
of intervals from an arithmetic progression were held constant while the mean interval was
varied. Rate of responding was a monotonically increasing, negatively accelerated function of
rate of reinforcement over a range from 8.4 to 300 reinforcements per hour. The rate of re-
sponding also increased as time passed within the individual intervals of a given schedule.
In Exp. 2 and 3, several variable-interval schedules made up of different sequences of inter-
vals were examined. In each schedule, the rate of responding at a particular time within an
interval was shown to depend at least in part on the local rate of reinforcement at that time,
derived from a measure of the probability of reinforcement at that time and the proximity
of potential reinforcements at other times. The functional relationship between rate of re-
sponding and rate of reinforcement at different times within the intervals of a single schedule
was similar to that obtained across different schedules in Exp. 1. Experiments 4, 5, and 6
examined fixed-interval and two-valued (mixed fixed-interval fixed-interval) schedules, and
demonstrated that reinforcement at one time in an interval had substantial effects on respond-
ing maintained at other times. It was concluded that the rate of responding maintained by
a given interval schedule depends not on the overall rate of reinforcement provided but
rather on the summation of different local effects of reinforcement at different times within
intervals.

CONTENTS The statement that responses take place in


time
Experiment 1: Rate of responding as a func- behavior expresses a fundamental characteristic of
tion of rate of reinforcement in arithmetic sponses occur (Skinner, 1938, pp. 263-264). Re-
at different rates, in different se-
variable-interval schedules.
Experiment 2: Effects of a zero-sec interval
in an arithmetic variable-interval schedule. 'This research was supported by NSF Grants G8621
Experiment 3: Effects of the distribution of and G18167 (B. F. Skinner, Principal Investigator) to
intervals in variable-interval schedules on Harvard University, and was conducted at the Har-
Psychological Laboratories. Some of the material
changes in the local rate of responding vard has been presented at the 1961 and 1963 meetings of
within intervals. the Psychonomic Society. The authors' thanks go to
Experiment 4: Overall and local rates of re- Mrs. Antoinette C. Papp and Mr. Wallace R. Brown,
sponding within three fixed-interval sched- Jr., for care of pigeons and assistance in the daily con-
duct of the experiments, and to Mrs. Geraldine Han-
ules. sen for typing several revisions of the manuscript. We
Experiment 5: Effects of the separation in are indebted to many colleagues, and in particular to
time of opportunities for reinforcement in N. H. Azrin, who maintained responsibility for the
two-valued interval schedules. manuscript well beyond the expiration of his editorial
Experiment 6: Effects of the omission of re- term, and to .D. G. Anger, L. R. Gollub, and S. S. Plis-
koff. Some expenses of preparation of the manuscript
inforcement at the end of the long interval were defrayed by NSF Grant GB 3614 (to New York
in two-valued interval schedules. University), by NSF Grants GB 316 and GB 2541 (to
General Discussion. the University of Chicago), and by the Smith Kline
and French Laboratories. Expenses of publication
were defrayed by NIH Grant MH 13613 (to New York
Appendix I: Analysis in terms of interre- University) and NSF Grants GB 5064 and GB 6821 (to
sponse times. the University of California, San Diego). Reprints may
Appendix II: Constant-probability variable- be obtained from A. C. Catania, Department of Psy-
interval schedules. chology, New York University, University College of
Arts and Sciences, New York, N.Y. 10453.
327
328 A. CHARLES CATANIA and G. S. REYNOLDS

quences, and with different temporal patterns, vidual intervals. (The former, a rate calcu-
depending on the temporal relations between lated over the total time in all the intervals
the responses and other events. One event of of a schedule, will be referred to as an overall
fundamental interest is reinforcement, and rate; the latter, a rate calculated over a period
the rate at which responses occur and the of time that is short relative to the average
changes in this rate over time are strongly de- interval between reinforcements, will be re-
termined by the schedule according to which ferred to as a local rate. The terms will be ap-
particular responses are reinforced (e.g., Morse, plied to reinforcement as well as to respond-
1966). ing. The terminology has the advantage of
An interval schedule arranges reinforce- pointing out that both reinforcement and re-
ment for the first response that occurs after a sponding are measured in terms of events per
specified time has elapsed since the occurrence unit of time.)
of a preceding reinforcement or some other In a VI schedule, a response at a given time
environmental event (Ferster and Skinner, after reinforcement is reinforced in some in-
1957). In such a schedule, the spacing of re- tervals but not in others. The probability of
inforcements in time remains roughly con- reinforcement at this time is determined by
stant over a wide range of rates of responding. the relative frequency of reinforcement at this
The schedule specifies certain minimum inter- time, which may be derived from the distri-
vals between two reinforcements; the actual bution of intervals in the schedule. The dis-
durations of these intervals are determined tribution of intervals in a VI schedule may
by the time elapsed between the availability act upon behavior because the time elapsed
of reinforcement, at the end of the interval, since a preceding reinforcement (or since any
and the occurrence of the next response, other event that starts an interval) may func-
which is reinforced. The patterns and rates of tion as a discriminable continuum. Skinner
responding maintained by interval schedules (1938, p. 263 ff.), in his discussion of temporal
usually are such that this time is short rela- discrimination, included the discrimination
tive to the durations of the -intervals. of the time elapsed since reinforcement as a
Ferster and Skinner (1957, Ch. 5 and 6) factor in his account of the performances
have described in considerable detail somie maintained by FI schedules. The major dif-
important features of the performances main- ference between FI and VI schedules is that
tained by interval schedules. In a fixed-inter- an Fl schedule provides reinforcement at a
val (FI) schedule, the first response after a fixed point along the temporal continuum,
fixed elapsed time is reinforced, and an or- whereas a VI schedule provides reinforcement
ganism typically responds little or not at all at several points. The present account ana-
just after reinforcement, although responding lyzes performances maintained by different
increases later in the interval. In a variable- interval schedules in terms of the local effects
interval (VI) schedule, the first response after of different probabilities of reinforcement on
a variable elapsed time is reinforced, and a the local rates of responding at different times.
relatively constant rate of responding is main- Within interval schedules, reinforcement
tained throughout each interval. Detailed ex- may be studied as an input that determines
amination shows, however, that this respond- a subsequent output of responses (cf. Skinner,
ing may be modulated by the particular dura- 1938, p. 130). In this sense, the study of the
tions of the different intervals that constitute performances maintained by interval sched-
the schedule. In other words, the distribution ules is a study of response strength. The con-
of responses in time depends on the distribu- cept of response strength, once a reference to
tion of reinforcements in time. For example, an inferred response tendency or state, has
r esponding shortly after reinforcement in- evolved to a simpler usage: it is "used to des-
creases with increases in the relative fre- ignate probability or rate of responding"
quency of short intervals in the schedule (Fer- (Ferster and Skinner, 1957, p. 733). This evo-
ster and Skinner, 1957, p. 331-332). Thus, it is lution is a result of several related findings:
important to study not only the rate of re- that the schedule of reinforcement is a pri-
sponding averaged over the total time in an mary determinant of performance; that differ-
interval schedule, but also the changes in the ent measures of responding such as rate and
rate of responding as time passes within indi- resistance to extinction are not necessarily
INTERVAL SCHEDULES OF REINFORCEMENT 329
highly correlated; that rate of responding is forced on VI schedules for at least 50 hr be-
relatively insensitive to such variables as fore the present experiments.
amount of reinforcement and deprivation; The experimental chamber was similar to
that rate of responding is itself a property of that described by Ferster and Skinner (1957).
responding that can be differentially rein- Mounted on one wall was a translucent
forced; and that rate of responding can be Plexiglas response key, 2 cm in diameter and
reduced to component interresponse times operated by a minimum force of about 15 g.
(e.g., Anger, 1956; Ferster and Skinner, The key was transilluminated by two yellow
1957; Herrnstein, 1961; Skinner, 1938). Nev- 6-w lamps. Two white 6-w lamps mounted on
ertheless, the relationship between reinforce- the chamber ceiling provided general illumi-
ment and responding remains of fundamen- nation. The operation of -the key occasion-
tal importance to the analysis of behavior. ally produced the reinforcer, 4-sec access to
Many studies of response strength have been mixed grain in a standard feeder located be-
concerned with the acquisition of behavior hind a 6.5-cm square opening beneath the
(learning: e.g., Hull, 1943) or with the rela- key. During reinforcement, the feeder was il-
tive strengths of two or more responses luminated and the other lights were turned
(choice: e.g., Herrnstein, 1961). The present off.
experiments emphasize reinforcement as it Electromechanical controlling and record-
determines performance during maintained ing apparatus was located in a separate room.
or steady-state responding, rather than dur- A device that advanced a loop of punched
ing acquisition, extinction, and other transi- tape a constant distance with each operation
tion states, and are concerned with absolute (ratio programmer, R. Gerbrands Co.) was
strength, rather than with strength relative stepped by an electronic timer, and intervals
to other behavior. between reinforcements were determined by
the spacing of the holes punched in the tape.
Thus, the absolute durations of the intervals
EXPERIMENT 1: depended on the rate at which the timer op-
RATE OF RESPONDING AS A erated the programmer, but the relative du-
FUNCTION OF RATE OF rations were independent of the timer.
REINFORCEMENT IN The punched holes in the tape provided a
VARIABLE-INTERVAL series of 15 intervals from an arithmetic pro-
SCHEDULES gression, in the following order: 14, 8, 11, 6,
The relation between the overall rate of 5, 9, 2, 13, 7, 1, 12, 4, 10, 0, 3. The numbers
reinforcement and the overall rate of a pi- indicate the durations of the intervals be-
geon's key-pecking maintained by interval tween successive reinforcements in multiples
schedules may be thought of as an input-out- of t sec, the setting of the electronic timer.
put function for the pigeon. In Exp. 1, this (To permit the arrangement of a 0-sec inter-
function was determined for VI schedules val, in which reinforcement was available for
over a range of overall rates of reinforcement the first peck after a preceding reinforcement,
from 8.4 to 300 rft/hr (reinforcements per the ratio programmer was stepped at each re-
hour). Each schedule consisted of an arithme- inforcement as well as at the rate determined
tic series of 15 intervals ranging from zero to by the electronic timer.) In this series, the
twice the average value of the schedule and average interval of the VI schedule was 7t
arranged in an irregular order. Thus, the sec; with t equal to 6.5 sec, for example, the
relative durations of the particular intervals average interval was 45.5 sec.
that made up each schedule were held con- At the end of each interval, when a peck
stant. was to be reinforced, the controlling appa-
ratus stopped until the peck occurred; the
METHOD next interval began only at the end of the
4-sec reinforcement. Thus, the apparatus ar-
Subjects and Apparatus ranged a distribution of minimum interrein-
The key-pecking of each of six adult, male, forcement intervals; the actual intervals were
White Carneaux pigeons, maintained at 80% given by the time from one reinforcement to
of free-feeding body weights, had been rein- the next reinforced response. In practice, the
330 A. CHARLES CATANIA and G. S. REYNOLDS

rates of responding at most VI values were for at least 15 daily sessions and until the
such that differences between the minimum pigeon's performance was stable, as judged
and the actual interreinforcement intervals by visual inspection of numerical data and
were negligible. cumulative records, for five successive ses-
Stepping switches that stepped with each sions. With few exceptions, the rate of re-
step of the ratio programmer and that reset sponding in each of the last five sessions of a
after each reinforcement distributed key- given schedule was within 10% of the aver-
pecks to the 14 counters, which represented age rate over those sessions.
successive periods of time after reinforce- The first peck in each session was rein-
ment. The time represented by each counter forced and the VI schedule then operated,
was t sec, and each counter recorded re- beginning at a different place in the series
sponses only within interreinforcement inter- of intervals in successive sessions. Thus, each
vals equal to or longer than the time after scheduled interval, including the first in the
reinforcement that the counter represented. session, began after a reinforcement. Sessions
For example, the first counter cumulated re- ended after each interval in the series had oc-
sponses that occurred during the first t sec of curred four times (61 reinforcements). Thus,
all intervals except the 0-sec interval (the 0- the duration of a session ranged from about
sec interval was terminated by a single rein- 16 min (12 min of VI 12-sec plus 61 rein-
forced response). Correspondingly, the sev- forcements) to about 431 min (427 min of VI
enth counter cumulated responses during the 427-sec plus 61 reinforcements).
seventh t sec of only those intervals 7t sec
long or longer. The fourteenth counter cu- RESULTS
mulated responses only during the four- The overall rate of key-pecking as a func-
teenth t sec of the 14t-sec interval, the long- tion of the overall rate of reinforcement is
est interval in the series. Thus, response rates shown for each pigeon in Fig. 1. The func-
at early times after reinforcement were based tions were, to a first approximation, mono-
on larger samples of pecking than response tonically increasing and negatively acceler-
rates at later times. ated, perhaps approaching an asymptotic
level for some pigeons. With increasing rates
Procedure of reinforcement, the rate of responding in-
Seven VI schedules with average intervals creased more rapidly at low rates of rein-
ranging from 12.0 to 427 sec (300 to 8.4 rft/hr) forcement (for most pigeons, to roughly 50
were examined. Each pigeon was exposed to rft/hr) than at higher rates of reinforcement.
VI 12.0-sec, VI 23.5-sec, and VI 45.5-sec, and to The shapes of the functions differed in detail
a sample of the longer average intervals, as from pigeon to pigeon: Pigeon 118, for exam-
indicated in Table 1. (Occasional sessions in ple, produced a fairly smooth increasing
which equipment failed have been omitted; function; Pigeon 121, an almost linear func-
none were within the last five sessions of a tion; and Pigeons 278 and 279, a rapid in-
given schedule.) Each schedule was in effect crease to a near invariance in the rate of re-

tble 1

Mean intervals (sec) of the arithmetic variable-interval schedules arranged for each pigeon,
with number of sessions for each schedule shown in parentheses.
Pigeon
118 121 129 278 279 281
108 (52) 45.5 (52) 108 (29) 23.5 (35) 427 (52) 23.5 (35)
45.5 (29) 23.5 (29) 216 (35) 12.0 (17) 216 (29) 45.5 (17)
23.5 (22) 12.0 (58) 427 (29) 45.5 (29) 108 (22) 12.0 (29)
12.0 (36) 108 (22) 23.5 (22) 216 (22) 23.5 (36) 427 (58)
323 (37) 23.5 (15) 45.5 (36) 108 (36) 12.0 (22) 45.5 (22)
108 (28) 12.0 (22) 45.5 (22) 45.5 (15) 12.0 (43)
23.5 (15) 427 (15) 108 (29)
108 (28) 427* (26)
Reinstated after interruption.
INTERVAL SCHEDULES OF REINFORCEMENT 331

120 _ .c- W41

RFT INTERVAL (SEC)


120 60 40 30 2*0 15 1?
50 100 150 200 250 300
80 pf t RFT/HR

40

118 121
z
E80

LI)

z
129 278

100 200 0 100 200 300


REINFORCEMENTS PER HOUR
Fig. 1. Rate of key-pecking as a function of rate of reinforcement for six pigeons. Key-pecking was main-
tained by VI schedules consisting of 15 intervals in an arithmetic progression of size, but arranged in an ir-
regular order. Each point is the arithmetic mean of the rates of responding over the last five sessions of a given
schedule. Numerals 1 and 2 indicate first and second determinations. Some representative average interrein-
forcement intervals, proportional to reciprocals of the rates of reinforcement (rft/hr), are shown on the scale at
the upper right.
332 A. CHARLES CATANIA and G. S. REYNOLDS

sponding. This near invariance might be ules studied with most pigeons. The major
called a "locked rate" (Herrnstein, 1955; Sid- exceptions were some pigeons' data from the
man, 1960), a term that has been applied to shorter VI schedules: Pigeon 121 at 12.0 and
the occasionally observed insensitivity of a 23.5 sec; Pigeon 278 at 23.5 and 45.5 sec; and
given pigeon's rate of responding to changes Pigeon 281 at 12.0 sec. It may be relevant that
in the parameters of an interval schedule of only in these schedules were rates of respond-
reinforcement. ing sometimes low enough to produce large
Despite the near invariance, the functions differences between the minimum and actual
appear in general to increase over their en- interreinforcement intervals. For the remain-
tire range. (Reversals, as for Pigeon 129 at ing functions, there appeared to be no sys-
33.3 rft/hr, were within the limits of varia- tematic ordering from one pigeon to another
bility implied by the redeterminations, which of the slopes or degrees of curvature of the
generally produced higher rates of respond- several functions (see, however, Exp. 3, Dis-
ing than the original determinations.) The cussion).
average rate of responding maintained by As with overall rates of responding (Fig. 1),
300 rft/hr was higher than that maintained the functions differed in detail from pigeon
by 153 rft/hr for all pigeons. In addition, to pigeon, even if the atypical data from the
rates of responding at higher rates of rein- shorter VI schedules are ignored. For a given
forcement may be spuriously low, because the pigeon, however, the functions in Fig. 1 and
contribution of the latency of the first re- in Fig. 3 were generally similar: fairly smooth
sponse after reinforcement to the overall rate increasing functions for Pigeon 118, almost
of responding was greatest at the higher rates linear functions for Pigeon 121 except for
of reinforcement. A correction for this la- data from the shorter VI schedules, and rapid
tency would slightly increase rates of re- increases to a near invariance for Pigeons 278
sponding at the higher rates of reinforce- and 279. The similarity is debatable for Pi-
ment (300, 153, and, perhaps, 79 rft/hr), but geon 281 even when the 12.0-sec function is
would have virtually no effect at the lower disregarded, and no simple relationship is
rates of reinforcement. Despite the small evident between the two sets of data for Pi-
changes at high rates of reinforcement, it geon 129. The possible significance of the
seems reasonable to conclude that overall ---similarities is that the same variables may
rates of responding increase monotonically have operated to produce changes in both the
(perhaps approaching an asymptote) as over- local rate of responding, as time passed
all rate of reinforcement increases. within interreinforcement intervals, and in
Within individual intervals between two the overall rate of responding, when the over-
reinforcements, the rate of key-pecking in- all rate of reinforcement was changed.
creased with increasing time since reinforce- A cumulative record of the responding of
ment, as shown for each pigeon in Fig. 2, Pigeon 118 is shown in Fig. 4. UJpward con-
which plots local rates of responding against cavity, which indicates an increasing rate of
the absolute time elapsed since reinforce- responding, is evident in almost every inter-
ment. The functions reflect in their vertical val between reinforcements. The averaging
separation the different overall rates main- of rates of responding across intervals as-
tained by each schedule (Fig. 1). sumed that there was no systematic change
Data obtained with each arithmetic VI in the responding within intervals from one
schedule for each pigeon are plotted against interval to another. No consistent sequential
relative time since reinforcement in Fig. 3. effects were evident in the cumulative rec-
The functions have been adjusted by multi- ords; if present, they constituted a relatively
plying local rates of responding by constants minor effect that, for the present purposes,
chosen to make the average rate of respond- will be ignored.
ing for each function equal to 1.0. When the
(lifferences in overall levels of the functions DISCUSSION
were remove(d by this adjustment, the local Overall rates of responding. Individual dif-
rate of responding within intervals grew as ferences among pigeons were considerable,
approximately the same function of relative but the functions relating overall rate of re-
time after reinforcement in most VI sched- sponding to overall rate of reinforcement
INTERVAL SCHEDULES OF REINFORCEMENT 333

12.0 sec.
23.5 sec.18 118
45.5,se.
108 sec.
100 323 s.c.

50

C0 III I
12.0 am..
f 121
50 23.5 sec.

4_ 1 0 18 sc.

'12
100 0~~
.5 sc. 1
~ ~ ~ ~ ~~~2
129
D|0X t1s12.0suc.1 1 1 216 sec.
Ei 5, 5 se Sec

50
LUJ

12.0 sec. 1
1 23.5 s.c. SCALE
100
o I 216 sec I 2Osc I I 427 sec.

4 5.5 sc. 40279-

5102

a L
0 200 400 600 800
TIME SINCE REINFORCEMENT (SECONDS)

Fig. 2. Rate of key-pecking as a function of time since reinforcement in several arithmetic VI schedules. The
function for each schedule, composed of averages of the local rates of responding over the last five sessions of
the schedule, is identified by the mean interreinforcement interval. Two of the 12.0-sec functions have been dis-
placed on the ordinate, as indicated by the inserted scales (Pigeons 278 and 279). For those schedules arranged
twice for a given pigeon, only one function, chosen on the basis of convenience of presentation, has been plotted.
334 A. CHARLES CATANIA and G. S. REYNOLDS

7
V I (sec) 0
1.5 * 12 o108 0
*23.5 a216 0£
* 45.5 0323 A -
ao427 A

(a tw P. I10 0~ I GI
0
1nl I -
0
1 .

z 0
U* *-
0
a
z 10-4
0.5
0~() 118 -
121
a a a a
m It a
I. i I I I I I I J
LUJ
CL .

I'S1 *at4-
0 1.0 I-I o * gi *t
LLJ p431
It44
J a£ Ae
A& A

U.S 129 e-
278
10
10 .o I
I 1.5
.
:D 1 0.

I A A i a*ae
£2 0

< 1.0 I"ttt*'si 6 s 0

-p -U0 oooa
0

cJ
-0

4
0
0.5
l! S a a a
279 3
'o
281
0 0.5 1.0 1.5 2.0 0 0.5 1.0 1.5 2.0
TIME SINCE REINFORCEMENT
(RELATIVE TO AVERAGE INTERREINFORCEMENT INTERVAL)
Fig. 3. Rate of key-pecking, adjusted so that the average rate of pecking equals 1.0, as a function of relative
time since reinforcement in several arithmetic VI schedules. For those schedules arranged twice for a given pi-
geon, only the first determination has been plotted.
INTERVAL SCHEDULES OF REINFORCEMENT 335

118 ARITHMETIC VI 108-sec

[/.vl/w Il
Vt/I/i
RVs/min

1MP
I

10 MINUTES

Fig. 4. Cumulative record of a full session of key-pecking maintained by an arithmetic VI schedule with a
mean interreinforcement interval of 108-sec (Pigeon 118). The recording pen reset to baseline after each rein-
forcement, indicated by diagonal pips as at a, a reinforcement after a zero-sec interval. Curvature can be seen
most easily by foreshortening the figure.

were generally monotonically increasing and Farmer (1963), whose data are discussed later
negatively accelerated. The general nature of (General Discussion). Other data have been
this relationship is well supported by the lit- obtained from pigeons by Cumming (1955)
erature on both VI and Fl schedules. Both and by Ferster and Skinner (1957). In Cum-
pigeons and rats have been studied in a va- ming's experiment, rates of responding did
riety of experimental contexts, usually over not increase monotonically with rates of re-
a narrower range of rates of reinforcement inforcement, but rates of responding may not
than was studied here. Monotonically increas- have reached asymptotic levels and the VI
ing and negatively accelerated functions have schedules alternated with a stimulus-corre-
been obtained from rats by Skinner (1936; lated period of extinction. Ferster and Skin-
data obtained early in the acquisition of Fl ner presented data in the form of cumulative
performance), Wilson (1954; Fl schedules), records selected to show detailed characteris-
Clark (1958; VI schedules at several levels of tics of responding; the data therefore were not
deprivation), and Sherman (1959; Fl sched- necessarily representative of the overall rates
ules). The same relationship may hold for of responding maintained by each schedule.
schedules of negative reinforcement (Kaplan, Monotonically increasing and negatively
1952; Fl schedules of escape). Similar func- accelerated functions relating total respond-
tions have been obtained from pigeons by ing to total reinforcement in concurrent
Schoenfeld and Cumming (1960) and by schedules (Findley, 1958; Herrnstein, 1961;
336 A. CHARLES CATANIA and G. S. REYNOLDS

Catania, 1963a), in which VI schedules were choice between these two functions was based
independently arranged for pigeons' pecks on more on logical considerations, i.e., that rate
two different keys, have been discussed by of responding should approach zero as rate
Catania (1963a). Additional data are pro- of reinforcement approaches zero, than on
vided by experiments with chained schedules the superiority of the fit of the power func-
(Autor, 1960; Findley, 1962; Nevin, 1964; tion to the data. This mathematical repre-
Herrnstein, 1964), in which reinforcement of sentation, however, does not provide an ade-
responding in the presence of one stimulus quate fit to the data from individual pigeons.
consists of the onset of another stimulus in Fits to data from individual pigeons are pos-
the presence of which another schedule of sible (cf. Norman, 1966), but they are not es-
reinforcement is arranged (cf. the review by sential for the present purposes and will not
Kelleher and Gollub, 1962). be considered further here.
Evidence for substantial individual differ- Local rates of responding. It has been
ences among pigeons has been noted in the noted (Results) that the idiosyncratic charac-
literature. Herrnstein (1955), for example, teristics of the present data from each pigeon
varied the overall rate of reinforcement pro- were reflected, to some extent, in the changes
vided by VI schedules in an experiment con- in the local rate of responding with the pas-
cerned with the effect of stimuli preceding a sage of time since reinforcement. This rela-
period of timeout from VI reinforcement. tionship is not mathematically determined;
Monotonically increasing, negatively acceler- a given overall rate of responding could have
ated functions were obtained from two pi- been produced by a variety of different tem-
geons (S1, 6 to 120 rft/hr, and S3, 6 to 60 poral distributions of responses within the
rft/hr), but the third pigeon's rate of respond- intervals of a given schedule. Aside from a
ing was roughly constant over the range of re- few atypical functions at high rates of rein-
inforcement rates studied (S2, 6 to 40 rft/hr: forcement, local rates of responding generally
this pigeon provided the basis for a discus- increased monotonically as time passed since
sion of "locked rate"). Individual differences reinforcement (Fig. 3). For a given pigeon,
among pigeons were also observed by Rey- the adjusted local rates of responding at dif-
nolds (1961, 1963), who obtained monotoni- ferent relative times after reinforcement re-
cally increasing, negatively accelerated func- mained roughly invariant over a wide range
tions when different VI schedules in the of overall rates of reinforcement.
presence of one stimulus were alternated with The changes in local rates of responding
a constant VI schedule in the presence of a cannot be accounted for solely in terms of
second stimulus (multiple schedules). time since reinforcement. The distribution of
The derivation of a mathematical function responses throughout a given period of time
describing the relationship between reinforce- since reinforcement can be manipulated
ment and responding for all pigeons is com- within VI schedules by changing the distri-
plicated by the idiosyncratic character of bution of intervals (e.g., from an arithmetic
each pigeon's data, particularly if the func- to a geometric progression of intervals; Fer-
tions are restricted to those involving simple ster and Skinner, 1957). A variable that may
transformations of the ordinate and/or ab- operate together with time since reinforce-
scissa and are limited in the number of ar- ment, however, is probability of reinforce-
bitrary constants. In an earlier version of this ment or some derivative of this probability.
paper (Reynolds and Catania, 1961; Catania If a responding organism reaches a time after
and Reynolds, 1963), a power function was reinforcement equal to the longest interval in
proposed, on the basis of a fit to average data a VI schedule, the probability that the next
for the group of pigeons (see also Catania, response will be reinforced is 1.0. If, how-
1963a). This function, of the form: R = kr02, ever, the organism has not yet reached that
where R is rate of responding, r is rate of time, the probability is less than 1.0, and de-
reinforcement, and k is a constant depending pends on the number of intervals that end at
on the units of measurement, was chosen in or after the time that the organism has
preference to a logarithmic function, of the reached. In the present arithmetic VI sched-
form: R = k log r + n, where n is a constant ules, therefore, probability of reinforcement
and the other symbols are as above. The increased as time passed since reinforcement.
INTERVAL SCHEDULES OF REINFORCEMENT 337
The calculation of probability of rein- Fig. 3), the substitution of average local rate
forcement is considered in greater detail in for overall rate of responding does not alter
Exp. 3, in which the probability of reinforce- the functional relation between rate of re-
ment was explicitly manipulated. It is suffi- sponding and overall rate of reinforcement
cient to note here that both probability of (Fig. 1); the average local rates and the over-
reinforcement and local rates of responding all rates of responding will differ slightly, by
increased as time elapsed since reinforcement. a multiplicative constant. This is not neces-
The overall-rate functions (Fig. 1) and the sarily the case, however, when the local-rate
local-rate functions (Fig. 3) may therefore be functions are dissimilar. For example, in the
similar because the changes in the overall 12.0-sec and 23.5-sec functions for Pigeon 121,
rate of reinforcement provided by an inter- the 23.5-sec function for Pigeon 278, and the
val schedule also changed the probability of 12.0-sec function for Pigeon 281 in Fig. 3, the
reinforcement for responses within any fixed local rates of responding shortly after rein-
period of time. Thus, the overall- and the forcement were relatively low compared to
local-rate functions may depend on the same the local rates within other schedules for the
relationship between probability of reinforce- same pigeons. The values of t in the 12.0-sec
ment and subsequent responding. and 23.5-sec VI schedules were roughly 1.7
Relationship between overall and local rates and 3.4 sec, respectively, and although rates
of responding. This relationship between lo- of responding were high, occasional short
cal and overall rates of responding suggests pauses that occurred immediately after rein-
that a given overall rate of responding may forcement reduced the number of responses in
not be determined directly by an overall rate the early t-sec periods after reinforcement.
of reinforcement. Rather, a schedule may pro- Because these pauses were weighted more
duce a given overall rate of responding heavily in the overall rate of responding than
through its effects on local rates of respond- in the average local rate, the overall rate was
ing at different times after reinforcement. The lower, relative to the average local rate, in
way in which local rates of responding con- these than in the remaining schedules. In-
tribute to overall rates of responding must versely, the local rate of responding was rela-
therefore be considered. tively high after reinforcement for Pigeon
An overall rate of responding is a weighted 278 at VI 45.5-sec (Fig. 3), and the overall
average of the local rates of responding at suc- rate was higher, relative to the average local
cessive times after reinforcement. The early rate, in this than in the remaining schetlules.
times after reinforcement are weighted more Figure 3 shows data from the initial deter-
heavily than the later times because the early mination of performance on each schedule.
times represent a larger proportion of the to- In three of the above cases (Pigeon 121 at VI
tal time in the schedule. For example, within 23.5-sec, Pigeon 281 at VI 12.0-sec, and Pigeon
the first t sec after reinforcement in the arith- 278 at VI 45.5-sec), data from a redetermina-
metic VI schedules, responding was possible tion were available. The redetermined local-
14 times as often as within the last t sec (first rate functions (not shown in Fig. 3) deviated
and last points on each function in Fig. 2 and considerably less from other local-rate func-
3; cf. Method). Thus, a consistent change in tions for the same pigeon than did the initial
the local rate of responding early after rein- local-rate functions. These three cases repre-
forcement would produce a greater change in sent three of the four largest discrepancies be-
the overall rate of responding than the same tween initial and redetermined overall rates of
consistent change late after reinforcement. responding (see Fig. 1), and it is of interest
An alternative measure, therefore, is the aver- that the three discrepancies are each reduced
age of the successive local rates of responding by about 5 resp/min if initial and redeter-
maintained by a particular schedule (e.g., the mined average local rates of responding are
average of all the points on a given function substituted for initial and redetermined over-
in Fig. 2), because this measure does not all rates of responding.
weight early local rates more heavily. This observation is consistent with the as-
When local-rate functions are similar at dif- sumption that the overall rate of responding
ferent rates of reinforcement (as to a first ap- is not directly determined by an overall rate
proximation for Pigeons 118, 129, and 279 in of reinforcement. Reinforcement does not
338 A. CHARLES CATANIA and G. S. REYNOLDS

produce a reserve of responses that are emit- sec with no 0-sec interval) and the second
ted irrespective of their distribution in time. schedule for 21 sessions (arithmetic VI 108-sec
Rather, a given rate of reinforcement pro- with 0-sec interval).
duces a given overall rate of responding
through its effects on local rates of respond- RESULTS
ing at different times after reinforcement. Local rate of responding as a function of
The experiments that follow consider the ef- absolute time since reinforcement is shown
fects of reinforcement on local rates of re- in Fig. 5. The first six points on each func-
sponding in detail, by varying the distribu- tion represent local rates during successive
tion of intervals in VI schedules. thirds of the first and second t sec after rein-
forcement. For Pigeon 278, rates of respond-
ing remained roughly constant after about 50
EXPERIMENT 2: sec since reinforcement in both schedules,
EFFECTS OF A ZERO-SEC INTERVAL and for Pigeon 281, rates of responding gradu-
IN AN ARITHMETIC VARIABLE- ally increased up to the longest time since rein-
INTERVAL SCHEDULE forcement in both schedules (cf. Fig. 2 and 3).
In the schedules of Exp. 1, the first response 100
after a reinforcement was reinforced in one 278
of every 15 intervals. This 0-sec interval may
have had an effect on responding both imme- 80
diately after reinforcement and later. The
present experiment directly compared local 60
rates of responding maintained by arithmetic
VI schedules with and without a 0-sec inter-
val. One consequence of the 0-sec interval w 40
I-.
was that the reinforced response was preceded
by a latency (timed from the end of reinforce- z
20
ment) whereas the reinforced response in
other intervals typically was preceded by an acc
interresponse time (timed from the preced-
ing response).
METHOD
Subjects and Apparatus
In sessions preceding Exp. 1, Pigeons 278
et
CL
LU cn
0..
U)
LU
v.0
281
I

and 281 were exposed to two arithmetic VI


schedules in the apparatus described previ-
ously. To permit a detailed examination of
responding shortly after reinforcement, the
recording circuitry cumulated responses sepa-
rately during successive thirds of the first and
second t sec of each interval.
0 50 100 150 200
Procedure TIME SINCE PREVIOUS RFT (sec)
In the first schedule, t sec were added to Fig. 5. Rate of key-pecking as a function of time
each interreinforcement interval of the arith- since reinforcement in arithmetic VI schedules with
and without a 0-sec interreinforcement interval.
metic VI schedule of Exp. 1. This increased
the mean interval from 7t to 8t sec and elimi- The effect of the 0-sec interval was re-
nated the Ot-sec interval; the shortest inter- stricted primarily to the time shortly after re-
val in the schedule was 1 t sec. The second inforcement (first two or three points on each
schedule was the same as the schedule of Exp. function). Relative to the schedule with no
1; it included the 0-sec interval. With t equal 0-sec interval, the 0-sec interval added a larger
to about 15.4 sec, the first schedule was ar- increment to the local rate of responding than
ranged for 29 sessions (arithmetic VI 123- would have been produced if only a single re-
INTERVAL SCHEDULES OF REINFORCEMENT 339

sponse immediately after reinforcement was Only pecks of sufficient force, however, pro-
added to each interval. One additional re- duced a feedback click and the feedback pre-
sponse at the beginning of each interval would sumably contributed to differentiation of the
have raised the local rate of responding im- force of pecks over the course of the present
mediately after reinforcement by about 12 experiment.
resp/min and would have had no effect on The fairly localized effect in time of the 0-
subsequent local rates of responding. The ac- sec interval suggests that it is reasonable to
tual increment was of the order of 30 resp/ compare local rates of responding maintained
min and persisted to some extent in subse- at later times after reinforcement in VI sched-
quent local rates of responding (second and ules with different distributions of intervals
third points on each function). even if some, but not all, of the schedules in-
The overall rates of responding maintained clude a 0-sec interval. Such comparisons are
by the VI 123-sec and VI 108-sec schedules made in Exp. 3, although the data presented
were 63.7 and 63.2 resp/min for Pigeon 278 here are limited. The data also suggest that
and 61.1 and 73.7 resp/min for Pigeon 281. reinforcement of the first response in each
Thus, the overall rate of responding was session, common to both schedules in Fig. 5,
higher for the schedule with no 0-sec interval had at best a small effect on responding early
for Pigeon 278 and lower for Pigeon 281. In in intervals compared to the effect of rein-
Fig. 5, the difference is exaggerated for Pi- forcement of the first peck after a reinforce-
geon 278 because the figure does not reflect ment in 0-sec intervals within the session.
the relatively large contribution of the high
rate of responding shortly after reinforcement
to the overall rate of responding maintained EXPERIMENT 3:
by the schedule with the 0-sec interval (Exp. EFFECTS OF THE DISTRIBUTION OF
1, Discussion). The schedule with the 0-sec INTERVALS IN VARIABLE-INTERVAL
interval (VI 108-sec) provided about 5 rft/hr SCHEDULES ON CHANGES IN THE
more than the schedule without the 0-sec in- LOCAL RATE OF RESPONDING
terval (VI 123-sec), but the magnitude of the WITHIN INTERVALS
reversal for Pigeon 278 was well within the Experiment 1 demonstrated that local rate
limits of variability suggested by the redeter- of responding increased as time passed since
minations in Fig. 1. reinforcement in an arithmetic VI schedule.
Evidence in the literature, however, demon-
DISCUSSION strates that VI schedules with other distribu-
The increment in the local rate of respond- tions of intervals have different effects. For
ing immediately after reinforcement suggests example, lerster and Skinner (1957) showed
that the 0-sec interval affected both the la- that local rates of responding decreased as
tency of the first response after reinforcement time passed since reinforcement in a VI sched-
and the local rate of responding shortly after ule in which the durations of intervals were
reinforcement. Continued exposure to the derived from a geometric progression. Their
schedules with the 0-sec interval might have demonstration that different distributions of
reduced the size of the increment, because intervals differently affect local rates of re-
first responses after reinforcement were occa- sponding indicates that local rates are not
sionally reinforced whereas second and third controlled solely by time since reinforcement.
were never reinforced. One factor that could The present experiment examined another
have counteracted this effect was the occa- variable: the probability of reinforcement at
sional reinforcement of responses that fol- different times since reinforcement, which is
lowed the first response (about 15 sec later, at determined by the distribution of intervals
the end of the t-sec interval). Another possi- in a VI schedule.
bility was that the reinforced first response in The present treatment defines the proba-
the 0-sec interval occasionally may have been bility of reinforcement as a relative fre-
preceded by a peck of insufficient force to op- quency: the number of times the first peck is
erate the response key, with an effect on sub- reinforced after- a particular time since re-
sequent behavior equivalent to the reinforce- inforcement divided by the number of op-
ment of a second peck after reinforcement. portunities for a peck after that time. This
340 A. CHARLES CATANIA and G. S. REYNOLDS

statistic will be called reinforcements per op- bilities of reinforcement for the first peck
portunity (rft/op) by analogy to Anger's mea- after reinforcement (in the 0-sec interval) and
sure of response probability, interresponse at 20 sec would not be separable if responses
times per opportunity (IRT/Op; Anger, never occurred before 25 sec; the relevant
1956). The method of calculation is illustrated probability of reinforcement would be 0.50
in Fig. 6, which diagrammatically shows an for both intervals. In most VI schedules, the
arbitrary schedule with intervals of 0, 20, 20, rate,of responding is high enough, relative to
60, 120, and 200 sec. The intervals are ar- the time separating successive opportunities
ranged in order of size, although they would for reinforcement, to avoid violating this as-
be arranged in an irregular order in practice. sumption (see, however, VI 12.0-sec and VI
The first peck after a reinforcement is rein- 23.5-sec in Exp. 1).
forced in the shortest interval but not in any Probability of reinforcement does not neces-
of the remaining five intervals. The probabil- sarily increase monotonically as time passes
ity that this peck will be reinforced is there- since reinforcement. In Fig. 6, for example,
fore one-sixth (0.17). When the peck is rein- the probability decreases from 0.40 at 20 sec
forced, in the 0-sec interval, the reinforcement to 0.33 at 60 sec, and then increases to 0.50
terminates the interval and serves as the start- and 1.0 at later times after reinforcement.
ing point for another interval. When the peck Each probability, however, occurs at a dis-
is not reinforced, in the remaining five inter- crete point in time. The statistic is greater
vals, the probability of reinforcement for sub- than zero only at times after reinforcement
sequent pecks becomes zero until the end of when intervals in the schedule end. An ac-
the next longer interval. In the example, the count of performance in terms of probability
next opportunity for reinforcement occurs at of reinforcement also must deal with other
20 sec, when two of the five remaining inter- times, when the probability is zero. In addi-
vals end. Thus, the first peck after 20 sec is tion, reinforcements per opportunity is inde-
reinforced on two of five opportunities, or pendent of the absolute values on the time
with a probability of 0.40. Similarly, the first scale for an interval schedule. Probabilities
peck after 60 sec is reinforced with a proba- would be unaffected, for example, if the
bility of 0.33, the first peck after 120 sec with values of the time scale of Fig. 6 were mul-
a probability of 0.50, and the first peck after tiplied by 100. Because performance presum-
200 sec with a probability of 1.0. As Fig. 6 ably would be different after this change,
illustrates, the statistic can be calculated by probability of reinforcement alone is prob-
dividing the number of intervals that end at ably not a sufficient determinant of perform-
a given time after reinforcement by the num- ance; absolute durations must be taken into
ber that end at that time or later. (Reinforce- account by converting probabilities to local
ments per opportunity is defined as the prob- rates of reinforcement. Figure 6 illustrates a
ability of reinforcement for the first response technique for computing such local rates. An
that occurs after a particular time since rein- opportunity for reinforcement is defined as a
forcement. For convenience, the present dis- point on the continuum of time since rein-
cussion sometimes refers to the probability of forcement at which the probability of rein-
reinforcement at a particular time.) forcement is greater than zero, or at which at
Reinforcements per opportunity rests on least one interval ends. The time over which
the assumption that, except at reinforcement, a particular probability of reinforcement is
the organism cannot discriminate between a assumed to be effective is arbitrarily taken as
given time since reinforcement in one inter- the time ranging from halfway back to the
val and the same time since reinforcement in preceding reinforcement or opportunity for
an interval of different duration (e.g., such reinforcement and halfway forward to the
discrimination could be based on the se- next reinforcement or opportunity for rein-
quence of intervals). Another assumption is forcement. This procedure takes into account
that the organism responds rapidly enough, the observation that a probability of rein-
when reinforcement becomes available at the forcement at a particular time since reinforce-
end of one interval, to emit the reinforced ment can affect responding at both earlier
response before the time at which the next and later times.
longer interval ends. For example, the proba- Consider, for example, the opportunity at
INTERVAL SCHEDULES OF REINFORCEMENT 3341

-a RFT AVALABILITY
INTER-RFT INTERVAL REINFORCED RESPONSE
G RFT RFT
zo
w 23LE1FJ
I ui 1
I.o SU I I
I

I I A
z - ~ 36U lI Il
~~
I I
l I I I
I
RF/O 1%6 2k7 I I
IAIBI : ID I E ! F I G I
I HI
1/3 X

l l
RFT/SEC 0.0200 0.0182 0.0083 0.0100 0.0250

0 20 40 60 80 100 120 140 160 180 200


TIME SINCE REINFORCEMENT (SEC.)
Fig. 6. Schematic presentation of a VI schedule illustrating the computation of two statistics discussed in the
text. The upper part of the figure shows the six interreinforcement intervals of the schedule in order of size:
0-sec, 20-sec, 20-sec, 60-sec, 120-sec, 200-sec. Each interval is shown starting from a preceding reinforcement (rft).
The first statistic, reinforcements per opportynity (rft/op), is a measure of probability of reinforcement: the
number of occasions that reinforcement becomes available at a particular time since reinforcement divided by
the number of occasions that the time since reinforcement is reached (e.g., reinforcement is available at 20 sec
on two of five occasions). The second statistic, reinforcements per second (rft/sec), is a measure of local rate of
reinforcement: the number of reinforcements within a particular period of time since reinforcement divided by
the number of seconds spent in that period of time. The periods of time since reinforcement are arbitrarily
taken as centered at a given opportunity for reinforcement and extending halfway back to the preceding rein-
forcement or opportunity for reinforcement and halfway forward to the next reinforcement or opportunity for
reinforcement (e.g., for the two reinforcements at 20 sec, the periods of time marked B and C: five 10-sec pe-
riods and three 20-sec periods for a total of 110 sec).

60 sec in Fig. 6. The time over which this spent in periods B and C: this local rate is
probability of reinforcement is considered two reinforcements per 110 sec (0.0182 rft/
effective is marked D and E (halfway back to sec). For the opportunity at 0 sec, which im-
the opportunity at 20 sec and halfway for- mediately follows a reinforcement, the local
ward to the opportunity at 120 sec). The or- rate of reinforcement is based only on time
ganism spends 120 sec within the period of period A. For the opportunity at 200 sec,
time represented by D and E in each full after which a peck always terminates the in-
sampling of the six intervals, and one rein- terval with reinforcement, the local rate of re-
forcement is arranged, at the end of the inforcement is based only on time period H.
60-sec interval. In other words, the rate of re- With this calculation, the local rates of re-
inforcement within this period is one rein- inforcement at 0 and 200 sec after reinforce-
forcement per 120 sec (0.0083 rft/sec). Corre- ment are almost equal, whereas the probabil-
spondingly, the local rate of reinforcement at ities of reinforcement at these times differ by a
20 sec is given by the number of reinforce- factor of six (0.166 and 1.0). Other plausible
ments arranged at 20 sec divided by the time techniques for assigning time to successive op-
342 A. CHARLES CATANIA and G. S. REYNOLDS

portunities for the purpose of calculating lo- dition, the constant-probability schedule, de-
cal rates of reinforcement are possible, such scribed in detail below, included interrein-
as bisection of the time interval separating forcement intervals longer than those in other
two successive opportunities using the geo- schedules. For this schedule, therefore, the
metric rather than the arithmetic mean, or last three counters grouped responses in the
the assignment to a given opportunity of the twelfth and thirteenth, the fourteenth and
time since the last opportunity. The present fifteenth, and the sixteenth and seventeenth
technique, though arbitrary, seems to involve t sec after reinforcement.
the simplest ad hoc assumptions.
To recapitulate, reinforcements per oppor- Subjects and Procedure
tunity expresses a conditional probability: Four of the six pigeons of Exp. 1 were each
the probability that the pigeon's response will assigned an average interreinforcement inter-
be reinforced, given that the pigeon has val: for Pigeons 118 and 129, VI 108-sec (33.3
reached a certain time since the last reinforce- rft/hr); for Pigeon 278, VI 427-sec (8.4 rft/hr);
ment. Defined in this way, the statistic does and for Pigeon 279, VI 45.5-sec (79 rft/hr).
not take into account the separation in time The arithmetic VI schedule of Exp. 1 was
of-different opportunities (ends of intervals). then compared with the four other VI sched-
By taking into account the temporal separa- ules, the component t-sec intervals of which
tion of successive opportunities, probabilities are indicated in Table 2. The table lists the
of reinforcement can be converted into local schedules in the order in which they were pre-
rates of reinforcement. sented. Each session consisted of 61 reinforce-
The present experiments compared five VI ments. The sessions of the arithmetic VI
schedules, each providing roughly the same schedule were the last sessions of Exp. 1 ex-
overall rate of reinforcement (rft/hr) but cept for Pigeon 279, for which 29 sessions of
with different distributions of intervals. One arithmetic VI 108-sec intervened between 15
schedule was the arithmetic VI schedule of sessions of arithmetic VI 45.5-sec and this pi-
Exp. 1. Two of the other four schedules dif- geon's other schedules in the present experi-
fered from the arithmetic VI schedule pri- ment (cf. Table 1).
marily by including extra short intervals. Schedules were changed only after the per-
The extra short intervals produced a formance of each pigeon had been stable over
higher probability of reinforcement shortly *a period of at least two weeks. Some schedules
after reinforcement than was produced at the were continued for a large number of ses-
same time after reinforcement in the arith- sions so that long-term stability of the per-
metic VI schedule. In another schedule, the formances could be examined. Data from this
distribution of intervals was such that the experiment are averages over the last five ses-
probability of reinforcement, given an oppor- sions of each schedule.
tunity for reinforcement, was roughly a lin- In making up the distribution of interrein-
early increasing function of the time since forcement intervals for the constant-probabil-
reinforcement. In the last schedule, the proba- ity VI schedule, it was not convenient to
bility of reinforcement was held roughly con- match the mean interval to that of the other
stant over most of the range of time since re- VI schedules. Thus, the constant-probability
inforcement. The relationship between local schedule was VI 79-sec (45.5 rft/hr) for Pi-
rates of responding and the probabilities and geons 118 and 129, VI 379-sec (9.5 rft/hr) for
local rates of reinforcement were examined Pigeon 278, and VI 40.5-sec (89 rft/hr) for
within each schedule. Pigeon 279.
METHOD RESULTS
Two kinds of graphs summarize the VI
Apparatus schedules. Probability of reinforcement (rft/
The apparatus was as described in Exp. 1 op) plotted as a function of time since rein-
and 2. The recording circuitry subdivided the forcement describes the schedule, and local
first and second t sec of each interval so that rate of pecking plotted as a function of time
responses were cumulated separately during since reinforcement describes the perform-
successive thirds of these time periods. In ad- ance maintained by the schedule.
INTERVAL SCHEDULES OF REINFORCEMENT 343
Table 2
Sequence of minimum interreinforcement intervals, mean interval, and number of sessions
for five variable-interval schedules. Interreinforcement intervals are expressed in terms of the
number of t-sec steps from one reinforcement to the next opportunity for reinforcement.
Schedule Sequence of Intervals Mean Sessions

Arithmetic 14, 8, 11, 6, 5, 9, 2, 13, 7, 1, 7 28*


12, 4, 10, 0, 3.
Extra short 14, 8, 11, 6, 5, 9, 2, 12, 7, 1, 7 109
interval, I 12, 4, 10, 1, 3.
"Linear" 13, 10, 10, 7, 4, 7, 7, 4, 10, 13, 7 37
1, 4, 10, 7, 10, 4, 7, 7, 10, 7,
7, 10, 4, 7, 10, 4, 1, 7, 4, 4.
Extra short 12, 1, 4, 13, 10, 1, 8, 11, 1, 14, 7 95
interval, II 2, 1, 7, 14, 6.
Constant 2, 10, 6, 17, 3, 5, 14, 3, 8, 15, 8.25 127
probability 1, 13, 10, 9, 2, 3, 8, 2, 1, 2,
(rft/op=O.l) 11, 5, 16, 9, 17, 6, 17, 7, 3, 4,
16, 1, 4, 17, 1, 7, 16, 12, 17, 8,
1, 4, 2, 16, 12, 13, 17, 3, 5, 7,
6, 11, 4, 1, 6, 14, 9, 16, 5, 15.
*For Pigeon 278, 26 sessions; for Pigeon 279, 15 sessions (see text).

cc
ui
a-

Z II.

0 0.5 1.0 1.5 0 0.5 1.0 1.5 U U.5 to L


ZA

TIME SINCE REINFORCEMENT (RELATIVE TO AVERAGE ITERREINFORCEMENT INTERVAL)


Fig. 7. Probability of reinforcement (upper frames) and the rate of Pigeon 278's key-pecking (lower frames)
as a function of relative time since reinforcement in each of three VI schedules. The schedules differed in the
number of short interreinforcement intervals and therefore in the probability of reinforcement (reinforce-
ments per opportunity) shortly after reinforcement. Two different times after reinforcement at which prob-
abilities of reinforcement were equal are indicated by arrows (extra short I and II). Dashed horizontal lines
show the overall rates of key-pecking maintained by each schedule.
344 A. CHARLES CA TANIA and G. S. REYNOLDS

Extra-short-interval schedules. The arith- in the rate at t sec relative to that in the
metic VI schedule and the two schedules arithmetic VI schedule for Pigeon 129.
with extra short intervals are described For all pigeons, the differences between the
in the upper frames of Fig. 7. The arith- first points on each function can be attributed
metic VI schedule arranged monotonically to the inclusion of a 0-sec interval in the arith-
increasing probabilities of reinforcement metic but not in the VI schedules with extra
(rft/op) at the ends of successive t-sec periods short intervals.
of time after reinforcement. The other two The overall rates of reinforcement were the
schedules (labeled extra short I and II) in- same in each of the three schedules. The
cluded extra t-sec intervals, and therefore pro- dashed horizontal lines in Fig. 7 and 8 show
vided a higher probability of reinforcement at the overall rates of responding maintained by
t sec. In these schedules, the 0-sec interval was each schedule. The addition of extra short
omitted (see Exp. 2) and the probabilities at intervals produced increments in the overall
later times also were changed from those in rate of responding for Pigeon 278, but did not
the arithmetic VI schedule, so that the three systematically affect overall rates of respond-
schedules had equal average values (see Ta- ing for the other three pigeons.
ble 2). The data supported the assumption "Linear" schedule. The schedule in which
that the changes later after reinforcement probability of reinforcement was roughly lin-
would have minor effects compared to those early related to time since reinforcement ("lin-
produced by the addition of more short in- ear" schedule) is compared with the arithmetic
tervals. VI schedule in the upper frame of Fig. 9. In
The lower frames in Fig. 7, for Pigeon 278, the "linear" schedule, non-zero probabilities
show that the arithmetic VI schedule main- of reinforcement occurred at only five discrete
tained local rates of responding that increased times after reinforcement, but the probabilities
as time passed since reinforcement (cf. Fig. 2), of reinforcement at successive opportunities in-
and that the two schedules with the extra creased more rapidly than in the arithmetic
short intervals maintained higher local rates VI schedule.
of key-pecking at t sec after reinforcement The lower frame of Fig. 9 shows the per-
than did the arithmetic VI schedule. A smaller formance of Pigeon 278. The local rate of
increment was generated by the schedule with responding increased as time passed since rein-
two intervals that ended at t sec after rein- forcement in both the "linear" and the arith-
forcement (extra short I) than by the schedule metic VI schedules. Overall rate was higher
with four intervals that ended at t sec (extra in the "linear" than in the arithmetic VI
short II). Thus, the local rate of responding schedule .The data for the other three pigeons
at t sec depended on the probability of rein- are shown in Fig. 10. For Pigeons 129 and 279,
forcement at that time. Some independence the rate of key-pecking increased over time
of the effect of probability of reinforcement since reinforcement within both schedules, and
from time since reinforcement is suggested by for Pigeon 118, the rate of key-pecking de-
the rates of responding later after reinforce- creased at later times after reinforcement in
ment when the probability of reinforcement the "linear" schedule. For these three pigeons,
was roughly the same as that at t sec (arrows; overall rate was lower in the "linear" than in
the later probabilities did not actually occur the arithmetic VI schedule.
in the schedules but were interpolated from The general similarity of the performances
the adjacent non-zero probabilities). maintained by the arithmetic and "linear"
Figure 8 shows the performances of the schedules, given the considerable differences
other three pigeons (118, 129, and 279) in the in the probabilities of reinforcement at par-
arithmetic and the extra-short-interval sched- ticular times, appear inconsistent with the
ules. The local rates of pecking plotted against findings obtained with the schedules contain-
time since reinforcement are similar to those ing the extra short intervals. But the differ-
of Pigeon 278 in Fig. 7. The rate of pecking ences between these and arithmetic VI sched-
at t sec after reinforcement increased with the ules were primarily in the probabilities of
probability of reinforcement at t sec. The one reinforcement shortly after reinforcement.
exception was that only the second extra- These comparisons suggest, therefore, that the
short-interval schedule produced an increment effect of a change in the probability of rein-
INTERVAL SCHEDULES OF REINFORCEMENT 345

LUJ

u- I

CL
L/)
un 4

UJ
CKJ

TIME SINCE REINFORCEMENT (RELATIVE TO AVERAGE


INTERREINFORCEMENT INTERVAL)
Fig. 8. Data from three VI schedules for three additional pigeons. Details as in Fig. 6.

forcement may depend on the time since re- specifications of Fleshler and Hoffman (1962),
inforcement or on the separation of different both of which are discussed in detail in Ap-
probabilities of reinforcement along the con- pendix II. In the present constant-probability
tinuum of time since the previous reinforce- VI schedule, shown in the upper frame of
ment. Fig. 11, the probability of reinforcement re-
Constant-probability schedule. The effects of mained equal to 0.10 + 0.02 over a range of
a roughly constant probability of reinforce- time since reinforcement within which, in the
ment over most of the range of time since arithmetic VI schedule, this probability in-
reinforcement were examined in a schedule creased almost five-fold, from 0.07 to 0.33. At
related to the random-interval schedules of the very late times since reinforcement, the
Farmer (1963) and Millenson (1963), and to probability of reinforcement necessarily in-
the constant-probability interval schedule ar- creased, because the series had to contain a
ranged by Chorney (1960) according to the longest interval.
346 A. CHARLES CA TANIA and G. S. REYNOLDS

"-
5-
1.0'
o "LINEAR"
O *8** ARITHMETIC
0
Cc .6
'U
0.~~~~~
z .4
EUui 0
o
8 .2
UA.
v _

z
LU
0 . 0
D
z
~40 LU
C1-
LI)
LU
(VI 427-sec.) C)
0 0.5 1.0 1.5 2.0
TIME SINCE RENFORCEMENT VI)
(RELATIVE TO AVERAGE LUI
INTERREINFORCEMENT NTERVAL)
Fig. 9. Probability of reinforcement (reinforcements
per opportunity) and rate of key-pecking as a function
of time since reinforcement in a "linear" VI schedule.
In this schedule, the probability of reinforcement,
when not zero, was roughly proportional to time since
reinforcement. The arithmetic VI schedule is pre-
sented for comparison.

The performance maintained by the con-


stant probability and the arithmetic VI sched-
ules are compared, for Pigeon 278, in the (VI 45.5 -sec.)
lower frame of Fig. 11. When the probability 0 0.5 1.0 1. 5 2.0
of reinforcement was held constant, the local TIME SNCE REINFORCEMENT
rate of responding remained roughly constant
throughout the interval between reinforce- (RELATIVE TO AVERAGE
ments. The increase in rate was only about INTERREINFORCEMENT INTERVAL)
2 resp/min over the time from 2t to 17t sec, Fig. 10. Data from the "linear" VI schedule for three
additional pigeons. Details as in Fig. 9.
or roughly one-tenth the increase over the
same range of time in the arithmetic VI reinforcement. Responding began at a low
schedule. A slight increase in response rate rate within intervals of the constant-proba-
might have been expected, even in the con- bility schedule, probably because no 0-sec in-
stant-probability VI schedule, because the terval had been included in the schedule, but
probability of reinforcement did increase the rate increased rapidly during the first t
eventually to 1.0 at the latest times after and part of the second t sec.
INTERVAL SCHEDULES OF REINFORCEMENT 347

0
C-
C-.
0
'hi
C.

I.-
z
'ae
0
'hi

LU
'hi D
z
'U
C.

'hi

(I)
ILl
LU
z
0
11)
LU
ck:
TIME SINCE REINFORCEMENT
(RELATIVE TO AVERAGE
INTERREINFORCEMENT INTERVAL)
Fig. 11. Probability of reinforcement (reinforcements
per opportunity) and rate of key-pecking as a func-
tion of time since reinforcement in a constant-proba-
bility VI schedule. In this schedule, the probability
of reinforcement remained roughly constant until the
latest times after reinforcement, when it increased
abruptly to 1.0. The arithmetic VI schedule is pre-
sented for comparison.
The performances of the three other pigeons
are shown in Fig. 12, again compared with 0.5 1.0 1.5
the performances maintained by the arithmetic TIME SINCE REINFORCEMENT
VI schedule. For all three pigeons, the local (RELATIVE TO AVERAGE
rate of responding changed considerably less
over most of the range of time since reinforce- INTERREINFORCEMENT INTERVAL)
ment in the constant-probability schedule than Fig. 12. Data from the constant-probability VI sched-
it did in the arithmetic VI schedule. A transi- tile for three additional pigeons. Details as in Fig. 11.
tory high local rate of responding shortly after
reinforcement, for Pigeon 279 and to a lesser the peak did not similarly persist in the per-
extent for Pigeon 118, may have persisted from formances of the other birds. If the very early
the previous schedule with additional short times after reinforcement have characteristics
intervals (Table 2). If so, it is not clear why that affect the local rate of responding main-
348 A. CHARLES CATANIA and G. S. REYNOLDS
tained by a given probability of reinforcement, the probability at the same time in the arith-
it may be relevant that the absolute value of metic VI schedule, by the addition of extra
the constant-probability VI schedule, and short intervals, the local rate of responding at
therefore the duration of the short interval, that time became relatively high (Fig. 7 and 8).
was shortest for Pigeon 279. Finally, when the probability of reinforcement
Figure 13 shows a cumulative record of was held roughly constant over most of the
Pigeon 118's performance on constant-proba- range of time since reinforcement, in the con-
bility VI 79-sec, and may be compared with stant-probability VI schedule, local rates of
Fig. 4, a record of arithmetic VI 108-sec for responding remained relatively constant as
the same pigeon. The record in Fig. 13 shows time passed since reinforcement (Fig. 11 and
that the constant-probability VI schedule 12).
maintained a roughly constant rate of respond- Cumulative records presented by Ferster and
ing within each individual interval. Thus, the Skinner (1957, Ch. 6) from schedules roughly
constancies in local rate shown in Fig. 11 and equivalent to the arithmetic and the extra-
12 were not artifacts of averaging perform- short-interval VI schedules support the present
ances over many intervals. Any consistent ef- findings: local rates of responding increased
fects on responding within successive intervals as time passed since reinforcement in the
that might have been caused by the particular former schedules, and were relatively high
sequence of intervals were not evident in the shortly after reinforcement in the latter sched-
cumulative records. If such effects were pres- ules. Ferster and Skinner also studied two
ent, they were small and will be disregarded other schedules, the geometric and the Fibo-
here. nacci, which supplement the present schedules.
A geometric VI schedule consists of a se-
118 CONSTANT PROBABILITY VI 79-sc. quence of intervals in which the duration of
a given interval is equal to the duration of the
next shorter interval multiplied by a constant
(by this specification, Ferster and Skinner's
schedules are only approximately geometric).
With a constant of 2, for example, one such
schedule consists of the following intervals in
an irregular order: 1, 2, 4, 8, 16, etc. sec. A
Fibonacci VI schedule consists of a sequence
of intervals in which the duration of a given
10 MINUTES interval is equal to the sum of the durations
Fig. 13. Cumulative record of a full session of key- of the next two shorter intervals, as, for ex-
pecking maintained by a constant-probability VI ample, in an irregular ordering of the follow-
schedule with a mean interreinforcement interval of ing intervals: 1, 1, 2, 3, 5, 8, 13, etc. sec.
79 sec (Pigeon 118). The recording pen reset to base- In both of these schedules, the probability
line after each reinforcement, indicated by diagonal of reinforcement increases monotonically to
pips. Compare Fig. 4.
1.0 over successive opportunities for reinforce-
ment (except for the first opportunity after
DISCUSSION reinforcement in the Fibonacci schedule, be-
Distributions of intervals in variable-inter- cause the shortest interval is represented twice
val schedules. In the arithmetic and "linear" in that sequence of intervals). For both of
VI schedules, two schedules in which the these schedules, Ferster and Skinner's cumu-
probability of reinforcement increased as time lative records show that local rates of respond-
passed since reinforcement, local rates of re- ing decreased as time passed since reinforce-
sponding also increased as time passed. The ment. This demonstration, that local rates of
increases in local rate were somewhat compara- responding may decrease even while proba-
ble in the two schedules despite considerable bilities of reinforcement increase, indicates
differences in the way the probability of rein- again that something more than probability
forcement changed over time (Fig. 9 and 10). of reinforcement alone must be taken into
When the probability of reinforcement early account in the analysis of performance within
after reinforcement was made high relative to intervals of VI schedules.
INTERVAL SCHEDULES OF REINFORCEMENT 349

Further evidence is provided in Fig. 14, The difference between two kinds of con-
which shows data obtained by Chorney (1960) stant-probability VI schedules may be sum-
with arithmetic, geometric, and constant-prob- marized as follows: the Fleshler and Hoffman
ability VI 3-min schedules. The upper frames schedule varied the separation of successive
describe the schedules in terms of reinforce- opportunities for reinforcement in time while
ments per opportunity; the lower frames pre- holding equal the relative frequencies of the
sent local rates of responding averaged across intervals ending at each opportunity; the pres-
data from three pigeons for each schedule. ent schedule spaced successive opportunities
Each pigeon was exposed to only one schedule for reinforcement uniformly in time while
for about 26 sessions of 60 to 80 reinforcements varying the relative frequencies of the inter-
each. Chorney's arithmetic and geometric VI vals ending at each opportunity. Some of the
schedules correspond to the examples of these implications of those two methods of arrang-
two schedules already discussed: successively ing constant-probability VI schedules are dis-
longer intervals differed in the arithmetic cussed in Appendix II.
schedule by an additive constant, and in the Each of the schedules arranged by Chorney
geometric schedule by a multiplicative con- consisted of 25 intervals. In the arithmetic
stant. schedule, the intervals ranged from 1.0 to
The constant-probability VI schedule, based 358.6 sec with an additive constant of 14.9 sec.
on a formula proposed by Fleshler and Hoff- In the geometric schedule, the intervals ranged
man (1962), differed in its derivation from the from 1.0 to 1150.0 sec with a multiplicative
constant-probability schedule of the present constant of 1.341. In the constant-probability
experiments. If a random generator arranged schedule, the intervals ranged from 3.6 to
a constant probability of reinforcement within 714.0 sec. The different ranges, produced when
successive equal periods of time since rein- the mean value of each schedule was set at
forcement, the frequencies of different inter- 180 sec, are reflected by the different scales
reinforcement intervals would decline expo- for the abscissas of Fig. 14. In each schedule,
nentially as a function of interval duration. In the same probabilities of reinforcement were
effect, Fleshler and Hoffman took this theo- represented at successive opportunities: from
retical frequency distribution of intervals and 0.04 (1/25) at the end of the shortest interval,
divided it into equal areas, or, in other words, to 1.0 at the end of the longest interval. The
into successive class intervals in each of which schedules differed only in the spacing of suc-
an equal number of intervals ended. These cessive opportunities for reinforcement in
class intervals became larger the longer the time. In the arithmetic schedule, the time
time since reinforcement because of the ex- from one opportunity to the next was con-
ponentially decreasing form of the frequency stant; in the geometric and constant-proba-
distribution. The average intervals of each of bility schedules, the time from one oppor-
these class intervals were then taken as the tunity to the next increased as time passed
constituent intervals of Fleshler and Hoff- since reinforcement, but later after reinforce-
man's constant-probability schedule. One ef- ment the increase was more rapid in the geo-
fect of this procedure was that the proba- metric than in the constant-probability sched-
bility of reinforcement taken over extended ule.
periods of time since reinforcement was held The data, which show the effects of the
roughly constant. For example, in the con- different temporal spacings of successive op-
stant-probability VI schedule arranged by portunities, consisted of average rates of re-
Chorney, 14 of the 25 intervals ended within sponding in successive thirds of intervals of
the first 150 sec after reinforcement, or with comparable duration, about 350 sec, in each
a probability of 0.56; six of the remaining schedule (unfilled circles), and average rates
11 intervals ended within the next 150 sec of responding in all those intervals greater
after reinforcement, or with a probability of than 100 sec in each schedule (filled circles).
0.55; and in the next two 150-sec periods, the The former showed that, during the first 300
probabilities were 0.60 and 0.50, respectively sec after reinforcement, local rates of respond-
(after 600 sec, when only the longest in- ing increased slightly in the arithmetic sched-
terval remained, the probability was neces- ule and decreased in both the geometric and
sarily 1.0). constant-probability schedules. The latter
350 A. CHARLES CATANIA and G. S. REYNOLDS

1.1.0
-
>
ARITHMETIC GEOMETRIC CONSTANT - PROBABILITY
z
VI 180-sec VI 180-sec
'
S.
VI 180-sec

tLUJ .6
0.

4-
z
w .4-
L7
as
z 0~
~ ~ ~ ~~~
.2., * 0 .0

Chorney (1960)
LU

75
z

QL 5

In ae o intervals of . 350 sec.


25-_ . . * all intervals >1100 sec.

O'' 0 100 200 300 0 400 800 0


TIME SINCE REINFORCEMENT (SECONDS)
200 400 600 800

Fig. 14. Probability of reinforcement (reinforcements per opportunity) and the rate of key-pecking as a func-
tion of absolute time since reinforcement in three schedules, from Chomey (1960). Note the different abscissa
scales. Details in text.

showed that the local rate of responding de- rates of reinforcement. The basic premise in
creased markedly as time passed since rein- converting probabilities of reinforcement to
forcement in the geometric schedule, but did local rates of reinforcement is that the effect
not change systematically in the arithmetic of a given probability of reinforcement may
and constant-probability schedules. In sum- spread over time and may depend on the
mary, in the arithmetic schedule, the evidence closeness in time of other opportunities for
for an increase in local rate of responding as reinforcement. A probability of reinforcement
time passed since reinforcement was ambigu- of 1.0 at one time, for example, may maintain
ous; in the geometric schedule, local rate de- responding at earlier times and may not main-
creased as time passed since reinforcement; tain as high a rate when it is separated from
and in the constant-probability schedule, local the preceding opportunity by a long time (e.g.,
rate was fairly constant over most of the range 300 sec in the geometric VI schedule of Fig.
of time since reinforcement, but was relatively 14) as when it is separated from the preceding
high shortly after reinforcement. Taking into opportunity by a short time (e.g., 15 sec in the
account the relatively short experimental his-- arithmetic VI schedule of Fig. 14). Local rates
tories on which these data are based, they are of reinforcement, illusttated in Fig. 6, take
in reasonable agreement with the present find- the separation of successive opportunities for
ings and with other findings in the literature. reinforcement into account; they are calcu-
Local rates of reinforcement. The differences lated by dividing the number of reinforce-
in performance produced by a fixed sequence ments at a given opportunity by the period
of probabilities of reinforcement, when the of time within which that opportunity is iso-
temporal separations of successive opportuni- lated. Local rates of reinforcement also can
ties for reinforcement were varied, indicate be considered equivalent to probabilities of
that the present analysis must be extended reinforcement averaged over extended periods
from probabilities of reinforcement to local of time.
INTERVAL SCHEDULES OF REINFORCEMEN75 351

Local rates of reinforcement within the five The conversion of these local rates of rein-
VI schedules of the present experiment and in forcement to those for equivalent VI schedules
a geometric VI schedule (intervals of 10, 20, with different mean intervals involves only the
40, 80, and 160 sec) are shown in Table 3. multiplication of each of the local rates by a
The mean values of the schedules were chosen constant. For an arithmetic VI schedule with
so that each schedule could be represented in a mean of 35 sec, for example, in which each
terms of intervals that were integral multiples of the intervals in the first column of Table 3
of 10 sec. Local rates of reinforcement are is halved, the local rates of reinforcement in
shown only at opportunities for reinforcement the table are multiplied by two.
(times at which at least one interval in the Corresponding changes in local rates of re-
schedule ended); the local rates take into ac- inforcement over time since reinforcement oc-
count the absence of opportunities for rein- cur in the schedules arranged by Chorney.
forcement at times for which there are no Except for some deviations at the earliest and
entries in Table 3. The opportunity at 0 sec latest opportunities for reinforcement, local
in the arithmetic VI schedule is omitted from rate of reinforcement increased monotonically
the table for reasons to be discussed below. as time passed since reinforcement in the arith-
Table 3 shows that the local rate of rein- metic VI schedule, decreased monotonically in
forcement increased with time since reinforce- the geometric VI schedule, and remained
ment in both the arithmetic and "linear" VI roughly constant in the constant-probability
schedules. Except for the opportunity at 10 sec VI schedule. In other words, these directions
after reinforcement, it also increased in the -of change in local rate of reinforcement are
VI schedules with the extra short intervals. characteristic, respectively, of these three
In the constant-probability VI schedule, the classes of distributions of intervals in VI
local rate of reinforcement remained relatively schedules, and changes in local rate of rein-
constant, except at the latest times after re- forcement correspond in direction to the
inforcement when the probability of rein- changes in local rate of responding observed
forcement necessarily increased to 1.0. In the in a given schedule (an increasing local rate
geometric VI schedule, the local rate of rein- of responding in the arithmetic VI schedule,
forcement decreased as time passed since re- a relatively high local rate of responding
inforcement, except at the terminal opportu- shortly after reinforcement in the extra-
nity (160 sec). short-interval VI schedules, and so on). Never-
Table 3
Local rates of reinforcement (rft/hr) at successive opportunities for reinforcement in six
variable-interval schedules. Details in text.
Extra Short Extra Short Constant-
Time Since Arithmetic I Il "Linear" Probability Geometric
rft(sec) VI 70-sec VI 70-sec VI 70-sec VI 70-sec VI 82.5-sec VI 62-sec
10 27 51 111 13 38 80
20 29 29 23 35 72
30 31 31 37
40 34 34 19 40 34 51
50 38 38 38
60 42 42 28 42
70 48 48 48 80 36
80 55 55 41 39 45
90 66 66 44
100 79 79 42 160 33
110 103 103 80 36
120 144 288 103 40
130 240 144 240 45
140 720 360 720 51
150 60
160 209 90
170 720
Local rate of 55 rft/hr at 0 sec is omitted (see text) .
352 A. CHARLES CATANIA and G. S. REYNOLDS

theless, changes in the local rate of reinforce- original rates. It therefore may be important
ment are large compared to the changes in the that the local rates of responding in the
local rate of responding. The local rate of schedules in Fig. 15 were generally higher,
reinforcement in the arithmetic VI schedule, for three of the four pigeons, than the overall
for example, increases by a factor of almost 40 rates derived from Fig. 1, because the sched-
over the time from 10 to 140 sec after rein- ules in Fig. 15 were presented later than those
forcement, whereas the local rate of respond- in Fig. 1.
ing increases by a much smaller factor. In In addition, most of the overall rates of
Exp. 1, however, the functions relating overall responding plotted in Fig. 15 were obtained
rate of responding to overall rate of reinforce- indirectly, by interpolation between actual
ment were generally monotonically increasing data points in Fig. 1, or, in the case of Pigeon
but negatively accelerated; beyond about 50 278 at VI 427-sec, by the linear extrapolation
rft/hr, large changes in the overall rate of re- to zero from the lowest rate of reinforcement
inforcement produced relatively small changes (8.4 rft/hr). For example, most of the local
in the overall rate of responding. It is there- rates of reinforcement within the VI 427-sec
fore appropriate to compare the local rates of schedules were lower than any of the overall
responding maintained by different local rates rates of reinforcement arranged for this pi-
of reinforcement (Fig. 7-10) with the overall geon in Exp. 1, and this extrapolation yielded
rates of responding maintained by different most of the overall rates of responding that
overall rates of reinforcement (Fig. 1; see also were considerably lower than the local rates
the top graph of Fig. 28, Appendix I). within each schedule for this pigeon in Fig. 15.
Figure 15 compares for each pigeon the Furthermore, the overall rates of responding
local rates of responding maintained by the in Fig. 1 were obtained with arithmetic VI
arithmetic, extra-short-interval, and "linear" schedules, in which local rates varied with
VI schedules (obtained: connected filled time since reinforcement, rather than with
circles) and, from Fig. 1, the overall rates of constant-probability VI schedules. As indi-
responding maintained by overall rates of re- cated in Exp. 1 (Discussion), this characteristic
inforcement that corresponded to the local of arithmetic VI schedules may have affected
rates of reinforcement at successive opportuni- the overall-rate functions.
ties within each schedule (calculated: uncon- Variability in the performances maintained
nected unfilled circles). The correspondence by the schedules in Fig. 15 also contributed to
between the two sets of data is by no means the disagreement between the two sets of data.
perfect, but several features are encouraging. The two most obvious cases are the idiosyn-
The two sets of data tend to increase and de- cratic performance of Pigeon 118 in the "lin-
crease together, even when they differ consid- ear" schedule, within which the local rate of
erably in absolute value. In several cases, both responding decreased at later times since re-
sets of data agree fairly well in absolute rate inforcement, and of Pigeon 129 in the first
as well as in the direction of change over time extra-short-interval schedule, within which
since reinforcement. Finally, some of the idio- the local rate of responding remained low
syncratic characteristics of the performances shortly after reinforcement.
of different pigeons, as in the larger rate Finally, some systematic disagreement is
changes for Pigeon 118 than for Pigeon 279, evident at the earliest and the latest times
are reflected in both sets of data. since reinforcement. In the arithmetic sched-
The disagreements between the two sets of ule, the two sets of data disagree most at
data stem from several sources. One of the short times after reinforcement for Pigeons
most important is the adequacy of the overall- 118 and 279, and would do so also for Pi-
rate data from Fig. 1. The data in Fig. 1 geons 129 and 278 if the two sets of data were
show average rates maintained by different adjusted to eliminate the differences in abso-
overall rates of reinforcement, but the range lute value. In the extra-short-interval sched-
of variation at a given overall rate of rein- ules, the overall rate of responding plotted
forcement is indicated only by the extent to at the latest time since reinforcement is
which redeterminations differed from original relatively high in both schedules for Pigeon
determinations. Redetermined overall rates of 118, in the first schedule for Pigeon 278, and
responding were generally higher than the in the second schedule for Pigeon 129. These
INTERVAL SCHEDULES OF REINFORCEMENT 353

279 (Vi 4 (5-sc.) V


Q5 1.0 15 0 0.5 1.0 1. 0 0.5 1.0 t5 0 0.5 1.0
TIME SNCE REORCEMENT (RELATIVE TO AVERAGE NTERREN FORCEMENT NTERVAL)
Fig. 15. Comparison of local rates of responding obtained in four VI schedules (Fig. 7-10) and local rates of
responding calculated from overall rates of responding (Fig. 1) and local rates of reinforcement (Fig. 6 and
Table 3). Details in text.

differences probably depend on the calcula- Adjustments could be made in the method
tion of local rates of reinforcement (Fig. 6) of calculating local rates of reinforcement
rather than on the properties of the perform- that would reduce the disagreements in the
ances and their controlling variables. There two sets of data at the earliest and latest
is no reason to believe that the arbitrary times since reinforcement (cf. discussion of
method of calculation of local rates of rein- Fig. 6). In view of the sources of disagree-
forcement takes into account either the prop- ment, however, and in the absence of addi-
erties of the 0-sec interval (Exp. 2), which was tional data, such adjustments seem prema-
arranged in the arithmetic but not the other ture. It is sufficient to note that, given the
schedules of Fig. 15, or of the latest time qualifications outlined, the agreement be-
after reinforcement, at which the probability tween the two sets of data is good enough to
of reinforcement necessarily increases to 1.0 suggest that both the overall rate of respond-
and at which the calculation of the local rate ing maintained by different interval schedules
of reinforcement is unique in that it is based and the local rates of responding as time
on periods of time preceding but not follow- passes within a particular interval schedule
ing the opportunity for reinforcement. are controlled, at least in part, by the same
354 A. CHARLES CATANIA and G. S. REYNOLDS

variable: rate of reinforcement. Because over- arithmetic VI schedules for a particular pi-
all rates of reinforcement and overall rates of geon (Fig. 3), the differences can be similarly
responding are simply weighted averages of related to the form of the overall-rate func-
the local rates, it also follows that the overall tion for that pigeon. The major deviations
rate of responding is indirectly determined by for the arithmetic VI schedules with short
the effects of different local rates of reinforce- mean values (e.g., Pigeon 121 at VI 12.0-sec,
ment as time passes, rather than directly de- Fig. 3) cannot be assessed in this way, for the
termined by the overall rate of reinforcement. reasons outlined in Exp. 1 and because the
Some additional evidence, supplementing local rates of reinforcement at later times
that in Fig. 15, is relevant to the relationship after reinforcement in those schedules ex-
between overall and local rates of respond- ceeded the overall rates of reinforcement in
ing. In Exp. 1 (Discussion), it was suggested Fig. 1 (in the arithmetic VI 12.0-sec schedule,
that, except for some of the arithmetic VI for example, the local rate of reinforcement
schedules with shorter mean values, the local exceeded 300 rft/hr by 8t sec after reinforce-
rate of responding for a particular pigeon ment).
changed in roughly the same way as time
passed since reinforcement in most of the
schedules studied (Fig. 3). This was approxi- EXPERIMENT 4:
mately so for most schedules and most OVERALL AND LOCAL RATES OF
pigeons, but it is possible, furthermore, to ac- RESPONDING WITHIN THREE
count for some of the deviations by compar-
FIXED-INTERVAL SCHEDULES
ing the overall-rate functions (Fig. 1) and the The fixed-interval (FI) schedule is the lim-
local rates of reinforcement within different iting case of the VI schedule: the distribution
arithmetic VI schedules. of intervals is narrowed down to a single
Compare, for example, Pigeon 278's per- value. The performance maintained by an
formance in the arithmetic VI 108-sec and Fl schedule is usually characterized by a
VI 427-sec schedules (Fig. 3, but more easily pause before the first response in an interval,
seen by comparing Fig. 5 and 7). The local and then by a gradually increasing rate of re-
rate of responding became fairly constant sponding as the end of the interval ap-
after about 50 sec in the VI 108-sec schedule proaches. Occasionally, the rate passes
whereas it increased over almost the entire through a maximal value some time before
range of time since reinforcement in the VI the end of the interval (Ferster and Skinner,
427-sec schedule. The local rates of reinforce- 1957) or, after extended exposure to a fixed-
ment ranged from 17.4 to 465 rft/hr in the interval schedule, the responding after the
former schedule and from 4.4 to 118 rft/hr in initial pause may be maintained at a rela-
the latter schedule (lt to 14t sec, with t equal tively constant rate throughout the remainder
to 15.4 and 61.0 sec in the two schedules, re- of the interval (Cumming and Schoenfeld,
spectively). Looking at the difference between 1958).
the two schedules in another way, note that The present analysis cannot easily be ap-
the local rate of reinforcement reached 25 plied to the Fl schedule, which includes a
rft/hr at about 5t sec in the VI 108-sec sched- single opportunity at which the probability
ule and at about 12t sec in the VI 427-sec of reinforcement is 1.0. This single oppor-
schedule. The two schedules therefore cov- tunity combines two difficulties in the analy-
ered different parts of the overall-rate func- sis: that of the earliest opportunity (end of
tion shown for Pigeon 278 in Fig. 1. This the shortest interval), which is preceded by
function was fairly flat beyond about 25 rft/ a reinforcement rather than by another op-
hr or over most of the range of rates of rein- portunity, and that of the latest opportunity
forcement locally represented in the VI 108- (end of the longest interval), at which the
sec schedule, but it increased steeply up to probability of reinforcement necessarily in-
about 25 rft/hr or over most of the range of creases to 1.0. The present experiment there-
rates of reinforcement locally represented in fore examined some properties of responding
the VI 427-sec schedule. within fixed intervals. Three Fl schedules
In some other cases in which local rates of were studied: FI 30-sec, FI 50-sec, and Fl
responding changed differently in different 200-sec.
INTERVAL SCHEDULES OF REINFORCEMENT 355

METHOD roughly constant for Pigeon 237. Local rate


of responding as a function of absolute time
Apparatus since reinforcement was typically lower in Fl
The standard experimental chamber was than in arithmetic VI schedules, but the two
similar to that described in the preceding ex- sets of data have some similar characteristics
periments. The response key was illuminated (cf. Fig. 2, VI 12.0-sec, VI 23.5-sec, and VI
by an orange 6-w bulb. Reinforcement dura- 108-sec).
tion was 3 sec. The controlling and recording Replotting the data against relative time~
apparatus was located in a separate room, since reinforcement (right column) shows
and included a stepping switch that arranged that, for all four pigeons, the Fl 200-sec sched-
the Fl schedules and distributed responses to ule maintained a relatively higher rate of
10 counters representing successive tenths of responding early after reinforcement and a
the interval. relatively lower rate of responding later after
reinforcement than the other two schedules.
Subjects and Procedure This change in the pattern of responding in
Four adult, male, White Carneaux pigeons, relative time contrasts with that observed in
maintained at about 80% of free-feeding body VI schedules (Fig. 3). Despite the duration
weight, were exposed to FI 50-sec, FI 200-sec, for which the schedules were continued, the
and Fl 30-sec schedules in that order. Sessions relatively high rate of responding maintained
of a different procedure (temporal discrimi- early after reinforcement in the FI 200-sec
nation: Reynolds and Catania, 1962) pre- schedule may have persisted from reinforce-
ceded FI 50-sec and intervened between Fl ment at an earlier time in the preceding FI
200-sec and Fl 30-sec. Each FI schedule was 50-sec schedule. It is also possible that rein-
maintained for approximately two monthspf forcement of the first response in each session
daily sessions, at which time the performance had effects similar to those of a 0-sec interval,
of each pigeon had appeared stable (visual but this interpretation is contradicted by the
inspection of the data) for at least two weeks. results of Exp. 2 and it is unlikely that such
Sessions of FI 50-sec and Fl 30-sec consisted effects would be most marked in FT 200-sec.
of 61 reinforcements: reinforcement of the Overall Fl rates of responding are plotted
first response of the session followed by rein- against reinforcements per hour in Fig. 17
forcement at the end of 60 intervals. Sessions (left). No systematic relationship is evident.
of Fl 200-sec consisted of only 31 reinforce- In general, however, the terminal rate of re-
ments (30 intervals). Each interval was timed sponding (the local rate of responding just
from the end of the preceding reinforcement. before reinforcement) increased as the overall
Data presented are averages over the last five rate of reinforcement increased (right). These
sessions of each FI schedule. data differ considerably from those obtained
with VI schedules (Fig. 1), in which both over-
RESULTS all and local rates of responding increased
Figure 16 shows local rates of responding with increases in the overall rate of reinforce-
in successive tenths of the fixed interval as a ment over a roughly comparable range.
function of both absolute (left column) and
relative (right column) time since reinforce- DISCUSSION
ment. In absolute time, the rate of respond- The FI schedules demonstrate the exten-
ing increased most rapidly in the shortest sive effects of reinforcement at one time after
fixed interval (Fl 30-sec). In both the Fl 30- reinforcement on local rates of responding at
sec and the Fl 50-sec schedules, local rates of other times. For example, reinforcement at
responding increased- monotonically as time 200-sec after reinforcement for Pigeon 68
passed since reinforcement. In the FI 200-sec maintained a considerable rate of responding
schedule, the local rate of responding began over most of the range of time since reinforce-
to decrease slightly about halfway through ment. Thus, the calculation of local rates of
the interval for Pigeon 68, and decreased reinforcement, in the introduction of Exp. 3
slightly shortly before reinforcement for Pi- (Fig. 6), probably underestimates the effect
geon 236. About halfway through the 200-sec of reinforcement on local rates of responding
interval, the local rate of responding become at remote points in time. The calculation of
356 A. CHARLES CATANIA and G. S. REYNOLDS

FIXED- INTERVAL SCHEDULE

UJ

Z
z
cc

UL
t)
z
0
a-
LI)
LU

0 50 100 150

ABSOLUTE TIME SINCE RFT RELATIVE TIME SINCE RFT


(Seconds) (Fl DURATIONZ10)
Fig. 16. Local rates of key-pecking maintained by three FI schedules as a function of absolute and relative
times since reinforcement (four pigeons).
INTERVAL SCHEDULES OF REINFORCEMENT 357

AVERAGE Fl RATE TERMINAL Fl RATE the pigeon to discriminate durations would


suggest (Reynolds and Catania, 1962; Rey-
12 nolds, 1966; Stubbs, 1968). In an Fl perform-
68 a
ance, responding occurs at appreciable rates
69
tu
1--
DO .-
10
Mn
237 .
A

a
even at times since reinforcement well before
the opportunity for reinforcement at the end
z 7 .5
of the interval. Other factors presumably op-
V. erate to attenuate temporal control in an Fl
im schedule.
I50
Fixed-interval reinforcement sets the occa-
we sion for the incidental but. consistent correla-
2
tion of responding at one time in an interval
and subsequent reinforcement at the end of
0 50 100 0 SO 100 the interval. The early responding may be
RENFORCEMENTS PER HOUR maintained by the later reinforcement (cf.
Fig. 17. Overall and terminal rates of key-pecking
Dews, 1962, who suggests that responding over
as a function of the rates of reinforcement provided time in an FI schedule reflects a delay-of-
by three FI schedules (four pigeons). reinforcement gradient). Such delayed rein-
forcement must operate in conjunction with
the local rate of reinforcement was based on temporal discrimination: the time in the in-
a time extending only halfway back to a pre- terval at which responding occurs must be
vious reinforcement or opportunity for rein- to some extent discriminated if the respond-
forcement and halfway forward to the next ing is to be consistently controlled by the
reinforcement or opportunity for reinforce- time that separates it from reinforcement at
ment. In many of the VI schedules discussed, the end of the interval.
this calculation involved a time of the order The incidental properties of the FI sched-
of only a few seconds, whereas reinforcement ule may even produce effects in relatively triv-
at one point in time affected rates of respond- ial ways. For example, a decrease in response
ing over a period of the order of 2 or 3 min rate toward the end of an interval (Pigeon 68
in the Fl 200-sec schedule. at Fl 200-sec, Fig. 16) may have its origin in
In an Fl performance, the time since rein- an increase in the frequency with which the
forcement may function as one discriminable pigeon looked toward the feeder as the time
property of the many aspects of the experi- approached when a response would operate
mental situation (cf. Skinner, 1938, Ch. 7). the feeder.
Times since reinforcement that are consis- Procedurally, the FI schedule is the sim-
tently correlated with nonreinforcement may plest of the interval schedules but, paradox-
come to control low rates of responding in ically, the variables that appear to operate
much the same way as do other stimulus prop- in Fl schedules suggest that, in at least one
erties (e.g., intensity, wavelength). Discrimi- respect, Fl performance is more complex than
nable periods of nonreinforcement in Fl VI performance. The Fl schedule is at one
schedules probably contribute to the fact that extreme of a continuum of schedules that
an Fl schedule generally maintains a lower differ in the degree to which they allow
overall rate of responding than a VI sched- discriminative control by time since rein-
ule providing the same overall rate of rein- forcement; at the other extreme is the
forcement. For example, except for Pigeon constant-probability VI schedule, which sim-
121 in Fig. 1, the overall rates of responding plifies performance by eliminating the tem-
maintained by arithmetic VI schedules were poral patterning of reinforcement as a con-
consistently higher than the overall rates trolling variable. The implication is that,
maintained by corresponding Fl schedules in although Fl schedules show that effects of re-
Fig. 17. inforcement extend over a considerable pe-
Despite the possibility of temporal control, riod of time since reinforcement, the contri-
the performances maintained by Fl schedules bution of Fl performance to the quantitative
do not appear to be as well under the control analysis of VI schedules may not be simple
of time since reinforcement as the capacity of and direct.
358 A. CHARLES CATANIA and G. S. REYNOLDS

The next experiment explores the effects


of combining two FI schedules. In terms of METHOD
procedure, such a combination produces the
simplest VI schedule, which consists of only Subjects and Apparatus
two intervals, and permits further examina- Four adult, male, White Carneaux pigeons
tion of the spread of the effects of reinforce- were maintained at 80% of free-feeding body
ment over the continuum of time since rein- weights. The key-pecking of each pigeon had
forcement. previously been maintained by Fl schedules
of reinforcement for at least 40 hr.
The experimental chamber was similar to
EXPERIMENT 5: that described in Exp. 1. The schedules were
EFFECTS OF THE SEPARATION arranged by stepping switches operated every
IN TIME OF OPPORTUNITIES 10 sec by an electronic timer and reset after
FOR REINFORCEMENT IN each reinforcement. The stepping switches
TWO-VALUED INTERVAL arranged reinforcement either after 240 sec
SCHEDULES (24 steps of the stepping switches) or, accord-
This experiment addressed two separate ing to the scheduled occurrences of the
but related questions. First, what is the effect shorter interval, at the time specified for this
of one probability of reinforcement on the interval. Each interval began only after the
local rate of responding maintained by a sec- 4-sec reinforcement at the end of the preced-
ond probability as the period of time that ing interval. The stepping switches also
separates them is varied? Second, what is the served to distribute responses to 24 counters
effect of the magnitude of their separation that represented the twenty-four 10-sec pe-
on the local rates of responding during the riods of time since reinforcement.
period of time between them?
The experiment compared the perform- Procedure
ance maintained by a single-valued interval The schedules and sessions for each pigeon
schedule, Fl 240-sec, with the performancesare summarized in Table 4. In each two-
maintained by several two-valued interval valued interval schedule, the long interval
schedules. -One interval in the two-valued was 240 sec. The table shows the duration of
schedules was 240-sec; the other was shorter
the short interval and its probability or rela-
(30, 90, 150, or 210 sec) and was presentedtive frequency of occurrence, in rft/op. An
with a relative frequency of 0.05 or 0.50. For
entry of 240 sec indicates that no short inter-
example, in a two-valued schedule of 90-secval was arranged.
and 240-sec intervals with a relative frequency
Each daily session consisted of 21 reinforce-
of 0.05 for the 90-sec interval, reinforcement
ments: reinforcement of the first response of
was available 90 sec after the previous rein-
the session, followed by 20 intervals. When
forcement in one of every 20 intervals and at
the relative frequency of the short interval
240 sec after reinforcement in the remaining
was 0.05, a single short interval occurred at
intervals. According to the terminology of a different place in the sequence of 20 inter-
Ferster and Skinner (1957, Ch. 11), this sched-
vals in each session. When the relative fre-
ules is a mixed Fl 90-sec Fl 240-sec schedule.
quency of the short interval was 0.50, an ir-
The present purposes, however, require a regular sequence of short and long intervals
specification not only of the durations of the
varied from one session to the next. The se-
scheduled intervals but also of their relative
quence never included more than four suc-
frequencies (cf. Millenson, 1959). cessive occurrences of either interval, and was
The proximity in time of the two oppor- balanced so that the relative frequencies of
tunities in the two-valued schedules changed
the short and long intervals were indepen-
when the duration of the short interval wasdent of the duration of the preceding inter-
varied over the range from 30 to 210 sec. val.
Thus, the temporal separation of the two The session durations varied from about 45
opportunities was necessarily confounded min (short interval of 30 sec with a relative
with the time since reinforcement at which frequency of 0.50, or mixed Fl 30-sec Fl 240-
the earlier opportunity occurred. sec) to about 80 min (no short interval, or Fl
INTERVAL SCHEDULES OF REINFORCEMENT 35;9
Table 4
Sequence of two-valued interval schedules for each pigeon. Entries show the duration (sec)
and relative frequency (rft/op) of the short interval. The long interval was held constant at
240 sec.
Pigeon
85 86 34 35
Short Short Short Short
Interval Rft/Op Interval Rft/Op Interval Rft/Op Interval Rft/Op Sessions

240 1.00* 240 1.00 240 1.00* 240 1.00* 40


30 0.50 210 0.50 30 0.05 210 0.05 52
30 0.05 210 0.05 90 0.05 150 0.05 102
150 0.05 30 0.05 150 0.05 150 0.50 61
240 1.00* 30 0.50 210 0.05 90 0.50 37
240 1.00 90 0.50 240 1.00* 90 0.05 39
240 1.00* 240 1.00* 240 1.00* 240 1.00* 47
Fixed-interval schedule (FT 240-sec).

240-sec). The data presented are averages over variables, the time at which the probability
the last five sessions of each schedule. of 0.05 occurred and its separation from the
terminal probability, were confounded: the
RESULTS later the probability of 0.05, the less its tem-
Local rates of responding. Figure 18 shows poral separation from the terminal probabil-
local rates of key-pecking as a function of ity. Data from Exp. 1 and 3, however, suggest
time since reinforcement for Pigeon 34 in five that the temporal proximity of the terminal
interval schedules. The performance main- probability is the more relevant variable. In
tained by the single-valued schedule (Fl 240- those experiments, increases in time since re-
sec) is represented in the bottom panel (filled inforcement generally produced decreases in
circles). The performances maintained by the the rate of responding maintained by a given
two-valued schedules, which consisted of 240- probability. In the arithmetic VI schedule,
sec intervals and a shorter interval, are repre- for example, the response rate maintained by
sented in the top four panels. Tfie relative the probability of reinforcement at lt sec
frequency of the shorter interval, here equal decreased as t increased (the left-hand point
to rft/op, was always 0.05 and its duration on each function in Fig. 2, Exp. 1). Because
was 30, 90, 150, or 210 sec. Reading the graphs this change in the local rate of responding
from top to bottom shows the effect of mov- maintained by a given probability with
ing the end of the short interval from an early changes in its location in time was opposite
time after reinforcement up to coincidence to that obtained in the present experiment,
with the end of the 240-sec interval. Differences the present findings cannot be attributed
in the performances were therefore due at least solely to the changes in the location in time
in part to changes in the time separating the of the probability of 0.05 in the two-valued
probability of reinforcement of, 0.05 from the interval schedules. A similar point can be
terminal probability of 1.0. These differences made based on the FI data of Exp. 4 (Fig. 17,
were of two sorts. First, when the probability terminal rate).
of 0.05 was at 30 sec, the rate of responding Additional data are shown in Fig. 19. The
declined for some time before it increased as data from Pigeon 86 (left column, upper three
the terminal probability of 1.0 at 240 sec was panels) show the effect of a probability of re-
approached, whereas when the probability of inforcement of 0.50 at various times since
0.05 was at 90, 150, or 210 sec, the rate of reinforcement; the data from Pigeons 85 (left
responding did not decline. column, bottom panel) and 35 (right column)
Second, the rate of responding maintained show the effect of a probability of 0.05 at
by the probability of 0.05 was lower the ear- various times. The responding maintained by
lier it occurred or, in other words, the longer the two-valued interval schedules (unfilled cir-
the period of time that separated it from the cles) is compared with that maintained by the
terminal probability at 240 sec. These two one-valued schedule, Fl 240-sec (filled circles).
360 A. CHARLES CATANIA and G. S. REYNOLDS

The local rates of responding maintained


by the probabilities of 0.50 and 1.0 were also
about equal when the probability of 0.50 was
at 90 sec (left column, panel b), but there was
little if any decrease in response rate be-
tween 90 and 240 sec. Finally, with the proba-
bility of 0.50 at 210 sec (panel c), the perform-
ance was scarcely distinguishable from that
maintained by the Fl 240-sec schedule.
The performance of Pigeon 85 partially
confirms the conclusions drawn from the per-
formance of Pigeon 34 (Fig. 18). The rate
maintained by the probability of 0.05 was
RFT
lower, relative to the terminal rate at 240 sec,
a 40
when this probability occurred at 30 sec than
30 when it occurred at 150 sec. Even when this
V) 20 _ probability occurred at 30 sec, however, the
w
U,
rate of responding did not decline during the
0
time between this and the terminal probabil-
ity of 1.0. The performance of Pigeon 35
40 (right column) was atypical in that the local
30 rate of responding consistently passed through
a maximum earlier than 240 sec after rein-
forcement, even in the Fl 240-sec schedule.
I0
Nevertheless, the maximum in the local rate
40-
of responding maintained by the probability
of reinforcement of 0.05 occurred later when
240" this probability was moved from 90 to 150
sec (panels a and b), and the performance
became more like that maintained by the Fl
240-sec schedule when the probability was
0 30 60 90 120 150 210
moved to 210 sec.
ISO 240
TIME SINCE PREVIOUS REINFORCEMENT (SEC.) Figure 20 directly compares the responding
Fig. 18. Rate of key-pecking as a function of time maintained by a probability of reinforcement
since reinforcement in five interval schedules (Pigeon of 0.05 at 30 sec after reinforcement and the
34). With a probability (rft/op) of 0.05, a shorter in- responding maintained by a probability of
terval was introduced into an FT 240-sec schedule at 0.50 at the same time after reinforcement
the times since reinforcement indicated by the arrows
in the top four panels. The bottom panel shows the (data from Pigeon 85). Relative to the per-
performance maintained by FI 240-sec without a formance maintained by Fl 240-sec (filled cir-
shorter interval. cles), the probability of 0.05 produced an in-
The local rate of responding maintained by crease in the local rate of responding at 30
the probability of 0.50 (Pigeon 86) at 30 sec sec. This rate was considerably lower than
was approximately equal to the rate main- that at 240 sec, when the probability became
tained 210 sec later by the terminal probabil- 1.0. The probability of 0.50 produced a rela-
ity at 240 sec. This is consistent with findings tively higher response rate at 30 sec and the
of Exp. 1 and 3 (Fig. 3, 7, and 8) which showed rate increased slightly throughout the remain-
that a probability of reinforcement of 0.50 der of the interval. This is consistent with the
maintained about the same rate of respond- findings of Exp. 1 and 3 (Fig. 3, 7, and 8), in
ing as a probability of 1.0. The equality of that the rate of responding maintained by a
the rates cannot be attributed to the closeness probability of 0.05 was low relative to that
of the two probabilities in time, because they maintained by a probability of 0.50.
were separated by 210 sec and because within When the two intervals, 30 sec and 240 sec,
these 210 sec the local rate of responding de- occurred with equal relative frequencies for
creased and then increased. Pigeon 85, they maintained a performance
INTERVAL SCHEDULES OF REINFORCEMENT 361

w
I.. b~~~~~~~~~~(
Ob
20 0

10 c

c
0 ~~~~~~~RFT
Z) 0'
Z 30
'I)
0~~~~~~~~~~~~~~~2
wI 20
10
10

60

50- 85 RFT ~~~~~~~~~~50


40 -
P.540-
30 -

-W. 0.05 ~~~~~~~~~30-


20 20

10 10

30 so 90 120 ISO ISO 210 240 nOL2~ ~ ~ ~ ~ ~ ~ ~ ~ .


0 30' 60 90 120 950 ISO 210 240
TIME SINCE PREVIOUS REINFORCEMENT (SEC.
Fig. 19. Rate of key-pecking as a function of time since reinforcement (Pigeons 86, 85, and 35). Filled circles
show the performance maintained by an FI 240-sec schedule. Unfilled circles show the performances main-
tained when a shorter interval was introduced with the probabilities (rft/op) and at the times since reinforce-
ment indicated by the arrows.

w 60 - 85 .
that was, over most of the range of time after
reinforcement, similar to those maintained by
some of the VI schedules examined in Exp. 1
Z 20
and 3. Thus, two-valued interval schedules
-
OP.o r can sometimes sustain responding over a con-
w
siderable period after reinforcement as effec-
tively as can many-valued (VI) schedules. Sim-
ilar findings have been discussed by Millenson
00 30 60 90 120 50 I 210 240 (1959), who examined a mixed FI 30-sec Fl
TIME SINGE PREVIOUS REINFORCEMENT (SEC.) 120-sec schedule in which the relative fre-
Fig. 20. Rate of key-pecking as a function of time quency of the shorter interval was 0.40. These
since reinforcement in three interval schedules (Pi- data may indicate that different probabilities
geon 85). Filled circles show the performance main- of reinforcement (e.g., 0.05 and 0.50) vary in
tained by an FI 240-sec schedule. Unfilled circles show the extent to which their effects spread in time.
the performances maintained when an interval of 30
sec was introduced with a probability (rft/op) of 0.05 A comparison of the probabilities of 0.05
or 0.50 into the FI schedule. and of 0.50 at four different times after rein-
362 A. CHARLES CATANIA and G. S. REYNOLDS

forcement is shown in Fig. 21 (data from Pi- terval of 210 sec) to 26.7 (short interval of 30
geons 86 and 35). At 30 sec, the difference be- sec) rft/hr. The probability of 0.05 at the end
tween the rates of pecking maintained by of the short interval, however, made avail-
these two probabilities was considerable (Pi- able only 15.1 (short interval of 210 sec) to
geon 86, lower left panel). This pigeon's rate 16 (short interval of 30 sec) rft/hr. To some
of responding, unlike that of Pigeon 85 in extent, the changes in the overall rates of re-
Fig. 19, decreased before again increasing dur- sponding maintained by these schedules may
ing the time from 30 to 240 sec. A difference have been determined by these changes in the
in the effects of the two probabilities was also overall rate of reinforcement. The changes in
evident at 90 sec (Pigeon 35, upper left panel) overall rate of responding, however, were not
and, to a lesser extent, at 150 sec (Pigeon 35, consistent with those predicted from the gen-
upper right panel). At 210 sec (Pigeon 86, eral form of the functions of Exp. 1 (Fig. 1)
lower right panel), both probabilities main- and therefore must have depended at least
tained rates of responding about equal to in part on the changes in the distribution of
those maintained 30 sec later, at 240 sec. intervals in time. This is particularly the case
Overall rates of reinforcement. The intro- for the probability of 0.05, for which a change
duction of short intervals increased the over- in rft/hr of only about 6% (from 15 to 16
all rate of reinforcement relative to the 15 rft/hr) produced 50% to 90% increases in
rft/hr provided by the single-valued Fl 240- overall rates of responding.
sec schedule. The probability of 0.50 at the Cumulative records. The relative consist-
end of the short interval made available rates ency of the performances maintained within
of reinforcement ranging from 16 (short in- individual intervals by VI schedules (Fig. 4

so 35 RFT
80~~~~~a5

70

90 X Oo P

z ve0o - k 0
0.
"Q

20 /

CL
(I)
60 -
z
0
A
w
8)6
6 RFT~

120 150 180 210 0 30 60 90


TIME SINCE PREVIOUS REINFORCEMENT (SEC.)
Fig. 21. Rate of key-pecking as a function of time since reinforcement (Pigeons 35 and 86). Filled circles
show the performance maintained by an FT 240-sec schedule. Unfilled circles show the performances main-
tained when a shorter interval (30, 90, 150, or 210 sec) was introduced with a probability (rft/op) of 0.05 or
0.50 into the FI schedule.
INTERVAL SCHEDULES OF REINFORCEMENT 363
and 13) was not a characteristic of the Fl 240- The record illustrating the performance
sec and the mixed Fl Fl schedules of the pres- maintained with a probability of 0.05 at 30
ent experiment. Figure 22 shows cumulative sec shows that a transition from a high to a
records of the performance of Pigeon 86 in low rate followed by a return to a higher rate
five of the present schedules, and indicates occurred in most intervals of the schedule.
that the average rates of responding illustrated Intervals in which this pattern was absent
in Fig. 18 through 21 are not necessarily rep- tended to occur early in the session. Again,
resentative of responding within individual the record indicates that the smoothness of
intervals. The performance maintained by Fl the average curve in Fig. 21 was not repre-
240-sec, for example, in the top record of Fig. sentative of the performances in individual
22, is fairly typical of the variability in out- intervals. Slow changes in local rates of re-
put from interval to interval in Fl perform- sponding were observed, as in the second full
ances as reported by Ferster and Skinner interval after the short interval in the illus-
(1957), among others. The temporal pattern- trative record, but rather abrupt transitions
ing of responding also varied considerably from a fairly high rate to an almost zero rate
from interval to interval. A fairly constant followed by a return to a higher rate were
rate of responding was maintained through- also fairly common, as in the next-to-last in-
out most of some intervals in the record, for terval of the record.
example, whereas a gradually increasing rate This pattern of responding, in which the
of responding was observed in other intervals. rate decreased and then increased within in-

Fl 240-sec.

RFT/OP at 30 sec.-0.05
I
i~v1~//
RFT/OP at 30 sec. = 0.50

86

RFT/OP at 90 sec.- 0.50

AAA_z
;R-/ W0 UTS

s.c.: 0.50
RFT/OP at 210
~~~~~~~A- _ e
~ * <.~
Fig. 22. Cumulative records of full sessions of Pigeon 86's key-pecking maintained by an FI 240-sec schedule
and by four schedules in which a shorter interval (30, 90, or 210 sec) was added with a probability (rft/op) of
0.05 or 0.50. The recording pen reset to baseline after each reinforcement, indicated by diagonal pips.
364 A. CHARLES CATANIA and G. S. REYNOLDS

dividual intervals, also occurred when the considered simple combinations of the per-
probability at 30 sec was raised to 0.50, as formances separately maintained by the com-
illustrated by the third cumulative record. ponent Fl schedules. Compare, for example,
Within this schedule, however, the perform- Pigeon 69 (Fig. 16) and Pigeon 86 (Fig. 19).
ance from interval to interval seemed some- whose performances on FI 200-sec and FI 240-
what more variable and, again, the temporal sec, respectively, were similar in both abso-
patterning was typically absent in the early lute level and the temporal patterning of re-
intervals of the session. sponding. Data for Pigeon 69 show local rates
When the probability of 0.50 was moved to of responding maintained separately by inter-
90 sec (fourth record), a fairly uniform rate vals of 30 and 200 sec (Fig. 16, left); data for
of responding was maintained within each in- Pigeon 86 show local rates of responding
terval, consistent with the average data pre- maintained by almost the same intervals, 30
sented in Fig. 19 (86b). Although this pattern and 240 sec, in combination (Fig. 19, upper
of responding was fairly regular, the particu- left). The agreement between the two sets of
lar rate of responding maintained through- data is fairly good, when it is considered that
out each interval tended to vary from inter- reinforcement at 30 sec was arranged with a
val to interval. probability of 1.0 for Pigeon 69 and 0.50 for
When the probability of 0.50 was moved to Pigeon 86, and that the FI 30-sec schedule for
210 sec (fifth record), the performance became Pigeon 69 necessarily prohibited responding
more like that maintained by Fl 240-sec (see after 30 sec since reinforcement. Some differ-
Fig. 19, 86c, and Record 1 of Fig. 22), with ences include the lower local rates of respond-
perhaps somewhat more variability in the ing shortly after reinforcement in the FI
total output from interval to interval than 30-sec schedule for Pigeon 69 than in the two-
was maintained by the FI schedule. valued schedule for Pigeon 86, and the some-
what larger difference between the terminal
DISCUSSION rates in the two FI schedules for Pigeon 69
When the two probabilities of reinforce- than between the rates at the end of the two
ment in the two-valued schedules were sepa- intervals in the schedule for Pigeon 86 (cf.,
rated by a considerable period of time, the however, the later performance of Pigeon 86
probability of 0.05 at the end of the short in- in the same schedule: Fig. 23, Exp. 6). Greater
terval maintained lower local rates of re- disagreement can be found by making the
sponding than did the probability of 1.0 at same kind of comparison between the FI data
the end of the long interval. The difference for Pigeon 69 and the corresponding data for
between the local rates maintained by the two Pigeon 85 (Fig. 20, rft/op = 0.50). Pigeon 85
probabilities became smaller as the two prob- differed from Pigeon 86 primarily in the
abilities moved closer in time. The probabil- higher local rates of responding maintained
ity of 0.50 at the end of the short interval, on between the two opportunities for reinforce-
the other hand, maintained about the same ment in the two-valued schedule.
local rate of responding as the later probabil- The relevance of the present findings to the
ity of 1.0 even when the temporal separation analysis in terms of local rates of reinforce-
of the two probabilities was large. The find- ment in Exp. 3 lies mainly in their indication
ings are consistent with the effects of different of the extensive period of time since reinforce-
probabilities of reinforcement in the VI ment over which a particular probability of
schedules of Exp. 1 and 3. The exceptions to reinforcement can have its effect (cf. Exp. 4),
these generalizations again demonstrate the and of the degree to which a later high prob-
consistency of individual differences among ability of reinforcement can influence the lo-
pigeons. For each of the schedules studied cal rate of responding maintained by an early
with Pigeon 35, for example, the local rate of low probability of reinforcement (e.g., rft/
responding passed through a maximal value op = 0.05). The calculation of local rates of
at some time before the end of the 240-sec reinforcement described in Exp. 3 (Fig. 6)
interval (Fig. 19 and 20), a characteristic of contributes little to the analysis of the two-
performance not noted for the other pigeons. valued schedules. By this calculation, the lo-
To some extent, the performances main- cal rate of reinforcement remains constant at
tained by the present schedules can be the end of the short interval and increases
INTERVAL SCHEDULES OF REINFORCEMENT 365

about seven-fold at the end of the long inter- each session consisted of 21 reinforcements.
val as the short interval is moved from 30 to With timeout substituted for reinforcement
210 sec since reinforcement. This inconsist- at 240 sec (conditions 2 and 3 in Table 5),
ency with the observed local rates of respond- sessions consisted of 20 intervals: two rein-
ing in the schedules again demonstrates the forcements per session when the short inter-
limited applicability of the method of calcu- val occurred with a relative frequency of 0.05
lating local rates of reinforcement. (reinforcement of the first response of the ses-
sion and at the end of the single short inter-
val), and about 11 reinforcements per session
EXPERIMENT 6: when the short interval occurred with a rela-
EFFECTS OF THE OMISSION OF tive frequency of 0.50 (reinforcement of the
REINFORCEMENT AT THE END OF first response of the session and at the end of
THE LONG INTERVAL IN TWO- about 10 short intervals).
VALUED INTERVAL SCHEDULES During the first 25 sessions of the second
Experiment 5 suggested that the respond- condition, timeout was produced by the first
ing at and near the end of the short interval response after the end of the 240-sec interval.
in two-valued interval schedules is main- Thereafter, timeout occurred independently
tained not only by reinforcement at the end of responses at the end of the 240-sec interval.
of the short interval but also by reinforce-
ment at the end of the long interval. The RESULTS
present experiment examined the role of the For each pigeon, Fig. 23 shows the local
long interval in two-valued interval schedules rates of responding maintained by a prob-
by substituting timeout, an event that gen- ability of reinforcement of 0.50 when a re-
erally does not serve as a reinforcer, for rein- sponse was reinforced with a probability of
forcement at the end of the long interval. 1.0 at 240 sec (filled circles) and when timeout
occurred at 240 sec (unfilled circles). The
METHOD schedules with reinforcement at 240 sec main-
The two-valued Fl schedules of Exp. 5 tained performances roughly comparable to
were modified by substituting a 4-sec period those maintained by the equivalent schedules
of timeout (no key light or house light) for in Fig. 5. One exception, in the performance
the 4-sec reinforcement at the end of the 240- of Pigeon 86, was that the local rate of re-
sec interval. Reinforcement remained avail- sponding maintained by the probability of
able for the first response of each session and 0.50 at 30 sec was considerably higher than
at the end of the short interval. Table 5 sum- the rate maintained by the higher probability
marizes the procedure and indicates the dura- of 1.0 at 240 sec (Fig. 19, 86a).
tion and relative frequency of the short inter- The substitution of a 4-sec timeout for the
vals for each pigeon. When reinforcement was 4-sec reinforcement had only small effects on
available at 240 sec (conditions 1 and 4 in local rates of responding. For all pigeons, the
Table 5), details were the same as in Exp. 5; local rate of responding immediately after

Table 5
Sequence of interval schedules for each pigeon. Entries show the relative frequency (rft/op)
of the short interval (in sec). The terminal event at 240 sec was either reinforcement of a
response (Rft) or a 4-sec timeout (TO).
Pigeon 85 86 34 35
Short Interval 30 30 90 90
Terminal Event
(240-sec) Rft/Op Rft/Op Rft/Op Rft/Op Sessions

1. Rft 0.05 0.50 0.05 0.50 44


2. TO 0.05 0.50 0.05 0.50 57*
3. TO 0.50 0.05 0.50 0.05 54
4. Rft 0.50 0.05 0.50 0.05 41
The 4-sec timeout was response-dependent during the first 25 sessions and response-independent thereafter.
366 A. CHARLES CA TANIA and G. S. REYNOLDS

LU

, .
~~~85 34/
Z.0
z ~~~It'O 0.
LU
*RFj/0p z 1.0 at 240-s.c.
u 0 0 TO at 240-s.c.

0-60.

40-

20

86 35
0~
0 50 100 150 200 0 50 100 150 200
TIME N NTERVAL. (SEC.)
Fig. 23. Rate of key-pecking as a function of time since the start of an interval, for four pigeons. In one
schedule (filled circles), reinforcement was available at 240 sec. In the other schedule (unfilled circles), a re-
sponse-independent 4-sec timeout (TO) was presented at 240 sec. In both schedules, reinforcement was avail-
able with a probability (rft/op) of 0.50 at either 30 (Pigeons 85 and 86) or 90 (Pigeons 34 and 35) sec after the
start of an interval.

timeout (early times in each interval) was for subsequent responding within intervals,
higher than the rate in the equivalent sched- so that the substitution of timeout for rein-
ules with reinforcement at 240 sec. Local forcement produced higher local rates of re-
rates of responding after the end of the short sponding early in intervals. In the preceding
interval (rft/op = 0.50) were fairly similar in experiments, the durations of intervals had
the two types of schedules. One factor that been timed consistently from a preceding re-
may have contributed to the small effect of inforcement. The effects, however, were evi-
substituting timeout for reinforcement was dent in the performances of Pigeons 85 and
that even with timeout at 240 sec reinforce- 34, for which the data presented are based on
ment occasionally followed 30 or 90 sec later over 100 sessions with timeout at 240 sec, as
(in the short interval). This does not seem to well as in the performances of Pigeons 86 and
account for the higher local rates early in in- 35. Finally, since timeout is sometimes fol-
tervals, however, because rates of responding lowed by relatively high rates of responding
immediately after timeout were, for Pigeons in interval schedules, the rise in local rate
86, 34, and 35, lower than local rates of re- early in intervals may reflect a direct effect of
sponding immediately preceding timeout timeout on subsequent responding (e.g., Fer-
(initial and terminal local rates). An alternate ster, 1958; Neuringer and Chung, 1967).
possibility is that timeout did not fully ac- Figure 24 compares, for each pigeon, the per-
quire control as a temporal reference point formances maintained with reinforcement at
INTERVAL SCHEDULES OF REINFORCEMENT 367

240 sec (unfilled circles) and with timeout at than the local rate maintained by the prob-
240 sec (unfilled circles) when the probability ability of 0.05 with reinforcement at 240 sec.
of reinforcement at the end of the short inter- The low local rates of responding, however,
val was 0.05. Again, with the exception of the do not represent a stable and relatively con-
elevated local rate of responding at 30 sec tinuous low rate of responding, but rather an
for Pigeon 86, the performances maintained average over higher rates of responding alter-
by the schedules with reinforcement at 240 nating irregularly with long periods of no re-
sec were roughly comparable to the equiva- sponding. Thus, the probability of 0.05, un-
lent performances -in Exp. 5. When timeout like the probability of 0.50, was less effective
was substituted for reinforcement at 240 sec, in maintaining responding when the higher
however, the schedules lost control over the probability of reinforcement at a later time
distribution of responses in time. In other was removed. In other words, the rate of re-
words, a relatively low and constant local rate sponding maintained by the probability of
of responding was maintained, independent 0.05 probably was supported in part by the
of the time elapsed in the interval. For Pi- probability of 1.0 at 240 sec.
geons 85 and 34, the local rate of responding
was about equal to the local rate maintained DISCUSSION
by the probability of 0.05 with reinforcement When reinforcement was eliminated at 240
at 240 sec. For Pigeons 86 and 35, the local sec, the temporal pattern of responding was
rate of responding was considerably lower maintained when the earlier probability was
80 .
RFT p '1.0 at 240-sec. 85
60 o TO at 240-sec.

40-

I- '% 0.05
Z20 I

ce.

ECI
6~~~~~~8

0 50 100 150 200 0 50 100 150 200


TIME N NTERVAL (SEC.)
Fig. 24. Rate of key-pecking as a function of time since the start of an interval, for four pigeons. Same as
Fig. 23, except that reinforcement at either 30 (Pigeons 85 and 86) or 90 (Pigeons 34 and 35) sec after the start
of the interval was available with a probability (rft/op) of 0.05.
368 A. CHARLES CATANIA and G. S. REYNOLDS

0.50, but not when the earlier probability was and the coordination of the overall-rate func-
0.05. It is possible that the temporal pattern tions from VI schedules with the local rates
could have been maintained by the probabil- in Fl and two-valued interval schedules are
ity of 0.05 under different circumstances. The beyond the scope of this paper. In the present
elimination of reinforcement at 240 sec not research, different pigeons served in different
only changed the stimulus from which inter- experiments, and the magnitude of the indi-
vals were timed, but also, when the earlier vidual differences among pigeons suggests
probability was 0.05, drastically decreased the that such an analysis would not stand up well
overall rate of reinforcement. Responding to comparisons across pigeons. For the pres-
might have been maintained more consist- ent, then, the formulation in Exp. 3 must be
ently at and near the end of the short inter- considered a first approximation with its ap-
val if the probability at 240 sec was gradually plication limited to many-valued interval
rather than abruptly reduced from 1.0 to zero schedules.
or if, in the absence of reinforcement at 240
sec, a high probability at the end of the short
interval (e.g., 0.90) was gradually reduced to GENERAL DISCUSSION
0.50 and then to 0.05. It may also be rele- The present experiments examined the ef-
vant that the number of intervals and num- fects on responding of a variety of character-
ber of reinforcements per session were rela- istics of interval schedules of reinforcement.
tively small, although the present results were Experiment 1 established that the overall
obtained over a reasonably large number of rate of responding maintained by arithmetic
sessions (cf. Method). VI schedules was a monotonically increasing,
The finding that reinforcement at the end negatively accelerated function of the overall
of a long interval may support the respond- rate of reinforcement (Fig. 1). The local rate
ing maintained by earlier opportunities for of responding at a given time since reinforce-
reinforcement has implications for the analy- ment was, correspondingly, a monotonically
sis of interval schedules in terms of local rates increasing, negatively accelerated function of
of reinforcement. The calculation of local the probability of reinforcement at that time
rates of reinforcement should not be limited (Fig. 3). Experiments 2 and 3 examined VI
only to the opportunities for reinforcement schedules with different distributions of in-
within a particular schedule. Instead, the lo- tervals. Experiment 2 demonstrated that re-
cal rate of reinforcement at any time since inforcement of a response immediately after
reinforcement should be based on the oppor- a preceding reinforcement affected respond-
tunities that occur over an extended period ing over only a relatively short period of time
of time, with the probabilities of reinforce- since reinforcement. In Exp. 3, two VI sched-
ment at the different opportunities weighted ules with extra short intervals arranged
as a function of their proximity to the time various probabilities of reinforcement at a
in question. The period of time over which fixed time early in interreinforcement inter-
opportunities for reinforcement contribute to vals (Fig. 7 and 8), and a constant-probability
the local rate of reinforcement at a particular VI schedule arranged a fixed probability at
time probably should grow as a function of various times within interreinforcement in-
the absolute time since reinforcement. Local tervals (Fig. 11 and 12). These schedules
rates of reinforcement calculated for succes- demonstrated that the effect of a given prob-
sive points of time since reinforcement, there- ability of reinforcement could be indepen-
fore, would be a kind of moving weighted av- dent of the time since reinforcement at which
erage of the probabilities of reinforcement it occurred, but another schedule in Exp. 3,
over successive overlapping ranges of time. the "linear" VI schedule, suggested that the
Such a calculation would take into account effect of a given probability also depended on
some of the properties of the interaction of its proximity in time to other probabilities.
different probabilities of reinforcement and The "linear" VI schedule separated different
different times since reinforcement explored probabilities widely enough in time to change
in Exp. 4, 5, and 6. But the formulation of the relationship between rate of responding
the quantitative details, their application to and probability of reinforcement (Fig. 9 and
the VI schedules of the earlier experiments, 10). These effects, plus data in the literature
INTERVAL SCHEDULES OF REINFORCEMENT 369
on geometric and Fibonacci VI schedules, led local rates of reinforcement within intervals.
to the conclusion that responding is not sim- If the distribution of intervals in a schedule
ply controlled by the probability of reinforce- is changed while the overall rate of reinforce-
ment at a particular time within an interval, ment is held constant, the decrease in the lo-
but rather by the probability taken over a cal rate of reinforcement at one time after re-
period of time or, in other words, by the local inforcement must be accompanied by an
rate of reinforcement. Limitations on a pre- increase at some other time after reinforce-
liminary formulation of the control by local ment. Because the local rate of responding is
rates of reinforcement were indicated by Exp. a negatively accelerated function of the local
4, 5, and 6. Experiment 4 examined fixed- rate of reinforcement, the decreased local rate
interval schedules, Exp. 5 showed in detail of reinforcement at one time will not neces-
the combined effects of two probabilities of sarily be compensated, in rate of responding,
reinforcement as a function of their values by the increased local rate of reinforcement
and their separation in time in two-valued at some other time.
schedules (mixed Fl FI), and Exp. 6 demon- The dependence of overall rate of respond-
strated that reinforcement at the end of the ing on the distribution of intervals is most
longest interval in a two-valued schedule sup- easily demonstrated by the comparison of Fl
ported the responding maintained by an ear- and VI schedules, as illustrated in Fig. 25.
lier opportunity for reinforcement. These ex- The FI schedule includes discriminable pe-
periments suggested that the period of time riods of time during which the local rate of
within which reinforcement could contribute reinforcement, as inferred from performance,
to a particular local rate of responding was is at or near zero (e.g., the responding of Pi-
large relative to the time since reinforcement. geon 34 between 0 and 90 sec in the FI 240-
The spread of the effect of reinforcement at sec schedule: Fig. 18). Such performance,
one time since reinforcement to local rates of which results in a large proportion of time
responding at other times could be inter- when low rates of responding occur during
preted in terms of a gradient of temporal gen- each interval, produces an overall rate of
eralization. The performance maintained by pecking lower than that maintained by a
an FI schedule may reflect such a gradient, schedule that provides no discriminable pe-
but by its nature the FI schedule can provide riods of nonreinforcement (e.g., the constant-
only one side of such a gradient: up to the probability VI schedule).
time at which reinforcement is made avail- The overall rates of responding maintained
able but not beyond that time. The elimina- by VI schedules are higher than those main-
tion of reinforcement at the end of the long tained by FI schedules that provide the same
interval in the two-valued schedules of Exp. overall rate of reinforcement, except at 1800
6 might have provided, but in fact did not rft/hr (VI or FI 2-sec), when the schedules
provide, complete gradients (see the perform- approach continuous reinforcement and
ance of Pigeon 86 in Fig. 19, upper left, for when responding is more appropriately
a suggestive example of a gradient of respond- treated as a series of latencies from reinforce-
ing around 30 sec since reinforcement in ment than as a rate. (A reversal may also oc-
Exp. 5). cur at very low rates of reinforcement. For
Despite the ubiquitous individual differ- example, Fl 24-hr may maintain higher rates
ences among pigeons, the monotonically in- of responding than VI 24-hr. Cf. Morse, 1966).
creasing, negatively accelerated form of the It seems reasonable to assume that the
input-output function for interval schedules random-interval (constant-probability) VI
was consistent with many aspects of the data schedule, in which the correlation between
from the several experiments. The form of probability of reinforcement and time since
the function implies that the overall rate of reinforcement is minimal, and the Fl sched-
responding maintained by a particular over- ule, in which the correlation between prob-
all rate of reinforcement may be critically ability of reinforcement and time since rein-
determined by the distribution of intervals in forcement is maximal, represent the full range
a schedule. The overall rate of responding is of overall rates of responding, at each overall
a weighted average of local rates of respond- rate of reinforcement, that can be maintained
ing, and local rates of responding depend on by interval schedules of reinforcement. The
370 A. CHARLES CATANIA and G. S. REYNOLDS

random-interval and arithmetic VI data are


in fair agreement, suggesting that effects of APPENDIX I:
the distribution of intervals on the overall ANALYSIS IN TERMS OF
rate of responding are small provided that INTERRESPONSE TIMES
opportunities for reinforcement are reason- A number of accounts of the performances
ably closely and uniformly spaced along the maintained by interval schedules of reinforce-
continuum of time since reinforcement. On ment have been concerned with the differen-
the basis of Fig. 25, it is not possible to say tial reinforcement of interresponse times, or
whether or not the VI and Fl functions have IRTs (Skinner, 1938; Newman and Anger,
the same form, and the form of the Fl 1954; Anger, 1956; Morse, 1966; Shimp, 1967).
function would in any case depend on the This section relates the present findings to
way in which it is a derivative of the more IRT analyses in a treatment that is an alter-
fundamental function relating local rates native to, but is not necessarily incompatible
of responding and local rates of reinforce- with, the treatment developed in the main
ment. body of the paper.

LU
D8(
z
/ --'~~~--
Cjg 6( / --

LU /_ '- _ _ _ _. ._ _

V)
LI) * PRESENT Fl DATA
z 0 PRESENT VI DATA
0 , SCHOENFELD & CUMMING 0960):Fl
L/)
LL * FARMER (1963): F I
Cy- o FARMER 0963):VI (P-0.S0)
O FARMER (1963: VI Pz 0.25)
I Vi
500 "f 1800
REINFORCEMENTS PER HOUR
Fig. 25. Rates of pigeons' key-pecking as a function of the rates of reinforcement provided by Fl and VI
schedules. The present FI data are averaged across four pigeons (Fig. 17). The present VI data are averaged
across six pigeons (Fig. 1); only the three VI schedules providing the highest rates of reinforcement were com-
mon to all six pigeons, so the average rates of responding on these three schedules were determined first, and
the other rates were averaged only after they had been multiplied by a constant to adjust for the differences
between birds in absolute levels maintained by the three common schedules. The FI data presented by Schoen-
feld and Cumming (1960) come from different groups of two to four pigeons in different experiments (Hearst,
1958; Cumming and Schoenfeld, 1958; Clark, 1959), in all of which intervals were timed from the end of the
preceding interval, rather than from the preceding reinforcement, and in which reinforcement, once arranged,
was held available only for a time equal to the duration of the Fl (limited hold). The FT and VI data from
Farmer (1963) are averages across either two or three different pigeons at each point. Farmer arranged ran-
dom-interval schedules (cf. Discussion, Exp. 3, or Appendix II} in which the probability of reinforcement, P,
in each recycling time interval, T, was 1.0 (FI) or a lower value (VI). In each of Farmer's groups, T-was con-
stant and P was varied. Data were selected from Farmer's groups so that P was constant and T varied and there-
fore the distributions of intervals were comparable within each set of connected points.
INTERVAL SCHEDULES OF REINFORCEMENT 371
An IRT is the time separating two consecu- response that would initiate such an IRT
tive responses; the first response initiates the would necessarily be reinforced.
IRT, and the second terminates it (techni- This calculation of probabilities of rein-
cally, the boundaries of an IRT should be forcement in an Fl schedule assumes that, for
defined in terms of response onsets, but re- any IRT, its distribution of starting times in
sponse durations will be assumed negligible the interval is uniform or rectangular, or, in
for the present purposes). An IRT is said to other words, the probability that a given IRT
be reinforced when the response that termi- will occur is independent of the time since
nates it is reinforced. For a given IRT, there- reinforcement. This assumption usually is not
fore, the probability of reinforcement is the satisfied within Fl performances; for exam-
probability that responses will be reinforced ple, no 50-sec IRT would ever be reinforced
when they terminate an IRT of this duration. if 50-sec IRTs never began after the first 25
Within schedules of reinforcement, IRTs sec of the 100-sec interval. The probabilities
and latencies are sometimes not distinguished, of reinforcement in Fig. 26, therefore, may be
but the distinction may be important. The considered relative frequencies only with re-
first response after reinforcement terminates spect to all possible starting times of each
a latency, timed from the end of the rein- IRT, and not necessarily with respect to ac-
forcement. This response does not terminate tual relative frequencies in a particular Fl
an IRT but, so long as it is not itself rein- performance. This observation imposes limi-
forced, it initiates the IRT terminated by the tations on the present treatment, as discussed
next response. In other words, two consecu- below, and indicates the importance of com-
tive responses define the temporal boundaries paring recorded distributions of all IRTs
of an IRT only if the first of the two responses with recorded distributions of reinforced
is not reinforced. This logical distinction is IRTs; such data are not available for the
consistent with the interpretation of rein- present experiments.
forcement as an event that not only maintains When the availability of Fl reinforcement
responding but also provides a discriminative is limited to a specified period of time (lim-
stimulus for subsequent responding. It may ited hold), the function relating probability
also have a bearing on the special character- of reinforcement to IRTs is altered for all
istics of the earliest times after reinforcement IRTs longer than the limited hold, as illus-
(see Exp. 2). trated in Fig. 26 by the 20-sec limited hold
added to FI 100-sec. In that schedule, any
The Probability of IRT between 20 and 100 sec long will be re-
Reinforcement for an Interresponse Time inforced only if it begins within a particular
Figure 26 illustrates the probabilities with 20-sec period of time within the 100-sec in-
which different IRTs are reinforced in sev- terval; for these IRTs, therefore, the prob-
eral schedules of reinforcement (cf. Anger, ability of reinforcement is 0.20. Any IRT of
1956, p. 152; Morse, 1966, p. 69). To empha- more than 120 sec cannot end before the lim-
size differences among the schedules, the fig- ited hold is over and so cannot be reinforced.
ure shows a considerable range of IRTs; in (The effects of reinforcement available in the
practice, the left-most portion is usually the next interval, after the limited hold is over,
most relevant, because the longer IRTs oc- have been ignored in this computation.)
cur relatively infrequently in most sched- The effect of a limited hold on perform-
ules. ance is similar to the effect of a ratio schedule
In an FI schedule, the probability of rein- (Ferster and Skinner, 1957; Morse, 1966), and
forcement varies linearly with IRT, reaching in a ratio schedule as in an interval schedule
1.0 at a duration equal to the fixed interval. with limited hold, the probability of rein-
Consider, for example, a 50-sec IRT in the forcement is constant over a considerable
Fl 100-sec schedule illustrated in Fig. 26. This range of IRTs. The variable-ratio and fixed-
IRT will be reinforced only if it begins dur- ratio schedules in Fig. 26 show that when
ing the last 50 sec of the 100-sec interval, and every tenth response on the average is rein-
its probability of reinforcement is therefore forced (VR 10) or when exactly every tenth
0.5. No IRT can begin 100 or more sec after response is reinforced (FR 10), the probabil-
reinforcement in this schedule, because the ity of reinforcement is 0.10 and is indepen-
372 A. CHARLES CATANIA and G. S. REYNOLDS

dent of IRT (assuming numerical positions ity, rft/op, of 0.10; cf. Exp. 3 and Appendix
of IRTs within the ratio can be ignored). II).
An Fl schedule provides differential rein- The probabilities of reinforcement for
forcement for long IRTs, in that the prob- IRTs in the arithmetic and geometric sched-
ability of reinforcement is higher for long ules in Fig. 26 were calculated by dividing all
than for short IRTs. Such differential rein- starting times of an IRT such that the IRT
forcement is arranged more explicitly in a would be reinforced by all possible starting
DRL (differential-reinforcement-of-low-rate) times of the IRT. Consider, for example, a
schedule. In the DRL schedule illustrated in 10-sec IRT in the arithmetic schedule. This
Fig. 26, the probability of reinforcement is IRT cannot occur in the 0-sec interval, in
zero for IRTs shorter than 100 sec and 1.0 which the first response after reinforcement is
for IRTs equal to or longer than 100 sec. reinforced. It will be reinforced if it begins
during the last 10 sec of any of the 10 other
ID intervals, from 20 to 200 sec; the sum of all
reinforced starting times, therefore, is 100
sec. The IRT can begin at any time within
an interval; the sum of all possible starting
0O.5 -? times, therefore, is the sum of all intervals, or
0 + 20 + 40 + ... + 200 = 1100 sec. Thus, the
lO-sec. LH 20-n-
0F probability of reinforcement for the 10-sec
z VR rFR 10
IRT is 100 sec divided by 1100 sec, or 0.091.
0. Correspondingly, the reinforced starting
times of a 10-sec IRT in the geometric sched-
I-~~~~~~~~~~~~~~~~~~~Sc ule consist of the 1-, 2-, 4-, and 8-sec intervals
plus the last 10 sec of each of the six longer
0* intervals, or 75 sec; all possible starting times
consist of the sum of the intervals, or
1 + 2 + 4 +... + 512 = 1023 sec. Thus, the
50 100 150 200
probability of reinforcement for the 10-sec
INTERRESPONSE TIME (SEC.) IRT is 75 sec divided by 1023 sec, or 0.073.
Fig. 26. Probability of reinforcement as a function
The constant-probability schedule, as spe-
of interresponse time in several schedules of reinforce- cified, does not consist of a finite number of
ment. The upper frame shows the functions for fixed- intervals over which all reinforced starting
interval (Fl 100-sec), fixed-interval with limited hold times and all possible starting times of an
(FT 100-sec LH 20-sec), variable-ratio (VR 10) and IRT can be summed. The probabilities of re-
fixed-ratio (FR 10), and reinforcement of all responses
terminating interresponse times that exceed a mini- inforcement for IRTs, however, can be de-
mum value (DRL 100-sec). The lower frame shows rived from the probability of reinforcement
the functions for three different types of VI schedules: (rft/op) of 0.10 at the end of each 10-sec pe-
arithmetic, constant-probability, and geometric. riod of time since reinforcement. For exam-
ple, any 10-sec IRT must end at or after the
In VI schedules, the relationship between end of one 10-sec period and before the end
IRTs and their probabilities of reinforcement of a second 10-sec period. Its probability of
depends on the distribution of interreinforce- reinforcement, therefore, is 0.10. On the as-
ment intervals. Figure 26 (bottom frame) sumption of a uniform distribution of start-
shows illustrative functions for three VI ing times for each IRT, the probability of
schedules with roughly equal mean intervals: reinforcement for all IRTs of less than 10
an arithmetic VI schedule (11 intervals from sec increases linearly with IRT from 0 to
0 to 200 sec, with an additive constant of 20 0.10. A 5-sec IRT, for example, can begin dur-
sec); a geometric VI schedule (10 intervals ing the first 5 sec or the last 5 sec of a given
from 1 to 512 sec, with a multiplicative con- 10-sec period, and its probability of reinforce-
stant of 2); and a constant-probability VI ment, therefore, is 0.50 times 0.10, or 0.05.
schedule (in which, at the end of successive For IRTs longer than 10 sec, the probabil-
10-sec periods of time since reinforcement, re- ity that reinforcement had become available
inforcement is scheduled with a probabil- at the end of either of two consecutive 10-sec
INTERVAL SCHEDULES OF REINFORCEMENT 373

periods must be taken into account. For a 20- interval schedule, its probability of reinforce-
sec IRT, for example, this probability is 0.19; ment is given by two independent probabili-
the probability of 0.10 at the end of the first ties: the probability that the IRT will end
10-sec period plus the conditional probability, at or after an opportunity for reinforcement
0.90 times 0.10, at the end of the second 10- and the probability of reinforcement at that
sec period. Again, the probabilities of rein- opportunity (rft/op). The former can be cal-
forcement increase linearly, from 0.10 for a culated on the assumption of a uniform dis-
10-sec IRT to 0.19 for a 20-sec IRT. (If a 10- tribution of starting times of the IRT within
sec limited hold were added to the schedule, the period of time considered; the latter can
as in the random-interval schedules of Farmer, be calculated from the distribution of inter-
1963, the probability of reinforcement would vals in the schedule.
remain constant at 0.10 for all IRTs longer With time since reinforcement as a parame-
than 10 sec). ter, Fig. 27 shows the probability of rein-
A comparable procedure for calculating forcement for IRTs in the three VI schedules
probabilities of reinforcement for IRTs could and the Fl schedule of Fig. 26. The periods
have been used for the arithmetic and geo- of time represented for the arithmetic and
metric schedules, but would have been more geometric VI schedules are those between suc-
complicated because of the different probabil- cessive opportunities for reinforcement.
ities of reinforcement (rft/op) at each oppor- In the constant-probability VI schedule,
tunity in those schedules. The VI functions the probability of reinforcement at each op-
in Fig. 26 appear to be smooth curves, but portunity is, by definition, independent of
each is actually made up of linear segments. whether reinforcement became available at
Changes in slope occur at IRTs equal to the the previous opportunity. The probabilities
durations of the intervals in a given schedule. of reinforcement for IRTs, therefore, also re-
For IRTs up to about 40 sec, the probabil- main independent of time since reinforce-
ities of reinforcement in Fig. 26 are slightly ment. (For the constant-probability schedule
higher in the constant-probability schedule illustrated, this is true as long as the probabil-
than in the arithmetic schedule. For longer ities are calculated over periods of at least
IRTs, the probabilities become higher in the 10 sec. For example, if the first and second 5
arithmetic than in the constant-probability sec after reinforcement were considered sepa-
schedule. The probabilities in the geometric rately, the function for the first 5 sec would
schedule are consistently the lowest. As men- be displaced to the right in Fig. 27, to begin
tioned above, the shorter IRTs are most at an IRT of 5 sec; the function for the sec-
significant in analyzing performance because ond 5 sec would be the same as that in the
the longer IRTs occur relatively infrequently. figure.)
The comparisons in Fig. 26 may imply that Within each period of time in the arith-
overall rates of responding maintained by a metic schedule, the probability of reinforce-
given overall rate of VI reinforcement should ment increases linearly from 0 to 1.0 with
be slightly higher in constant-probability increasing IRT. The later the time since re-
schedules than in arithmetic schedules and inforcement, the steeper the function. In other
lowest in geometric schedules, but they ne- words, for a given IRT (vertical cut through
glect the possible role of different starting the functions), the probability of reinforce-
times of IRTs. Thus, they do not contribute ment increases as time passes since reinforce-
to an account of how local rates of respond- ment. Or, for a given probability of reinforce-
ing increase with time since reinforcement in ment (horizontal cut through the functions),
arithmetic schedules, decrease in geometric the IRT reinforced with that probability
schedules, and remain roughly constant in becomes shorter as time passes since reinforce-
constant-probability schedules (Exp. 3). ment.
The broken line superimposed on the arith-
The Probability of Reinforcement for metic-schedule functions shows the effect of
Interresponse Times as a Function of adding three extra 20-sec intervals to the
Time Since Reinforcement schedule. From 0 to 20 sec after reinforce-
For an IRT that begins within a particu- ment, the probabilities of reinforcement for
lar period of time since reinforcement in an IRTs up to 20 sec long become almost as high
374 A. CHARLES CATANIA and G. S. REYNOLDS

For successive 20-sec periods of time since


reinforcement in an Fl 100-sec schedule, the
functions are linear and parallel. Within the
first 20 sec after reinforcement, for example,
the probability of reinforcement is 0 for all
IRTs of less than 80 sec. The probability
then rises linearly to 1.0 for an IRT of 100
sec (cf. Morse, 1966, Fig. 3 and 4, pp. 70-71).
The schedules in Fig. 27 are illustrative,
but the directions of change in the probabili-
ties of reinforcement for IRTs as time passes
ui0.5 -j-'l since reinforcement are characteristic of the
four classes of interval schedules. The choice
of the periods of time over which probabili-
~ ~ ~ ~ ~ ~ ~
ties were calculated was based in part on ease
~ ~

of computation, but also the further subdivi-


sion of periods of time within the arithmetic
205~~~~~20sirmoX eIHM and geometric schedules would have pro-
duced functions in which the probability of
reinforcement was zero for some range of
IRTs. In the geometric schedule, for exam-
ple, the subdivision of 128-to-256 sec into two
equal periods would have produced a func-
tion in which, for 128-to-192 sec, the prob-
ability of reinforcement was zero for all IRTs
less than 64 sec. This schedule, however,
would probably maintain responding at a
IF 100- .- moderate rate throughout this period of time.
The problem of choosing periods of time
0 over which probabilities of reinforcement
100 20

INTERRESPONSE TIME (SEC.) can be calculated for IRTs is analogous to the


Fig. 27. Probability
problem of choosing periods of time within
of reinforcement as a function
tiw;e,
of interresponse
which to calculate local rates of reinforce-
with periods of time since rein-
forcement as a ment (Discussions, Exp. 3 and 6). The simi-
parameter, in four different interval
schedules: constant-probability, larity of the two problems is illustrated by
arithmetic, and geo-
metric VI FI
schedules and
their common concern with the earliest times
an schedule. The dotted
line in tecond
the frame shows the effect of adding
extra short intervals
since reinforcement, but the treatment in
to the arithmetic VI schedule.
Details in text. terms of IRTs has the advantage that the
difficulty can be related to an observed dis-
as the later probabilities of reinforcement for crepancy between assumptions and data.
these IRTs within the period from 140 to Probabilities of reinforcement for IRTs are
160 sec after reinforcement. based on the assumption of a uniform distri-
In the geometric schedule, the functions for bution of starting times for each IRT, but lo-
most periods of time since reinforcement are cal rates of responding, and therefore IRTs,
concave downward, and are lower the longer are changing most rapidly during the earliest
the time since reinforcement (the function for times after reinforcement (see Exp. 2 and 3).
256-to-512 sec, omitted from Fig. 27, is a Thus, the difference between probabilities of
straight line; up to an IRT of 128 sec, it reinforcement for IRTs, in Fig. 27, and re-
corresponds to the function for 32-to-64 sec). corded relative frequencies of reinforced
In general, the probability of reinforcement IRTs, from a performance, is likely to be
for a given IRT decreases as time passes since greatest at the earliest times since reinforce-
reinforcement, or the IRT reinforced with a ment (e.g., 0-to-20 sec in the arithmetic sched-
given probability becomes longer as time ule in Fig. 27). Another advantage of the
passes since reinforcement. treatment in terms of IRTs is that no assump-
INTERVAL SCHEDULES OF REINFORCEMENT 375

tion is made that the rate of responding at I


one opportunity is high enough to produce
an available reinforcement before the time uJ 101 10a
is reached for the next opportunity (see Exp. z I I
3). I
In if IRTs become shorter 'and
any case, I
I o I
a.u
LU I
~~~~~~~~~~~~~~~~~I
I
rates of responding increase as the probabili- uj

ties of reinforcement for IRTs increase, then


the functions in Fig. 27 agree in a general
5 io lI I
I I
way with the performances maintained by
ui
each schedule (Exp. 3): in a constant-prob- I
ability VI schedule, local rate of responding
I
I II
remains roughly constant over time since re- 0.1%u
inforcement; in an arithmetic VI schedule,
local rate of responding increases but the ad- u
dition of extra short intervals produces a rela- I I
tively high local rate shortly after reinforce- I 0.
ment; in a geometric VI schedule, local rate
of responding decreases; and in an Fl sched- kw
ule, local rate of responding increases and u, I
very long IRTs often occur early in the in-
. 100
I-

terval. z
C .
50
Ai 150
The Relationship Between Observed REINFORCEMENTS PER HOUR
Rates of Responding and the
Probabilities of Reinforcement for be
Interresponse Times
Interresponse times and rate of responding of
are reciprocally related. For each overall and
local rate of responding, there is a correspond- ~-a
ing average IRT. The relationship between
overall and local rates of responding, there-
fore, can be expressed in terms of average
IRTs. In Exp. 3, overall rates of responding S.-

maintained by different overall rates of rein-


forcement in arithmetic VI schedules were
compared with local rates of responding main-
tained by different local rates of reinforce-
ment within intervals of various VI sched-
ules (Fig. 15). Figure 28 illustrates three pro-
cedures for making this comparison, with the
data from Pigeon 118 (Fig. 1, Exp. 1) as an INTERRESPONSE TIME (SEC.)
example. Fig. 28. Data from Pigeon 118 (Fig. 1) are replotted
The top frame of Fig. 28 shows the proce- to illustrate three techniques for estimating local rates
dure used to derive the open circles in Fig. 15, of responding in VI schedules. Dotted lines show esti-
Exp. 3. The data, average overall rates of mations from 20 and 100 reinforcements per hour (rft/
responding obtained at each overall rate of hr). Details in text.
reinforcement, were connected by straight
lines, the function was extrapolated linearly from rate of responding to average IRT. Al-
to zero, and rates of responding correspond- though average IRTs corresponding to differ-
ing to particular rates of reinforcement were ent rates of reinforcement can be read directly
read directly from the graph, as illustrated. from the graph, the abscissa does not lend it-
The middle frame shows the same procedure, self to comparison with the probability-of-
except that the ordinate has been converted reinforcement functions for IRTs in Fig. 27.
376 A. CHARLES CA TANIA and G. S. REYNOLDS

The bottom frame of Fig. 28 illustrates a oversimplifications in the underlying assump-


procedure that makes this comparison possi- tions (in particular, that of a uniform distri-
ble. Each of the straight lines, with overall bution of starting times of IRTs) suggests
rate of reinforcement (rft/hr) as a parameter, that the IRT analysis can serve as a useful
represents the probabilities of reinforcement and perhaps a preferable alternative to the
for IRTs within a particular arithmetic VI analysis in terms of local rates of reinforce-
schedule. The figure represents the schedules ment. In addition, data are available on the
for Pigeon 118 in Exp. 1 (heavy lines) and a form of the distribution of IRTs maintained
sample of other schedules (light lines). The by VI schedules (e.g., Anger, 1956; Farmer,
functions are linear because longer IRTs are 1963), and, although the average IRT does
excluded; specifically, none of the functions not provide information about the shape of
extends beyond an IRT equal to the shortest the IRT distribution, the relationship be-
non-0-sec interval in that VI schedule. A tween rates of responding and IRT distribu-
change in the overall rate of reinforcement tions is more likely to be clarified if both are
provided by the schedule produces a corre- expressed in the same dimension, as IRTs.
sponding change in the slope of the function.
For example, when the overall rate of rein- Implications of Interresponse-Time Analyses
forcement is doubled (e.g., 20 to 40 rft/hr), The data function in the bottom frame of
the slope doubles. Fig. 28 shows that the average IRT decreases
The data for Pigeon 118 are plotted as in- as the slope of the probability-of-reinforce-
tersections of the probability-of-reinforcement ment function increases. The average IRT,
function for a given schedule and the average however, does not change in such a way that
IRT maintained by that schedule. When the its probability of reinforcement remains con-
data are connected by straight lines, the aver- stant (Morse and Herrnstein, 1955); the
age IRT maintained by other arithmetic VI probability of reinforcement increases as
Schedules can be read from the graph, as il- the average IRT decreases (e.g., from the
lustrated for schedules providing 20 and 100 point on the 33-rft/hr function in Fig. 28 to
rft/hr. In addition, the average IRT at a that on the 79-rft/hr function). It appears
particular time since reinforcement in a given that the function, if extrapolated, would
schedule can be compared with the average asymptotically approach a probability of 0
IRT maintained by a given overall rate of with increasing IRT, and would reach a fi-
reinforcement by superimposing the data nite IRT at a probability of 1.0 (paradoxi-
function for Pigeon 118 in the lower frame cally, this probability corresponds to a sched-
of Fig. 28 on the appropriate probability-of- ule of continuous reinforcement, in which,
reinforcement functions, such as those in Fig. according to the present account, latencies
27, for different times since reinforcement in but no IRTs can occur).
a particular schedule. The higher probabilities of reinforcement
The results of such a procedure, with re- at which shorter average IRTs are maintained
spect to both general relationships and indi- suggest some sort of reciprocal relationship
vidual differences, do not differ much from between reinforcement and IRTs. Reinforce-
the other two procedures, after average IRTs ment increases responding, and therefore
are converted to rates of responding. The ma- shortens IRTs. Other effects, perhaps includ-
jor deviations are in those instances in which ing effort and fatigue, decrease responding
it is necessary to extrapolate beyond the range and therefore lengthen IRTs. Such an ac-
of the overall rates of reinforcement ar- count has considerable precedent: in balanc-
ranged for a given pigeon (cf. Pigeon 278, ing these two opposing factors, the pigeon
Fig. 15). In other words, the deficiencies of appears to compromise between obeying the
the analysis in terms of local rates of rein- Law of Effect and obeying the Law of Least
forcement (e.g., the problem of the earliest Effort.
times after reinforcement) are not eliminated A number of factors, however, may operate
when the analysis is carried out in terms of to shorten or lengthen IRTs. To speak sim-
the probabilities of reinforcement for IRTs. ply of a reinforced IRT is convenient, but
Nevertheless, the extent to which the defi- reinforcement has several effects, and some
ciencies in the IRT analysis can be related to may be antagonistic. The fundamental effect
INTERVAL SCHEDULES OF REINFORCEMENT 377
of reinforcement, and its defining character- is usually negligible (except perhaps dur-
istic, is that it enhances the organism's ten- ing acquisition), because the difference be-
dency to emit the reinforced response. Rein- tween the minimum and the actual interrein-
forcement of responses, regardless of their forcement intervals is likely to be small.
associated IRTs, tends to shorten IRTs (such The lengthening of IRTs may depend on
phenomena as the high frequencies of short such potential factors as effort or fatigue, but
IRTs in DRL performances, e.g., Staddon, reinforcement itself may also contribute. In
1965, may be an example of this effect of re- interval schedules, reinforcement favors long
inforcement). IRTs because the probability of reinforce-
The tendency for IRTs to shorten even in ment increases with IRT. To the- extent that
schedules of reinforcement in which long the temporal spacing of responses comes un-
IRTs are differentially reinforced (e.g., VI der the control of differential reinforcement,
schedules: Anger, 1956) has prompted the IRTs lengthen and, as a consequence, the
suggestion that short IRTs are more suscep- number of responses per reinforcement also
tible to reinforcement than long IRTs (Mil- decreases. The complication is that when re-
lenson, 1966). This view seems contradicted inforced IRTs are long, the tendency of rein-
by the fact that, in Fig. 28, shorter average forcement to shorten IRTs antagonizes the
IRTs are maintained only at higher prob- effects of the differential reinforcement of
abilities of reinforcement. An alternative ac- long IRTs. (This complication arises more
count of the tendency for IRTs to shorten is obviously in DRL performances, in which
based on another characteristic of reinforce- IRTs too short for reinforcement sometimes
ment, its effectiveness even after a delay (cf. preponderate.)
Dews, 1960). For example, consider the effect The final interval-schedule performance
of reinforcement of a 10-sec IRT, and com- may emerge as a compromise between antago-
pare it with the effect of the reinforcement of nistic effects of reinforcement. Reinforcement
the last of five 2-sec IRTs. In both cases, a tends to shorten IRTs, directly and perhaps
response at the end of a 10-sec period of time through an effect of delay of reinforcement.
is followed by immediate reinforcement. In It also tends to lengthen IRTs, through the
the former case, however, one other response control produced by the higher probability
is reinforced incidentally with a delay of 10 of reinforcement for long IRTs. As IRTs be-
sec, whereas in the latter case, five other re- come longer or shorter, one or the other effect
sponses are reinforced incidentally with de- of reinforcement may predominate, but the
lays of 10, 8, 6, 4, and 2 sec. It seems reason- interaction that comes about because IRTs
able to assume that the reinforcement of more and their probabilities of reinforcement co-
responses within a fixed period of time, even vary in interval schedules generates a bal-
though the reinforcement of some responses ance between the effects that is reflected in
is both delayed and incidental, will be more the average IRT maintained for a given pi-
likely to increase subsequent responding or, geon at a given time within a given schedule.
in other words, to shorten IRTs. (This char- This analysis of the performances main-
acteristic of reinforcement may be relevant to tained by interval schedules of reinforcement
the development of short IRTs in other treats them in terms of a process: the interac-
schedules that do not differentially reinforce tion of IRTs and their probabilities of rein-
short IRTs, such as concurrent DRL sched- forcement as a function of time since re-
ules: Malott and Cumming, 1964, 1966; ra- inforcement. The analysis emphasizes the
tio schedules: Millenson, 1966,- and stochas- variables that come into direct contact with
tic schedules: Weiss and Laties, 1964; Blough, behavior, rather than the variables specified
1966). in the arrangement of schedules (cf. Schoen-
Another factor that may tend to shorten feld, Cumming, and Hearst, 1956). Accord-
IRTs is that long IRTs can produce de- ing to this analysis, the power of positive re-
creases in the overall rate of reinforcement. inforcement lies in its capacity to control not
When reinforcements are made available only the occurrence of responses, but also
during long IRTs, the long IRTs add to the their temporal relationship to other responses
minimum interreinforcement interval. In and to events such as reinforcement. These
most interval schedules, however, this factor temporal constraints, imposed on perform-
378 A. CHARLES CA TANIA and G. S. REYNOLDS

ance because the differential reinforcement Two methods of designing constant-prob-


of IRTs is different within each schedule and ability VI schedules will be considered. One,
at different times within the same schedule, illustrated by the random-interval schedules
may bear on the relative insensitivity of inter- of Farmer (1963) and Millenson (1963) and
val-schedule performances to some variables by the constant-probability schedule of Exp.
(e.g., magnitude of reinforcement: Catania, 3, holds constant the separation in time of
1963b). Sensitivity to such variables is more successive opportunities for reinforcement
likely to be obtained with nontemporal mea- while varying the relative frequencies of dif-
sures of performance (e.g., the proportion of ferent intervals. The other, illustrated by the
changeovers from one interval schedule to an- schedules of Fleshler and Hoffman (1962: see
other when the schedules operate concur- Exp. 3, Discussion) and by a modified sched-
rently: Catania, 1966). ule described below, holds constant the rela-
Schedules can be designed either to mini- tive frequencies of the different intervals
mize or to maximize temporal constraints (cf. while varying the separation in time of suc-
"synthetic" VI schedules, Newman and An- cessive opportunities for reinforcement.
ger, 1954; stochastic reinforcement of IRTs, The random-interval schedules of Farmer
Weiss and Laties, 1964; reinforcement of and Millenson arranged a constant, recycling
"least-frequent" IRTs, Blough, 1966), but time interval, T. Within each T-sec interval,
eliminating or establishing constraints with the first response was reinforced with a prob-
respect to some variables is bound to affect ability, P, corresponding to the statistic, re-
constraints with respect to others. In other inforcements per opportunity. The timing of
words, no particular schedule of reinforce- the T-sec intervals was not interrupted dur-
ment manipulates "response strength"; ing reinforcement, so that a 0-sec interval
rather, it controls a particular sample of the (reinforcement of the first response after a
various properties of responding. reinforcement) was possible if T was less than
the duration of reinforcement. As arranged
by Farmer, the schedules also included a lim-
APPENDIX II: ited hold: a reinforcement made available
CONSTANT-PROBABILITY within one T-sec interval was not kept avail-
VARIABLE-INTERVAL able beyond the end of that interval.
SCHEDULES Farmer studied a range of T from 1 to 60
A constant-probability VI schedule is one sec, and a range of P from 0.0052 to 1.0 (when
with a minimal correlation between probabil- P equaled 1.0, these schedules corresponded
ity of reinforcement and the time since the to Fl schedules). Cumulative records showed
last reinforcement. In other words, a constant- that rate of responding was roughly constant
probability VI schedule provides that time over time since reinforcement at only some
since reinforcement cannot acquire discrimi- combinations of T and P. The deviations
native control over responding through its re- can be attributed to at least three factors: the
lationship to the availability of subsequent limited hold, particularly when T equaled
reinforcement. This condition may be a pre- 1 sec; the time to the first opportunity for
requisite for local rates of responding that do reinforcement when T was large (30 or 60
not change with the passage of time since re- sec), which produced long pauses after rein-
inforcement. The condition is obviously not forcement (cf. constant-probability VI 379-sec
satisfied by an Fl schedule, which makes rein- for Pigeon 278 in Fig. 11, Exp. 3); and, triv-
forcement available at the same time in every ially, the FI character of the schedules when
interval; it is also not satisfied by a variety P equalled 1.0.
of standard VI schedules, including the Millenson chose 4 sec as an optimal value
arithmetic and the geometric (Exp. 3). The of T, and arranged schedules with P equal to
present section considers the design of con- 0.0667 and 0.0183. In cumulative records, lo-
stant-probability VI schedules, a problem sig- cal rates of responding appeared roughly con-
nificant for the technology of behavior be- stant over time since reinforcement, although
cause a constant rate of responding provides one of three pigeons showed systematically
a useful baseline against which to assess the low rates of responding for some time after
effects of many variables. reinforcement when P equalled 0.0667, and
INTERVAL SCHEDULES OF REINFORCEMENT 379

all three pigeons showed cyclic short-term al- random generator will occasionally (and un-
ternations between high and low rates when predictably) produce a long, locally regular
P equaled 0.0183. sequence, which, through a temporary local
The average rate of reinforcement in a effect on the rate of reinforcement or on the
random-interval schedule equals P/T. As P correlation between reinforcements and time
and T become small (see Millenson, 1963), since reinforcement, may have significant ef-
the distribution of interreinforcement inter- fects on performance, especially if it occurs in
vals approaches the exponential distribution: the early stages of acquisition. A predeter-
e- (t/t)
mined sequence (for example, in the form of
f(t) = t dt, a loop of punched tape) not only avoids this
possibility, but also may simplify data collec-
where t is the duration of an interval, f(t) is tion because the experimenter can predict the
the relative frequency of the interval, t is the number of times the organism will reach vari-
mean interval, and e is the base of natural ous times since reinforcement within a par-
logarithms. The relative frequencies of the ticular experimental session.
discrete intervals in the following distribu- A satisfactory sequence of intervals in
tion with T equal to 10 sec and P equal to which each interval is an integral multiple of
0.10 provide one approximation to the con- the minimum interval, however, is necessarily
tinuous distribution described by the equa- long (e.g., the 60-interval sequence in Table
tion (intervals are shown in parentheses): 2). With the usual VI programmer (e.g., Ralph
0.100 (10-sec), 0.090 (20-sec), 0.081 (30-sec), Gerbrands Co.), such a sequence requires ex-
0.073 (40-sec), 0.066 (50-sec), 0.059 (60-sec), cessively long tapes and produces the problem
0.053 (70-sec), 0.048 (80-sec), 0.043 (90-sec), of tape breakage and tangling. A desirable
0.039 (100-sec), and so on. In this sequence, sequence for many applications, therefore,
the exact relative frequency of t., the nth in- would be short and yet would retain the basic
terval with intervals ranked in order of dura- characteristics of a constant-probability VI
tion, equals P(lp)n-1. schedule.
The constant probability schedule of Exp. The method proposed by Fleshler and
3 provided a distribution of intervals simi- Hoffman (1962) for generating a sequence of
lar to the distribution with P equal to 0.10 intervals roughly satisfies these requirements.
in the random-interval schedules of Farmer Their progression of intervals is described by
and Millenson. Each interval was an integral the equation:
multiple of the minimum interval, t (see Ta- tn = t [1 + In N + (N-n) In (N-n)-
ble 2 and Fig. 11, Exp. 3). The schedule dif- (N-n+l) In (N-n+l)],
fered from the random-interval schedules in
that the sequence of intervals within each ses- where t. and t are, again, the durations of
sion was predetermined by a punched tape. the nth and the mean intervals respectively,
Consequently, the schedule specified a longest N is the total number of intervals, and In
interval at the end of which the probability represents the natural (base e) logarithm. The
of reinforcement necessarily became 1.0. equation is derived from the exponential dis-
Both the random and predetermined meth- tribution (cf. Discussion, Exp. 3). In effect, as
ods of arranging constant-probability sched- the probability of reinforcement increases
ules have certain advantages. In a random- from one opportunity to the next (Fig. 14),
interval schedule, an interval of any multiple the temporal separation of successive oppor-
of T-sec is possible, though less likely the tunities increases in such a way that the prob-
larger the multiple. The probability of rein- ability of reinforcement per unit of time (in
forcement never becomes 1.0 except in the other words, the local rate of -reinforcement)
ex post facto sense that there will always have remains roughly constant. In discussing the
been a longest interval when the relative fre- problem that this distribution provides rein-
quencies of different intervals are tabulated forcement at discrete points in time and the
at the end of a particular session. probability of reinforcement at other times is
Under some circumstances, however, a pre- zero, Fleshler and Hoffman state:
determined sequence of intervals may be pref- "This difficulty would be insurmount-
erable to a randomly generated sequence. A able if organisms had perfect temporal
380 A. CHARLES CATANIA and G. S. REYNOLDS

discrimination. The fact that they do not at this time in one of every 20 intervals.
means that the effects of rf [reinforce- If this IRT begins between 2.5 and 7.7
ment] at a given point in time will sec, however, its probability of reinforce-
spread to nearby points in time (at least ment is 0.053 (reinforcement at 7.7 sec in one
within the difference limen). If the dif- of 19 intervals) multiplied by the probability
ferences between successive terms in the of 0.48 that the IRT will end after 7.7 sec
progression were sufficiently small so that (see Appendix I). This probability equals
within the schedule context, discrimina- 0.0252, or about half the earlier probability,
tion between these terms were poor, the and the probability remains at roughly this
effective probability distribution would value through most of the remaining time
be continuous and would approximate since reinforcement. In light of Chorney's
the theoretical distribution" (Fleshler findings, therefore, its higher value shortly
and Hoffman, 1962, p. 530). after reinforcement may be significant. The
modification suggested below provides a pro-
This schedule, as arranged by Chorney gression similar to Fleshler and Hoffman's,
(1960), maintained a relatively constant local but takes into account the probabilities of re-
rate of responding over most of the range of inforcement for IRTs at different times since
time since reinforcement, thus supporting the reinforcement.
assumption that this and the preceding con- In a sequence of intervals such that all in-
stant-probability VI schedules are equivalent tervals occur with the same relative frequency,
and demonstrating the importance of the the probability of reinforcement at the end of
separation of different opportunities for re- a given interval, t,D is given by the reciprocal
inforcement along the continuum of time of the number of intervals equal to or greater
since reinforcement. Chorney's finding of a than t.; that is to say, reinforcements per op-
higher rate of responding shortly after rein- portunity grows with time since reinforce-
forcement than later within intervals, how- ment as the reciprocal of the number of in-
ever, prompts a detailed examination of the tervals ending at or after the given time since
early terms of the progression. reinforcement. But given that a particular
A sample sequence of intervals from the IRT is shorter than the time between succes-
progression is the following (20 intervals, sive opportunities, the probability of rein-
mean = 100 sec): 2.5, 7.7, 13.5, 19.3, 25.5, 32.2, forcement of the IRT is directly proportional
39.3, 47.0, 55.5, 64.5, 74.5, 85.6, 98.2, 112.5, to reinforcements per opportunity (see Ap-
129.2, 149.4, 174.6, 208.6, 260.9, and 399.6 sec. pendix I). Thus, to hold constant the prob-
Local rates of reinforcement are approxi- ability of reinforcement for a given IRT, the
mately one reinforcement per 100 sec at all increments in the durations of successive in-
opportunities except the first (2.5 sec, at tervals in the progression must grow directly
which local rate of reinforcement is about with reinforcements per opportunity.
35% higher) and the last (399.6 sec, at which For example, in a progression of 20 inter-
local rate of reinforcement is about 30% vals, the probability of reinforcement at the
lower). The relatively high, early local rate end of the shortest interval, tl, is 0.050, and
of reinforcement might account for Chorney's so the probability of reinforcement for an
finding, but this deviation alone cannot be IRT of t1-sec is also 0.050. But the probabil-
taken too seriously because the computation ity of reinforcement (rft/op) at t2, at the end
of local rates of reinforcement is most arbi- of the next shortest interval, is 0.053 (1/19).
trary at early and late times after reinforce- The duration of the increment added to tl,
ment. therefore, must be 1.053 (20/19) times t1 if
The progression can also be evaluated in the probability of reinforcement for this IRT
terms of interresponse times. Consider a 2.5- at t2 is to be held equal to 0.050. Correspond-
sec IRT that begins either within the first ingly, the increment added to t2 must be 1.056
2.5 sec after reinforcement or between 2.5 and (19/18) times tl, and so on. A general pro-
7.7 sec. If the IRT begins within 2.5 sec, its gression that satisfies these requirements and
probability of reinforcement is 0.050 because so holds constant the probability of reinforce-
it must end at least 2.5 sec after reinforce- ment for any IRT less than or equal to t,
ment and because reinforcement is arranged sec is:
INTERVAL SCHEDULES OF REINFORCEMENT 381
n _
t relatively low immediately after reinforce-
tn = Z (N+1)-i' ment at the end of a short interval. Only in-
formal data on the effects of the order of in-
where the symbols are the same as those in tervals are available. One schedule (colloqui-
the Fleshler and Hoffman equation. The fol- ally, the golden tape) was developed over a
lowing sequence of 20 intervals with a mean period of years by several investigators at the
of 100 sec is an example of the progression: Harvard Pigeon Laboratories. One feature
5.0, 10.3, 15.8, 21.7, 28.0, 34.6, 41.8, 49.4, 57.8, that persisted among several variations in the
66.9, 76.9, 88.0, 100.5, 114.8, 131.4, 151.4, schedule was that the two shortest intervals
176.4, 209.8, 259.8, and 359.8 sec. This progres- were separated by exactly two of the inter-
sion, with a shortest interval longer than that mediate intervals. One version of the sched-
in the Fleshler and Hoffman progression, is ule is the following (15 intervals, mean of 180
likely to generate relatively lower rates of re- sec): 560, 60, 220, 5, 140, 120, 5, 260, 500, 60,
sponding shortly after reinforcement than 300, 20, 60, 350, and 140 sec. The order of the
were observed in Chorney's experiment. intervals was assumed to contribute to the
With the qualification of a uniform distri- schedule's success in maintaining roughly
bution of starting times for a particular IRT constant local rates of responding with only
(see Appendix I), the sequence holds exactly minor sequential effects (as observed in cumu-
constant the probability of reinforcement for lative records). It is interesting to note that
any IRT less than or equal to the duration of this schedule and a similar one designed by
the shortest interval. The sequence also inci- Anger (1956) had the property that local rates
dentally holds local rates of reinforcement of reinforcement at successive opportunities,
roughly constant at one reinforcement per though not constant, varied considerably less
100 sec, with the sole exception of the last than in arithmetic or geometric VI schedules.
opportunity (end of the longest interval),
when the local rate of reinforcement is higher.
In practice, the performances maintained REFERENCES
by the Fleshler and Hoffman schedule and Anger, D. The dependence of interresponse times
by the present modification would probably upon the relative reinforcement of different inter-
not be appreciably different at any but the response times. Journal of Experimental Psychol-
ogy, 1956, 52, 145-161.
shortest times after reinforcement. Both sched- Autor, S. M. The strength of conditioned reinforcers
ules, as short sequences with the approximate as a function of frequency and probability of re-
characteristics of constant-probability VI inforcement. Unpublished doctoral dissertation,
schedules, have been in laboratory use and, Harvard University, 1960.
Blough, D. S. The reinforcement of least-frequent
on inspection of cumulative records, appear interresponse times. Journal of the Experimental
to maintain fairly constant local rates of re- Analysis of Behavior, 1966, 9, 581-592.
sponding over most of the range of time since Catania, A. C. Concurrent performances: Reinforce-
reinforcement. ment interaction and response independence. Jour-
Additional study may suggest further re- nal of the Experimental Analysis of Behavior, 1963,
6, 253-263. (a)
finement of the schedules. For example, if the Catania, A. C. Concurrent performances: A baseline
earliest times after reinforcement have spe- for the study of reinforcement magnitude. Journal
cial characteristics, it may be desirable to find of the Experimental Analysis of Behavior, 1963, 6,
out whether a 0-sec interval could be included 299-300. (b)
Catania, A. C. Concurrent operants. In W. K. Honig
in the sequence without excessively raising (Ed.) Operant behavior: areas of research and ap-
the local rate of responding shortly after re- plication. New York: Appleton-Century-Crofts,
inforcement (see Exp. 2). 1966. Pp. 213-270.
The above progressions provide no infor- Catania, A. C. and Reynolds, G. S.- A quantitative
mation about the maximally effective order analysis of the behavior maintained by interval
schedules of reinforcement. Paper presented at the
of the intervals that make up a constant- meeting of the Psychonomic Society, Bryn Mawr,
probability VI schedule. Sequential proper- 1963.
ties can produce systematic local changes in Chorney, H. Variable-interval schedules: the behav-
the rate of responding. For example, if short ioral consequences of the probability of reinforce-
ment as a function of time since reinforcement.
intervals are always followed by long inter- Unpublished Master's thesis, Pennsylvania State
vals, the local rate of responding may become University, 1960.
382 A. CHARLES CATANIA and G. S. REYNOLDS

Clark, F. C. The effect of deprivation and frequency escape behavior. Journal of Comparative and Physi-
of reinforcement on variable-interval responding. ological Psychology, 1952, 45, 538-549.
Journal of the Experimental Analysis of Behavior, Kelleher, R. T. and Gollub, L. R. A review of posi-
1958, 1, 221-228. tive conditioned reinforcement. Journal of the Ex-
Clark, R. Some time-correlated reinforcement sched- perimental Analysis of Behavior, 1962, 5, 543-597.
ules and their effects on behavior. Journal of the Malott, R. W. and Cumming, W. W. Schedules of
Experimental Analysis of Behavior, 1959, 2, 1-22. interresponse time reinforcement. Psychological
Cumming, W. W. Stimulus disparity and variable- Record, 1964, 14, 221-252.
interval reinforcement schedule as related to a be- Malott, R. W. and Cumming, W. W. Concurrent
havioral measure of similarity. Unpublished doc- schedules of interresponse time reinforcement:
toral dissertation, Columbia University, 1955. probability of reinforcement and the lower bounds
Cumming, W. W. and Schoenfeld, W. N. Behavior of the reinforced interresponse time intervals.
under extended exposure to a high-value fixed Journal of the Experimental Analysis of Behavior,
interval reinforcement schedule. Journal of the Ex- 1966, 9, 317-325.
perimental Analysis of Behavior, 1958, 1, 245-263. Millenson, J. R. Some behavioral effects of a two-
Dews, P. B. Free-operant behavior under conditions valued, temporally defined reinforcement schedule.
of delayed reinforcement: I. CRF-type schedules. Journal of the Experimental Analysis of Behavior,
Journal of the Experimental Analysis of Behavior, 1959, 2, 191-202.
1960, 3, 221-234. Millenson, J. R. Random interval schedules of rein-
Dews, P. B. The effect of multiple SA periods on re- forcement. Journal of the Experimental Analysis
sponding on a fixed-interval schedule. Journal of of Behavior, 1963, 6, 437-443.
the Experimental Analysis of Behavior, 1962, 5, Millenson, J. R. Probability of response and proba-
369-374. bility of reinforcement in a response-defined ana-
Farmer, J. Properties of behavior under random in- logue of an interval schedule. Journal of the Experi-
terval reinforcement schedules. Journal of the Ex- mental Analysis of Behavior, 1966, 9, 87-94.
perimental Analysis of Behavior, 1963, 6, 607-616. Morse, W. H. Intermittent reinforcement. In W. K.
Ferster, C. B. Control of behavior in chimpanzees Honig (Ed.). Operant behavior: areas of research
and pigeons by time out from positive reinforce- and application. New York: Appleton-Century-
ment. Psychological Monographs, 1958, 72, (8, Crofts, 1966. Pp. 52-108.
Whole No. 461). Morse, W. H. and Herrnstein, R. J. The analysis of
Ferster, C. B. and Skinner, B. F. Schedules of rein- responding under three different forms of fixed
forcement. New York: Appleton-Century-Crofts, interval reinforcement. Paper delivered at the
1957. meeting of the Eastern Psychological Association,
Findley, J. D. Preference and switching under con- Philadelphia, 1955.
current scheduling. Journal of the Experimental Neuringer, A. J. and Chung, S.-H. Quasi-reinforce-
Analysis of Behavior, 1958, 1, 123-144. ment: control of responding by a percentage-rein-
Findley, J. D. An experimental outline for building forcement schedule. Journal of the Experimental
and exploring multi-operant behavior repertoires. Analysis of Behavior, 1967, 10, 45-54.
Journal of the Experimental Analysis of Behavior, Nevin, J. A. Two parameters of conditioned rein-
1962, 5, 113-166. forcement in a chaining situation. Journal of Com-
Fleshler, M. and Hoffman, H. S. A progression for parative and Physiological Psychology, 1964, 58,
generating variable interval schedules. Journal of 367-373.
the Experimental Analysis of Behavior, 1962, 5, Newman, E. B. and Anger, D. The effect upon sim-
529-530. ple animal behavior of different frequencies of re-
Gollub, L. R. The relations among measures of per- inforcement. Report PLR-33, Office of the Surgeon
formance on fixed-interval schedules. Journal of General, 1954. (Document #7779, ADI Auxiliary
the Experimental Analysis of Behavior, 1964, 7, Publications Project, Photoduplication Service, Li-
337-343. brary of Congress, Washington, D.C. 20025.)
Hearst, E. The behavioral effects of some temporally Norman, M. F. An approach to free-responding on
defined schedules of reinforcement. Journal of the schedules that prescribe reinforcement probability
Experimental Analysis of Behavior, 1958, 1, 45-55. as a function of interresponse time. Journal of
Herrnstein, R. J. Behavioral consequences of the re- Mathematical Psychology, 1966, 3, 235-268.
moval of a discriminative stimulus associated with Reynolds, G. S. Relativity of response rate and rein-
variable-interval reinforcement. Unpublished doc- forcement frequency in a multiple schedule. Jour-
toral dissertation, Harvard University, 1955. nal of the Experimental Analysis of Behavior, 1961,
Herrnstein, R. J. Relative and absolute strength of 4, 179-184.
response as a function of frequency of reinforce- Reynolds, G. S. Some limitations on behavioral con-
ment. Journal of the Experimental Analysis of Be- trast and induction during successive discrimina-
havior, 1961, 4, 267-272. tion. Journal of the Experimental Analysis of Be-
Hermstein, R. J. Secondary reinforcement and rate havior, 1963, 6, 131-139.
of primary reinforcement. Journal of the Experi- Reynolds, G. S. Discrimination and emission of tem-
mental Analysis of Behavior, 1964, 7, 27-36. poral intervals by pigeons. Journal of the Experi-
Hull, C. L. Principles of behavior. New York: Apple- mental Analysis of Behavior, 1966, 9, 65-68.
ton-Century-Crolts, 1943. Reynolds, G. S. and Catania, A. C. Response rate as
Kaplan, M. The effects of noxious stimulus intensity a function of rate of reinforcement and probability
and duration during intermittent reinforcement of of reinforcement in variable-interval schedules.
INTERVAL SCHEDULES OF REINFORCEMENT 383
Paper presented at the meeting of the Psychonomic Skinner, B. F. "Superstition" in the pigeon. Jour-
Society, New York, 1961. nal of Experimental Psychology, 1948, 38, 168-172.
Reynolds, G. S. and Catania, A. C. Temporal dis- Staddon, J. E. R. Some properties of spaced respond-
crimination in pigeons. Science, 1962, 135, 314-315. ing in pigeons. Journal of the Experimental Analy-
Schoenfeld, W. N. and Cumming, W. W. Studies in sis of Behavior, 1965, 8, 19-27.
a temporal classification of reinforcement sched- Staddon, J. E. R. Attention and temporal discrimina-
ules: Summary and projection. Proceedings of the tion: factors controlling responding under a cyclic-
National Academy of Sciences, 1960, 46, 753-758. interval schedule. Journal of the Experimental
Schoenfeld, W. N., Cumming, W. W., and Hearst, E. Analysis of Behavior; 1967, 10, 349-359.
On the classification of reinforcement schedules. Stubbs, A. The discrimination of stimulus duration
Proceedings of the National Academy of Sciences, by pigeons. Journal of the Experimental Analysis
1956, 42, 563-570. of Behavior, 1968, 11, 223-238.
Sherman, J. G. The temporal distribution of re- Weiss, B. and Laties, V. G. Drug effects on the tem-
responses on fixed interval schedules. Unpublished poral patterning of behavior. Federation Proceed-
doctoral dissertation, Columbia University, 1959. ings, 1964, 23, 801-807.
Shimp, C. P. The reinforcement of short interre- Weiss, B. and Moore, E. W. Drive level as a factor
sponse times. Journal of the Experimental Analysis in the distribution of responses in fixed-interval
of Behavior, 1967, 10, 425-434. reinforcement. Journal of Experimental Psychology,
Sidman, M. Tactics of scientific research. New York: 1956, 52, 82-84.
Basic Books, 1960. Wilson, M. P. Periodic reinforcement interval and
Skinner, B. F. The effect on the amount of condition- number of periodic reinforcement as parameters
ing of an interval of time before reinforcement. of response strength. Journal of Comparative and
Journal of General Psychology, 1936, 14, 279-295. Physiological Psychology, 1954, 47, 51-56.
Skinner, B. F. The behavior of organisms. New York:
Appleton-Century-Crofts, 1938. Received 10 June 1963.

You might also like