Test Bank For Introduction To Learning and Behavior 4th Edition Powell Download

Download as pdf or txt
Download as pdf or txt
You are on page 1of 33

Test Bank for Introduction to Learning and Behavior, 4th Edition : Powell

Test Bank for Introduction to Learning and Behavior,


4th Edition : Powell

To download the complete and accurate content document, go to:


https://testbankbell.com/download/test-bank-for-introduction-to-learning-and-behavior
-4th-edition-powell/

Visit TestBankBell.com to get complete for all chapters


CHAPTER 6: Operant Conditioning: Introduction

Chapter Outline
Historical Background
Thorndike’s Law of Effect
Skinner’s Selection by Consequences
Operant Conditioning
Operant Behavior
Operant Consequences: Reinforcers and Punishers
Operant Antecedents: Discriminative Stimuli
Four Types of Contingencies
Positive Reinforcement
Negative Reinforcement
Positive Punishment
Negative Punishment
Positive Reinforcement: Further Distinctions
Immediate Versus Delayed Reinforcement
Primary and Secondary Reinforcers
Intrinsic and Extrinsic Reinforcement
Natural and Contrived Reinforcers
Shaping

Explanation of Opening Scenario


This is an example of how, through being affectionate at the wrong time, one might inadvertently reinforce abusive
tendencies in others.

Dr. Dee Assignment (See Chapter 1 in this manual for a sample set of instructions.)

I. Dear Dr. Dee,


I’ve been spending a fortune to entice this girl to go out with me. My plan is that if she spends enough
time with me, she will see what a great guy I am. Is this a great plan or what?
Mr. Big Bucks

II. Dear Dr. Dee,


My boyfriend acts like such a jerk that I often have to yell at him to get him to shape up. Someone
suggested that I instead try reinforcing him when he behaves properly but that seems so artificial. I
don’t want to be in a phony relationship.
Sherry Sharptongue

OPERANT CONDITIONING: INTRODUCTION 151


Relevant concepts:
I. Mr. Big Bucks has set up a situation in which the possibility of this girl developing an intrinsic interest in him is
extremely unlikely. At least two of the conditions for undermining intrinsic interest—tangible reinforcer and
reinforcement for mere performance of the activity—are clearly present here (251-252).

II. As noted on p. 244, the quality of a relationship is strongly determined by the ratio of positive to negative
interactions. Moreover, why does she consider the use of aversive consequences to control someone’s behavior
somehow less phony than the use of appetitive consequences? Finally, as noted on p. 254, contrived reinforcers
can often be withdrawn after the behavior has become “trapped” by the natural consequences of appropriate
behavior.

Internet Resources
“Animal Intelligence” by Edward L. Thorndike (1911): http://psychclassics.yorku.ca/Thorndike/Animal/
This treatise is Thorndike’s major work on learning processes in animals. Chapter 2 contains a description of the
famous puzzle box experiments. (From York University Classics in the History of Psychology.) See also the
introduction to this article below.

Introduction to “Animal Intelligence” Edward Lee Thorndike (1911):


http://psychclassics.yorku.ca/Thorndike/Animal/wozniak.htm
This is an introduction to Thorndike’s classic work written by Robert H. Wozniak.
(From York University Classics in the History of Psychology.)

George Romanes on animal intelligence:


http://www.pigeon.psy.tufts.edu/psych26/romanes1.htm
This site contains an excerpt from Romanes’ book “Animal Intelligence” (1888). Romanes was extreme in the
extent to which he assumed animals are capable of human-like thought and emotion. This extremism is the epitome
what Thorndike was attempting to counter with his own research. (From the Animal Cognition and Learning site at
Tufts University.)

Edward Thorndike’s criticism of Romanes’ methodology:


http://www.pigeon.psy.tufts.edu/psych26/thorn.htm
This excerpt is from Thorndike’s 1911 book. (From the Animal Cognition and Learning site at Tufts University.)

Kohler’s criticism of Thorndike:


http://www.pigeon.psy.tufts.edu/psych26/kohler1.htm
This excerpt from the book, The Mentality of Apes (1925), outlines Kohler’s criticisms of Thorndike’s puzzle box
experiments. (From the Animal Cognition and Learning site at Tufts University.)

Kohler’s Research on the Mentality of Apes plus Criticisms http://www.pigeon.psy.tufts.edu/psych26/kohler.htm


Description and photos of Kohler’s famous research on insight and planning in apes, which he believed refuted
behavioristic assumptions about learning. Follow the links at the bottom to see criticisms of Kohler research. See
especially the description and photos of Epstein’s demonstration of “insight” in pigeons. (From the Animal
Cognition and Learning site at Tufts University.)

Positive reinforcement tutorial


http://psych.athabascau.ca/html/prtut/reinpair.htm
A well-designed tutorial on the concept of positive reinforcement, created by Lyle Grant at Athabasca University.

Video of a shaping procedure: http://www.owu.edu/~deswartz/procedures/shaping.html


Follow this link to see a video clip of a rat being shaped to press a lever for food reinforcement. (The clip is from
Dale Swartzentruber’s online collection of videos on animal learning.)

152 CHAPTER 6
Clicker training: http://www.clickertraining.com
Karen Pryor’s official website for clicker training. Follow the links for detailed information and enlightening
discussions.

Dog training and behavior: http://www.uwsp.edu/psych/dog/fulltext.htm#Deeley


This site, hosted by Dr. M. Plonsky of the University of Wisconsin, contains an extensive library of information
about dog training and behavior.

Animal training at Sea World: http://www.seaworld.org/animal-info/info-books/training/index.htm


This site discusses the many ways in which principles of learning are utilized in the training and care of animals at
Sea World.

Suggested Readings
Cameron, J., & Pierce, W. D. (2002). Rewards and intrinsic motivation: Resolving the controversy. New York:
Greenwood Publishing.

Deci, E. L., Koestner, R., & Ryan, R. M. (2001a). Extrinsic rewards and intrinsic motivation: Reconsidered once
again. Review of Educational Research, 71, 1-27.

Grant, L., & Evans, A. (1994). Principles of behavior analysis. New York: HarperCollins.

Pryor, K. (1975). Lads before the wind: Adventures in porpoise training. New York: Harper & Row.

Pryor, K. (1999). Don’t shoot the dog: The new art of teaching and training (Rev. ed.). New York: Bantam Books.

Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. Acton, MA: Copley.

Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals.
Psychological Review Monograph Supplement, 2, 1–109.

Answers to Quick Quiz Items


Quick Quiz A
1. consequences
2. precedes; follows
3. instrumental
Quick Quiz B
1. gradually
2. law of effect; satisfying; unsatisfying (annoying)
3. stamped in; stamped out
4. regular
5. pressing; lever; pecking; response key
6. free operant
7. reflexes; operant
Quick Quiz C
1. satisfying; annoying
2. adaptive; nonadaptive
3. response; consequence; consequence; response; discriminative; response; consequence
4. elicited; emitted
5. operants
6. class

OPERANT CONDITIONING: INTRODUCTION 153


Quick Quiz D
1. strengthen; weaken
2. follows; increases; follows; decreases
3. process; procedure (in either order)
4. reinforcement; reinforcer
5. punishment; punisher
6. effect; reward
7. reinforcer; increased
8. punisher; decreased
9. SP; punishing stimulus; SR; reinforcing stimulus; R
10. the behavior of fetching the toy
11. the child’s rude behavior
12. extinction
13. punishment; extinction

Quick Quiz E
1. discriminative stimulus; operant; consequence
2. SD
3. set the occasion; more
4. does not
5. SD; R; SR
6. antecedent; behavior; consequence
7. notice; do; get
8. discriminative stimulus; punishment; SP
9. CS; classical; operant

Quick Quiz F
1. presentation; withdrawal
2. does not; does not
3. addition; subtraction
4. increase; decrease

Quick Quiz G
1. removal; increased; negative
2. presentation; increased; positive
3. more; positive
4. escape; avoidance; negative
5. escape; avoidance

Quick Quiz H
1. presentation; decreased; positive punishment
2. removal; decreased; negative punishment
3. presentation; increased; positive reinforcement
4. presentation; decrease; positive punishment
5. removal; increased; negative reinforcement

Quick Quiz I
1. immediate
2. delayed; weak; immediate; strong
3. do not; instructions; rules

154 CHAPTER 6
Quick Quiz J
1. primary; unconditioned
2. secondary; conditioned
3. primary; secondary
4. CS; US; SD
5. many other reinforcers
6. money and attention
7. tokens; token economy

Quick Quiz K
1. intrinsically; extrinsically
2. extrinsically; intrinsically
3. expected; tangible; merely engaging in the behavior
4. verbal; high

Quick Quiz L
1. natural; contrived
2. natural; extrinsic
3. contrived; extrinsic
4. contrived; trapped; natural consequences
5. natural

Quick Quiz M
1. new; successive approximations
2. secondary; food; primary
3. immediately; satiated

Answers to Study Question Items: See short-answer test items in the test bank.

Test Bank for Chapter 6


1. In ____ conditioning, behavior comes under the control of its consequences.
a) operant
b) classical
c) instrumental
d) both operant and instrumental
> D 223

2. In ____ conditioning, it is what comes ____ the behavior that is critical.


a) operant; after
b) classical; after
c) instrumental; before
d) both b and c are correct
> A 223 QZ

3. Goal-oriented is to automatic as ____ behavior is to ____ behavior.


a) operant; elicited
b) elicited; operant
c) conditioned; unconditioned
d) unconditioned; conditioned
> A 223

OPERANT CONDITIONING: INTRODUCTION 155


4. In the textbook discussions of operant conditioning, the term “consequence” refers to
a) how we feel about what is happening to us.
b) the change in the probability of the behavior as a result of applying the reinforcer or punisher.
c) the event that follows the behavior and is contingent upon it.
d) the rate of behavior.
> C 223

5. Reflexive is to ______ conditioning as intentional is to ______ conditioning.


a) classical; operant
b) operant; Pavlovian
c) instrumental; respondent.
d) both b and c are correct
> A 223

Historical Background
Thorndike’s Law of Effect

6. The first psychologist to systematically investigate the effect of consequences on the strength of a behavior
was
a) Skinner.
b) Pavlov.
c) Tolman.
d) Thorndike.
> D 224

7. Thorndike was motivated to conduct experiments on animal intelligence, partially due to


a) the massive popularity of experimental research on animal learning at the time.
b) his skepticism about the anecdotal approach to animal intelligence.
c) availability of new technology designed to examine neural correlates of behavior.
d) his profound belief that humans and other animals solved problems in exactly the same way.
> B 224

8. Thorndike argued that animal intelligence could only be studied using


a) systematic investigation.
b) anecdotal evidence.
c) analogy to human intelligence.
d) ethological approaches.
> A 224

9. Thorndike found that cats learned to escape from a puzzle box


a) gradually.
b) suddenly.
c) with insight.
d) both suddenly and with insight
> A 224

10. The original law of effect stated that behaviors leading to a(n) ____ are ____.
a) satisfactory state of affairs; stamped in
b) reinforcer; stamped in
c) positive reinforcer; strengthened
d) unconditioned stimulus; stamped out
> A 226

156 CHAPTER 6
11. According to Thorndike’s law of effect, behaviors leading to a(n) ____ state of affairs are stamped in, while
behaviors leading to a(n) ____ state of affairs are stamped out.
a) annoying; satisfactory
b) satisfactory; annoying
c) irregular; regular
d) regular; irregular
> B 226

12. Freud believed that humans are motivated to seek pleasure and avoid pain. This notion accords most closely
with ______ definition of ______.
a) Thorndike’s; the law of effect
b) Skinner’s; operant conditioning
c) Thorndike’s; operant conditioning
d) Skinner’s; the law of effect
> A 226 QZ

13. With his puzzle box experiments, Thorndike discovered that learning is usually a(n) ______ process.
a) sudden
b) unpredictable
c) stressful
d) gradual
> D 226 FN

Skinner’s Selection by Consequences

14. When first setting out to investigate the behavior of animals, Skinner had originally thought that all behavior
could be explained in terms of
a) thoughts and feelings.
b) reflexes.
c) operants.
d) fixed action patterns.
> B 226

15. Skinner’s development of the operant conditioning chamber was partly motivated by his desire to find a
procedure that yielded ____ patterns of behavior.
a) inflexible
b) reflexive
c) regular
d) irregular
> C 226

16. In a standard Skinner box, a ____ earns food by ____.


a) rat; pressing a lever
b) rat; running in a wheel
c) pigeon; pressing a lever
d) pigeon; flapping its wings
> A 226

17. Skinner’s operant conditioning procedure is known as a free operant procedure because the rat
a) is put on a free feeding schedule before the experiment starts.
b) is free to enter and leave the chamber.
c) is free to move about the chamber.
d) freely controls the rate at which it responds for food.
> D 226

OPERANT CONDITIONING: INTRODUCTION 157


18. Skinner’s operant conditioning procedure became known as a(n) ____ procedure because ____.
a) instrumental; the consequences are free
b) free operant; the animal is free to respond at any rate
c) instrumental; the animal is free to enter or leave the chamber
d) adjunctive; the experimenter is free to observe the rat’s behavior
> B 226

19. In one variant of a Skinner box, a pigeon earns food by


a) flapping its wings.
b) turning circles.
c) pecking a response key.
d) pressing a lever.
> C 226

20. Rat is to ____ as pigeon is to ____.


a) lever press; key peck
b) key peck; lever press
c) turning circles; lever press
d) lever press; turning circles
> A 226

21. Which of the following most closely parallels what happens in a Skinner box?
a) You are in your apartment with nothing to do but bake cookies and eat them.
b) You are at home watching television and raiding the refrigerator.
c) You are in prison with nothing to do. Meals are served at fixed times during the day.
d) You are at work with lots to do. Meals are served at fixed times during the day.
> A 226 WWW

22. Skinner divided behaviors into two categories:


a) operant and instrumental.
b) conditioned and unconditioned.
c) primary and secondary.
d) operant and respondent.
> D 228 MD

Operant Conditioning

23. Skinner’s restatement of Thorndike’s law of effect is


a) less mentalistic.
b) more mentalistic.
c) less precise.
d) both less mentalistic and less precise.
> A 228

24. The basic components of the operant conditioning process include


a) a response that produces a certain consequence.
b) a consequence that strengthens or weakens the response.
c) a preceding stimulus that signals the availability of the consequence.
d) all of these
> D 228

158 CHAPTER 6
25. The three components of the operant conditioning process include
a) a response that is automatically elicited by a preceding stimulus.
b) a consequence that strengthens or weakens the response.
c) a preceding stimulus that elicits the response.
d) all of these
> B 228

Operant Behavior

26. Properly speaking, operant behavior is said to be ____ by ____.


a) emitted; the organism
b) elicited; the organism
c) emitted; stimuli
d) elicited; stimuli
> A 229

27. The operant response is properly described as a(n)


a) emitted behavior.
b) contrived behavior.
c) covert behavior.
d) elicited behavior.
> A 229 FN

28. Emitted is to elicited as ____ conditioning is to ____ conditioning.


a) classical; operant
b) respondent; classical
c) operant; classical
d) instrumental; operant
> C 229 QZ

29. The behavior of lever pressing for food is said to be


a) elicited by the rat.
b) emitted by the rat.
c) elicited by the food.
d) emitted by the food.
> B 229 WWW

30. Operant behaviors are usually defined as a


a) class of behaviors that are topographically similar.
b) class of behaviors that lead to a certain consequence.
c) specific behavior that leads to a certain consequence.
d) specific behavior that leads to a class of consequences.
> B 230

31. Operant behaviors are usually defined as a ______________ because _____________.


a) class of responses; that definition makes naturalistic observation easier
b) specific response; behaviorists prefer specificity in measurement
c) class of responses; that approach incorporates all responses that lead to an identical consequence
d) specific response; cumulative recorders can only record specific responses
> C 230

OPERANT CONDITIONING: INTRODUCTION 159


32. Behaviorists have found it useful to define operant behaviors as a(n)
a) specific response.
b) covert stimulus.
c) class of responses.
d) unconditioned stimulus.
> C 230

Operant Consequences: Reinforcers and Punishers (and Extinction)

33. Which of the following conditions must be met for a response to be considered an operant?
a) Its occurrence results in the delivery of a certain consequence.
b) The consequence affects the future probability of the response.
c) The response is elicited by the antecedent stimulus.
d) a and b only
> D 231

34. Properly speaking, when we give a dog a treat for sitting on command, we are attempting to reinforce
a) the dog.
b) the behavior.
c) the command.
d) our relationship with the dog.
> B 231

35. Properly speaking, when we praise a child for following instructions, we are attempting to reinforce
a) the child.
b) the instructions.
c) the behavior of following instructions.
d) both the instructions and the behavior of following instructions.
> C 231 MD

36. From an operant conditioning perspective, chocolate is a reinforcer if it


a) strengthens the behavior that follows it.
b) strengthens the behavior that precedes it.
c) elicits salivation.
d) both strengthens the behavior that precedes it and elicits salivation.
> B 231 WWW

37. Consequence is to process as ____ is to ____.


a) reinforcer; punisher
b) reinforcement; punishment
c) punisher; punishment
d) reinforcement; reinforcer
> C 231

38. Procedure is to _____ as consequence is to ____.


a) reinforcer; reinforcement
b) reinforcement; punisher
c) punishment; reinforcement
d) reinforcer; punisher
> B 231

160 CHAPTER 6
39. The term ____ refers to a process or procedure.
a) reinforcer
b) reinforcement
c) punisher
d) both reinforcer and punisher
> B 231

40. Suppose a rat presses a lever and receives a food pellet. As a result, it is more likely to press the lever in the
future. In this example, the food is functioning as a ____ for lever pressing.
a) reinforcer
b) discriminative stimulus
c) punisher
d) punishment
> A 232

41. A spanking is a punisher if it


a) follows a behavior.
b) precedes a behavior.
c) decreases the probability of a behavior.
d) both follows a behavior and decreases the probability of a behavior.
> D 232

42. Suppose a rat runs in a wheel and receives a food pellet. The subsequent increase in wheel running as a result
of the food delivery is an example of
a) an establishing operation.
b) reinforcement.
c) a reinforcer.
d) punishment.
> B 232

43. Sam received a traffic fine for speeding the other day. The traffic fine is a ____ for Sam’s behavior of
speeding.
a) reinforcer
b) punisher
c) conditioned stimulus
d) none of these; further information is needed to determine the answer
> D 232

44. When Leena finished her homework, her mother gave her some apple pie. This is obviously an example of
a) positive reinforcement.
b) negative reinforcement.
c) positive punishment.
d) It is impossible to know given the information provided.
> D 232

45. A dog is given a treat each time it comes when called, and as a result no longer comes when called. The
______ is an example of ______.
a) treat; negative reinforcement
b) treat; a punisher
c) decrease in behavior; a punisher
d) treat; punishment
> B 232 WWW

OPERANT CONDITIONING: INTRODUCTION 161


46. Maria gives her canary some food each time it flutters its wings. The food is a
a) punisher.
b) reinforcer.
c) discriminative stimulus.
d) none of these; further information is needed to determine the answer
> D 232 QZ

47. Properly speaking, reinforcers and punishers are defined entirely by


a) their intensity.
b) the probability of their occurrence.
c) their effect on behavior.
d) the extent to which they are perceived as pleasant versus unpleasant.
> C 232

48. Reinforcers and punishers are entirely defined by


a) their hedonic value.
b) the manner in which they influence behavior.
c) the extent to which they are appetitive or aversive.
d) both their hedonic value and the extent to which they are appetitive or aversive.
> B 232

49. Reinforcers are ____ the kinds of events that we consider pleasant.
a) often but not always
b) always
c) rarely
d) never
> A 232

50. If a mother kisses her child whenever he breaks a dish and, as a result, he breaks fewer dishes in the future, the
kissing would by definition be a
a) punisher.
b) reinforcement.
c) reinforcer.
d) punishment.
> A 232

51. An electric shock is a reinforcer if it


a) follows a behavior.
b) precedes a behavior.
c) increases the probability of a behavior.
d) both follows a behavior and increases the probability of a behavior.
> D 232

52. The withdrawal of reinforcement for a behavior is called


a) extinction.
b) inhibition.
c) dishabituation.
d) negative punishment.
> A 233

162 CHAPTER 6
53. The dog no longer receives food for begging and therefore stops begging. This is an example of
a) blocking.
b) punishment.
c) reinforcement.
d) extinction.
> D 233

Operant Antecedents: Discriminative Stimuli

54. An SD is a stimulus that


a) increases the probability of a certain behavior.
b) signals that a reinforcer is now available for the behavior.
c) decreases the probability of a behavior.
d) both increases the probability of a certain behavior and signals that a reinforcers is now available for the
behavior.
> D 234

55. When Hai visits his parents, he whines a lot about how unappreciated he is at work. It seems likely that the
presence of his parents is ______ for whining.
a) a discriminative stimulus
b) a reinforcer
c) reinforcement
d) a conditioned stimulus
> A 234 WWW MD

56. A(n) ____ is a stimulus that “sets the occasion for” a behavior.
a) CS
b) SD
c) SR
d) SP
> B 235

57. A(n) ____ stimulus serves as a signal that a response will be followed by a reinforcer.
a) operant
b) discriminative
c) conditioned
d) appetitive
> B 235

58. A restaurant sign can be viewed as a(n) ____ for entering the restaurant and getting a hamburger.
a) SD
b) US
c) SR
d) CS
> A 235 QZ

59. A simple way of thinking about the three-term contingency is that you (in correct order)
a) notice something, get something, and do something.
b) do something, notice something, and get something.
c) get something, notice something, and do something.
d) notice something, do something, and get something.
> D 235

OPERANT CONDITIONING: INTRODUCTION 163


60. In the three-term contingency, the antecedent is the
a) reinforcer.
b) operant response.
c) discriminative stimulus.
d) conditioned stimulus.
> C 235

61. In correct order, the three-term contingency consists of


a) antecedent, consequence, and behavior.
b) antecedent, behavior, and consequence.
c) consequence; behavior, and antecedent.
d) behavior, antecedent, and consequence.
> B 235

62. A stimulus which signals that a response will be punished is a(n)


a) conditioned stimulus for punishment.
b) unconditioned stimulus for punishment.
c) negative antecedent .
d) discriminative stimulus for punishment.
> D 235

63. The statement, “Don’t you dare try it!”, would for most people be a(n)
a) discriminative stimulus for reinforcement.
b) unconditioned stimulus for fear.
c) discriminative stimulus for punishment.
d) discriminative stimulus for fear.
> C 235

64. A stimulus that signals that reinforcement will not be available is known as
a) a discriminative stimulus for punishment.
b) a discriminative stimulus for extinction.
c) SΔ .
d) both a discriminative stimulus for extinction and SΔ .
> D 236

65. As I walk up to my favorite coffee shop, I see a sign that says “Closed for Repairs.” This means that I won’t be
able to walk into the shop in order to get a latte and a muffin. In operant conditioning terms, the ‘closed’ sign is
a) a discriminative stimulus for punishment.
b) a punisher.
c) a discriminative stimulus for extinction.
d) an extinguisher.
> C 236

66. Unlike classically conditioned behavior, operant behavior is


a) typically seen as voluntary and flexible.
b) said to elicited by the stimulus.
c) both typically seen as voluntary and flexible and said to elicited by the stimulus.
d) neither typically seen as voluntary and flexible nor said to elicited by the stimulus.
> A 238

164 CHAPTER 6
67. Unlike classical conditioning, operant conditioning involves a(n)
a) S-S-R sequence.
b) is a function of what comes before it.
c) both S-S-R sequence and is a function of what comes before it.
d) neither S-S-R sequence nor is a function of what comes before it.
> D 238

68. To determine if operant conditioning is involved, the most critical question to ask is whether the occurrence of
the behavior is mostly a function of
a) the stimulus that precedes it.
b) the stimulus that follows it.
c) the person.
d) the environment.
> B 238

Four Types of Contingencies

69. A contingency of reinforcement means that


a) a response is followed by a reinforcer.
b) a reinforcer is followed by a response.
c) a response is elicited by a reinforcer.
d) a response is elicited by an SD.
> A 238

70. When combined with the terms “reinforcement” or “punishment,” the word “positive” means
a) something that is appetitive.
b) something that is subtle.
c) something is added or presented.
d) both something that is appetitive and something that is subtle.
> C 238

71. When combined with the terms reinforcement or punishment, the word “negative” means
a) something that is good.
b) something that is intense.
c) something that is unpleasant.
d) something is subtracted or withdrawn.
> D 238

72. With respect to the four types of contingencies, add is to subtract as ____ is to ____.
a) desire; hate
b) positive; negative
c) negative; positive
d) hate; desire
> B 238

73. Increase is to decrease as ____ is to ____.


a) reinforcement; punishment
b) punishment; reinforcement
c) antecedent; consequence
d) consequence; antecedent
> A 238

OPERANT CONDITIONING: INTRODUCTION 165


Positive Reinforcement

74. The term ____ refers to the presentation of a stimulus following a response which then leads to an increase in
the future strength of that response.
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> A 240

75. The term “positive reinforcement” refers to the ____ of a stimulus following a response which then leads to
a(n) ____ in the future strength of that response.
a) removal; increase
b) presentation; decrease
c) presentation; increase
d) removal; decrease
> C 240

76. The pigeon pecks a response key and receives food. As a result, the probability of key pecking increases. This
is an example of
a) positive reinforcement.
b) negative reinforcement.
c) negative punishment.
d) positive punishment.
> A 240

77. Andre praises his young daughter for being assertive, after which she becomes even more assertive. This is an
example of
a) negative reinforcement.
b) positive reinforcement.
c) negative punishment.
d) positive punishment.
> B 240

78. John yells at his dog whenever it barks. As a result, the dog begins barking even more frequently. This is an
example of
a) positive punishment.
b) negative reinforcement.
c) negative punishment.
d) positive reinforcement.
> D 240

79. Paula laughs at her child when he breaks a dish. As result, he breaks dishes even more frequently. This is an
example of
a) negative punishment.
b) negative reinforcement.
c) positive reinforcement.
d) positive punishment.
> C 240

166 CHAPTER 6
Negative Reinforcement

80. The term ____ refers to the removal of a stimulus following a response which then leads to an increase in the
future strength of that response.
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> B 240

81. The term “negative reinforcement” refers to the ____ of a stimulus following a response which then leads to
a(n) ____ in the future strength of that response.
a) removal; increase
b) presentation; decrease
c) presentation; increase
d) removal; decrease
> A 240

82. When I banged on the heating pipe, it stopped making a noise. The next time I heard that noise, I immediately
banged on the pipe. This seems to be an example of
a) positive reinforcement.
b) negative reinforcement.
c) positive punishment.
d) negative punishment.
> B 240 MD

83. Jamaal’s mother tells him: “If you clean your room, you won’t have to do the dishes.” What type of
contingency is she attempting to apply?
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> B 240

84. “I’ll do anything to avoid housework.” This statement speaks to the power of
a) positive reinforcement.
b) negative reinforcement.
c) positive punishment.
d) negative punishment.
> B 240 WWW

85. A(n) ____ response occurs before the aversive stimulus is presented and thereby prevents its delivery.
a) escape
b) avoidance
c) reflexive
d) primary
> B 241

86. A(n) ____ response results in the termination of an aversive stimulus.


a) escape
b) avoidance
c) reflexive
d) primary
> A 241

OPERANT CONDITIONING: INTRODUCTION 167


87. Putting on a heavy parka before going out into the cold is an example of a(n) ____ response, while putting it on
after you go outside and become cold is an example of a(n) ____ response.
a) operant; reflexive
b) avoidance; escape
c) escape; avoidance
d) reflexive; operant
> B 241

Positive Punishment

88. The term ____ refers to the presentation of a stimulus following a response which then leads to a decrease in
the future strength of that response.
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> C 242

89. The term “positive punishment” refers to the ____ of a stimulus following a response which then leads to a(n)
____ in the future strength of that response.
a) removal; increase
b) presentation; decrease
c) presentation; increase
d) removal; decrease
> B 242

90. Jim compliments his secretary on her sexy new outfit when she offers to bring him coffee one morning. She
never again offers to bring him coffee. Out of the following, this is an example of which type of process?
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> C 242 WWW

91. When Pedro punched his sister, she punched him back. He never again punched her. This seems to be an
example of what process?
a) positive punishment
b) negative reinforcement
c) positive reinforcement
d) negative punishment
> A 242 FN

92. When Amir teased his sister, she hugged him. He never again teased her. This seems to be an example of what
process?
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> C 242 QZ

168 CHAPTER 6
93. A stimulus that can serve as a negative reinforcer can probably also serve as a(n)
a) negative punisher.
b) positive punisher.
c) positive reinforcer.
d) unconditioned reinforcer.
> B 242

94. Most people would be least likely to volunteer for an experiment on


a) positive reinforcement.
b) appetitive conditioning.
c) positive punishment.
d) negative punishment.
> C 242

Negative Punishment

95. The term ____ refers to the removal of a stimulus following a response which then leads to a decrease in the
future strength of that response.
a) positive reinforcement
b) negative reinforcement
c) positive punishment
d) negative punishment
> D 243

96. The term “negative punishment” refers to the ____ of a stimulus following a response which then leads to a(n)
____ in the future strength of that response.
a) removal; increase
b) removal; decrease
c) presentation; increase
d) presentation; decrease
> B 243

97. Melissa stayed out past her curfew and subsequently lost car privileges for a week. As a result, she never again
stayed out past her curfew. This example best illustrates the process of
a) positive reinforcement.
b) negative reinforcement.
c) positive punishment.
d) negative punishment.
> D 243 MD

98. Felix swore at his girlfriend during an argument one day, after which she wouldn’t talk to him for a week. As a
result, he became much less likely to swear at her. This is best described as an example of
a) positive reinforcement.
b) negative reinforcement.
c) positive punishment.
d) negative punishment.
> D 243

OPERANT CONDITIONING: INTRODUCTION 169


99. Miranda complains loudly every time her boyfriend watches sports. As a result of her complaining, he usually
turns off the television. He feels like their relationship is strained, but she feels like things are really going her
way. Regarding the use of punishment, what could you say about this example?
a) Miranda’s “punishing” behavior of complaining has been negatively reinforced.
b) The relationship might be better if there were more pleasant interactions, rather than aversive ones.
c) Miranda’s boyfriend is experiencing punishment for his behavior of watching sports.
d) all of these
> D 244

100. The use of punishment can be quite seductive in that its delivery is often followed by ____ for the person who
delivered it.
a) immediate positive reinforcement
b) immediate negative reinforcement
c) delayed positive punishment
d) delayed negative punishment
> B 244

And Furthermore: Four Types of Contingencies: Tricky Examples

101. When Sean doesn’t cry, he doesn’t get an extra helping of dessert. As a result, he always cries at the dinner
table. This is best interpreted as an example of
a) positive reinforcement.
b) negative punishment.
c) positive punishment.
d) extinction.
> A 245 WWW

102. “I don’t think about an upcoming exam so that I won’t get anxious.” This pattern of behavior probably evolved
as a function of what process?
a) positive reinforcement
b) extinction
c) positive punishment
d) negative punishment
> C 245 FN

103. Kyle’s mother tells him: “If you don’t clean your room, you won’t get to watch television.” This can be
classified as what type of contingency?
a) positive reinforcement
b) extinction
c) positive punishment
d) negative reinforcement
> A 245 QZ

Positive Reinforcement: Further Distinctions


Immediate versus Delayed Reinforcement

104. In general, the more ____ the reinforcer, the stronger its effect on behavior.
a) immediate
b) delayed
c) negative
d) positive
> A 246

170 CHAPTER 6
105. It is difficult to eat a healthy diet because the reinforcers for healthy eating are often ____, while the
reinforcers for eating junk food are ____.
a) extrinsic; intrinsic
b) intrinsic; extrinsic
c) immediate; delayed
d) delayed; immediate
> D 246

106. When Courtney closes her books at the end of a study session, her mother tells her how proud she is of the
effort she puts into her classes. How might this comment by Courtney’s mother increase the likelihood that
Courtney will study tomorrow night?
a) it is a negative reinforcer, which increases the likelihood of the behavior of studying
b) it is a positive reinforcer, which increases the likelihood of the behavior of studying
c) it is an immediate reinforcer, whereas the delayed reinforcer of good grades may be insufficient to
increase the probability of studying
d) both b and c
> D 246

107. It is difficult to study on the weekend because the reinforcers for studying are often ____, while the reinforcers
for having fun are ____.
a) immediate; delayed
b) delayed; immediate
c) primary; secondary
d) secondary; primary
> B 246

Primary and Secondary Reinforcers

108. A(n) ____ reinforcer is one that has become a reinforcer because it is associated with some other reinforcer.
a) primary
b) unconditioned
c) secondary
d) both primary and unconditioned
> C 248

109. A primary reinforcer is one that


a) is innately reinforcing.
b) has become associated with another reinforcer.
c) has become associated with many other reinforcers.
d) is immediate rather than delayed.
> A 248

110. Innate is to learned as ____ reinforcer is to ____ reinforcer.


a) secondary; primary
b) primary; secondary
c) intrinsic; extrinsic
d) extrinsic; intrinsic
> B 248

111. Events that are innately reinforcing are called


a) extrinsic reinforcers.
b) primary reinforcers.
c) secondary reinforcers.
d) generalized reinforcers.
> B 248

OPERANT CONDITIONING: INTRODUCTION 171


112. A primary reinforcer is also called a(n) ____ reinforcer, while a secondary reinforcer is also called a(n) ____
reinforcer.
a) conditioned; unconditioned
b) generalize; nongeneralized
c) nongeneralized; generalized
d) unconditioned; conditioned
> D 248

113. A secondary reinforcer can also be called a(n) ____ reinforcer.


a) conditioned
b) unconditioned
c) generalized unconditioned
d) intrinsic
> A 248

114. A primary reinforcer can also be called a(n) ____ reinforcer.


a) conditioned
b) unconditioned
c) generalized conditioned
d) intrinsic
> B 248

115. Food usually functions as a(n) ____ reinforcer while a light that has been paired with food functions as a(n)
____ reinforcer.
a) generalized; discriminative
b) extrinsic; intrinsic
c) primary; secondary
d) secondary; primary
> C 248 FN

116. The rat’s home cage is strongly associated with food, water, warmth, and safety. As a result, the opportunity to
enter the home cage can likely function as a(n) ____ reinforcer.
a) primary
b) preconditioned
c) generalized
d) unconditioned
> C 249 MD

117. For a professional athlete, the noise that a cheering crowd makes is best described as a(n)
a) generalized reinforcer.
b) unconditioned reinforcer.
c) primary reinforcer.
d) contrived reinforcer.
> A 249

118. Money and social attention are common examples of ____ reinforcers.
a) primary
b) secondary
c) unconditioned
d) generalized
> D 249

172 CHAPTER 6
119. For a student, seeing a large ‘A’ at the top of a marked paper or exam could be considered
a) a conditioned reinforcer.
b) a secondary reinforcer.
c) a generalized reinforcer.
d) all of these.
> D 249

120. Behaviors that have been strongly associated with reinforcement can themselves become ____ reinforcers.
a) primary
b) secondary
c) intrinsic
d) discriminated
> B 249

And Furthermore: Learned Industriousness

121. According to the theory of ____, hard work can sometimes function as a secondary reinforcer.
a) learned helplessness
b) acquired industriousness
c) learned industriousness
d) learned acquisitiveness
> C 250

122. Research has shown that rats that have been reinforced for emitting forceful lever presses will subsequently
a) run faster down an alleyway to obtain food.
b) run more slowly down an alleyway to obtain food.
c) show a generalized tendency to be lazy.
d) both run more slowly down an alleyway to obtain food and show a generalized tendency to be lazy.
> A 250

123. Research has shown that students who have been reinforced for solving complex math problems will
subsequently
a) write essays of lower quality.
b) write essays of higher quality.
c) show a decrease in math ability.
d) both write essays of lower quality and show a decrease in math ability.
> B 250

124. Jack works extremely hard at whatever task he is assigned. According to learned industriousness theory,
working hard is which sort of reinforcer, for Jack?
a) conditioned reinforcer
b) secondary reinforcer
c) primary reinforcer
d) both a and b are correct
> D 250

Intrinsic and Extrinsic Reinforcement

125. Behaviors performed for their own sake are said to be


a) intrinsically motivated.
b) extrinsically motivated.
c) extrinsically reinforced.
d) innately motivated.
> A 251

OPERANT CONDITIONING: INTRODUCTION 173


126. Behaviors that are motivated by some added incentive are said to be ____ motivated.
a) intrinsically
b) extrinsically
c) hedonically
d) extraneously
> B 251

127. Extrinsic rewards are likely to lower intrinsic interest in a task when they are
a) expected.
b) verbal.
c) delivered contingent upon high quality performance.
d) all of these
> A 252

128. Extrinsic rewards are likely to raise intrinsic interest when they are
a) nonverbal.
b) expected.
c) given for high quality performance.
d) both nonverbal and expected.
> C 252 QZ

129. Each time Jana learns a new piece on the piano and can play it without error, she gets to have her favorite
dessert for dinner. Chances are that Jana’s interest in playing the piano will likely
a) decrease.
b) increase.
c) remain unchanged.
d) either decrease or increase.
> B 252

130. Extrinsic rewards are less likely to damage intrinsic interest when they are
a) expected.
b) verbal.
c) delivered contingent upon mere performance of the activity.
d) all of these
> B 252

131. Shara praises her daughter each time she does a fine job on her math homework. As a result, her daughter is
likely to become
a) more interested in math.
b) less interested in math.
c) focused upon receiving praise from her mother.
d) both less interested in math and focused upon receiving praise from her mother.
> A 252

132. Fatima promises to give her son a cookie for each hour that he studies math. As a result, her son could well
become
a) more interested in math.
b) less interested in math.
c) less interested in cookies.
d) resistant to her instructions.
> B 252 MD

174 CHAPTER 6
133. Suzie notices that her daughter Nina loves to play piano. Suzie decides to encourage her further by promising
to pay her a dollar for every extra hour of piano practice in the evening. Chances are that Nina’s intrinsic
interest in playing the piano will likely
a) decrease.
b) increase.
c) remain unchanged.
d) both increase and remain unchanged.
> A 252 WWW FN

And Furthermore: Positive Reinforcement of Artistic Appreciation

134. In Skinner’s anecdote about roommates who instilled an appreciation of art in a previously non-artistic person,
which of the following techniques did the roommate use?
a) positive reinforcement
b) application of intrinsic reinforcers
c) negative punishment
d) application of natural reinforcers
> A 253

Natural and Contrived Reinforcers

135. A reinforcer that has been deliberately arranged to modify a behavior and is not a common aspect of a certain
situation is called a(n) ______ reinforcer.
a) natural
b) secondary
c) contrived
d) secondary and contrived
> D 254

136. A(n) _____ reinforcer is one that has been deliberately arranged to modify a behavior and is not a natural
aspect of a certain situation.
a) contrived
b) extrinsic
c) intrinsic
d) both contrived and intrinsic
> A 254

137. Intrinsic reinforcers


a) are always natural reinforcers.
b) are always contrived reinforcers.
c) can be either natural or contrived reinforcers.
d) can be neither contrived nor natural reinforcers.
> A 254

138. Which of the following is true of money, as a reinforcer?


a) It is always a contrived reinforcer.
b) It can be either a natural or contrived reinforcer.
c) It can be neither a natural nor contrived reinforcer.
d) It is always an intrinsic reinforcer.
> B 254

139. Seeing a movie is a _____ reinforcer for going to a theatre.


a) contrived
b) natural
c) primary

OPERANT CONDITIONING: INTRODUCTION 175


d) negative
> B 254 QZ

140. Being paid to study is a(n) _____ reinforcer for studying.


a) natural
b) extrinsic
c) contrived
d) extrinsic and contrived
> D 254 MD

141. Which of the following is true of the use of contrived reinforcers in a clinical setting?
a) It is important to consistently maintain them within that setting.
b) The attempt will often be made to withdraw them over time.
c) It is hoped that the behavior will eventually become trapped the natural contingencies in that environment.
d) both b and c are correct
> D 254

142. Enjoying yourself at a party is a(n) _____ reinforcer for going to the party.
a) contrived
b) natural
c) extrinsic
d) both contrived and natural
> B 254

Shaping

143. The process of reinforcing gradual approximations to a new behavior is known as


a) chaining.
b) shaping.
c) graduated reinforcement.
d) fading.
> B 255

144. Shaping is the


a) reinforcement of new operant behavior.
b) gradual reinforcement of new operant behavior.
c) reinforcement of gradual approximations to a new behavior.
d) creation of new behavior through gradual reinforcement.
> C 255

145. Which of the following is an example of shaping?


a) Reinforcing the behavior of lever pressing.
b) Reinforcing gradual approximations to lever pressing.
c) Gradual reinforcement for lever pressing.
d) Reinforcing the rat for gradual approximations to lever pressing.
> B 255 WWW

146. The advantages of using a click or whistle as a secondary reinforcer during shaping include
a) it can be delivered immediately following the correct behavior.
b) the animal will not satiate upon it.
c) both a and b
d) neither a nor b
> C 255 FN

176 CHAPTER 6
147. The sound of a click can be an effective tool for shaping after it has been paired with ____, thereby making it a
____.
a) food; secondary reinforcer
b) shock; primary punisher
c) food; primary reinforcer
d) shock; secondary punisher
> A 255

148. At the zoo one day, you notice a zookeeper coaxing a camel into a pen by blowing a whistle. It is probably the
case that the whistle has been paired with ____, and is now functioning as a(n) ____.
a) shock; punisher
b) food; unconditioned stimulus
c) food; secondary reinforcer
d) shock; primary punisher
> C 256

149. Over time, we are likely to become more and more efficient at washing the dishes. This is mostly the result of
a) primary punishment.
b) positive punishment.
c) chaining.
d) shaping.
e) D 256 QZ

150. Over time, Jim gradually becomes more and more efficient at cleaning his apartment. This improvement is
most likely an example of what type of process?
a) intermittent reinforcement
b) stimulus control
c) shaping
d) an FR schedule of reinforcement
> C 256 MD

And Furthermore: Training Ishmael

151. For a male Betta splendens, the sight of another male can act as a
a) releasing stimulus.
b) positive reinforcer.
c) both releasing stimulus and positive reinforcer.
d) neither releasing stimulus nor positive reinforcer.
> C 259

152. For a male Betta splendens, the sight of another male elicits _____________ behavior, and functions as a(n)
__________. This provides evidence that _________________ .
a) aggressive behavior; punisher; punishers are associated with aggression
b) aggressive behavior; reinforcer; not all reinforcers are stimuli that are perceived as pleasant
c) avoidance behavior; punisher; fish will work to avoid conflict
d) avoidance behavior; reinforcer; this species cannot learn to avoid punishers
> B 259

153. For a male Betta splendens, potential reinforcers include


a) the sight of another male.
b) food.
c) a cold stream of water.
d) both the sight of another male and food.
> D 259

OPERANT CONDITIONING: INTRODUCTION 177


Fill-in-the-Blank Items
Most of these items are taken from or are very similar to the end-of-chapter test items in the text; the items at the end
that are marked WWW are posted on the student resource website.

1. Compared to elicited behaviors, operant behaviors seem (more/less) _______ automatic and reflexive.
> less
2. According to Thorndike’s ____________________________, behaviors that lead to a(n) __________________
state of affairs are strengthened, while behaviors that lead to a(n) ___________________ state of affairs are
weakened.
> law of effect; satisfactory; unsatisfactory

3. Operant behaviors are usually defined as a(n) ______________ of behaviors, all of which are capable of
producing a certain ___________________.
> class; consequence

4. An event is a punisher if it _______________ a behavior and the future probability of that behavior
____________________.
> follows; decreases

5. Reinforcers and punishers are defined entirely by their ___________________ on behavior.


> effect(s)

6. A discriminative stimulus is a stimulus which signals that a(n) __________________ is available. It is said to
“____________________________________________” for the behavior.
> reinforcer; set the occasion

7. When Beth tried to take the bone away from Jack (her dog), Jack bared his teeth and growled threateningly.
Beth quickly pulled her hand back. Jack growled even more threateningly the next time Beth reached for the
bone, and she again pulled her hand away. Eventually, Beth gave up and she now lets Jack chew on bones for as
long as he wants. Jack’s behavior of baring his teeth and growling served to ___________________________
Beth’s behavior of trying to take the bone away from him. Beth’s behavior of pulling her hand away served to
_______________________________ Jack’s behavior of growling.
> positively punish; negatively reinforce

8. Events that are innately reinforcing are called _________________ reinforcers; events that become reinforcers
through experience are called __________________ reinforcers.
> primary (unconditioned); secondary (conditioned)

9. Money and praise are common examples of __________________________________ reinforcers.


> generalized (or generalized secondary)

10. Sacha very much enjoys hard work, and often volunteers for projects that are quite demanding. According to
______________________________________________ theory, it is likely the case that, for Sacha, the act of
expending a lot of effort has often been ______________________________________.
> learned industriousness; positively reinforced

11. Behaviors that are performed for their own sake are said to be _________________ motivated; behaviors that
are performed in order to achieve some additional incentive are said to be ____________________ motivated.
> intrinsically; extrinsically

12. The gradual development of new operant behavior through reinforcement of successive approximations to that
behavior is called _______________________.
> shaping

178 CHAPTER 6
13. At the zoo one day, you notice a zookeeper leading a rhinoceros into a pen by simply whistling at it. It is
probably the case that the whistle has been paired with _________, and is now functioning as a(n)
__________________________________________.
> food; secondary reinforcer

14. A major advantage of using the sound of a whistle while training dolphins are that the whistle can be presented
___________________ following the correct behavior.
> immediately
15. The three-term contingency consists of a(n) _________________, a(n) ________________, and a(n)
___________________, in that order.
> antecedent; behavior; consequence WWW

16. An event is a(n) ______________ if it follows a behavior, and the future probability of that behavior decreases.
> punisher WWW

17. Each time a student cleans his room, he is hugged by his mother. As a result, he no longer cleans his room. This
is an example of (what type of contingency) __________________________________________________.
> positive punishment WWW

Short-Answer Items
Most of these items are end-of-chapter study questions from the text; those marked WWW are additional items from
the student resource website.

1. State Thorndike’s law of effect? What is operant conditioning (as defined by Skinner), and how does this
definition differ from Thorndike’s law of effect?

The law of effect states that behaviors leading to a satisfactory state of affairs are strengthened or stamped in,
while behaviors leading to an unsatisfactory or annoying state of affairs are weakened or stamped out. Operant
conditioning is a type of learning in which the future probability of a behavior is affected by its consequences.
Skinner’s definition is more objective and avoids any inference about whether the animal is feeling satisfied or
annoyed. (226, 228-229)

2. Explain why operant behaviors are said to be “emitted,” and why they are defined as a “class” of responses?

Operant behaviors are said to be emitted by the organism to indicate that such behavior often has a more
voluntary, flexible quality to it compared to elicited behavior. An operant response is usually defined as a class
of behaviors, with all of the behaviors in that class capable of producing the consequence. Defining operants in
this way has proven fruitful in that it is easier to predict the occurrence of a class of responses than it is to
predict the exact response that will be emitted at a particular point in time. (230)

OPERANT CONDITIONING: INTRODUCTION 179


3. Define the terms reinforcer and punisher. How do these terms differ from the terms reinforcement and
punishment?

An event is a reinforcer if (1) it follows a behavior, and (2) the future probability of that behavior increases. An
event is a punisher if (1) it follows a behavior, and (2) the future probability of that behavior decreases. The
term reinforcer (punisher) refers to the specific consequence used to strengthen (weaken) a behavior. The term
reinforcement (punishment) refers to the process or procedure by which a certain consequence strengthens
(weakens) a behavior. (230-231)

4. What is a discriminative stimulus? Define the three-term contingency and diagram an example? Be sure to
include the appropriate symbol for each component in the example.

A discriminative stimulus (SD) is a stimulus in the presence of which responses are reinforced and in the
absence of which they are not reinforced.
The three-term contingency consists of the discriminative stimulus, the operant response, and the reinforcer
or punisher. (The three-term contingency can also be viewed as consisting of an antecedent event, a behavior,
and a consequence.) For example:
Tone: Lever press —> Food pellet
SD R SR
(234-235)

5. Define positive reinforcement and diagram an example. Define negative reinforcement and diagram an
example. In the examples, be sure to include the appropriate symbol for each component.

Positive reinforcement consists of the presentation of a stimulus (one that is usually considered pleasant or
rewarding) following a response which then leads to an increase in the future strength of that response. For
example:
Lever press —> Food
R SR
Negative reinforcement is the removal of a stimulus (one that is usually considered unpleasant or aversive)
following a response which then leads to an increase in the future strength of that response. For example:
Open umbrella —> Escape rain
R SR
(240-241)

6. Define positive punishment and diagram an example. Define negative punishment and diagram an example. In
the examples, be sure to include the appropriate symbol for each component.

Positive punishment consists of the presentation of a stimulus (one that is usually considered unpleasant or
aversive) following a response which then leads to a decrease in the future strength of that response. For
example:
Talk back to the boss —> Get reprimanded
R SP
Negative punishment consists of the removal of a stimulus (one that is usually considered pleasant) following a
response which then leads to a decrease in the future strength of that response. For example:
Stay out past curfew —> Lose car privileges
R SP
(242-243)

180 CHAPTER 6
7. What are the similarities and differences between negative reinforcement and positive punishment?

They are similar in that the consequence in each case involves an aversive stimulus. They are different in that in
negative reinforcement, the aversive stimulus is removed following the behavior in order to strengthen the
behavior, while in positive punishment, the aversive stimulus is presented following the behavior in order to
decrease the behavior. (240-242)

8. How does immediacy affect the strength of a reinforcer? How does this often lead to difficulties for students in
their academic studies?

In general, the more immediate the reinforcer, the stronger its effect upon the behavior.
The reinforcers for studying are usually delayed, while the reinforcers for doing something other than
studying are more immediate. Hence, students are strongly tempted to do something other than study. (246-247)

9. Distinguish between primary and secondary reinforcers and give an example of each.

A primary reinforcer is an event that is innately reinforcing. A secondary reinforcer is an event that is
reinforcing because it has been associated with some other reinforcer. Plus examples. (248)

10. What is a generalized reinforcer? What are two examples of such reinforcers?

A generalized reinforcer is a type of secondary reinforcer that has been associated with several other
reinforcers. Plus examples (money and attention are common examples of generalized secondary reinforcers).
(249)

11. Define intrinsic and extrinsic reinforcement, and provide an example of each.

Intrinsic reinforcement is reinforcement provided by the mere act of performing the behavior. Extrinsic
reinforcement is the reinforcement provided by some consequence that is external to the behavior. Plus
examples. (251)

12. Under what three conditions does extrinsic reinforcement undermine intrinsic interest? Under what two
conditions does extrinsic reinforcement enhance intrinsic interest?

Extrinsic reinforcement can undermine intrinsic motivation when the reward is (1) expected, (2) tangible, and
(3) given for simply performing the activity (not for how well it is performed). An increase in intrinsic
motivation can occur with (1) the use of verbal rewards, such as praise, and (2) the use of tangible rewards
given for high quality performance. (252)

13. Define natural and contrived reinforcers, and provide an example of each.

Contrived reinforcers are reinforcers that have been deliberately arranged in order to modify a behavior; they
are not a typical consequence of the behavior in that setting. Natural reinforcers are reinforcers that are typically
provided for a certain behavior; that is, they are a natural consequence of the behavior within that setting. Plus
examples. (254)

OPERANT CONDITIONING: INTRODUCTION 181


Test Bank for Introduction to Learning and Behavior, 4th Edition : Powell

14. Define shaping. What are two advantages of using a secondary reinforcer, such as a sound, as an aid to shaping?

Shaping is the gradual creation of new operant behavior through reinforcement of successive approximations to
that behavior. The benefit of using a sound as a reinforcer is that it can be presented immediately upon the
occurrence of the behavior, even if the animal is a distance away. Also, if food was presented each time as a
reinforcer, the animal would quickly satiate, at which point the food would be ineffective as a reinforcer. (255-
256)

15. Reinforcers and punishers are formally defined entirely by their effect on what? Why, in a practical sense, is
this important? Illustrate your answers with some examples. WWW

Reinforcers and punishers are defined entirely by their effect on behavior. For example, a laugh is a reinforcer
for the behavior of joke telling only to the extent that joke telling then increases. If, for some reason, joke telling
decreased as a result of the laugh, the laugh would by definition be a punisher. It is important to remember this,
because events that on the surface seem like reinforcers or punishers do not always function in that manner. For
example, teachers sometimes yell at their students for being disruptive, and as a result the students become
MORE (not less) disruptive. Although the teacher is clearly trying to punish the disruptive behavior, the yelling
is actually having the opposite effect. By definition, therefore, the yelling is a reinforcer because it is causing
the disruptive behavior to increase in frequency. (232)

16. List four of the distinctions between classical and operant conditioning that are outlined in the text. WWW

Four of the following:


Classical conditioning Operant conditioning
1. Behavior typically seen as involuntary Behavior generally seen as voluntary & flexible.
& inflexible.
2. Behavior said to be “elicited by the stimulus.” Behavior said to be “emitted by the organism.”
3. Typically involves innate patterns of Often does not involve innate patterns of behavior.
behavior (URs).
4. Behavior is a function of what comes before it, Behavior is a function of what comes after it, i.e.,
i.e., the preceding stimulus is critical and the the consequence is critical and the preceding
consequences are largely irrelevant. stimulus signals the availability of that consequence.
5. Conditioning involves a stimulus-stimulus- Conditioning involves a stimulus-response-stimulus
response (S-S-R) sequence. (S-R-S) sequence.
(238)

17. To what extent do behaviorists prefer natural versus contrived reinforcers. What is the hoped for outcome when
contrived reinforcers are used? What is a big advantage of natural over contrived contingencies? Illustrate your
answers with examples. WWW

Although contrived reinforcers are often seen as a hallmark of behaviorism, behaviorists strive to utilize natural
reinforcers whenever possible. When contrived reinforcers are used, the ultimate goal is to let the “natural
contingencies” take over if possible. For example, although we might initially use tokens to motivate a patient
with schizophrenia to socialize with others, our hope is that the behavior will eventually become “trapped” by
the natural consequences of socializing (e.g., smiles and pleasant comments from others) such that the tokens
can eventually be withdrawn. A big advantage of natural contingencies is that they tend to produce more
efficient behaviors patterns than contrived contingencies. Although a coach might use praise to reinforce correct
throwing actions by a young quarterback, the most important factor in producing correct throws will be the
natural consequence of where the ball goes. (254)

182 CHAPTER 6

Visit TestBankBell.com to get complete for all chapters

You might also like