Presentation is loading. Please wait.

Presentation is loading. Please wait.

Myers’ Psychology for AP®, 2e

Similar presentations


Presentation on theme: "Myers’ Psychology for AP®, 2e"— Presentation transcript:

1 Myers’ Psychology for AP®, 2e
David G. Myers PowerPoint Presentation Slides by Kent Korek Germantown High School Worth Publishers, © 2014 AP® is a trademark registered and/or owned by the College Board ®, which was not involved in the production of, and does not endorse, this product.

2 Unit 6: Learning

3 Unit 06 - Overview How We Learn and Classical Conditioning
Operant Conditioning Operant Conditioning’s Applications, and Comparison to Classical Conditioning Biology, Cognition, and Learning Learning By Observation Click on the any of the above hyperlinks to go to that section in the presentation.

4 Module 26: How We Learn and Classical Conditioning

5

6 How Do We Learn?

7 How Do We Learn? Learning Habituation Stimulus Associative learning
Classical conditioning Operant conditioning Cognitive learning Observational learning

8 Learning the process of acquiring new and relatively enduring information or behaviors. Enables humans to be especially adaptable to our environments, perhaps our most outstanding trait as human beings.

9 Basic Forms of Learning
Habituation- an organism’s decreasing response to a stimulus with repeated exposure to it. Habituation is NOT the same thing as sensory adaptation. Both involve a diminished response, but in the case of habituation, it is a form of learning, not a function of the sensory system. Essentially, through experience, you learn to react to a stimulus less.

10 Basic Forms of Learning
Cognitive Learning- the acquisition of mental information, whether by observing events, by watching others, or through language. Ex. Observational Learning- learning by observing others. Also called social learning.

11 Basic Forms of Learning Continued
Associative Learning- learning that certain events occur together. The events may be two stimuli (as in classical conditioning) or a response and its consequence (as in operant conditioning). Stimulus- any event or situation that evokes a response.

12 Conditioning Two Basic Forms
Classical Conditioning- we learn to associate two stimuli and thus to anticipate events. Operant Conditioning- We learn to associate a response (our behavior) and its consequences, thus we (and other animals) learn to repeat acts followed by good results and avoid acts followed by bad results.

13 How Do We Learn? Classical Conditioning

14 How Do We Learn? Classical Conditioning

15 How Do We Learn? Classical Conditioning

16 How Do We Learn? Classical Conditioning

17 How Do We Learn? Classical Conditioning

18 How Do We Learn? Classical Conditioning

19 How Do We Learn? Operant Conditioning

20 How Do We Learn? Operant Conditioning

21 How Do We Learn? Operant Conditioning

22 How Do We Learn? Operant Conditioning

23 Classical Conditioning

24 Classical Conditioning
Ivan Pavlov John B. Watson Behaviorism

25 Classical Conditioning
a type of learning in which one learns to link two or more stimuli and anticipate events.* First explored, very famously, by Ivan Pavlov. *the unconditioned response must be automatic.

26 Classical Conditioning Pavlov’s Experiments
Ivan Pavlov

27 Classical Conditioning Pavlov’s Experiments
Parts of Classical Conditioning Neutral Stimulus (NS) Unconditioned stimulus (US) Unconditioned response (UR) Conditioned stimulus (CS) Conditioned response (CR)

28 Parts of Classical Conditioning
Neutral Stimulus (NS)- in classical conditioning, a stimulus that elicits no response before conditioning (NS). Unconditioned Stimulus (US)- in classical conditioning, a stimulus that unconditionally – naturally and automatically – triggers a response (UR). Unconditioned Response (UR)- in classical conditioning, an unlearned, naturally occurring response to the unconditioned stimulus (US), such as salivation when food is in the mouth. Conditioned Stimulus (CS)- in classical conditioning, an originally irrelevant stimulus that, after association with an unconditioned stimulus (US), comes to trigger a conditioned response (CR). Conditioned Response (CR)- in classical conditioning, a learned response to a previously neutral (but now conditioned) stimulus (CS).

29 Pavlov’s device for recording (a tube in the dog’s cheek collects saliva, which
is measured in a cylinder outside the chamber).

30 Classical Conditioning Pavlov’s Experiments

31 Classical Conditioning Pavlov’s Experiments

32 Classical Conditioning Pavlov’s Experiments

33 Classical Conditioning Pavlov’s Experiments
Pavlov presented a neutral stimulus (a tone) just before an unconditioned stimulus (food in mouth). The neutrals stimulus then became a conditioned stimulus, producing a conditioned response.

34 Classical Conditioning

35 Classical Conditioning

36 Classical Conditioning
Psychologist Michael Tirrell recalled that his first girlfriend loved onions, so he came to associate onion breath with kissing.

37 Typical Example: A mild electric shock is used to teach a rat to flex his forepaw when he hears a tone. US UR NS CS CR Shock Paw flex Tone Tone Paw flex

38 Office Example In the groundbreaking TV show, The Office, there is, allegedly, a famous example of classical conditioning. Looking at it more closely, there are a few problems with this example. Why?

39 Office Example US Altoid? UR NS Hand extension CS CR Reboot sound?
Is the Altoid really an Unconditioned Stimulus? Is the Hand extension really an Unconditioned Stimulus?

40 Classical Conditioning Acquisition
Higher-order conditioning

41 Acquisition in classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned response. In operant conditioning, the strengthening of a reinforced response.

42 Higher Order Conditioning
a procedure in which the conditioned stimulus in one conditioning experience is paired with a new neutral stimulus, creating a second (often weaker) conditioned stimulus. For example, an animal that has learned that a tone predicts food might then learn that a light predicts the tone and begin responding to the light alone. (Also called second-order conditioning.)

43 Classical Conditioning Extinction and Spontaneous Recovery
The rising curve shows that the CR grows stronger as the NS is repeatedly paired with the US to become the CS (acquisition), then weakens as the CS is presented alone (extinction).

44 Extinction the diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus (US) does not follow a conditioned stimulus (CS); occurs in operant conditioning when a response is no longer reinforced.

45 Spontaneous Recovery the reappearance, after a pause, of an extinguished conditioned response.

46 Classical Conditioning Extinction and Spontaneous Recovery
After a pause, the CR reappears (spontaneous recovery)

47 Classical Conditioning Generalization

48 Generalization the tendency, once a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses. Ex. A dog conditioned to one tone, would also respond to a new and somewhat different tone.

49 Classical Conditioning Discrimination

50 Discrimination in classical conditioning, the learned ability to distinguish between a conditioned stimulus and stimuli that do not signal an unconditioned stimulus. Ex. Guard dog vs guide dog.

51 Classical Conditioning Pavlov’s Legacy
Classical conditioning applies to other organisms Showed how to study a topic scientifically

52 Pavlov Was a Huge Influence on Watson and Behaviorism
John Watson used the ideas of Pavlov as a basis for his work, which is the foundation of behaviorism. Behaviorism- the view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most research psychologists today agree with (1) but not with (2).

53 John Watson and Baby Albert
Classical Conditioning Pavlov’s Legacy: Applications of Classical Conditioning John Watson and Baby Albert

54 Baby Albert US UR NS CS CR Loud noise Fear behaviors White rat

55 Module 27: Operant Conditioning

56

57 Operant Conditioning

58 Operant Conditioning Classical Conditioning Operant conditioning
Respondent behavior Operant conditioning Actions associated with consequences Operant behavior

59 Operant Conditioning a type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher. The focus is on using positive consequences to encourage the continuance of behavior, and negative consequences to encourage the avoidance of behavior. The underlying principle is what Thorndike called, the Law of Effect.

60 Skinner’s Experiments

61 Skinner’s Experiments
Edward Thorndike’s Law of Effect B.F. Skinner Behavioral technology Behavior control

62 Law of Effect Thorndike’s principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely. Edward L. Thorndike was a tremendous influence on psychology, and Behaviorism. Thorndike use a fish reward to entice cats to find their way out of a puzzle box. This would be a huge influence on B.F. Skinner.

63 B.F. Skinner B.F. Skinner used the ideas of Thorndike to develop his operant chamber, or Skinner box, in which he could measure the impact of reinforcement on behavior, without interference from outside sources.

64 Operant Chamber and Reinforcement
Operant Chamber- in operant conditioning research, a chamber (also known as a Skinner Box) containing a bar or key that an animal can manipulate to obtain a food or water reinforcer; attached devices record the animal’s rate of bar pressing or key pecking. Reinforcement- in operant conditioning, any event that strengthens the behavior it follows.

65 Skinner’s Experiments
Operant Chamber (Skinner Box) Reinforcement

66

67 Classical vs Operant https://www.youtube.com/watch?v=H6LEcM0E0io
Classical Conditioning- we learn to associate two stimuli and thus to anticipate events. Operant Conditioning- We learn to associate a response (our behavior) and its consequences, thus we (and other animals) learn to repeat acts followed by good results and avoid acts followed by bad results.

68 Skinner’s Experiments Shaping Behavior
Successive approximations Discriminative stimulus

69 Shaping an operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of the desired behavior. By making rewards contingent on desired behaviors, researchers and animal trainers gradually shape complex behaviors.

70 Shaping and Perception
Shaping can demonstrate how non-verbal organisms perceive. For example, when researchers can get a baby, or dog to respond to a particular stimulus and not to others, it demonstrates that the organism does recognize that particular stimulus. Discriminative Stimulus- in operant conditioning, a stimulus that elicits a response after association with reinforcement (in contrast to related stimuli not associated with reinforcement).

71 Positive vs Negative Reinforcement
Positive Reinforcement- increasing behaviors by presenting positive reinforcers. A positive reinforcer is any stimulus that, when presented after a response, strengthens the response. Negative Reinforcement- increases behaviors by stopping or reducing negative stimuli, such as shock. A negative reinforcer is any stimulus that, when removed after a response, strengthens the response Note: negative reinforcement is NOT punishment.

72 Skinner’s Experiments Types of Reinforcers
Positive reinforcement Negative reinforcement

73 Skinner’s Experiments Types of Reinforcers
Positive reinforcement Negative reinforcement

74 Skinner’s Experiments Types of Reinforcers
Positive reinforcement Negative reinforcement

75 Skinner’s Experiments Types of Reinforcers
Positive reinforcement Negative reinforcement

76 Skinner’s Experiments Types of Reinforcers
Positive reinforcement Negative reinforcement

77 Primary Reinforcer vs Conditioned Reinforcer
Primary Reinforcer- an innately reinforcing stimulus, such as one that satisfies a biological need. -food, water, eliminating pain, etc. Conditioned Reinforcer- a stimulus that gains its reinforcing power through its association with a primary reinforcer; also known as a secondary reinforcer. -A pleasant tone of voice, a smile, money, good grades, etc.

78 Immediate vs Delayed Reinforcers
Immediate reinforcers are those provided very soon after a desired act. Delayed reinforcers are those that do not come immediately. Animals typically need reinforcements that come within 30 seconds of the desired behavior for them to learn. An essential element of being human is the ability to delay reinforcement.

79 Conditioned reinforcer Immediate vs delayed reinforcers
Skinner’s Experiments Types of Reinforcers: Primary and Secondary Reinforcers Primary reinforcer Conditioned reinforcer Secondary reinforcer Immediate vs delayed reinforcers

80

81 Skinner’s Experiments Reinforcement Schedules
Continuous reinforcement-reinforcing the desired response every time it occurs. Reinforcement Schedule- a pattern that defines how often a desired response will be reinforced. Partial (intermittent) reinforcement- reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement. Fixed-ratio schedule- in operant conditioning, a reinforcement schedule that reinforces a response only after a specific number of responses. Variable-ratio schedule- in operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses.

82 Reinforcement Schedules Continued
Fixed-interval schedule- in operant conditioning, a reinforcement schedule that reinforces a response only after a specific time has elapsed. Variable-interval schedule- in operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals.

83 Skinner’s Experiments Reinforcement Schedules

84 Skinner’s Experiments Reinforcement Schedules

85 Skinner’s Experiments Reinforcement Schedules

86 Skinner’s Experiments Reinforcement Schedules

87 Skinner’s Experiments Reinforcement Schedules

88 Skinner’s Experiments Reinforcement Schedules

89 Skinner’s Experiments Reinforcement Schedules

90 Skinner’s Experiments Reinforcement Schedules

91 Skinner’s Experiments Reinforcement Schedules

92 Skinner’s Experiments Reinforcement Schedules

93 Previous Chart Skinner’s laboratory pigeons produced these response patterns to each of four reinforcement schedules (reinforcers are indicated by diagonal marks). For people, as for pigeons, reinforcement linked to number of responses (a ratio schedule) produces a higher response rate than reinforcement linked to amount of time elapsed (an interval schedule). But the predictability of the reward also matters. An unpredictable (variable) schedule produces more consistent responding than does a predictable (fixed) schedule.

94 Skinner’s Experiments Punishment
Punishment- an event that tends to decrease the behavior that it follows. Positive punishment Negative punishment

95 Skinner’s Experiments Punishment
Positive punishment Negative punishment

96 Skinner’s Experiments Punishment
Positive punishment Negative punishment

97 Skinner’s Experiments Punishment
Positive punishment Negative punishment

98 Skinner’s Experiments Punishment
Positive punishment Negative punishment

99 The Big Bang…Again.

100 Skinner’s Experiments Punishment
Negatives of using punishment Punished behavior is suppressed not forgotten Punishment teaches discrimination Punishment can teach fear Physical punishment may increase aggression

101 Skinner’s Legacy

102 Skinner’s Legacy Controversies surrounding Skinner’s Operant Conditioning

103 Skinner’s Legacy B.F. Skinner insisted that outside forces shaped behavior, and encouraged using the methods of reinforcement to produce positive results. He believed reinforcement and punishment were already shaping behavior anyway, so why not control the process, and thus, shape behavior in a desirable way. Critics felt that Skinner ignored individuality and free-will, essentially stripping people of their humanity.

104 Module 28: Operant Conditioning’s Applications, and Comparison to Classical Conditioning

105

106 Application of Operant Conditioning

107 Application of Operant Conditioning
At school- Quizlet, Socrative, candy. In sports- Starting with a task that is easier to accomplish, and working up. Superstitions, i.e. certain socks, batting box routine, etc. At work- Boss passing out praise, bonuses, little competitions, etc. Notice someone doing things right, and reinforce it). At home- Potty training, putting kids to bed, etc. For self improvement- getting your spouses to do things you want, conditioning yourself to study, or exercise, etc.

108 Keys to Using Conditioning
State your goal measurable terms, and announce it. Monitor how often you engage in your desired behavior. Reinforce the desired behavior. Reduce the rewards gradually.

109

110 Biofeedback A system for electronically recording, amplifying, and feeding back information regarding a subtle physiological state, such as blood pressure or muscle tension. In the 1960’s, there was some research that made the possibilities of Biofeedback seem fantastic. As it turns out, research indicates that tension headaches are the most impacted by biofeedback.

111 Contrasting Classical and Operant Conditioning

112 Contrasting Classical and Operant Conditioning

113 Contrasting Classical and Operant Conditioning

114 Contrasting Classical and Operant Conditioning

115 Contrasting Classical and Operant Conditioning

116 Contrasting Classical and Operant Conditioning

117 Contrasting Classical and Operant Conditioning

118 Contrasting Classical and Operant Conditioning

119 Contrasting Classical and Operant Conditioning

120 Contrasting Classical and Operant Conditioning

121 Contrasting Classical and Operant Conditioning

122 Contrasting Classical and Operant Conditioning

123 Contrasting Classical and Operant Conditioning

124 Contrasting Classical and Operant Conditioning

125 Contrasting Classical and Operant Conditioning

126 Contrasting Classical and Operant Conditioning

127 Contrasting Classical and Operant Conditioning

128 Module 29: Biology, Cognition, and Learning

129 Biological Constraints on Conditioning

130 Biological Constraints on Conditioning

131 Biological Constraints on Conditioning

132 Biological Constraints on Conditioning

133 Biological Constraints on Conditioning

134 Biological Constraints on Conditioning Limits on Classical Conditioning
John Garcia (and Robert Koelling) Conditioned Taste Aversion: rats were exposed to a particular taste, sight, or sound, and then exposed to material that would make them sick. The rats would then become averse to the taste, but not the sight or sound. When the rats were sickened significantly later, they would still become taste averse. This is likely because the best way for the rat (or the human) to avoid eating toxic food, is to develop the sense of taste to discern the toxic nature, and then develop aversion to it. Biologically primed associations are associations that develop because we are predisposed toward them. An example of natural selection at work.

135 A tainted bit of sheep meat
was intentionally left out for the wolves to find. When they became sick from the meat, they later avoided attacking sheep altogether

136 Biological Constraints on Conditioning Limits on Classical Conditioning

137 Biological Constraints on Conditioning Limits on Classical Conditioning

138 Biological Constraints on Conditioning Limits on Classical Conditioning

139 Biological Constraints on Conditioning Limits on Classical Conditioning

140 Biological Constraints on Conditioning Limits on Operant Conditioning
We most readily learn and retain behaviors that reflect our biological predispositions, or put another way, biological constraints Predispose organisms to learn associations that are naturally adaptive. Ex. Horses

141 Cognition’s Influence on Conditioning

142 Cognition’s Influence on Conditioning Cognitive Processes and Classical Conditioning
Cognitive processes, i.e.thoughts, perceptions, expectations can impact classical conditioning. Animals can learn the predictability of an event. Ex. Rats and Dogs get a shock after every tone, but will sometimes have a light precede the tone, the light is neutral. Latent Learning- learning that occurs but is not apparent until there is an incentive to demonstrate it. Cognitive Map- a mental representation of the layout of one’s environment. For example, after exploring a maze, rats act as if they have learned a cognitive map of it. Insight- a sudden realization problem’s solution. Intrinsic Motivation- a desire to perform a behavior effectively for its own sake. vs Extrinsic Motivation- a desire to perform a behavior to receive promised rewards or avoid threatened punishment.

143 Latent learning Insight Cognitive map Intrinsic motivation
Cognition’s Influence on Conditioning Cognitive Processes and Operant Conditioning Latent learning Cognitive map Insight Intrinsic motivation Extrinsic motivation

144 Influences on Conditioning

145 Influences on Conditioning

146 Influences on Conditioning

147 Influences on Conditioning

148 Influences on Conditioning

149 Learning and Personal Control

150 Learning and Personal Control
Cope Problem-focused coping Emotion-focused coping

151 Learning and Personal Control
Cope- alleviating stress using emotional, cognitive, or behavioral methods. Problem Focused Coping- attempting to alleviate stress directly – by changing the stressor or the way we interact with that stressor. Emotion Focused Coping- attempting to alleviate stress by avoiding or ignoring a stressor and attending to emotional needs related to one’s stress reaction.

152 Learned Helplessness The hopelessness and passive resignation an animal or human learns when unable to avoid repeated aversive events. Ex. Dogs repeatedly given shocks with no way out, became cowardly and defeated later when facing a shock, but this time, having a path to escape. Dogs who had never been given a no way out scenario, quickly escaped every time. People who are repeatedly faced with traumatic events, over which they have no control, will often develop the learned helplessness. What are the social implications?

153 Learning and Personal Control Learned Helplessness
Learned helplessness (Martin Seligman)

154 Learning and Personal Control Learned Helplessness
Learned helplessness (Martin Seligman)

155 Learning and Personal Control Learned Helplessness
Learned helplessness (Martin Seligman)

156 Learning and Personal Control Learned Helplessness
Learned helplessness (Martin Seligman)

157 External locus of control Internal locus of control
Learning and Personal Control Learned Helplessness: Internal Versus External Locus of Control External locus of control Internal locus of control

158 External Locus of Control- the perception that chance or outside forces beyond our personal control determine our fate. Internal Locus of Control- the perception that you control your own fate.

159 Learning and Personal Control Learned Helplessness: Depleting and Strengthening Self-Control
Self-control- = the ability to control impulses and delay short-term gratification for greater long-term rewards.

160 Module 30: Learning by Observation

161

162 Mirrors and Imitation in the Brain

163 Mirrors and Imitation in the Brain
Observational learning Social learning Modeling Bandura’s Bobo Doll Experiment

164 Learning by Observation
Observational Learning- learning by observing others. Also called social learning. Modeling-the process of observing and imitating a specific behavior. Ex. Albert Bandura had kids draw a picture, while in a room with an adult working with tinker toys. The adult throws a temper tantrum, and beats up a large inflatable doll called Bobo. Later, when the children are confronted with adversity, they behaved in a strikingly similar way, when compared to children who had not been exposed.

165 Mirrors and Imitation in the Brain

166 Mirrors and Imitation in the Brain

167 Mirrors and Imitation in the Brain

168 Mirrors and Imitation in the Brain
Mirror neurons

169 Mirror Neurons frontal lobe neurons that some scientists believe fire when performing certain actions or when observing another doing so. The brain’s mirroring of another’s action may enable imitation and empathy. First observed in monkeys who had a sensor placed in their brains to show activity when they moved. When the monkey saw people moving, the sensor would show activity, even though the monkey did not move.

170 Mirrors and Imitation in the Brain
Cognitive imitation

171 Cognitive Imitation Previous Page
Money A watched Monkey B touch four pictures on a display screen in a certain order to gain bananas. Money A learned to imitate that order, even when shown the same pictures in a different configuration.

172 Applications of Observational Learning

173 Prosocial effects Antisocial effects
Applications of Observational Learning Prosocial versus Antisocial Effects Prosocial effects Antisocial effects

174 Applications of Observational Learning
Prosocial Behavior- positive, constructive, helpful behavior. The opposite of antisocial behavior. Ex. Gandhi, MLK, good parental role models, good teachers and coaches, ministers, etc. What problems might come up?

175 Antisocial Effects Anti-social behavior- behavior that is negative, destructive behavior. -Can also be immitated. Violent TV and Video games?

176 The End

177 Teacher Information Types of Files Animation
This presentation has been saved as a “basic” Powerpoint file. While this file format placed a few limitations on the presentation, it insured the file would be compatible with the many versions of Powerpoint teachers use. To add functionality to the presentation, teachers may want to save the file for their specific version of Powerpoint. Animation Once again, to insure compatibility with all versions of Powerpoint, none of the slides are animated. To increase student interest, it is suggested teachers animate the slides wherever possible. Adding slides to this presentation Teachers are encouraged to adapt this presentation to their personal teaching style. To help keep a sense of continuity, blank slides which can be copied and pasted to a specific location in the presentation follow this “Teacher Information” section.

178 Teacher Information Unit Coding
Just as Myers’ Psychology for AP 2e is color coded to the College Board AP Psychology Course Description (Acorn Book) Units, so are these Powerpoints. The primary background color of each slide indicates the specific textbook unit. Psychology’s History and Approaches Research Methods Biological Bases of Behavior Sensation and Perception States of Consciousness Learning Cognition Motivation, Emotion, and Stress Developmental Psychology Personality Testing and Individual Differences Abnormal Psychology Treatment of Abnormal Behavior Social Psychology

179 Teacher Information Hyperlink Slides - This presentation contain two types of hyperlinks. Hyperlinks can be identified by the text being underlined and a different color (usually purple). Unit subsections hyperlinks: Immediately after the unit title and module title slide, a page can be found listing all of the unit’s subsections. While in slide show mode, clicking on any of these hyperlinks will take the user directly to the beginning of that subsection. Bold print term hyperlinks: Every bold print term from the unit is included in this presentation as a hyperlink. While in slide show mode, clicking on any of the hyperlinks will take the user to a slide containing the formal definition of the term. Clicking on the “arrow” in the bottom left corner of the definition slide will take the user back to the original point in the presentation. These hyperlinks were included for teachers who want students to see or copy down the exact definition as stated in the text. Most teachers prefer the definitions not be included to prevent students from only “copying down what is on the screen” and not actively listening to the presentation. For teachers who continually use the Bold Print Term Hyperlinks option, please contact the author using the address on the next slide to learn a technique to expedite the returning to the original point in the presentation.

180 Teacher Information Continuity slides
Throughout this presentation there are slides, usually of graphics or tables, that build on one another. These are included for three purposes. By presenting information in small chunks, students will find it easier to process and remember the concepts. By continually changing slides, students will stay interested in the presentation. To facilitate class discussion and critical thinking. Students should be encouraged to think about “what might come next” in the series of slides. Please feel free to contact me at with any questions, concerns, suggestions, etc. regarding these presentations. Kent Korek Germantown High School Germantown, WI 53022

181 Division title (red print) subdivision title (blue print)
xxx

182 Division title (red print in text) subdivision title (blue print in text)
Use this slide to add a table, chart, clip art, picture, diagram, or video clip. Delete this box when finished

183 Definition Slide = add definition here

184 Definition Slides

185 Learning = the process of acquiring new and relatively enduring information or behaviors.

186 Habituation = an organism’s decreasing response to a stimulus with repeated exposure to it.

187 Associative Learning = learning that certain events occur together. The events may be two stimuli (as in classical conditioning) or a response and its consequence (as in operant conditioning).

188 Stimulus = any event or situation that evokes a response.

189 Cognitive Learning = the acquisition of mental information, whether by observing events, by watching others, or through language

190 Classical Conditioning
= a type of learning in which one learns to link two or more stimuli and anticipate events.

191 Behaviorism = the view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most research psychologists today agree with (1) but not with (2).

192 Neutral Stimulus = in classical conditioning, a stimulus that elicits no response before conditioning.

193 Unconditioned Response (UR)
= in classical conditioning, an unlearned, naturally occurring response to the unconditioned stimulus (US), such as salivation when food is in the mouth.

194 Unconditioned Stimulus (US)
= in classical conditioning, a stimulus that unconditionally – naturally and automatically – triggers a response (UR).

195 Conditioned Response (CR)
= in classical conditioning, a learned response to a previously neutral (but now conditioned) stimulus (CS).

196 Conditioned Stimulus (CS)
= in classical conditioning, an originally irrelevant stimulus that, after association with an unconditioned stimulus (US), comes to trigger a conditioned response (CR).

197 Acquisition = in classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned response. In operant conditioning, the strengthening of a reinforced response.

198 Higher-Order Conditioning
= a procedure in which the conditioned stimulus in one conditioning experience is paired with a new neutral stimulus, creating a second (often weaker) conditioned stimulus. For example, an animal that has learned that a tone predicts food might then learn that a light predicts the tone and begin responding to the light alone. (Also called second-order conditioning.)

199 Extinction = the diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus (US) does not follow a conditioned stimulus (CS); occurs in operant conditioning when a response is no longer reinforced.

200 Spontaneous Recovery = the reappearance, after a pause, of an extinguished conditioned response.

201 Generalization = the tendency, once a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses.

202 Discrimination = in classical conditioning, the learned ability to distinguish between a conditioned stimulus and stimuli that do not signal an unconditioned stimulus.

203 Operant Conditioning = a type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher.

204 Law of Effect = Thorndike’s principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely.

205 Operant Chamber = in operant conditioning research, a chamber (also known as a Skinner Box) containing a bar or key that an animal can manipulate to obtain a food or water reinforcer; attached devices record the animal’s rate of bar pressing or key pecking.

206 Reinforcement = in operant conditioning, any event that strengthens the behavior it follows.

207 Shaping = an operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of the desired behavior.

208 Discriminative Stimulus
= in operant conditioning, a stimulus that elicits a response after association with reinforcement (in contrast to related stimuli not associated with reinforcement).

209 Positive Reinforcement
= increasing behaviors by presenting positive reinforcers. A positive reinforcer is any stimulus that, when presented after a response, strengthens the response.

210 Negative Reinforcement
= increases behaviors by stopping or reducing negative stimuli, such as shock. A negative reinforcer is any stimulus that, when removed after a response, strengthens the response Note: negative reinforcement is NOT punishment.

211 Primary Reinforcer = an innately reinforcer stimulus, such as one that satisfies a biological need.

212 Conditioned Reinforcer
= a stimulus that gains its reinforcing power through its association with a primary reinforcer; also known as a secondary reinforcer.

213 Reinforcement Schedule
= a pattern that defines how often a desired response will be reinforced.

214 Continuous Reinforcement
= reinforcing the desired response every time it occurs.

215 Partial (intermittent) Reinforcement
= reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement.

216 Fixed-Ratio Schedule = in operant conditioning, a reinforcement schedule that reinforces a response only after a specific number of responses.

217 Variable-Ratio Schedule
= in operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses.

218 Fixed-Interval Schedule
= in operant conditioning, a reinforcement schedule that reinforces a response only after a specific time has elapsed.

219 Variable-Interval Schedule
= in operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals.

220 Punishment = an event that tends to decrease the behavior that it follows.

221 Biofeedback = a system for electronically recording, amplifying, and feeding back information regarding a subtle physiological state, such as blood pressure or muscle tension.

222 Respondent Behavior = behavior that occurs as an automatic response to some stimulus.

223 Operant Behavior = behavior that operates on the environment, producing consequences.

224 Cognitive Map = a mental representation of the layout of one’s environment. For example, after exploring a maze, rats act as if they have learned a cognitive map of it.

225 Latent Learning = learning that occurs but is not apparent until there is an incentive to demonstrate it.

226 Insight = a sudden realization problem’s solution.

227 Intrinsic Motivation = a desire to perform a behavior effectively for its own sake.

228 Extrinsic Motivation = a desire to perform a behavior to receive promised rewards or avoid threatened punishment.

229 Coping = alleviating stress using emotional, cognitive, or behavioral methods.

230 Problem-Focused Coping
= attempting to alleviate stress directly – by changing the stressor or the way we interact with that stressor.

231 Emotion-Focused Coping
= attempting to alleviate stress by avoiding or ignoring a stressor and attending to emotional needs related to one’s stress reaction.

232 Learned Helplessness = the helplessness and passive resignation an animal or human learns when unable to avoid repeated aversive events.

233 External Locus of Control
= the perception that chance or outside forces beyond our personal control determine our fate.

234 Internal Locus of Control
= the perception that you control your own fate.

235 Self-Control = the ability to control impulses and delay short-term gratification for greater long-term rewards.

236 Observational Learning
= learning by observing others. Also called social learning.

237 Modeling = the process of observing and imitating a specific behavior.

238 Mirror Neurons = frontal lobe neurons that some scientists believe fire when performing certain actions or when observing another doing so. The brain’s mirroring of another’s action may enable imitation and empathy.

239 Prosocial Behavior = positive, constructive, helpful behavior. The opposite of antisocial behavior.


Download ppt "Myers’ Psychology for AP®, 2e"

Similar presentations


Ads by Google