LEARNING

 

v     Learning refers to a relatively durable change in behavior or knowledge that is due to experience.  Includes - habits, preferences, skills, etc. 

v     Conditioning = learning. Whenever you see the word "conditioned," you can replace it with "learned." 

 

CLASSICAL CONDITIONING

Understanding Classical Conditioning: Pavlov's & Watson's Contributions

C.C. involves what happens before a response - the antecedents. Through classical conditioning, a stimulus will come to evoke a response that was originally evoked by a separate stimulus.

 

Vocabulary Words - click here.

v     Pavlov's Experiment

Ø      Pavlov rang a bell and then immediately put meat powder on the dog's tongue, causing the dog to salivate reflexively.  The bell is a neutral stimulus - meaning it produces no response naturally.  Animals don't come pre-wired to respond to bells.  The meat powder is an unconditioned stimulus - a stimulus that is responded to naturally - it's biologically significant.  The drooling is the unconditioned response - or an unlearned response. Pavlov did multiple pairings of bell + food = salivation.  Later, he rang bell, but withled the UCS.  Dogs still salivated.  Association linked meatpower with bell ---> drool.  The bell became the conditioned stimulus because the dog learned to associate it with the meatpowder, which in turn caused it to salivate (the conditioned response).  The bell became a conditioned stimulus - meaning it was a previously neutral stimulus, that through this process of conditioning, began to act like a UCS - it could elicit a response.  This response that the CS elicits is called the conditioned response.

 

NS + UCS -------> UCR

CS ----------> CR

v     C.C. can be responsible for many emotional responses that we have, both good ones & bad ones.  Ex:  phobias - highly irrational fears.

 

Principles of Classical Conditioning

v     Acquisition - acquiring the response, learning the response. 

v     stimulus contiguity - time association between 2 events.  Generally, closer in time the NS is to the UCS, the easier conditioning will be. 

Ø      Ex:  If Pavlov rang the bell, but didn't present the food until an hour later, the dogs probably would not have made the connection between bell & food.

v     Extinction happens when the CS (it's no longer neutral because it's eliciting a CR) is no longer followed by the US. 

v     Sometimes responses reappear even after a long time of extinction.  This is called spontaneous recovery.  It's a reappearance of an extinguished response after a period of nonexposure to the CS.

Ø      Despite the fact that vodka doesn't make you sick anymore, one day, you smelled it & felt really sick. 

v     Sometimes after conditioning has occurred, animals may show a tendency to give the conditioned response to stimuli that are similar to the CS. This is known as stimulus generalization.

Ø      Now your sickness w/ vodka has generalized! All alcohol makes you want to hurl!

v     Stimulus discrimination is on the other end of things.  Stimulus discrimination is when you give the CR only for the CS.

Ø      No wait, you only want to hurl on the vodka.

v     Sometimes new stimuli will take on the characteristics of an unconditioned stimulus because it is paired with a conditioned stimulus.  This is a process called higher-order conditioning.

v     Practical uses of Classical conditioning - your book has some great info on page 210. READ.

 

OPERANT CONDITIONING

Unlike classical conditioning where responses are controlled by the stimuli that precede them, in operant conditioning, responses are controlled by what follows them.  Behaviors are guided by their consequences

 

v     Behaviors that are reinforced will tend to occur again.  Behaviors that are punished will tend not to occur again.

Ø      reinforcers - a reinforcer is anything that follows a response that will increase the likelihood of that response occurring again

Ø      punisher - a punisher is anything that follows a response that will decrease the likelihood of that response occurring again.

Ø      Some research on operant conditioning has been conducted using a Skinner Box, or a conditioning chamber, aka SKINNER BOX

v     Superstitious Behavior happens because the reinforcer reinforced the desired response and the behavior occurring at about the same time.  You reinforced more than what you intended to.  You reinforced the target behavior, but also another one.

v     When you're trying to initially condition an animal, it's best if reinforcement is immediate.  You press the lever - you get the food.

v     In shaping, you reinforce successive approximations of the desired response.  Shaping is necessary when an organism doesn't, on its own, emit the desired response.  You have to shape it or mold it.

v     In operant conditioning, extinction refers to the gradual weakening and disappearance of a response tendency because the response is no longer followed by a reinforcer.  So, once the reinforcement is gone, the response will gradually fade away.

Ø      If you didn't get paid, you probably wouldn't go to work.  Unless, of course, you go to work for the sheer love of the job.  And in that case, that's reinforcing enough.  But in the real world, we need $$$$.

v     Resistance to extinction occurs when an organism continues to make the response after the delivery of the reinforcer has stopped.

Ø      You don't drop out of school is you make one bad grade.  The Coke machine doesn't give you your soda but you don't swear off Coke machines forever. 

Ø      People at the casino.  Enough said.

v     operant stimulus discrimination - it's the ability to tell the difference between those stimuli that came before behaviors that lead to reinforcement & those that lead to no reinforcement.

v     Generalization - the tendency to respond to stimuli that are similar to the ones that were present when you got reinforced.

Ø      You can work one Coke machine, you can work them all.

v     Primary & secondary reinforcers:

Ø      Primary - biologically significant. Ex: Food

Ø      Secondary - have learned value.  Ex:  Dollar bills.

v     Schedules of Reinforcement

Ø      Continuous - reinforcement for every correct response

Ø      Partial - reinforcement for only some correct responses

§         Variable - changing

§         Fixed - set

§         Interval - time

§         Ratio - number of correct responses

·        FR - bonus for every 5th car.

·        VR - selling things door to door

·        FI - the washing machine

v     VI - watching for shooting stars

 

REINFORCEMENT & PUNISHMENT

v     Anything  that leads to an increase in behavior is reinforcement.

Ø      ANYTHING.  If I spit on you and your behavior continues to increase as I spit on you, then that's reinforcement.  Reinforcement doesn't have to be what most people consider to be pleasant (like praise, cookies, etc).

v     Anything that leads to a decrease in behavior is punishment. 

v     If I give you money, but your behavior slows & decreases, then it is still punishment. 

 

TWO KINDS OF REINFORCEMENT

Ø      Positive - adding something in.  Ex:  giving dog treats

Ø      Negative - taking something away Ex:  the alarm clock example from class.

TWO KINDS OF PUNISHMENT

Ø      Positive - adding something in.  Ex:  spanking

Ø      Negative - taking something away Ex:  you're grounded! No TV for you!

 

Side effects of punishment:  Please see Table 6.5 on page 219.

 

COGNITIVE-SOCIAL LEARNING (Cognitive-Behavioral Learning)

Learning is more than classical & operant conditioning.  Recognizes thought/mental process role in learning.  Learning/behavior -----> from both cognitive & behavioral factors.

 

Insight & Latent Learning:  Where is the Reinforcement?

Insight - sudden flash of memory.  Great example: Köhler's chimps - see page 226 of book.

Latent learning is learning that occurs in the absence of any behavioral indications  - you may never have been to Target, but you can tell me how to get there. 

 

 

 

Modeling

Observational learning (or modeling) - Bandura.  Observational learning means that we learn by watching & imitating others.  (How did you learn to shave? How did you learn to work things in your household - grew up watching your parent(s)/guardian(s).)  Read about Bandura's experiment with the Bo-Bo doll.

 

The necessities of observational learning:

1.  Attention - did you pay attention to the behavior as it was being shown?

2. Retention - do you remember the steps?

3.  Motor reproduction - can you actually reproduce these steps physically?

4.  Reinforcement - do you expect it or not