Operant Conditioning Vocabulary

These are some basic words that you need to familiarize yourself with in order to better grasp the materials presented in the section on operant conditioning and in the lecture. These definitions are pulled from various sources in hopes that they will aid in your enlightenment with this material. Please notice that there are multiple definitions for each word. These are provided in hopes that they will help expand your understanding. One definition may help clarify the next! Good luck!

Operant Conditioning

A form of learning in which responses come to be controlled by their consequences

Instrumental conditioning is the same thing as operant conditioning

Shaping

The reinforcement of closer & closer approximations of a desired response

Teaching a desired response by reinforcing a series of successive steps leading to this final response

Extinction

The gradual weaking & disappearance of a response tendency because the response is no longer followed by a reinforcer

Primary Reinforcers

Events that are inherently reinforcing because they satisfy biological needs

Stimuli that increase the probability of a response and whose value does not need to be learned, such as food, water, & sex

Secondary Reinforcers

Events that acquire reinforcing qualities but they're not biologically significant

Stimuli that increase the probability of a response & whose reinforcing properties are learned, such as money & material possessions

Schedule of Reinforcement

Determines which occurrences of a specific response result in the presentation of a reinforcer

The set intervals at which a response is reinforced

Continuous Reinforcement

Occurs when every instance of a designated response is reinforced

Reinforcement in which every response is reinforced

Partial Reinforcement

Occurs when a designated response is reinforced only some of the time

Reinforcement in which some, but not all, responses are reinforced

Fixed Ratio (FR) schedule

The reinforcer is given after a fixed number of nonreinforced responses

A partial schedule of reinforcement in which a subject must make a certain number of responses before being reinforced

Variable Ratio (VR) schedule

The reinforcer is given after a variable number of nonreinforced responses

A schedule of reinforcement in which the subject is reinforced after a variable number of responses

Fixed Interval (FI) schedule

The reinforcer is given for the first response that occurs after a fixed time interval has elapsed

Schedule of reinforcement in which a subject is reinforced for the first response after a specific period of time has elapsed

Variable Interval (VI) schedule

The reinforcer is given for the first response after a variable time interval has elapsed

 

Reinforcement

Occurs when an event following a response increases an organism's tendency to make that response

Any action or event that increases the probability that a response will be repeated

Punishment

Occurs when an event following a response decreases the tendency to make that response

Any action or event that decreases the likelihood of a response being repeated

 

Please keep in mind that these definitions are provided to you in addition to the ones that you will find in your textbook. These extra definitions are provided in hopes that multiple phrasings of the same material will help you better understand this material. J

 

And, as always, please remember that all materials presented here are purely supplemental in nature. They are not intended to supercede or substitute for the information presented in lecture & in the textbook! Thanks again & enjoy.