1.7: Behavioral Approaches
1.7: Behavioral Approaches
The Behavioral Perspective: A Focus on Observable Behavior
The behavioral perspective is the psychological approach that suggests that the keys to understanding development are observable behavior and external stimuli in the environment. Behaviorism is a theory of learning, and learning theories focus on how we respond to events or stimuli rather than emphasizing internal factors that motivate our actions. These theories provide an explanation of how experience can change what we do.
Behaviorism emerged early in the 20th century and became a major force in American psychology. Championed by psychologists such as John B. Watson (1878–1958) and B. F. Skinner (1904–1990), behaviorism rejected any reference to mind and viewed overt and observable behavior as the proper subject matter of psychology. Through the scientific study of behavior, it was hoped that laws of learning could be derived that would promote the prediction and control of behavior. Russian physiologist Ivan Pavlov (1849–1936) influenced early behaviorism in America. His work on conditioned learning, popularly referred to as classical conditioning, provided support for the notion that learning and behavior were controlled by events in the environment and could be explained with no reference to mind or consciousness.
Classical Conditioning and Emotional Responses
Classical conditioning theory helps us to understand how our responses to one situation become attached to new situations. For example, a smell might remind us of a time when we were a kid. If you went to a new cafe with the same smell as your elementary cafeteria, it might evoke the feelings you had when you were in school. Or a song on the radio might remind you of a memorable evening you spent with your first true love. Or, if you hear your entire name (Isaiah Wilmington Brewer, for instance) called as you walk across the stage to get your diploma and it makes you tense because it reminds you of how your father used to use your full name when he was mad at you, then you’ve been classically conditioned.
Figure 1. Ivan Pavlov
Classical conditioning explains how we develop many of our emotional responses to people or events or our “gut level” reactions to situations. New situations may bring about an old response because the two have become connected. Attachments form in this way. Addictions are affected by classical conditioning, as anyone who’s tried to quit smoking can tell you. When you try to quit, everything that was associated with smoking makes you crave a cigarette.
Pavlov and Classical Conditioning
Ivan Pavlov (1849–1936) was a Russian physiologist interested in studying digestion (Figure 1). As he recorded the amount of salivation his laboratory dogs produced as they ate, he noticed that they actually began to salivate before the food arrived as the researcher walked down the hall and toward the cage. “This,” he thought, “is not natural!” One would expect a dog to automatically salivate when the food hit their palate, but before the food comes? Of course, what happened is that the dogs knew that the food was coming because they had learned to associate the footsteps with the food. The keyword here is “learned.”
A learned response is called a “conditioned” response. Pavlov began to experiment with this “psychic” reflex. He began to ring a bell, for instance, prior to introducing the food. Sure enough, after making this connection several times, the dogs could be made to salivate to the sound of a bell. Once the bell had become an event to which the dogs had learned to salivate, it was called a conditioned stimulus. The act of salivating to a bell was a response that had also been learned, now termed in Pavlov’s jargon, a conditioned response. Notice that the response, salivation, is the same whether it is conditioned or unconditioned (unlearned or natural). What changed is the stimulus to which the dog salivates. One is natural (unconditioned) and one is learned (conditioned) (Figure 2).
Watson and Behaviorism
Let’s think about how classical conditioning is used on people, and not just with dogs. One of the most widespread applications of classical conditioning principles was brought to us by the psychologist, John B. Watson. Watson proposed that the process of classical conditioning (based on Pavlov’s observations) was able to explain all aspects of human psychology. He established the psychological school of behaviorism, after doing research on animal behavior. This school was extremely influential in the middle of the 20th century when B.F. Skinner developed it further.
Watson believed that most of our fears and other emotional responses are classically conditioned. He gained a good deal of popularity in the 1920s with his expert advice on parenting offered to the public. He believed that parents could be taught to help shape their children’s behavior and tried to demonstrate the power of classical conditioning with his famous experiment with an 18-month-old boy named “Little Albert.” Watson sat Albert down and introduced a variety of seemingly scary objects to him: a burning piece of newspaper, a white rat, etc. But Albert remained curious and reached for all of these things. Watson knew that one of our only inborn fears is the fear of loud noises so he proceeded to make a loud noise each time he introduced one of Albert’s favorites, a white rat. After hearing the loud noise several times paired with the rat, Albert soon came to fear the rat and began to cry when it was introduced.
" Gaaye Mouse - Cutest White Rat Ever! " by Gaaye Mouse is licensed under CC BY-NC-ND 2.0 .
Watson filmed this experiment for posterity and used it to demonstrate that he could help parents achieve any outcomes they desired if they would only follow his advice. Watson wrote columns in newspapers and in magazines and gained a lot of popularity among parents eager to apply science to household order. Parenting advice was not the legacy Watson left us, however; where he really made his impact was in advertising. After Watson left academia, he went into the world of business and showed companies how to tie something that brings about a natural positive feeling to their products to enhance sales. Thus the union of sex and advertising!
Operant Conditioning
Now we turn to the second type of associative learning, operant conditioning. In operant conditioning , organisms learn to associate a behavior and its consequence (Table 1). A pleasant consequence makes that behavior more likely to be repeated in the future. For example, Spirit, a dolphin at the National Aquarium in Baltimore, does a flip in the air when her trainer blows a whistle. The consequence is that she gets a fish.
" The Dolphin Habitat " by Cayusa is licensed under CC BY-NC 2.0 .
Psychologist B. F. Skinner saw that classical conditioning is limited to existing behaviors that are reflexively elicited, and it doesn’t account for new behaviors such as riding a bike. He proposed a theory about how such behaviors come about. Skinner believed that behavior is motivated by the consequences we receive for the behavior: the reinforcements and punishments. His idea that learning is the result of consequences is based on the law of effect , which was first proposed by psychologist Edward Thorndike. According to the law of effect, behaviors that are followed by consequences that are satisfying to the organism are more likely to be repeated, and behaviors that are followed by unpleasant consequences are less likely to be repeated. [3] . Essentially, if an organism does something that brings about a desired result, the organism is more likely to do it again. If an organism does something that does not bring about a desired result, the organism is less likely to do it again. An example of the law of effect is in employment. One of the reasons (and often the main reason) we show up for work is because we get paid to do so. If we stop getting paid, we will likely stop showing up—even if we love our job.
Working with Thorndike’s law of effect as his foundation, Skinner began conducting scientific experiments on animals (mainly rats and pigeons) to determine how organisms learn through operant conditioning. [4] . He placed these animals inside an operant conditioning chamber, which has come to be known as a “Skinner box” (Figure 3). A Skinner box contains a lever (for rats) or disk (for pigeons) that the animal can press or peck for a food reward via the dispenser. Speakers and lights can be associated with certain behaviors. A recorder counts the number of responses made by the animal.
Skinner believed that we learn best when our actions are reinforced. For example, a child who cleans his room and is reinforced (rewarded) with a big hug and words of praise is more likely to clean it again than a child whose deed goes unnoticed. Skinner believed that almost anything could be reinforcing. A reinforcer is anything following a behavior that makes it more likely to occur again. It can be something intrinsically rewarding (called intrinsic or primary reinforcers), such as food or praise, or it can be something that is rewarding because it can be exchanged for what one really wants (such as receiving money and using it to buy a cookie). Such reinforcers are referred to as secondary reinforcers.
Comparing Classical and Operant Conditioning
| Classical Conditioning | Operant Conditioning | |
|---|---|---|
| Conditioning approach | An unconditioned stimulus (such as food) is paired with a neutral stimulus (such as a bell). The neutral stimulus eventually becomes the conditioned stimulus, which brings about the conditioned response (salivation). | The target behavior is followed by reinforcement or punishment to either strengthen or weaken it so that the learner is more likely to exhibit the desired behavior in the future. |
| Stimulus timing | The stimulus occurs immediately before the response. | The stimulus (either reinforcement or punishment) occurs soon after the response. |
Attributions:
The above content was remix from:
2.5: Behavioral Perspective is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Laura Overstreet, Diana Lang, Sonja Ann Miller, & Sonja Ann Miller.