Skinner Box: What Is an Operant Conditioning Chamber?
Charlotte Nickerson
Research Assistant at Harvard University
Undergraduate at Harvard University
Charlotte Nickerson is a student at Harvard University obsessed with the intersection of mental health, productivity, and design.
Learn about our Editorial Process
Saul McLeod, PhD
Editor-in-Chief for Simply Psychology
BSc (Hons) Psychology, MRes, PhD, University of Manchester
Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.
Olivia Guy-Evans, MSc
Associate Editor for Simply Psychology
BSc (Hons) Psychology, MSc Psychology of Education
Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.
On This Page:
The Skinner box is a chamber that isolates the subject from the external environment and has a behavior indicator such as a lever or a button.
When the animal pushes the button or lever, the box is able to deliver a positive reinforcement of the behavior (such as food) or a punishment (such as noise), or a token conditioner (such as a light) that is correlated with either the positive reinforcement or punishment.
- The Skinner box, otherwise known as an operant conditioning chamber, is a laboratory apparatus used to study animal behavior within a compressed time frame.
- Underlying the development of the Skinner box was the concept of operant conditioning, a type of learning that occurs as a consequence of a behavior.
- The Skinner Box has been wrongly confused with the Skinner air crib, with detrimental public consequences for Skinner.
- Commentators have drawn parallels between the Skinner box and modern advertising and game design, citing their addictive qualities and systematized rewards.
How Does It Work?
The Skinner Box is a chamber, often small, that is used to conduct operant conditioning research with animals. Within this chamber, there is usually a lever or key that an individual animal can operate to obtain a food or water source within the chamber as a reinforcer.
The chamber is connected to electronic equipment that records the animal’s lever pressing or key pecking, allowing for the precise quantification of behavior.
Before the works of Skinner, the namesake of the Skinner box, instrumental learning was typically studied using a maze or puzzle box.
Learning in these settings is well-suited to examining discrete trials or episodes of behavior instead of a continuous stream of behavior.
The Skinner box, meanwhile, was designed as an experimental environment better suited to examine the more natural flow of behavior in animals.
The design of the Skinner Box varies heavily depending on the type of animal enclosed within it and experimental variables.
Nonetheless, it includes, at minimum, at least one lever, bar, or key that an animal can manipulate. Besides the reinforcer and tracker, a skinner box can include other variables, such as lights, sounds, or images. In some cases, the floor of the chamber may even be electrified (Boulay, 2019).
The design of the Skinner box is intended to keep an animal from experiencing other stimuli, allowing researchers to carefully study behavior in a very controlled environment.
This allows researchers to, for example, determine which schedule of reinforcement — or relation of rewards and punishment to the reinforcer — leads to the highest rate of response in the animal being studied (Boulay, 2019).
The Reinforcer
The reinforcer is the part of the Skinner box that provides, naturally, reinforcement for an action. For instance, a lever may provide a pellet of food when pressed a certain number of times. This lever is the reinforcer (Boulay, 2019).
The Tracker/Quantifier
The tracker, meanwhile, provides quantitative data regarding the reinforcer. For example, the tracker may count the number of times that a lever is pressed or the number of electric shocks or pellets dispensed (Boulay, 2019).
Partial Reinforcement Schedules
Partial reinforcement occurs when reinforcement is only given under particular circumstances. For example, a pellet or shock may only be dispensed after a pigeon has pressed a lever a certain number of times.
There are several types of partial reinforcement schedules (Boulay, 2019):
- Fixed-ratio schedules , where an animal receives a pellet after pushing the trigger a certain number of times.
- Variable-ratio schedules , where animals receive reinforcement after a random number of responses.
- Fixed-interval schedules , where animals are given a pellet after a designated period of time has elapsed, such as every 5 minutes.
- Variable-interval schedules , where animals receive a reinforcer at random.
Once data has been obtained from the Skinner box, researchers can look at the rate of response depending on the schedule.
The Skinner Box in Research
Modified versions of the operant conditioning chamber, or Skinner box, are still widely used in research settings today.
Skinner developed his theory of operant conditioning by identifying four different types of punishment or reward.
To test the effect of these outcomes, he constructed a device called the “Skinner Box,” a cage in which a rat could be placed, with a small lever (which the rat would be trained to press), a chute that would release pellets of food, and a floor which could be electrified.
For example, a hungry rat was placed in a cage. Every time he activated the lever, a food pellet fell into the food dispenser (positive reinforcement). The rats quickly learned to go straight to the lever after a few times of being put in the box.
This suggests that positive reinforcement increases the likelihood of the behavior being repeated.
In another experiment, a rat was placed in a cage in which they were subjected to an uncomfortable electrical current (see diagram above).
As they moved around the cage, the rat hit the lever, which immediately switched off the electrical current (negative reinforcement). The rats quickly learned to go straight to the lever after a few times of being put in the box.
This suggests that negative reinforcement increases the likelihood of the behavior being repeated.
The device allowed Skinner to deliver each of his four potential outcomes, which are:
- Positive Reinforcement : a direct reward for performing a certain behavior. For instance, the rat could be rewarded with a pellet of food for pushing the lever.
- Positive Punishment : a direct negative outcome following a particular behavior. Once the rat had been taught to press the lever, for instance, Skinner trained it to cease this behavior by electrifying the floor each time the lever was pressed.
- Negative Reinforcement : the removal of an unpleasant situation when a particular behavior is performed (thus producing a sense of relief). For instance, a mild electric current was passed through the floor of the cage and was removed when a desired behavior was formed.
- Negative Punishment : involves taking away a reward or removing a pleasant situation. In the Skinner box, for instance, the rat could be trained to stop pressing the lever by releasing food pellets at regular intervals and then withholding them when the lever was pressed.
Commercial Applications
The application of operant and classical conditioning and the corresponding idea of the Skinner Box in commercial settings is widespread, particularly with regard to advertising and video games.
Advertisers use a number of techniques based on operant conditioning to influence consumer behavior, such as variable-ratio reinforcement schedule (the so-called “slot machine effect”), which encourages viewers to keep watching a particular channel in the hope of seeing a desirable outcome (e.g., winning a prize) (Vu, 2017).
Similarly, video game designers often employ Skinnerian principles in order to keep players engaged in gameplay.
For instance, many games make use of variable-ratio schedules of reinforcement, whereby players are given rewards (e.g., points, new levels) at random intervals.
This encourages players to keep playing in the hope of receiving a reward. In addition, many games make use of Skinner’s principle of shaping, whereby players are gradually given more difficult tasks as they master the easy ones. This encourages players to persevere in the face of frustration in order to see results.
There are a number of potential problems with using operant conditioning principles in commercial settings.
First, advertisers and video game designers may inadvertently create addictive behaviors in consumers.
Second, operant conditioning is a relatively short-term phenomenon; that is, it only affects behavior while reinforcement is being given.
Once reinforcement is removed (e.g., the TV channel is changed, the game is turned off), the desired behavior is likely to disappear as well.
As such, operant conditioning techniques may backfire, leading to addiction without driving the game-playing experiences developers hoped for (Vu, 2017).
Skinner Box Myths
In 1945, B. F. Skinner invented the air crib, a metal crib with walls and a ceiling made of removable safety glass.
The front pane of the crib was also made of safety glass, and the entire structure was meant to sit on legs so that it could be moved around easily.
The air crib was designed to create a climate-controlled, healthier environment for infants. The air crib was not commercially successful, but it did receive some attention from the media.
In particular, Time magazine ran a story about the air crib in 1947, which described it as a “baby tender” that would “give infant care a new scientific basis.” (Joyce & Fay, 2010).
The general lack of publicity around Skinner’s air crib, however, resulted in the perpetuation of the myth that Skinner’s air crib was a Skinner Box and that the infants placed in the crib were being conditioned.
In reality, the air crib was nothing more than a simple bassinet with some features that were meant to make it easier for parents to care for their infants.
There is no evidence that Skinner ever used the air crib to condition children, and in fact, he later said that it was never his intention to do so.
One famous myth surrounding the Skinner Crib was that Skinner’s daughter, Deborah Skinner, was Raised in a Skinner Box.
According to this rumor, Deborah Skinner had become mentally ill, sued her father, and committed suicide as a result of her experience. These rumors persisted until she publicly denied the stories in 2004 (Joyce & Fay, 2010).
Effectiveness
One of the most common criticisms of the Skinner box is that it does not allow animals to understand their actions.
Because behaviorism does not require that an animal understand its actions, this theory can be somewhat misleading about the degree to which an animal actually understands what it is doing (Boulay, 2019).
Another criticism of the Skinner box is that it can be quite stressful for the animals involved. The design of the Skinner box is intended to keep an animal from experiencing other stimuli, which can lead to stress and anxiety.
Finally, some critics argue that the data obtained from Skinner boxes may not be generalizable to real-world situations.
Because the environment in a Skinner box is so controlled, it may not accurately reflect how an animal would behave in an environment outside the lab.
There are very few learning environments in the real world that replicate a perfect operant conditioning environment, with a single action or sequence of actions leading to a stimulus (Boulay, 2019).
Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice Hall.
Dezfouli, A., & Balleine, B. W. (2012). Habits, action sequences and reinforcement learning. European Journal of Neuroscience, 35 (7), 1036-1051.
Du Boulay, B. (2019). Escape from the Skinner Box: The case for contemporary intelligent learning environments. British Journal of Educational Technology, 50 (6), 2902-2919.
Chen, C., Zhang, K. Z., Gong, X., & Lee, M. (2019). Dual mechanisms of reinforcement reward and habit in driving smartphone addiction: the role of smartphone features. Internet Research.
Dad, H., Ali, R., Janjua, M. Z. Q., Shahzad, S., & Khan, M. S. (2010). Comparison of the frequency and effectiveness of positive and negative reinforcement practices in schools. Contemporary Issues in Education Research, 3 (1), 127-136.
Diedrich, J. L. (2010). Motivating students using positive reinforcement (Doctoral dissertation).
Dozier, C. L., Foley, E. A., Goddard, K. S., & Jess, R. L. (2019). Reinforcement. T he Encyclopedia of Child and Adolescent Development, 1-10.
Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement . New York: Appleton-Century-Crofts.
Gunter, P. L., & Coutinho, M. J. (1997). Negative reinforcement in classrooms: What we’re beginning to learn. Teacher Education and Special Education, 20 (3), 249-264.
Joyce, N., & Faye, C. (2010). Skinner Air Crib. APS Observer, 23 (7).
Kamery, R. H. (2004, July). Motivation techniques for positive reinforcement: A review. I n Allied Academies International Conference. Academy of Legal, Ethical and Regulatory Issues. Proceedings (Vol. 8, No. 2, p. 91). Jordan Whitney Enterprises, Inc.
Kohler, W. (1924). The mentality of apes. London: Routledge & Kegan Paul.
Staddon, J. E., & Niv, Y. (2008). Operant conditioning. Scholarpedia, 3 (9), 2318.
Skinner, B. F. (1938). The behavior of organisms: An experimental analysis. New York: Appleton-Century.
Skinner, B. F. (1948). Superstition” in the pigeon . Journal of Experimental Psychology, 38, 168-172.
Skinner, B. F. (1951). How to teach animals. Freeman.
Skinner, B. F. (1953). Science and human behavior. SimonandSchuster.com.
Skinner, B. F. (1963). Operant behavior. American psychologist, 18 (8), 503.
Smith, S., Ferguson, C. J., & Beaver, K. M. (2018). Learning to blast a way into crime, or just good clean fun? Examining aggressive play with toy weapons and its relation with crime. Criminal behaviour and mental health, 28 (4), 313-323.
Staddon, J. E., & Cerutti, D. T. (2003). Operant conditioning. Annual Review of Psychology, 54 (1), 115-144.
Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2(4), i-109.
Vu, D. (2017). An Analysis of Operant Conditioning and its Relationship with Video Game Addiction.
Watson, J. B. (1913). Psychology as the behaviorist views it . Psychological Review, 20, 158–177.