The “marshmallow test” is an often cited study when talking about “what it takes” to be successful in life. In the early 1970’s, Psychologist Walter Mischel, a professor at Stanford University, set up an experiment where preschool aged children were given a marshmallow to enjoy now, but were told that they could have another in fifteen minutes if they were able to wait. Some of the kids ate the marshmallow right away. Others waited and were able to enjoy two marshmallows. These children were then followed into their teen years, with the purpose of seeing if there was any differences between “those that could wait” versus the ones that decided to just eat the marshmallow right away. As it turned out, for those that were able to wait, there was a significant difference in SAT scores, educational attainment, and even body-mass index (BMI). Subsequent brain imaging studies on the original participants, when they were in their 20’s, even showed anatomical differences in the brains of those that waited.

Thus, the ability to self-regulate impulses from a young age was proven to be a marker of future success for all human beings.

That is if you make a certain assumptions. One assumption is that a small study conducted in a certain location can be representative of all people in any location. Another assumption made is that the observed results are a reflection of something within the child (i.e. an ability to control impulses), as opposed to a reflection of their environment. Unfortunately, both these assumptions are wrong, and therefore so are the common conclusions drawn from this famous study. Now the marshmallow test does prove something, but it just has nothing to do with the kids.

Is it normal to be WEIRD?

The first false assumption made is that these marshmallow eating kids, 32 of them in the original study, are typical kids that can appropriately represent all kids. The problem is that these kids weren’t typical, they were WEIRD. Most kids aren’t WEIRD. Most people aren’t WEIRD. Let me explain.

WEIRD is an acronym used to describe the context in which the vast majority of published research in the Social Sciences (psychology, sociology) is conducted. WEIRD stands for White, Educated, Industrialized, Rich, and Democratic. The problem is that when talking about the world’s population of over 7 billion people, and the contexts in which most people live, 90 percent of all human beings on the planet right now don’t grow up or live in WEIRD cultures.

If we’re talking about what the most statistically “average” person is like, that “typical” person is most likely a Chinese male, with no bank account, making less than $12,000 a year. He does have a cell phone though. Another way to say that is that the most typical representation of a human being living today is not a white kid whose parents are college professors at one of the world’s most elite universities in one of the wealthiest neighborhoods in the country, like any one of the original 32 children in the study. So that’s partly why the marshmallow test doesn’t really tell us what the researchers claim. Small sample size aside, those kids don’t represent very well most other kids.

Researchers are finally acknowledging that this WEIRD bias is a problem. The question we should ask is whether or not race, education, economics or government have impact on how a person matures into adulthood. In other words, how significant is the environment in which someone grows up? It’s actually really, really significant. I’ll be talking in depth about this in part 2 of this Impact of Environment series.

For now, let’s talk a bit about culture, context, and perspective a bit more to unpack why WEIRD populations make for an especially problematic group to study if we are trying understand the whole of humanity.

What do you think about me?

The most populous countries in the world by far are China and India. The United States is a distant third, followed closely by another Asian country, Indonesia. No European countries make the top ten, unless you count Russia at number nine, which is debatable. Canada, the largest non-European, non-American white-majority country, has the same number of people as the city of Tokyo, Japan. Australia has the same number of people as Southern California alone, about 1.8% of the population of China.

As an Asian American, I have lived a bicultural experience. My parents speak to me in Chinese. I speak back to them in English and Chinese. I have a large extended immigrant family, where most of my aunts and uncles grew up in Asia but have lived most of their lives in America. I was born here so my exposure to Chinese culture is through the context of my family relationships and our shared experiences. As is the case with most children of immigrants, my lived experience has given me a natural understanding that there’s more than one way to say things, more than one way of doing things, more than one way of knowing what is right and wrong.

One of the most obvious ways in which my Asian “normal” and my American “normal” come into conflict is how we see ourselves relative to those around us. The main difference between these cultural norms is that most cultures have a “collectivist” perspective, rather than an “individualistic” perspective. That means that by default, most people in the world see their “self” identity as being more associated with the group of people that they are connected to, rather than their self-image as being defined by their own traits and experiences.

Not only that, having a socially-prioritized perspective is also the default perspective that human beings have had for all of our history, dating back hundreds of thousands of years. It is only in the past few hundred years that Individualism has even been considered as a valid alternative view, and again, only in countries influenced by Western European philosophy after the Dark Ages. Also, it’s worth noting, while Europe went through a cultural collapse after the fall of the Roman Empire, all of the rest of the world enjoyed a continuous history of cultural and technological growth and development without interruption. That means that what is “normal” in Asia, Africa, the Middle East, South America, and also parts of Northern Europe is more influenced by thousands of years of continuous culture, which is connected to our collectivist hunter-gatherer and early agricultural roots, where people lived collaboratively, fairly, interdependently, and successfully for a long, long, long time. Going back even further, from a biological evolutionary standpoint, our human brains adapted over millions of years to thrive in a socially interdependent context, not as a bunch of smart apes living in isolation.

This historical view is important because living in America, it can seem like Individualism is some innate human way of existing, even though it’s really unnatural both in this moment in time and for pretty much all of the history of human existence. However, if you’ve grown up here like I have, it seems perfectly normal to aspire to be an independent, self-sufficient, jack-of-all-trades. In fact we often use this measuring stick as a judge of how mature a person is, or whether or not their parents “did a good job” of raising them. Being able to take care of yourself, manage your own bills, be financially independent of your parents, cook your own meals, keep your place clean, make a difference in the world — this is for many people a way to describe a successful adult. It is even the prerequisite to see if you are ready have a serious long term relationship or to become a parent. You may even be tempted to wonder if this person was one of the kids that would have waited fifteen minutes for the second marshmallow.

And so all these things considered, not surprisingly, taking a default WEIRDly-normalizing bias, layering a narrow Individualistic worldview, the outcome of the marshmallow test is interpreted and readily accepted as the result of an innate, universal human quality – selfcontrol.

Well here’s the rub. The marshmallow experiment has since been repeated, but this time, over 900 children were studied and the researchers made sure that the children included in the study were from different cultural, racial, and socioeconomic backgrounds. And what did they find? They found nothing nearly as compelling as the narrative that came out of the original study. And in fact, when the researchers adjusted the data collected to account for certain environmental factors, there was nothing of significance found between those that waited and those that didn’t. This also happens all the time in social sciences research, where different researchers conducting the same experiments don’t get the same results. It is super problematic when the results of one small experiment become one of the foundational principles of what parents try to instill in their children.

If we aren’t being WEIRD about it, what does the marshmallow test tell us?

Flawed and biased interpretation aside, now here’s what is worth thinking about in the original Stanford study. Assuming the measured outcomes were genuine — kids who waited really did do measurably better in terms of future achievement, what really was being measured if not internal self-discipline?

Well, if we shift from an individualistic perspective, which is biased towards explaining things as products of personal strengths and shortcomings, and instead we looked at those kids as being influenced by their surroundings and environments, then a simple explanation emerges, that is also consistent with why the repeated study yielded no differences between kids who waited versus kids who didn’t.

The kids who waited truly believed that better things were coming to them. The kids who didn’t wait, didn’t have the same conviction. And since we are talking about preschool aged kids, it is very likely that this early worldview where certain kids had the audacity to expect future fulfillment was instilled in them by their primary caretakers early in life. In other words, learned beliefs shaped by their environment.

“We can’t go to the aquarium today but we’ll go later, I promise.”

For a kid with a stay-at-home parent because that is economically feasible, this statement is probably an issue of convenience. The kid might have gone to the aquarium five times already since they have an annual membership and so going again is a reasonable expectation. She might have even heard her dad tell her this very same thing in the past and lo and behold, he really did take her to the aquarium a few days later. For this kid, if a researcher says you can have another marshmallow in fifteen minutes if you just wait a bit, she’s been in this situation before and has no reason to doubt that she’ll get a better deal later.

For a kid who lives in poverty, this is an issue of parental good intentions limited by economic scarcity. There’s a lot of reasons that there won’t be a future trip to the aquarium, and for this kid, they may have had enough experiences in the past to know that good intentions don’t lead to kept promises.

It doesn’t have to be economic scarcity either. A kid with busy caregivers with a scarcity of time or energy, a broken promise might be an unfortunate regular experience. For a kid with an absent parent or a sick parent, neglect may be an unintentional consequence because of the scarcity of adequate caregivers.

So let’s consider that these kids were not exhibiting self-control over impulses, but perhaps they were making good decisions based on what they had learned to expect from promises adults were making to them.

Kids who grow up with more than enough can more easily expect that their needs can be equally met in the present as well as the future. Their decisions then are based on what they think is better, what they can have now or what they can also have in the future.

Kids who grow up with barely enough or not enough, based on past experience and logical decision making correctly take what is in front of them because there’s no reason to believe that promises made for future fulfillment are as reliable as what can be had now. This is not a reflection of poor impulse control but rather a reflection of correctly predicting the future.

So the Stanford version of the marshmallow test told us more about the kids parents and their home environment than it told us about some magical innate self-discipline that certain kids are born with that will destine them for future achievement. Those future achievements could just as well be explained by saying that those kids that waited had families that could both reliably provide for the kids present and future needs, and instilled the confidence in their children that their parents would be reliable and provide them support in all their endeavors.

And this is the real truth of what it takes for kids to grow up to be healthy, successful, thriving adults, and it is the opposite of being self-oriented or self-dependent. What it takes for people to thrive is outside resources, reliable and timely help, and people believing in you.

In part 2, I’ll expand on this further.

Subscribe to this blog, my mailing list, or follow my Facebook page to stay notified for future articles and speaking events.

Image Credit: Shutterstock

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s