Are You Painting Yourself and Others Into a Corner?

The importance of holding opposing stories in our heads

Sofonisba Anguissola, Self-Portrait at the Easel Painting a Devotional Panel,” 1556 (Wikimedia Commons)

Sofonisba Anguissola, Self-Portrait at the Easel Painting a Devotional Panel,” 1556 (Wikimedia Commons)

This summer, a Vanity Fair headline about the 2019 disappearance of Jennifer Farber caught my eye, lifting the veil anew on a stranger's ghost story.

The story of Ms. Farber has haunted me from the get-go, despite the fact that I'm not especially voyeuristic in this regard. I don't watch true crime and investigative news shows like Dateline. I don't pick up People to read an exposé about a scandalous and violent event. This is not for a lack of compassion, but a desire not to become desensitized or fan the flames of sensationalism.

So, my interest and anxiety around this particular case surprised me. I was at a loss to understand what drew me to this particular tragedy.

Sadly, the facts of the case still leave us wondering what happened. In May 2019, 50-year-old Jennifer Farber disappeared from her Connecticut home. Her body has yet to be recovered. Authorities immediately suspected her husband, Fotis Dulos, with whom she'd been in bitter divorce negotiations for over two years, but armed with only circumstantial evidence they moved slowly hoping her body would turn up.  Ten months after Jennifer's disappearance, they finally went ahead and arrested Dulos. While out on bail, however, he committed suicide upon learning that his bail was going to be revoked and that he would face an indictment.

Dulos left behind no information as to Jennifer's whereabouts or what had happened. The couple is survived by five children, including two sets of twins.

Press coverage almost exclusively painted her as a missing suburban mom. But reducing the 50-year-old’s life to a familiar tabloid trope missed so much of her story.

The summer of 2019 was an emotional time for me. Living through the breakup of my own marriage, it really wasn't a stretch to see how this story struck a nerve. As a mom and a wife, this kind of loss would feel incomprehensible even in normal times. Still, I was strangely invested in Jennifer Farber’s story.

Clicking open the Vanity Fair article, I better understood. The article explores Jennifer Faber’s wider identity as a woman, much of which got left out of the expansive media coverage of her disappearance. It got me thinking about labels, yet again. Who was Jennifer Farber beyond the melodrama of her disappearance? Who was Jennifer beyond the label of suburban mom?

Earlier this year, I wrote an article challenging LinkedIn to reconsider its use of labels.  At the time, the professional networking platform offered very limited options for people, especially women, to call out time away from paid employment. To LinkedIn’s credit, they subsequently added “stay-at-home-mom/dad/parent” to the dropdown menus and made other adjustments that now allow job seekers to capture a nontraditional career path on their digital resume.

Notably, however, the label “stay-at-home-mom” isn’t without its critics, which goes to show the messiness of labels. Labels are simplistic shortcuts that take on an all-or-nothing meaning. We’re this, or we’re that. We wear a mask, or we don’t. All sorts of assumptions follow from that one fixed idea, usually limiting our view of the full picture. In today’s polarized political climate and amid a pandemic, it’s almost impossible to escape labels. Yet even in less extreme times, labels are everywhere.

Given the possibility of a hurtful outcome, why do we paint ourselves and others into a corner? What can we do to reduce the likelihood of causing harm?

LABELS AREN’T NECESSARILY BAD

The brain likes categories. The NPR podcast, Invisibilia, examines the unseeable forces that control human behavior and shape our ideas, beliefs, and assumptions. In 2015, the show looked at the way in which categories play an invisible role in our lives, noting that even babies start to place objects into categories from about the age of four months.

The ability to categorize allows the brain to recognize the objects around us as members of a particular category, all our knowledge of that category then guides our response to that thing. Organizing the world around us means we don’t have to start from scratch each time we encounter an object, obviously saving us a ton of time each day and adding to our quality of life. Imagine, for example, not recognizing a cup or a dog or a bus. Or how to group individual letters into sentences. In the 1980s, a scientific study of a stroke victim who had lost the ability to categorize, but who was otherwise cognitively fine, revealed that when shown two pictures of two different trains he could not see the connection between them.

Lumping things together is a survival skill that allows our brains to make sense of an endless flow of data.

The thing is, we can’t shut off our brain’s natural tendency to group things together. And, once we place something in a category, it’s very hard to recommit it elsewhere.

Think how profoundly the very first distinction we make shapes our lives: Is it a boy or a girl?

When this unconscious ability to sort is applied to social, racial, and gender categories, we face the danger of not moving beyond the overt qualities of the category to see people as individuals.

In other words, this gift for grouping can leave us wondering whether we’re destined to stereotype.

STEREOTYPING IS AN EVOLUTIONARY ADVANTAGE


To be clear, we all stereotype. We stereotype not due to faulty cognition but thanks to a helpful feature of the brain that allows for efficient decision-making. According to Dr. Noam Shpancer in Psychology Today:

“Our evolutionary ancestors were often called to act fast, on partial information from a small sample, in novel or risky situations. Under those conditions, the ability to form a better-than-chance prediction is an advantage. Our brain constructs general categories from which it derives predictions about category-relevant specific, and novel, situations. That trick has served us well enough to be selected into our brain’s basic repertoire. Wherever humans live, so do stereotypes. The impulse to stereotype is not a cultural innovation, like couture, but a species-wide adaptation, like color vision. Everyone does it. The powerful use stereotypes to enshrine and perpetuate their power, and the powerless use stereotypes just as much when seeking to defend or rebel against the powerful.”

In other words, the act of categorizing a person or group hasn’t always been a source of harm. Indeed, a stereotype, considered from the strictly non-judgemental definition of “specific traits attributed to people based on group membership,” can prove useful.

As Yale psychologist Paul Bloom has noted, “You don’t ask a toddler for directions, you don’t ask a very old person to help you move a sofa, and that’s because you stereotype.”

This example not only demonstrates the benefits of stereotypes but also how stereotypes, contrary to popular belief, can be based in reality. Some recent research shows that even when applied to large groups from a racial, social, or genderized perspective, stereotypes are often accurate. Professor Lee Jussim of Rutgers University calls it the “myth of stereotype inaccuracy” and his research reveals that “laypeople’s beliefs about groups correspond well with what those groups are really like.” His work and that of other modern-day social scientists is a rebuke to the claims of an earlier generation of social scientists, who in an effort to combat oppression in the aftermath of WWII and the Civil Rights and women's movements, declared stereotypes inaccurate and bad. It’s been discovered that much of these earlier claims are based on little to no empirical evidence.

Current scholarship puts forth the idea that many of us rely on “rational stereotyping,”  in which we make informed assumptions about a person or group in the face of not having unique information about an individual.  A stereotype is not universally valid, but it is statistically accurate, and so helps us assess situations and behaviors. “In situations where one has abundant, vividly clear information about an individual, the stereotype becomes completely irrelevant,” states Professor Jussim.

But, what happens when the stereotype is exploited, or when the generalization is inaccurate, or when the generalization hinders our curiosity about the individual? 

THE SINGLE STORY

In her fabulous TED talk called The Danger of a Single Story, novelist Chimamanda Ngozi Adichie addresses how our lives, our cultures, are composed of many overlapping stories. The talk is about the risk of cultural misunderstanding when a narrative presents only one perspective and is repeated over and over again. To reduce an individual to one story is to take away their humanity.

The beauty of the talk is that Adichie uses humor, storytelling, vulnerability, and a fresh cultural perspective to illustrate the universality of single stories. She offers us empathy and kind words of caution.

The single story creates stereotypes, and the problem with stereotypes is not that they are untrue, but that they are incomplete. They make one story become the only story.
— Chimamanda Adichie

The problem with stereotypes is not that they are untrue, but that they are incomplete.

So how do we stop ourselves from jumping to the single story? Is it simply a matter of awareness?

In a 2014 New York Times article about bias in the workplace, Adam Grant, organizational psychologist and TED speaker, and Facebook’s Sheryl Sandberg recount the following illustration of gender stereotyping:

A father and his son are in a car accident. The father is killed and the son is seriously injured. The son is taken to the hospital where the surgeon says, “I cannot operate, because this boy is my son.”

This popular brain teaser dates back many years, but it remains relevant today; 40 to 75 percent of people still can’t figure it out. Those who do solve it usually take a few minutes to fathom that the boy’s mother could be a surgeon. Even when we have the best of intentions, when we hear “surgeon” or “boss,” the image that pops into our minds is often male.

Gender stereotyping is rampant in the workplace, with some industries worse than others. It’s also in our schools and can start at a young age. To combat bias in the workplace, business leaders and educators implemented extensive training and programming to raise awareness. Article after article emphasizes the importance of education. Fairygodboss contributor, Natalia Marulanda, states: “Since most gender bias is implicit, meaning it’s unconscious, it can be especially difficult to recognize if you don’t know what to look for, and even more so if you benefit from it. To get men involved, they first need to recognize that there is a problem to solve.”

According to some experts, awareness of bias isn’t necessarily enough, and having it pointed out could actually lead to greater stereotyping, not less. The thinking, according to two major studies is that when people are made aware of the ubiquity of bias, they assume it’s more socially acceptable. In other words, Grant and Sandberg note: “If everyone else is biased, we don’t need to worry as much about censoring ourselves.”

The trick is to stress not just that most people have bias, but that most people actually seek to overcome their stereotypical perceptions. The studies found that discrimination disappeared with that small adjustment. It’s the difference between legitimizing the single story and reinforcing the idea that most people don’t want to discriminate, especially if there are benefits to not doing so.

Meanwhile, it will benefit us to remember many of our common stereotypes are super outdated and irrelevant in today’s world. Returning to Dr. Noam Shpancer’s thoughts on evolution and stereotypes:

Considering stereotypes, the stereotyping process has evolved in a time when a tribe was the defining unit of identity. Today, in the epoch of the differentiated self, tribal distinctions, however accurate, may no longer provide sufficiently useful and important cues for adaptive action. Rapid social change, in other words, is rendering stereotyping superfluous, and certain previously relevant stereotypes gratuitous.

For example, male physical superiority, and the attendant stereotype, may have been sufficient to justify and support a social system of male dominance during a time when physical strength was a crucial survival and social asset. Due to socio-cultural innovation, it no longer is. The most socially powerful people around, and those most likely to survive, are no longer the most physically strong. The old stereotype that women are physically weaker is still accurate, but the right question in our new social times might be: So what?

Indeed, so what?


PAINTING OUR SELF PORTRAIT


Mom. Writer. Traveler. Divorcée. Hiker. Runner. Reader. Friend. Introvert. Daughter. Curious. Emotional. Critical. Generous. Hard-working. Impatient. Extrovert. Perfectionist.

Savory vs. sweet? Savory.

Favorite season? Fall.

Beer or wine drinker? Neither, I prefer a cocktail.

This list is just a sample of the labels I’ve given myself. My list, like all of our lists, goes on and on. For, every day, we make hundreds of micro distinctions that feed the narrative we tell ourselves about ourselves. And just like when we draw conclusions about others when we label them, all sorts of assumptions flow from the labels we give ourselves.

During Invisibilia’s episode on categories, the hosts interview strangers: Are you a dog or a cat person? The responses are amusing, but what stands out is how adamant and enthusiastic everyone was to place themselves in one category or the other and how one’s preference for dog vs. cat seemed to hold deeper meaning as to who they were as a person.

As humans, we have a powerful urge to differentiate ourselves, to declare our category or identity. TAgain, the brain can’t help itself.

But, self labels and the meanings that come with them can be limiting or empowering.

For instance, I go ahead and order the french fries because somewhere along the line I’ve labeled myself fat. Or, alternatively, I order a salad because I call myself an athlete and have to get up in the morning for a long run. This appears to be a simplified example, but the point is this, according to Bryan Kramer, CEO of PureMatter and Ted Talk speaker:

When you believe you can’t change because of a label you’ve been ascribed or given yourself, you start to cement the assumption that you can’t do certain things. Life becomes much more stressful when you’re trying to negotiate your way through it by avoiding tasks you "can’t" do.

Since the 1940s, personality testing has become a $2 billion industry, with the Myers-Briggs test being the most popular of them all.

These type of tests were originally developed in 1917 to identify soldiers prone to nervous breakdowns during enemy bombardment in World War I and then began to be used in a variety of industries as part of the hiring process. Lately, however, personality tests have come under scrutiny as a hiring tool. Among several problems note for hiring managers using personality testing, one Harvard Business Review study raises the question: “How can an individual’s assessment results be used to predict future job performance if there is a reasonable chance that their scores will change over time?”

This question gets at the heart of recent research by cognitive and behavioral scientists who say while some personality elements remain stable over time, others change in distinct ways. Often for the positive.

Organizational psychologist, Benjamin Hardy, pulls no punches, calling personality tests “junk science.” He states:

Interestingly, more recent research shows that 90% of people want to make changes in their personalities. As people, we want to improve ourselves. But non-scientific theories like Briggs' can lead people to believe they literally can't change, because their "core" attributes or "type" is inflexible. Hence, type-based tests that create a label can also create a fixed mindset.

Hardy also talks about becoming our future selves., which he says can only happen we look at our present and future selves as two different people. In his Harvard Business Review article he states:

Your personality, skills, likes, and dislikes change over time — but that change isn’t out of your control. What can you do to become the version of yourself that you most want to be? Start by acknowledging the differences between your past, current, and future selves. Next, imagine your desired future self: Set goals that are as clear and specific as possible to maximize your chances of achieving them. Finally, develop (and re-develop) an identity narrative consistent with the person you want to become — and share that story with others! Your identity drives your behavior, which over time creates your personality. So start acting like the best version of yourself, and you will become that person.

For me, this is a highly motivating notion. In essence, Hardy is encouraging us to declare our future selves. To label ourselves today with labels that are aligned with who we want to become tomorrow. By doing so, we will likely achieve our goals and change our personality in the process.

Above, my personal list of labels contains both introvert and extrovert.

Is it possible to be both an introvert and extrovert?

Before doing this research on the topic of stereotypes and personality, I already felt as if I moved along the spectrum of these two personality types. As a kid, I was social with lots of friends, but also a bit shy. During my corporate career, I found myself speaking in front of huge audiences, leading meetings, and managing teams of people—all skills I never thought I was capable of because, well, I was an introvert. The Myers-Briggs test told me so, over and over again. And so did scads of other tests I took in college and during my professional life. But, there I was, doing very extroverted things. On the social front, at some parties I’m a total wallflower while at others I work the room. Sometimes I crave meeting up with a group of friends to relax after a rough day, other times I want a good book under a blanket on the couch. I don’t think any of this takes away from the fact that I lean introvert, but it certainly speaks to the idea that we are capable of of being lots of types and holding seemingly opposite personality attributes depending on the situation and life chapter we’re in.


PAINTING THE ENTIRE LANDSCAPE

We’ve all been on the receiving end of others ascribing us a label. At best, while the label may resonate, it likely feels incomplete. At worst, a label feels completely unfair, reducing our complex web of experiences, emotions, thought, and actions to a single characteristic that often infers other unfair characteristics. He’s arrogant, so he’s probably not a good friend or much of a family man.

Remembering the sting of having a label forced upon us, personally, can hopefully guide us to see the nuance in others. Knowing that we can grow and change can hopefully help us see that potential in others and reducing them to a single story.

It’s not easy, though, to put down the metaphorical Sharpie and stop the labels in today’s world.

Americans seemingly can’t help themselves live in such a polarized political climate. David Brooks, Op-Ed columnist for The New York Times, writing in 2016 about the presidential election said:

American politics has always been prone to single storyism — candidates reducing complex issues to simple fables. This year the problem is acute because Donald Trump and Bernie Sanders are the giants of Single Storyism. They reduce pretty much all issues to the same single story: the alien invader story.

Every problem can be solved by finding some corrupt or oppressive group to blame. If America is beset by wage stagnation it’s not because of intricate structural problems. It’s because of the criminal Mexicans sneaking across the border or it’s because of this evil entity called “the banks.”

Social media and marketing campaigns exacerbate the polarization, dishing it out at every turn. But, interestingly, recent studies show that exposure to opposing viewpoints may be making thing worse, causing us to dig in our heels and reinforce our own viewpoint, rather than swaying us to the other side. A Wall Street Journal article summarizes the twist as such:

The reason is probably intuitive for anyone who has the misfortune to spend an unhealthy amount of time on Facebook, Instagram, Twitter, YouTube or even cable news. (During the pandemic, that’s more of us than ever.) Because social media and Balkanized TV networks tend to highlight content with the biggest emotional punch—that is, they operate on the principle that if it’s outrageous, it’s contagious—when we’re exposed to a differing view, it often takes an extreme form, one that seems personally noxious.

Mr. Sabin-Miller and Dr. Abrams, both mathematicians, call this effect “repulsion.” In addition to the “pull” of repeatedly seeing viewpoints that reinforce our own, inside of our online echo chambers, repulsion provides a “push” away from opposing viewpoints, they argue.

And guess what? The psychological force of repulsion is stronger than attraction to the debate on our own side.

Indeed, research demonstrates how wrongly we perceive political outgroups. For example, the average person of both parties is a middle-aged, white, nonevangelical Christian, but this isn’t who comes to mind when we think about Democrats and Republicans. Instead, we resort to partisan stereotypes where Democrats are urban minorities and young people, and Republicans are older, wealthy, or evangelical Christians. Furthermore, approximately 11% of Democrats belong to a labor union, but survey data found that the average American believes that 39% of Democrats were union members. Remarkably, 44% of Republicans and 37% of Democrats held this view, respectively. Similarly, a mere 2.2% of Republicans make more than $250,000 per year, but the average person believed that 38% of Republicans had incomes that high.

The key, David Brooks suggests, is to find a balance between the stories.

To hold opposing stories about ourselves and others in our head, to use stories buoy, not break, our personal and collective humanity.

Stories matter. Many stories matter. Stories have been used to dispossess and to malign, but stories can also be used to empower and to humanize. Stories can break the dignity of a people, but stories can also repair that broken dignity….When we reject the single story, when we realize that there is never a single story about any place, we regain a kind of paradise.
— Chimamanda Ngozi Adichie

***

Jennifer Farber was a mom. A suburban mom. She was also a writer, a Phi Beta Kappa graduate of Brown University, and beautiful. She was Gen-Xer, loved music, and was witty. She liked to read the classics and was very active in the New York literary scene in the late 1990s. She was rich. She attended an MFA program at NYU and wrote four full-length plays. Some perceived her to be haughty and emotionally awkward. Once, she drove across the country by herself. She often had a boyfriend and loved her Cavalier King Charles spaniel. Jennifer married and went through a number of years during which fertility treatments dominated her life. She gave birth to five children, including two sets of twins. She struggled to balance her sense of self with motherhood. She was a divorcée.

There is so much more to the story of Jennifer Farber than suburban mom. She was a complex human being whose story should not be reduced to a marketing ploy by today’s media.

It is incumbent upon us to move beyond the headline. Beyond the label.

We must ask ourselves: Are we asking the right questions?

Previous
Previous

CROSSWORD - “OriGINal Sin”

Next
Next

Flower of Salt