One of the sponsored posts I ran on LinkedIn to promote a blog article sparked a debate I genuinely didn’t see coming: gender bias. The question was sharp and immediate: why did he (a man) choose an image of women being portrayed as less competent at work?
The article I was promoting was “10 Behaviors That Make You Look Less Competent at Work (Even If You’re Not).” And, ironically, inside the article the images were split evenly between men and women. The choice to highlight a female image in the ad had a simple, data‑driven reason: the audience for that topic was overwhelmingly female. That was it. No hidden agenda. No subliminal messaging. Just targeting.

Still, the interpretation went in a completely different direction.
When I changed the image to something more neutral, a messy desk next to a tidy one, the reaction didn’t calm down. It simply shifted.
“I work like this and I work just fine.”
In other words, even an image that seems harmless to many can carry very different meanings for others — including questions like, “Why do competent women need to present themselves in a more masculine way?” or “Why does a tidy desk portray more competence?”
To be completely transparent, the images were created with the help of AI to illustrate the contrast between how we think we’re being portrayed and how others might actually perceive us.
Could it have been better? I’m sure it could — and that’s precisely what this text is about. We are all biased, even when we’re simply choosing visual representations.
We all project our own experiences, insecurities, frustrations, and assumptions onto what we see — just as I overlooked how differently others might perceive those images when I selected them.
And that’s the point: bias isn’t always intentional. Sometimes it’s simply the collision between what we meant and what someone else sees.
There’s always room to improve communication, but there will also always be constraints on what you choose to improve, because there’s that little tick‑tack in the back of our minds pushing us to rush past different angles when making a choice.
Are we doomed to be biased?
I believe we are, to varying degrees — and in this case, the bias worked in my favor by pushing engagement up. After all, the second book in the How Smart People Think series is precisely about predictably bad decisions.
Depending on where you express an opinion and how you express it, biases will surface on both sides — and even small controversies tend to get people talking.
The discussion that unfolded ended up raising a surprisingly relevant question:
Is it even possible to interpret or present a topic without any bias at all?

Put simply, a bias is the result of a mental shortcut — a fast, energy‑saving way for our brain to reach conclusions. In the context of an article or an ad, that shortcut might look like:
“Is this worth clicking? Commenting on? Reading?”
These shortcuts are what we call heuristics — quick rules of thumb.
For example: seeing a photo of a woman in an ad for an article about behaviors that make you look less competent at work, and immediately assuming the author believes women are incompetent and the article isn’t worth your time.
We say something is biased when these shortcuts — these heuristics — lead us to flawed or misguided conclusions.
So far, researchers haven’t concluded that all information is inherently biased — but there is strong consensus that our processing of information tends to be biased in most situations.
So yes, it’s entirely possible that both the interpretation (“he thinks women are incompetent”) and my own reasoning when choosing the images were biased. In my case, I believe the bias came from the availability of images related to the topic and from my intention to capture attention and create impact — which, ironically, worked.
As for the women who felt offended, they have my sincere apologies.

Information processing is often biased
We love when someone agrees with us — or even seems like they might. The problem is that this agreement is often an illusion. We routinely overestimate how similar other people’s beliefs and opinions are to our own. And just as often, we apply completely different standards when evaluating the behavior of people we see as “our group” versus “the opponents.” The old “you’re either with me or against me” mindset shows up more than we’d like to admit.
Experimental research shows that even when people try to be rational, they still ignore data that contradicts their beliefs and interpret evidence selectively.

And believe it or not, even experts aren’t immune — scientists, doctors, analysts. Some would even argue that their biases are more visible.
Think of the classic example: “Don’t ask an investor who sells courses whether now is a good time to invest.”
If even journalists must decide what makes it into the news cycle and what gets left out, it becomes obvious that the information we consume has already passed through a filter — the filter of what is expected to get more views, more clicks, more comments.
With that in mind, I became curious and decided to lay out some of the most common double‑sided biases I’ve observed in online posts.
1. Political posts with “high‑impact negativity”
Research shows that news sources with political leanings — across the spectrum — tend to publish content with strong negative emotional tones because negativity reliably drives engagement. The bias begins in the choice of tone and framing.
On the receiving end, audiences interpret these posts already primed to react emotionally, reinforcing their existing beliefs and sharing the content even more.
2. YouTube comments on gender and race topics
A recent study analyzing five years of YouTube comments found that videos about politics, sports, and entertainment often carry gender, racial, and ideological biases. Creators may frame topics in subtle ways that reinforce stereotypes, while viewers interpret and comment through the lens of their own cultural and cognitive biases.

3. Posts that “expose” inequalities using selective data
The same research shows that creators frequently choose data, charts, or comparisons that support their preferred narrative — for example, highlighting only the statistics that confirm a point about gender or racial inequality while ignoring relevant variables or alternative explanations.
This selective framing shapes how audiences perceive the issue, often without realizing that the comparison itself was biased from the start.
4. Threads calling out the “moral failures” of public figures
Posts exposing the behavior of celebrities or public figures are often shaped by cultural and identity‑based biases. The wording, the order in which facts are presented, and the absence of context all influence how the audience interprets the situation.
Even when the information is technically accurate, the framing can lead readers toward a particular moral judgment — and audiences, in turn, interpret the content through their own preexisting loyalties, values, and assumptions.
Where does bias come from?
That quick, premature conclusion we jump to about something or someone.
Much of it comes from well‑known heuristics — mental shortcuts our brains use to save time and energy. They’re useful, but they often lead us to flawed conclusions.
Here are some of the most common ones:
1. Availability
We judge the likelihood of something based on how easily examples come to mind. Example: After seeing several news stories about plane crashes, someone starts believing flying is far more dangerous than it actually is.
2. Representativeness
The mind quickly compares something to a mental “prototype” and jumps to conclusions. Example: Seeing someone in a lab coat and assuming they’re a doctor — when they might be a student or technician.
3. Affect
We make decisions based on immediate feelings rather than facts. Example: “I like this brand, so the product must be good,” without evaluating its quality.
4. Recognition
If we recognize something, we tend to see it as better or more trustworthy. Example: Voting for a candidate simply because “I’ve heard their name before.”
5. Framing
The way information is presented dramatically changes how we interpret it. Example: “90% success rate” sounds better than “10% failure rate,” even though they’re identical.
6. Anchoring
The first piece of information we receive becomes the reference point for everything that follows. Example: If a product is shown first at $500, the same product at $300 feels cheap — even if it’s still expensive.
7. Familiarity
The brain prefers what it already knows, even without a rational reason. Example: Always taking the same route to work because it “feels safer.”
8. Scarcity
When something seems limited, we assume it’s more valuable. Example: “Last units!” pushes people to buy impulsively.
9. Authority
We trust people we perceive as authorities — even without verifying. Example: “A specialist said…” even when that specialist may have a conflict of interest.
10. Sunk Cost
We keep investing in something simply because we’ve already invested in it. Example: Continuing a bad course because “I already paid for it.”

Shortcuts are great — until they aren’t
Mental shortcuts are incredibly useful for getting our thinking started. They’re pre‑wired, efficient, and save us a tremendous amount of time and energy in daily life. The only real danger is assuming they always reflect reality. When we place too much confidence in them, our snap judgments can become costly — either by making us overlook real opportunities or by seeing opportunities where none exist.
Can we reduce bias?
Bias can be reduced, but rarely eliminated. These shortcuts exist because they make us more efficient thinkers, not because they’re inherently flawed. Still, when our shortcuts consistently lead to negative outcomes — in relationships, finances, or decision‑making in general — we tend to become more open to examining them.
Recognizing that our heuristics are steering us toward poor judgments requires conscious effort. The usual recommendation is to adopt a more analytical, almost scientific mindset:
- What assumptions am I making about X?
- What evidence am I ignoring?
- What alternative explanations exist?
It also helps to seek out diverse perspectives, consult multiple sources, and invite others to review or challenge your reasoning. In other words, treat your own thinking the way a researcher treats a hypothesis — something to be tested, not simply trusted.
