- Bias and Blind Spots — An Invitation to Awareness
- How Cognitive Bias Creeps Into Code
- Search Me, O God: Naming Our Blind Spots (Ps 139:23–24)
- When Data Misleads Us: Bias in Datasets and Models
- The Logs in Our Own Eyes (Matt 7:1–5)
- Bias in AI: How to Build More Just Systems
- Learning to See as Christ Sees
- Debugging Our Thinking: Techniques for Reducing Bias
By this point in the month, one thing should be clear: bias is not an occasional intruder in technical work. It is a constant presence.
Bias does not enter systems only when something goes wrong. It enters when things feel routine. When decisions feel obvious. When assumptions go unchallenged because they have worked before.
If bias creeps into systems through human thinking, then addressing it requires more than better tools. It requires better habits of thought.
In this sense, reducing bias is less like applying a patch and more like debugging.
Debugging as a Way of Thinking
Debugging is not just a technical task; it is a mindset.
Good debugging begins with the assumption that something about our understanding is incomplete. It resists defensiveness. It treats unexpected behaviour as information rather than inconvenience.
This posture is exactly what bias reduction requires.
When we assume our thinking is correct by default, bias remains invisible. When we assume it may be flawed, learning becomes possible.
Make Assumptions Explicit
Bias thrives in implicit assumptions.
Unspoken beliefs about users, data quality, or system behaviour quietly shape decisions. Making assumptions explicit is one of the most powerful ways to surface bias.
Ask:
- What am I assuming about who will use this?
- What am I assuming about how data was generated?
- What am I assuming about what “normal” looks like?
Writing these assumptions down — in documentation, design notes, or comments — turns them from invisible drivers into testable hypotheses.
Once assumptions are visible, they can be challenged.
Slow Down at Points of Confidence
Bias often appears where confidence is highest.
When something feels obvious, we are least likely to examine it. This is why routines are dangerous. Familiar patterns stop triggering reflection.
A useful technique is to slow down deliberately at moments of certainty:
- when a solution feels “clean”,
- when a result aligns perfectly with expectations,
- when a model performs well on headline metrics.
These are precisely the moments when confirmation bias is most active.
Asking “what would surprise me here?” can reveal blind spots quickly.
Test Beyond the Happy Path
Bias often hides in edge cases.
Systems are typically tested against expected scenarios — the happy path. Users, however, rarely behave ideally. Data rarely conforms neatly.
Actively testing atypical scenarios helps surface assumptions:
- unusual inputs,
- missing data,
- minority use cases,
- unexpected interactions.
This is not about exhaustive coverage. It is about curiosity.
Bias shrinks when systems are exposed to perspectives they were not designed around.
Separate Exploration from Decision
Exploration invites creativity. Decision demands rigour.
One common source of bias is allowing exploratory insights to quietly become authoritative conclusions. A chart created to explore data becomes evidence for action without sufficient scrutiny.
Clear separation helps:
- exploratory work is labelled as such,
- decisions are backed by reproducible analysis,
- uncertainty is documented, not hidden.
This discipline does not slow progress; it prevents costly rework.
Invite External Eyes
We are poor judges of our own blind spots.
Code reviews, design critiques, and peer feedback are not bureaucratic hurdles; they are bias-reduction tools. Others see what we cannot — especially those with different backgrounds or experiences.
Crucially, this only works in cultures where questioning is safe. When dissent is punished or dismissed, bias thrives.
Psychological safety is not a “soft” concern. It is a technical requirement for trustworthy systems.
Use Checklists, Not Just Intuition
Bias is resilient because it operates automatically. Countering it requires structure.
Checklists help externalise responsibility. They ensure that key questions are asked even when time is tight:
- Who might this disadvantage?
- What data is missing?
- How will this be monitored over time?
- What would failure look like?
Checklists do not replace judgment. They support it.
Treat Feedback as Signal, Not Noise
User feedback that challenges assumptions is often inconvenient. It disrupts plans. It complicates narratives of success.
This makes it easy to dismiss feedback as edge cases or misunderstandings.
But feedback is often where bias first becomes visible.
Treating feedback as signal requires humility. It means resisting the urge to defend and choosing instead to listen.
Accept That Bias Reduction Is Ongoing
Bias is not eliminated. It is managed.
Reducing bias is not a milestone you reach. It is a practice you maintain. Systems evolve. Contexts change. New blind spots emerge.
This is not failure. It is reality.
What matters is responsiveness — the willingness to revisit decisions, revise assumptions, and improve over time.
Debugging Our Thinking
Debugging our thinking is demanding. It requires attention, humility, and patience. It resists the illusion of objectivity and replaces it with responsibility.
But the payoff is significant.
Systems become more resilient. Decisions become more trustworthy. Teams become more reflective. And harm is reduced — not perfectly, but meaningfully.
As this month draws to a close, the invitation is not to master bias, but to remain attentive to it. To treat thinking itself as something worth examining.
Because in technical work, as in life, the hardest bugs to find are the ones we assume cannot exist.
