top of page

10 Techniques to Improve Your Thinking Skills

A person learning how to improve thinking skills.

This site contains product affiliate links. We may receive a small commission if you make a purchase after clicking on one of these links.


We like to believe we’re rational thinkers. We weigh options, consider evidence and make choices based on logic, reason and experience. But the truth is a little messier – and a lot more interesting.


Every day, our decisions are quietly shaped by mental shortcuts called cognitive biases (or ‘heuristics’) and thinking errors. These aren’t signs of stupidity or laziness; they’re built-in features of the human brain. They evolved to help us make snap judgments in a complex world; a world where decision speed could be a matter of survival – is that rustling in the bush a squirrel or sabertoothed tiger?


The problem is that what helped our ancestors survive life and death threats doesn’t always help us moderns think clearly. As Robert Greene observes in his magnum opus, The Laws of Human Nature, through biases our brains ‘distort’ reality which causes the ‘mistakes and ineffective decisions that plague our lives.'


From relationships and money decisions to career choices and social media spats, our biases influence how we interpret and respond to internal and external stimuli – often without us noticing.


Let’s walk through some of the most common ones, how they show up in everyday life and why simply knowing about them can make a real difference to the quality of how you think.


Why our brains take shortcuts

Your brain processes an enormous amount of information every second (apparently it churns out over 50,000 thoughts a day!). To cope with this perpetual deluge of information, it relies on shortcuts or heuristics – rules of thumb that save time and energy. This is the ‘fast’ or ‘System 1’ thinking that Nobel Prize-winning psychologist, Daniel Kahneman, talks about in his book, Thinking Fast and Slow.


Most of the time, heuristics work well enough. But under uncertainty, emotion, or pressure, these shortcuts can misfire. Instead of seeing the world as it is, we see it as our brains expect it to be. In Kahneman’s words:


"If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it."

That’s where biases and thinking errors come in.


Confirmation bias: Seeing what you want to see

‘I know what I know and the evidence I’ve seen supports my view.’


Confirmation bias is our tendency to seek out, interpret and remember information that supports what we already believe – and ignore what doesn’t.


If you think a co-worker is incompetent, you’ll notice every mistake they make and gloss over their successes. If you believe a certain diet works, you’ll focus on testimonials that praise it and dismiss studies that question it.


This bias is especially powerful online. Social media algorithms amplify confirmation bias by feeding us content we already agree with. Thus, unwittingly, we wander into ‘echo chambers’ where our beliefs grow stronger simply because they’re repeated and reinforced.


The danger isn’t having opinions – it’s mistaking familiarity for truth.


A simple way to attenuated confirmation bias is to seek out contradictory information. To do this, we can adapt a decision-making method outlined in Benjamin Franklin’s autobiography. He suggests drawing up a table with ‘pro’ on the one side and ‘con’ on the other. Then you would conduct research ensuring that you made it a rule that for each pro you’d also include a corresponding con.


Availability heuristic: When what’s memorable feels common

‘This is obviously the case because the available data confirms it.’


The availability heuristic is when we judge how common or likely something is based on how easily examples come to mind. Here are a couple of easy-to-picture examples.


Reports of plane crashes feel terrifying because they’re vivid and heavily covered in the news. Such reports are nearly always followed by a steep decline in air travel. Yet flying is statistically far safer than any other form of transportation – the chance of a first world plan coming down is about one in eleven million.


A friend’s messy breakup might make relationships feel doomed, even if most couples you know are doing fine. And the doom and gloom story of the split will get you questioning your happy relationship.


Our brains confuse memorable with frequent.


Availability heuristic can distort how we assess risk and probability – often pushing us to worry about dramatic but unlikely events while ignoring quieter, more common dangers.


Anchoring bias: The first number sticks

‘I arrived at that price estimate of that property based on experience.’


Anchoring bias happens when we rely too heavily on the first piece of information we receive. In certain circumstances this makes sense. For example, when making a long journey, using the mile markers on road signs to judge how much fuel you’ll need. However, studies have shown that even irrelevant information can distort our judgements.


If you see a jacket originally priced at £300 and marked down to £150, it feels like a bargain – even if £150 is still overpriced. The original number becomes the anchor against which the current number is compared. Marketers know we fall foul to this bias which is why this sales tactic is ubiquitously used.


Anchors influence salary negotiations, real estate prices, first impressions and even judicial decisions. Once set, they’re surprisingly hard to shake, even when better information comes along. Here’s an example.


In the book, Noise, the authors outline a study showing how anchoring can affect the sentences judges set for the same crime. Before being presented with a fictional case study of criminal conviction, judges are discreetly provided with a high or low number. They are then asked to read the study and decide on a sentence. What researchers found was that the majority of judges anchor their sentences on the number they were previously given – high numbers correspond closely with high sentences, and vice versa for low numbers.  


Hindsight bias: “I knew it all along”

‘I thought it would happen that way – and guess what, it did!’


After something happens, it often feels obvious – like we knew it would turn out that way. Scores of people reported that they predicted the 2008 financial crash. But conveniently they only broadcast their predictive prowess after the event. ‘Oh, I saw it coming. It was obvious because so many people were overleveraged on their mortgages.’


That’s hindsight bias: the belief that you portended an outcome once the facts are in.


When a relationship ends, we suddenly see all the warning signs. When a stock crashes, the risks feel painfully clear. This illusion of predictability makes us overestimate our foresight and underestimate uncertainty. In addition, it can strengthen false confidence which breeds two other biases – overconfidence and the Dunning-Kruger effect.


The problem? It prevents learning. If outcomes always seem obvious in retrospect, we don’t seriously examine how limited our knowledge was at the time – or how much luck played a role. Why bother investing the time and effort in understanding the factors that culminated in the outcome when your suspicions are always confirmed?


Overconfidence bias: When confidence outruns accuracy

‘I don’t need to check the facts because I know all about it.’


Overconfidence bias is the tendency to overestimate our abilities, knowledge or judgment. As was previously mentioned, hindsight bias can inflate overconfidence which in turn can lead to us succumbing to the Dunning-Kruger effect.


Sociological questionnaires show that most people believe they’re above-average drivers. And it’s not without a sense of irony that respondents with a string of motoring offences report being the best. Many assume they’re better decision-makers than their peers. This isn’t arrogance – it’s human nature.


Confidence feels good, but unchecked overconfidence leads to risky decisions, poor planning and resistance to feedback. It can make us skip preparation, ignore advice or double down when we should pause. An example is the person who refuses to reform a detrimental lifestyle habit because they believe that they are exempt from the risks.


Ironically, the more complex the task, the more likely overconfidence becomes.


Fundamental attribution error: Blaming the person, not the situation

'He failed because he lacked skill. When I failed it was because of a lack of support.'


When someone else messes up, we tend to blame their character. When we mess up, we blame circumstances. When someone is promoted, it’s because they were buttering up management. When we get promoted, it's because of our intelligence and work ethic.


This is the fundamental attribution error: mistakenly attributing an effect to a cause.


If someone cuts you off in traffic, they’re reckless and antisocial. If you cut someone off, you were late or distracted. We underestimate how much context shapes our behaviour – stress, incentives, constraints (recency effect, anchoring) – and overestimate personality and, of course, our inherent biases.


Falling foul to the fundamental attribution error fuels misunderstanding, erroneous judgment and conflict. That final adverse outcome – conflict – is particularly prevalent in workplaces and relationships. One reason is because mistakenly attributing the effect of a cause (a colleague getting promoted because they’ve been buttering up management) can foster resentment.


It also makes empathy harder, because we don’t see the full picture behind others’ actions. A simple strategy to breakdown this bias is to suspend judgement. Another method would be to compile a list of possible causes: that colleague, as well as being unusually friendly to managers of late, has also been putting in long hours and they did lead on that important project.


Survivorship Bias: Learning only from winners

‘I could write the next bestseller. Just look at how many there are!’


Survivorship bias occurs when we focus only on successes and ignore failures that didn’t make it into view. This is elegantly expressed by the ‘long tail’ – a graph with a sharp upward spike followed by a long flat tail. In the context of, say, book sales, what this shows is a small number of super-successes and a large number of failures.


We constantly hear about successful entrepreneurs who made fortunes but not the thousands (even millions) whose startups failed. (In The Lean Startup, we are reminded that some 99% of businesses do not survive the first three years.) We read about bestselling authors, not the rejected manuscripts or self-published books that make no profit. We see influencers living glamorous lives, not those who burned out and disappeared.


Survivorship bias creates distorted expectations. ‘If that book has sold a million copies, so could my manuscript.’ Success starts to look more common – and failure more personal – than it really is. This leads us into a few false assumptions: 1) success is more ubiquitous than it actually is, 2) successful outcomes are a product of hard work, grit and genius, 3) if my project failed it must mean that I’m a failure. A couple of corrections are required.


First, success in all domains – art, business, sport – is quite rare. Here’s a sobering example. According to one industry insider, of the roughly two-million titles published each year, fewer than 1% are successful – they sell more than a few thousand copies. And those stats don’t include rejected manuscripts and manuscripts that remain under the mattress.


Second, ever more attention is being drawn to the role lady luck plays in successful outcomes. Nassim Nicholas Taleb highlights this in his book, Fooled by Randomness. ‘Survivorship biases,’ he says, arise ‘from the fact that we see only winners’ which distorts our ‘view of the odds’ while casually concealing ‘the fact that luck is most frequently the reason for extreme success.’


Third, just because your project didn’t succeed doesn’t mean you’re a failure. As cheesy as it sounds, you’re a winner for having a go – as Ralph Waldo Emerson purportedly said, “Our greatest glory is not in never failing, but in rising up every time we fail.” So next time you experience a setback, remember that failure is the handmaiden for future success.


The lesson isn’t “don’t aim high,” but “don’t confuse visibility with probability.”


Sunk Cost Fallacy: Throwing good after bad

‘I’m not selling that stock because it cost me good money!’


Picture this. You’re at the movies, popcorn box in hand, litre of sugar water nestled between your legs, watching that flick everyone’s been raving about. The only problem is, after 30-minutes, you’ve not only polished off the popcorn, but you’re bored to breaking point. You have two options. 1) Suffer through the remaining 90-minutes and waste even more of your precious time, or 2) get up and walk out knowing that you paid for that seat.


Chances are, like most people, you’ll take option one and doggedly see that film through to the closing credits. After all, isn’t it illogical to discard something when you’ve paid good money for it? Actually, contrary to common misunderstanding, no, it isn’t – the reverse is logical.


The sunk cost fallacy is our tendency to continue something because of what we’ve already invested (such as £15 on a movie ticket), rather than what makes sense going forward (not wasting any more time).


Summed up in the colloquialism ‘throwing good money after bad,’ examples of sunken costs include:


  • staying in a job you despise because you’ve been there for donkey’s

  • finishing a terrible book because you’re a quarter through

  • continuing a failing relationship because of shared history

  • refusing to sell a falling stock because it was the darling of your portfolio


Past investments are emotionally powerful, and we can’t help forming strong attachments to them. But they’re irrelevant to future outcomes. Economists and psychologists tell us that every new decision should be based on current costs and benefits, not guilt over the past.


Refusing to leave that banal movie because you paid £15 for the ticket will cost you the remaining time plus the displeasure of watching a substandard film. Also, you’ll incur an opportunity cost: while you remain in your seat you are denied the opportunity to do something enjoyable.


When these ‘hidden’ costs are factored into the total price, the sunk cost fallacy can become very expensive. (Taleb tells of an investor who, thirteen-million dollars up, lost it all because he couldn’t part with his precious stock.)


Hard though it is, we must know when to cut our losses – and the earlier the better.


Dunning–Kruger Effect: Not knowing what you don’t know

‘You’re wrong. The single reason why the economy crashed in 2008 was greed.’


The Dunning–Kruger effect describes a strange phenomenon: why people with limited knowledge often overestimate their competence, while experts tend to underestimate theirs.


When you don’t know much about a topic, it’s hard to recognise your own gaps. You quickly grasp the basics, which is much more than what most people you meet know. Confidence soon swells. But, for those who delve deeper into the subject, a counterintuitive relationship occurs. As knowledge grows, uncertainty increases and confidence drops.


This effect explains why loud opinions often come from the least informed – and why genuine experts speak with nuance and caution or settle for silent reflection. But as Rolf Dobellie observes in The Art of Thinking Clearly, even professionals are susceptible to overconfidence. ‘If asked to forecast oil prices in five years’ time, an economics professor will be as wide of the mark as a zookeeper will.’


The Dunning-Kruger effect can be further exacerbated by confirmation bias: selecting information that confirms your belief or filtering feedback so that you are left with only positive insights.


This negative, self-reinforcing feedback loop fuels overconfidence which impedes learning and development. Those who are overconfident in their beliefs or skillset are less likely to push beyond their current ability. Thus they may never enter those expansive vistas of insight that humble even the most knowledgeable scholars.


Awareness doesn’t instantly fix this bias, but it encourages intellectual humility, which is a powerful corrective.


Status Quo Bias: The comfort of staying the same

I’m sticking to what I know, what works, what feels right.’


We’re mostly creatures of habit who viscerally hate change. We prefer things to remain as they are, even when change would likely be beneficial. In many the announcement of a ‘structural change’ at work or returning home to see a ‘for sale’ sign in the neighbour’s front lawn can induce sickening anxiety. Our inherent predilection for permanence is called Status quo bias.


Change feels risky and adaptation to new circumstances takes thought and physical effort. Familiar problems feel safer than unfamiliar solutions. Plus, well-trodden familiarity presents the path of least resistance. So, we stick with default settings, suboptimal heuristics, outdated habits and unsatisfying routines. And not because they’re best, but because they’re known. As the saying goes, it’s better the devil you known than the devil you don’t.


This bias quietly shapes careers, finances, relationships and the growth mindset. It’s why change often requires more energy than stagnation, even when stagnation is costly. For example, the person who believes that depositing money into a savings account is the best way to prepare for retirement may be missing out on a small fortune. (In a Fidelity International article, we are provided with a comparison of the earnings trajectory of £1000 in a cash savings account and £1000 in a global share fund. After twenty-five years, the cash savings would be worth £1700 whereas the global share fund would be worth £5000. This doesn’t sound a great deal, but the return on investment is 5.7 times more. Consider how much of a difference that would make over a 40-year horizon in an account where regular deposits are made.)


But to break free from the status quo bias doesn’t require a paradigm shifting change in perspective. Making small modifications to our daily round can serve as a form of status quo-busting training. The same method can be applied to any area in our life where we have become mouldy (such as emotional intelligence). Challenge your cherished beliefs by seeking out contradictory information. Ultimately, the goal is to continue rolling forward because rolling stones gather no moss.


Why improving your thinking skills matter

Biases and thinking errors don’t just affect big life decisions. They shape everyday judgments – how we interpret conversations, evaluate ourselves and understand others.


Left unchecked, they can:

  • Distort reality

  • Fuel conflict and polarisation

  • Lead to poor long-term decisions

  • Undermine learning and growth


But here’s the good news: awareness helps.


You don’t eliminate biases – you manage them. The goal isn’t perfect rationality; it’s better judgment.


How to how to improve thinking skills

You won’t outsmart your brain, but you can slow it down:


  • Pause before deciding, especially when emotions are high

  • Actively seek disconfirming information

  • Ask what you might be missing

  • Separate past costs from future choices

  • Assume context matters for others – and for you


Even small shifts in how you question your own thinking can compound into better decisions over time.



Final thought

Biases and thinking errors aren’t personal flaws. They’re universal. The most dangerous ones are the ones we don’t notice.


The moment you start spotting them – in yourself, not just others – you gain something powerful: the ability to step back, reconsider and choose more deliberately.


And in a world full of noise, that’s a quiet superpower.



Learn how to improve thinking | Selected reading list

Thinking Fast and Slow – Daniel Kahneman

This important book explains why we do what we do. Why we make errors in judgement and self-defeating decisions. Kahneman, winner of the Nobel Prize, distils over 40-years of scientific research and scholastic investigation into showing not only why we make irrational choices but, more importantly, how to correct them.


Rationality – Steven Pinker

In Rationality, Pinker dispels the assumption that humans are inherently irrational cavemen(women) with outdated brains in a technological age. In this highly accessible book, he outlines a range of cognitive tools logic, critical thinking, probability, causal inference, and decision-making under uncertainty that we can apply in every day life.


The Art of Thinking Clearly covers a staggering range of teachings, from why you should not accept a free drink to why you should keep a diary. Simple and straightforward in its composition, The Art of Thinking Clearly will provided you with a roadmap to smarter decision making.


Noise – Kahneman, Sibony, Sunstein

Noise explains how and why humans are so susceptible to noise (divergent judgement formation) and bias in decision-making. We are all prone to making 'suboptimal' judgements. Mostly, poor decisions are harmless, However, sometimes they can have a disastrous impact. With a few simple remedies (such as understanding biases), Noise explores what we can do to make better ones.


Laws of Human Nature – Robert Greene

We are social animals. Our very lives, happiness and success depend on our relationships with people. Knowing why people do what they do is the most important skill we can possess, without which our other talents can only take us so far.


Predictably Irrational – Dan Ariely

Predictably Irrational exposes our strange, often counterintuitive behaviour. It demonstrating how irrationality often undermines rational thought and that the reason for this is embedded in the very structure of our minds.



About Dr Laura Allen –

A Chartered Psychologist & Integrative Therapist, Dr Allen specialises in a broad range of therapeutic methods. She is a published author of numerous research papers and Interactive Courses in the field of Psychology. Dr Allen works one-to-one with clients and supervises other practitioners. She is also a proud member of the British Psychological Society assessment team supporting psychologists in training.

Comments


bottom of page