Now we move from hard, technical skills onto the soft skills that are related to personality and personal attributes. But just because something is a soft skill doesn't mean it can't be learned and improved. Any soft skill can be honed. And hone we must, because as more and more practical tasks become automated and given over to machines, certain soft skills will become increasingly valuable in the workplace.
For me, critical thinking is right up there as one of the most vital soft skills to cultivate for future success. In this era of fake news and social media bubbles—and indeed, with the sheer volume of information that we're now bombarded with on a daily basis—the ability to look at evidence, evaluate the trustworthiness of a source, and think clearly is becoming more important than ever. Critical thinking is often confused with being critical or negative, but that's a misconception. Critical thinking is about objectivity. It's about having an open, inquisitive mind—something that all employers find valuable.
Critical thinking isn't just important for career success (or academic success, for that matter). If you lack the ability to think critically, you may be more vulnerable to things like manipulation, fraud, and fake news in your everyday life. Critical thinking is an important life skill, then—and not just something you get credits for at university or a throwaway statement that looks good on your CV.
We all think, right? As humans, we can't help but think. (If you've ever tried to meditate, you'll know how hard it is to “turn off” your thoughts.) But there's a big difference between the kind of thinking we do routinely and thinking critically. Because our everyday thinking—the kind of thinking we do without, well, thinking about it—is hampered by things like incomplete information, personal opinions, assumptions, biases, and even prejudices. Not all thought is high-quality thought, in other words. This is why we need critical thinking.
Critical thinking essentially means thinking objectively. It means analyzing issues or situations based on evidence (rather than personal opinions, biases, and so on) so that we can build a thorough understanding of what's really going on. And from this place of thorough understanding, we can make better decisions and solve problems. To put it another way, critical thinking is Mister Spock coolly applying reason and logic to an unfolding situation, while Bones is in the background spluttering out the first emotion-laden thought that comes into his head.
Delving a little deeper, critical thinking is comprised of several processes:
In a nutshell, I like to think of critical thinking as active, independent thinking—as opposed to passively swallowing information and taking it at face value. Importantly, this is a skill that can be cultivated and improved over time, meaning we can all take steps to become better critical thinkers. And, given some of the challenges in the modern workplace and the world at large, it's vital we all do just that.
There are some troubling trends at play in society today—trends that pose a threat to critical thinking and, at the same time, make critical thinking more important than ever. At the end of the chapter, we'll look at ways to overcome these roadblocks and improve critical thinking, but for now let's focus on the trends themselves.
Bias obviously isn't a trend. Humans have always had biases (and, yes, we all have them). But trends such as increasing polarization and social media filter bubbles can make it harder to recognize and break free of biases. That's why it's worth dwelling on the subject of bias, and how it affects thought.
No matter how rational and logical we think we are, the truth is we're all under the influence of cognitive biases—biases that influence our beliefs, thoughts, and decisions. Indeed, a good many of us will have experienced cognitive bias in action. Sometimes these can be very obvious, such as gender bias and stereotyping, but other types of bias can be incredibly subtle and hard to spot. For example:
These are just a few of the biases that can have a powerful impact on our everyday thought and decision-making. There are many more. On top of these cognitive biases, there's often bias in the data we use to make decisions (see Chapter 2), meaning the data may be skewed towards a particular group, or exclude or underrepresent others. All of which means we need to work hard to spot bias in our own thinking, and in the information we're presented with.
The world seems to grow smaller every day as the gaps between our different cultures lessen (think of British teens soaking up South Korean pop music, or Americans celebrating Saint Patrick's Day). And yet, at the same time, we seem to be getting more, not less, divided. This is the polarization trend, where people are divided into sharply opposing camps, with little or no ability to identify with the other side. Think of the Brexit versus Remain debate, or the apparent chasm between Democrat and Republican voters in the US.
One 2018 study perfectly demonstrates polarization in action. In it, people were asked to estimate the percentage of Democrats who are Black, atheist or agnostic, and gay, lesbian or bisexual, and the percentage of Republicans who are evangelical, 65 or older, Southern, and earning more than $250K a year—essentially playing on stereotypes for both parties. Democrats believed that 44 percent of Republicans earned more than $250K a year (the true figure is 2 percent), and Republicans believed that 38 percent of Democrats were gay, lesbian, or bisexual (when it's really around 6 percent).1 In other words, misconceptions ran high when describing the “other” party. And the more political information people consumed, the more they were mistaken about the other side.
How can this happen? Fundamentally, the way we consume information has changed. The internet has made it possible to access whatever information we want and yet the sheer vastness and choice of information has only widened gaps in society, particularly when it comes to politics (although political polarization isn't the only kind of polarization). As Ezra Klein puts it in his book Why We're Polarized, “Greater choices let the devotees learn more and the uninterested learn less.”
Based on this, you might think that the cure for polarization is consuming a wider variety of information, including info from the “other side,” but that doesn't seem to help. One study had Twitter users who identified as either Democrat or Republican follow a bot that shared tweets from authoritative figures and organizations on the other side. The participants were regularly surveyed about their views during the month-long experiment, and these surveys showed that respondents became more polarized, not less, after being exposed to opposing voices.2 Republicans expressed substantially more conservative views after the month was up, and Democrats became slightly more liberal.
So what can we do about this? The first step is to recognize that we're all prone to polarization, and not just political polarization. If you've ever clicked on a Buzzfeed headline along the lines of 22 Signs You're a '90s Child or 33 Things Only People with IBS Will Understand, you'll know how alluring it is to feel part of a “tribe.” This is why we need to exercise critical thinking skills in all walks of life. By thinking critically, we can spot the information that attempts to position different groups as “other,” learn to question our assumptions, and apply logic to the information we consume on a daily basis.
One reason for the increasing polarization in our world is the fact that so many of us get our news from social media—especially younger people. More than half of teens, for example, say they get their news from Instagram, Facebook, and Twitter.3
The trouble with social media is that it's designed to keep us coming back for more, and as such these platforms repeatedly serve up information that they know we'll like, based on our existing interests and beliefs. (By the way, I highly recommend you watch The Social Dilemma on Netflix. It shows with astonishing clarity how social media apps are designed to maximize our attention.) So if you like and share anti-vaccination content on Facebook, for example, the platform will show you more and more of the same content. The danger is we can end up thinking the world is exactly as we see it online because we're never presented with information that challenges our beliefs.
This is what's commonly known as a “filter bubble”—a state of information isolation caused by algorithms feeding us content that we agree with, based on our previous behavior. In the filter bubble, news that we dislike or disagree with is automatically filtered out, and this can create an “echo chamber” effect, where our perception of reality can become distorted because we're only ever exposed to views that mirror our own.
Critical thinking helps by encouraging us to step beyond the filter bubble and echo chamber. Someone who thinks critically is able to stop and ask themselves “Am I seeing the world as it really is, or am only seeing a narrow slice that echoes my own beliefs?”
There's also the fact that what we're seeing on social media may not even be true! Misinformation spreads like wildfire online, and two-thirds of US adults say they've come across false information on social media. If, like me, you're surprised that figure isn't higher, consider the sobering statistic that 56 percent of Facebook users can't actually recognize fake news that aligns with their beliefs.4
Misinformation was particularly rife during the pandemic, prompting the World Health Organization to use the term “infodemic” to describe the rapid spread of misleading or fabricated news.5 As an example, according to a 2020 NPR/Ipsos poll, 40 percent of Americans believe COVID-19 was created in a lab in China, even though there is no evidence for this.6
While some of the misinformation online isn't necessarily a deliberate attempt to deceive people, we can't ignore the impact of malicious disinformation, designed to sow confusion and chaos. (Note the distinction between misinformation and disinformation. Both present untrue, inaccurate, or misleading information, but the difference lies in the intent. Misinformation refers to false or out-of-context information that's presented as fact, regardless of an intent to deceive people, while disinformation is a type of misinformation that intentionally and maliciously attempts to deceive people. To put it in regular terms, misinformation might be your Aunt Jan sharing an article on Facebook that she believed to be true, but you later find out to be false; while disinformation might be a political party or nation-state deliberately setting out to deceive or mislead people with false, incomplete or out-of-date information.)
To be clear, I'm not just talking about countries like Russia and China intentionally spreading disinformation; during the pandemic, President Trump repeatedly touted unproven medical treatments, contradicted his own government scientists, and even retweeted conspiracy theories by celebrities Diamond and Silk, whose own Twitter account was locked for spreading false information about the pandemic.7 Trump ended up being permanently banned from Twitter, but undeterred, he was busy announcing his own social media app TRUTH Social just as I was writing this chapter. What really troubles me is that Trump's plans include a video-on-demand service that will feature entertainment, news, and podcasts.8 This planned progression from social media to content creation is likely to make the whole polarization and echo chamber problem even worse, and make it harder for people to understand that other views exist—yet another reason why we need critical thinking skills.
Deepfakes are where artificial intelligence and deep learning techniques are used to create fake images, video, or audio—often with disturbingly realistic results. If you're not sure you've seen a deepfake, watch the YouTube video “Spot on Al Pacino impression by Bill Hader” and see the comedian's face eerily morph into a young Al Pacino as he mimics the actor's voice. Or check out @deeptomcruise on TikTok, an account devoted entirely to Tom Cruise deepfakes, in which you can watch fake Tom Cruise performing magic tricks or doing banal stuff like washing his hands.
A lot of deepfakes are pornographic (for example, mapping female celebrity faces onto porn stars' bodies), but they can also be used to embarrass public figures and even disrupt elections. Facebook took steps to ban deepfake videos that were aimed at misleading viewers in the run-up to the 2020 US election (basically, the videos were designed to make people think politicians had said words they did not actually say).9 Other deepfakes are aimed at scamming people out of money. In one example, the boss of a UK subsidiary of a German energy company authorized the payment of £200,000 into a Hungarian bank account after falling victim to a fake phone call from the German head of the company, featuring what the company's insurers suspect was deepfake audio.10
Deepfake technology isn't inherently bad. Remember that digital avatar of me that I mentioned in Chapter 1? It's made possible thanks to deepfake technology. And in the future, as we spend more and more time in the metaverse (Chapter 1), the ability to have a super-realistic digital version of ourselves will mean you can hang out with friends online in a more immersive way, and even conduct meetings from the comfort of your sofa, with your digital self dressed in smart clothes, while the real you lounges comfortably in pajamas! But the flip side is that people could hijack our digital selves very convincingly, potentially with devastating results.
For me, the even bigger concern is that deepfakes will further undermine trust in what we see, which could of course be used to certain people's advantage. President Trump, for example, has reportedly suggested that the infamous audio of him boasting about grabbing women by the genitals was not real, despite admitting in 2016 that “I said it, I was wrong and I apologize.”11 In other words, as deepfakes become more of a problem, and become even more convincing, there's a distinct danger that political leaders will be able to claim things that really happened—things that we actually saw or heard—never really happened!
Bottom line, it'll become increasingly difficult to tell the real from the not-real, which means we all need to think more critically about the content we consume.
It should be clear by now that individuals and organizations without critical thinking skills are at a disadvantage in today's world, so let's briefly explore some practical ways to beef up critical thinking skills.
In essence, critical thinking means not taking information at face value. In practice, this means you should:
Working with a mentor can help you practice these skills in your working life—at the very least, it can be helpful to have someone pull you up short when you're responding to information from a place of emotion or assumption. Also, do check out online learning platforms such as Udemy and Coursera for helpful courses on general critical thinking skills, as well as courses on specific subjects like cognitive biases.
I'd highly recommend building critical skills training into your soft skills training programs—and ensuring critical thinking is covered in relevant technical training (especially data literacy).
In this chapter, we've learned:
One of the many benefits of critical thinking is it enables you to make better decisions. This leads us very neatly onto the next future skill: decision-making.