I recently read a brilliant book called The New Breed, in which the author, Kate Darling, argues that our relationship with robots should not come from a place of fear, but should be informed by our interactions with animals. According to Darling, instead of thinking in terms of humans versus machines, it's much more helpful to think in terms of our relationship with domesticated animals: as in, we're the master (for want of a better word), but they make our lives better and easier. It's an interesting counterpoint to the usual depressing predictions that robot overlords will replace us and take all our jobs.
I mention this because it's never been more important to approach technology with a positive attitude, because times are changing, faster than we ever could have imagined. (And I say that as a futurist who has built an entire career around technology trends. Even I am staggered by the accelerated pace of transformation.) In a world that's constantly changing, digital skills will quickly grow stale and need refreshing. Continual learning will become the norm. And in this ever-shifting landscape, a positive mindset—by which I mean excitement about the possibilities that new technologies bring, and a willingness to learn about them—is what will separate the successful from the not-so-successful. That's why, in this chapter, I aim to spark your excitement for a digital-driven future, a future in which digital literacy skills will become hard currency in the workplace.
In short, digital literacy refers to the digital skills needed to learn, work, and navigate everyday life in our increasingly digital world. It means being able to interact with technologies with ease and having confidence in your digital skills—from the basic digital skills to some more advanced capabilities. So we're talking about skills such as these:
The digital transformation is probably the biggest transformation most of us have seen in our working lives. All of my work is now digital, from routine admin tasks, to creating and sharing content, to consulting with clients, to giving educational seminars. The digitization of work massively accelerated during the COVID pandemic, of course, but I expect it to continue to accelerate. The transformation will become more dramatic. And this transformation will apply across all sectors, even traditionally people-centric sectors like hospitality, education, and healthcare.
No industry will be left untouched by the digital transformation. And this means everyone's jobs will change, to one degree or another. Everyday tasks and communications will increasingly involve digital tools. Learning (whether workplace learning or full-time education) will increasingly harness digital tools. Intelligent machines—which could encompass robots, software, AIs, sensors, and who knows what else in future—will increasingly become part of every workplace, from factories to law firms.
Let's take AI, one of the biggest technology trends that we'll cover in this chapter, as an example. I believe AI is going to augment almost every job that humans do. Here's a cool example from my own business. I've been working with a company called Synthesia to create a digital me. Yes, by recording me in front of a green screen, they've been able to create a realistic digital Bernard that can say anything using my voice—all I have to do is type out what I want the digital Bernard to say and away he goes! This means I can turn one of my articles into a video of me without having to step in front of the camera (something that has enormous potential for growing my YouTube channel). There's even the potential to create videos in other languages with ease.
It's all possible thanks to AI. And pretty soon, a huge variety of occupations will use AI tools to get the job done more efficiently. Architects, for example, will be able to feed a design brief and specifications into AI-driven software and the AI will effortlessly come up with the most efficient designs for the architect to choose from. Or marketers will be able to generate rich content at the touch of a button. Or security guards will be able to analyze masses of security footage for suspicious activity, in real time. It's already happening. Just think of the rise of customer service chatbots—yet another example of AI at work.
This doesn't mean we all need to be retraining as software developers or become AI experts. But it does mean we all need be comfortable around technology tools, and develop the skills to work alongside them. With this in mind, I believe everyone should be asking themselves two key questions:
And looking beyond the workplace, there's no denying that digital technology is irreversibly integrated into our everyday lives. When did you last use a paper map while driving somewhere new? Or write a letter to someone? Or search for a business in one of those big, heavy local directories? My guess is it's been a while (or never, for many of my younger readers). Chances are you reach for a device when you want to find out something, communicate, navigate an unfamiliar city, or whatever. Even these everyday, familiar tasks will change rapidly as AI (and other technologies) evolve. It's therefore vital that we build a society that's comfortable, confident, and capable with technology if we want to thrive. And that requires some investment—both at a government and organizational level to equip people with the skills for success, and at an individual level, to engage with this brave new world and commit to becoming lifelong learners.
There's much work to do. According to one survey, 75 percent of employees think their job will become more digitally demanding within five years, yet a fifth of businesses have no digital skills strategy in place.1 People are at risk of falling behind, in other words, due to a lack of digital literacy.
UK think tank the Learning and Work Institute makes a more urgent case and says the UK is heading towards a “catastrophic” digital skills shortage.2 And the picture is equally troubling on the other side of the pond, where a third of US workers lack digital skills—and this despite the fact that 82 percent of middle-skills jobs (jobs that require less than a bachelor's degree while still paying a living wage) are described as “digitally intensive.”3 Something's got to change. And a big part of the solution lies in all of us embracing essential digital literacy skills.
For me, there are two levels of skills needed. First, there are the basic skills that we all need in order to use technology effectively for everyday tasks, and then there are the next-level skills that I believe are key to thriving in the workplace. We'll get into both levels in this section but ultimately, whether I'm talking about the basics or more advanced stuff, all of these skills are about being able to use technology to solve problems, communicate with others, access and share information, make work (and life) easier, and drive success.
The UK government has an essential digital skills framework that serves as a useful definition of the foundational digital skills everyone needs to navigate 21st-century life with ease. These include things like:
This may sound basic indeed, especially if you're used to working in an office where digital tools have become integrated with most tasks. But consider this: a 2018 report found that more than 11 million people (21 percent) in the UK and 10 percent of working adults lack some or all of these basic digital skills.4
The framework also sets out additional essential skills for work, including:
Given the rapid digitization of work, I'd also argue that basic digital literacy now goes beyond turning on devices, using technology to communicate, and the like. So, to the above lists, I would also add the following as essential basic skills that we all need:
Now let's get into the next-level skills. If the basics are what we need to be able to navigate everyday life and do a job competently, these next-level skills are what we need to really excel in the workplace. These are the skills that will make you more valuable, and will help to “future-proof” your career (if anything can truly be future-proofed in this age of breakneck advancement).
Bear with me here, because I'm going to delve into some technical stuff like machine learning and the “metaverse.” You may be asking yourself, “Do I really need to know about this?” And the answer is yes, absolutely you do. True, you don't need to understand it to the level of a software developer, for example, but you need to have a simple grasp of how AI and other related technologies will impact life and work.
(As an aside, in the future, we probably won't even need humans to have computer programming skills, because AI will be designing software for us. Elon Musk's company OpenAI has already developed an AI called GPT-3 that can generate computer code based on someone simply describing what they want the software to do, which effectively means anyone could create their own software. GPT-3 can also write articles and other content pretty much as well as human writers, even in the style of particular writers, but that's another story!)
By a mile, the number-one trend everyone needs to understand is AI and machine learning. In this book, I'll often use AI as a catchall term to encompass artificial intelligence and machine learning, but that's not strictly accurate. They're not quite the same thing.
Whether it's based on machine learning or more complex deep learning, AI (to use the catchall term) is essentially about using data to make more accurate predictions and better decisions—predictions about which factory machines are likely to break down, for example, or which customers are most likely to ditch your company's product or service in the next year, and decisions such as the most efficient transit route for goods, or which leads the sales team should focus their resources on this month. That's the core of AI and its potential: more accurate predictions and better decisions, made possible by intelligent machines.
There's no doubt in my mind that AI will become the most transformative technology humanity has ever developed. As Google CEO Sundar Pichai puts it, AI's impact will be even greater than that of fire or electricity. The full scale of AI's potential is difficult to wrap our heads around, but let's briefly explore some of the biggest areas in which I expect to see AI breakthroughs in the very near future.
Firstly, we'll see more blended, augmented workforces. Forget those fears of robots replacing human workers. While it's true that some jobs will change due to AI, and some will be lost altogether, the majority of workplaces will become blended environments where humans work alongside intelligent machines, and businesses can get the best out of both machine and human intelligence. In the very near future, more and more of us will find ourselves working alongside intelligent tools on an everyday basis.
Then there's better language modeling. AI allows machines and devices to understand human speech, respond to spoken requests, and even generate content. This will only become more powerful in the near future. Remember the GTP-3 AI that Elon Musk's company has developed? The company is already working on a successor, GTP-4, that will be even more powerful, in theory, giving it the potential to hold conversations that are indistinguishable from human conversation.
AI in cybersecurity is another area to watch. AI is playing a greater role in cybersecurity, by learning to recognize those behaviors that may signal nefarious intentions. I expect this to be a huge focus of AI going forward. (Read more on cybersecurity in Chapter 4.)
AI will also be the lynchpin of the metaverse—a virtual world, like the internet, that we can essentially live in. More on this mind-boggling idea is coming up later in the chapter.
We will also see advances in “low-code” and “no-code” AI. Much as you can use online drag-and-drop platforms such as Squarespace to create your own website even if you don't have any web design experience, low-code and no-code AI will allow people to create their own AI systems using easy, plug-and-play interfaces. This will do wonders for “democratizing” AI and making it more accessible to the masses.
Finally, there's creative AI. Here, we'll increasingly see AI being used for routine creative tasks, such as coming up with headlines and photo captions for articles, or designing infographics—and even for not-so-routine creative tasks, such as writing articles and creating art. Co-creation, where human creativity is enhanced with AI tools, is another key area to watch. (More on this in Chapter 8.)
It's also important to understand that AI, and the wider digitization of our world, will also impact other technology trends, such as 3D printing (indeed, it's already possible to print pretty much anything, from houses to food), or gene editing and synthetic biology. As an example of AI's wider impact, it's now possible to run digital trials of new drugs and vaccines, thereby speeding up development time.
In this way, it's important to consider AI not just as a standalone technology trend, but as an intrinsic part of a wider technology revolution.
AI wouldn't be possible without data. It's data that allows intelligent machines to spot patterns and make predictions. Therefore, another essential digital literacy skill is being data literate. I talk more about data literacy in Chapter 2, but in very simple terms, data literacy means being able to read and use data effectively. I'm not talking about becoming a data analyst here. Rather, data literacy means being able to access, interpret, and extract useful insights from whatever data is needed to do your job well and make decisions.
But data literacy also means understanding that data underpins so many other technology trends, particularly those based on AI. It's data that allows Alexa to understand your spoken requests (the technical term for which is natural language processing) and reply to you in natural speech (known as natural language generation). Data allow machines to “see,” such as in autonomous vehicles, which use cameras and sensors to understand what's going on around the vehicle and act accordingly. This ability of machines to see and interpret visual data is known as machine vision. Then there's robotic process automation, in which software robots are deployed to carry out repetitive tasks, such as scheduling appointments or processing credit card applications. There's also quantum computing—basically, super-fast computers that are capable of carrying out tasks that traditional computers would never be able to manage. None of this would be possible without data.
Data is also connected to 5G, in the sense that better, faster telecommunications networks will allow us to carry out more data-heavy tasks on the fly, wherever we are in the world. This in turn links to cloud computing, because, with data stored in the cloud and better, faster networks, we'll be able to access data stored in the cloud from anywhere. But 5G networks will also enable more edge computing, where data is processed on devices rather than in the cloud.
The proliferation of data has also given us smart, well, everything, from smartphones to smart homes and even smart cities. This will only continue as everything in our lives becomes smarter, from our fridges and vacuum cleaners to our workplaces.
Could the future of the internet be us living in the internet rather than just looking at it? That's the idea behind the metaverse concept, and it's the next big digital trend after AI. Mark Zuckerberg has said building a metaverse is something he was interested in before he ever dreamed of Facebook. But what is a metaverse? It's the term for a persistent, shared, virtual 3D world, in which more and more activities—working, gaming, going to a concert, shopping, hanging out with friends, and more—take place in a virtual, not physical, environment. “Shared” is a key word there, since the metaverse is all about creating a shared, immersive experience where people can collaborate and interact as though they were in the same physical space. The metaverse doesn't need to be limited to one platform, but there does need to be a shared, continuous experience. So you could move from an immersive Virtual Reality (VR) environment to a 2D application on your phone, but the key thing is there's continuity between the activities and environments. Having your own individual digital avatar—a digital you—that represents you across different experiences will be a key feature of the metaverse.
The idea of humans being permanently plugged into machines experiencing an immersive digital reality naturally raises unfavorable comparisons with The Matrix. And of course there are moral and ethical challenges to consider, such as the potential for anonymous trolls to stalk us across immersive digital spaces. But, if you think about it, the metaverse is a concept that humanity has been naturally building towards since the emergence of the internet, social media, shared digital environments such as Second Life, and virtual and augmented reality. In other words, more and more of our everyday activities are already taking place in a digital environment—something that the pandemic only accelerated—and the metaverse could be the next logical step on that journey.
If this sounds far-fetched, consider the ever-popular Fortnite game as an example. The game has begun hosting virtual concerts on the platform, attended by millions of players who can watch artists like Ariana Grande perform a set within the game instead of in real life. This is a sign of what's to come, if the metaverse concept comes to fruition.
Even more, let's say, vintage artists are getting in on the trend for immersive, digital experiences. Swedish supergroup Abba has worked with visual effects company Industrial Light & Magic to create digital versions of themselves in their prime—creating virtual copies of the foursome that behave accurately in every way, right down to every dance movement. And this de-aged virtual Abba (ABBAtars, as they've been dubbed) will be taking to the stage in London in 2022 in a digital performance named Abba Voyage. Fans will go to a purpose-built, physical venue in London, but they'll be treated to virtual avatars, depicting the group as they were in 1979. It's an intriguing glimpse at the future of entertainment, one that could potentially see Elvis resurrected for concerts, and who knows what else.
But realistically, how close are we to developing a metaverse? Companies like Facebook and Microsoft that are exploring this area mostly position the metaverse as an aspirational thing to aim for. So it's not like it's just around the corner. For now, the metaverse is more of a concept for making existing online environments more immersive and even more deeply integrated with our lives—for instance, by merging virtual reality with social media, something that Facebook has said it hopes to do within the next five years.
Looking further ahead, living inside the internet will be possible thanks to new, more immersive devices and hardware. For example, instead of using chunky VR headsets to enjoy a virtual experience, we'll be able to put on a pair of comfortable smart glasses—and beyond that, potentially wear smart contact lenses. The line between the real world and the digital world will become all the more blurred as we find new ways to plug into digital experiences. (By the way, if you're interested in reading more about this topic, check out my book Extended Reality in Practice.)
Clearly social media and virtual reality are key stepping-stones on the way to unlocking the metaverse. But, in my opinion, another important stepping-stone comes in the form of Omniverse, developed by gaming and AI pioneers Nvidia. Omniverse is a simulation and collaboration platform that runs physically realistic virtual worlds and connects to other digital platforms. At this point, it's mostly designed to coordinate remote teams and give them digital environments that recreate the experiences of working together in real, shared workspaces—including fully animated avatars created from webcam feeds. Tools like this could revolutionize the nature of work as more and more of us switch to remote working. But it's also easy to see how a platform like this could be extended to all kinds of nonwork experiences, such as having a virtual quiz night with friends.
Let's explore some useful starting points for boosting your digital literacy.
The first step is to understand where you're currently at in terms of digital literacy. Start with the basic skills outlined in this chapter and assess whether you have the knowledge needed to use technology effectively in everyday life. Depending on where you are in the world, there should be some useful government or institutional learning resources to help you pick up essential digital skills. A great example in the UK comes from the Open University, which has a free course called “Digital Skills: Succeeding in a Digital World,” designed to help people develop the confidence and skills for life online.
When it comes to the next-level technology trends, such as AI and the metaverse, let me stress again that we won't all have to become tech experts to succeed. Rather, you need to be aware of these technologies, and consider how they might impact your work and life. How you keep abreast of these trends will depend on your preferred way of learning new things. There are resources like WIRED magazine, and the GeekSpeak podcast. Or there's tons of accessible information on YouTube, and a whole host of free online courses on particular topics, designed for every level from beginner to pro.
There's also my own website, bernardmarr.com, which provides a wealth of info on all manner of technology trends, plus practical case studies that show how businesses are already using these tech tools to drive success. And don't forget to check out my YouTube channel for video content, from short and accessible videos to deep dives on certain topics. Just search my name on the platform, and hit subscribe.
And, of course, do encourage your employer to invest in digital literacy training and support. This will be a harder task at some companies than others, but try to sell the positive benefits that come with enhanced digital skills—including improved productivity and performance.
Ultimately, though, the very best thing you can do for your digital literacy skills is to think of yourself as a “lifelong learner.” (Read more about continual learning as an essential skill in Chapter 18.) And the second-best thing you can do is to approach new technologies with a positive mindset. Because, yes, the digital transformation will change many people's jobs, and will lead to many millions of jobs becoming obsolete. But the number of new jobs being created by the digital revolution will outnumber those lost. To put this in numbers, the World Economic Forum estimates that 85 million jobs may be displaced by 2025 as the division of labor shifts from humans to machines, but, crucially, 97 million new roles will emerge that are more suited to this new division of labor.5
Employers will also need to take steps to ensure their workforce has the digital skills necessary for success. The starting point is to understand the current state of digital literacy in the organization, and identify any existing gaps and training needs. Good questions to ask include:
Employee surveys (and, potentially, digital tests) will help you understand the current state of digital literacy. Only then can you put together a digital learning program and provide ongoing support—harnessing both online learning and on-the-job training as appropriate. But remember, a key part of boosting digital literacy in the workplace is emphasizing time and time again why it's valuable for every employee. After all, change can be scary, so it's important to cultivate a positive attitude to technology and dispel those negative stereotypes of robots coming for people's jobs. Couple this with an organizational culture that values lifelong learning and you're in a great position to face the rapid transformation coming our way.
Let's finish up this chapter with some final key takeaways.
The massive digital acceleration we're experiencing wouldn't be possible without data. So let's turn to the essential data-related skills needed to thrive in the 21st-century workplace.