Deloitte: How to Prevent AI from Taking Your Job

An interview with David Mallon, managing director of Deloitte Consulting

Deborah Yao, Editor

February 26, 2024

6 Min Read
Illustration of a robot worker
Getty Images

Workers, it can be said, have a love-hate relationship with AI: It can make the routine parts of their jobs easier but it also has the potential to replace them.

AI Business recently sat down with David Mallon, managing director at Deloitte Consulting, to discuss the impact of AI on jobs and ways in which white-collar employees can avoid being replaced by AI.

The following is the edited transcript of that conversation:

You’ve seen the headlines: Companies are cutting jobs because of AI’s increased efficiency or they want to pivot towards an AI-ready workforce. How can you protect your job from being taken by AI?

AI is causing no small amount of anxiety on the part of workers and teams and leaders. They are wondering, ‘Is it going to affect me in some way?’ But it is also causing a great deal of wonder as well. …

Rank-and-file workers are, broadly speaking, for the most part more excited by AI than afraid of it, and are certainly up on the notion that they could give to an algorithm a lot of the mundane parts of their job that we all would not want to do.

But since we are not going to outdo the algorithms (on tasks they can do for us), we should look for opportunities to upskill ourselves and our people in some of the AI tools of the trade, like prompt engineering and so forth.

But more to the point, we need people who are curious, who are good at divergent thinking, who are good at building and sustaining human relationships. … Once AI has transformed a particular job, what's left over are what we call human capabilities. Others call them soft skills or power skills.

I've got a 13 year old and a 15 year old. If I tell them what they should be focused on as they start their careers, it would be to think about those things that are distinctly human − being able to tell a story, to find the connections between things, to ask the right questions … and building relationships.

We tend to value the things that take a lot of work to create. It should not be surprising that today there is an explosion of interest in things that artists (make), in luxury items for which there is a distinct human element in their creation.

If you don't want to be disrupted by AI but also don't necessarily want to work in the tech world, that's fine. Then gravitate to professions where there is a distinctly human element. Be curious, be open. Find ways to stretch your imagination.

The best way to avoid being disrupted by AI is to figure out how to use AI to reinvent what you do. If you're on the forefront of re-authoring your own job, you've increased the likelihood that you're going to be just fine no matter how this plays out.

What types of jobs or tasks are more likely to be replaced by AI? How can one pivot to avoid it?

What we're experiencing right now with generative AI has been true in the history of automation and particularly computer-enabled or digital automation. We continue to find ways to take out that rote or mundane, step-by-step process, and have tools, whether physical robots or an algorithm, doing those steps for you.

It's not so much that any particular job is going away; it's that the jobs that are there (are changing). Take the middle manager. It's not a role that's going away, but it is absolutely a role that's distinctly changing. …

We're getting lots of new and very interesting tools and systems and data that can actually tell us very interesting things. And we can actually be better at that role of manager. But it is absolutely a very different role now. … (It will be more of) coaching, development, building relationships, building bridges between teams. That's where we'll spend our energies instead of on things like generating that weekly report because that's where tools like AI are stepping in for us.

On the flip side, how should companies be thinking about their workforce in the age of AI?

Some are trying to find ways to create spaces for their workforces to play with these new technologies - but in safe spaces rather than using them directly with clients where, if something went wrong, it might cause an issue. Why not create sandbox versions of these tools where the workforce can experiment with them and use them to figure out how their own roles are going to change? Or come up with new ideas a leader in the organization may not have thought of?

We don't want to view our workforce as a cost that these tools are going to somehow reduce. We want to see our workforce as a source of curiosity and relationship-building and divergent-thinking and so forth. And the combination of having this volume of humans who can bring those human capabilities to bear with these new tools − that particular value is always going to be greater than just trying to use the technologies by themselves without the human element.

A big retailer (IKEA) did this with their call center staff. But rather than automating to reduce headcount, they brought these new technologies in and changed how those call center agents dealt with customers on policy and process. (Using chatbots to handle more routine issues) freed up their time.

They've now upskilled call center agents to be more focused on helping customers with design and they are putting tools like AI in their hands to be better on-demand interior designers. Also, they're using this audience as a source of new ideas on top of that. Yes, I can bring these tools in and they might save me some money, but these tools plus the humans that I have create exponential value.

In your most recently report on workforce trends, you mentioned the importance of human sustainability, meaning employers should have policies that keep their workforce happy. Do companies practice it?

Seventy-nine percent of executives say that organizations have a responsibility to create value for workers as human beings. That's a big number. But only 43% of workers feel like their organizations have left them better off. Also, only 29% of companies or executives think they have a clear understanding of how to go about this. That's the challenge.

Of course, if we're public companies, we have a responsibility to our investors, our shareholders. But what does that responsibility look like? Is it just a quarterly number? Or is it more than that?

We don't necessarily agree with the notion that it has to be either (serving your shareholders or workers). If an organization does focus on the broader notion of human sustainability, we believe that business outcomes will also be higher. We believe that investing in human outcomes is an investment in business outcomes − and that these are not necessarily mutually exclusive.

What do you think the workforce of the future will look like?

Twenty years from now (AI technologies will be) just there. We talked about digital natives. These will be GenAI natives; they'll just be used to it. … I use these technologies to go farther, faster, and do better than I could have (on my own). Maybe there's more time for you and I to have a conversation, to be human.

In the future, we'll have prepped for this call with a bot doing most of the back-and-forth data sharing and we want to just focus on the chat we were going to have and then another bot would take it and write most of your story for you.

You're going to see more and more examples of that.

Read more about:

ChatGPT / Generative AI

About the Author(s)

Deborah Yao

Editor

Deborah Yao runs the day-to-day operations of AI Business. She is a Stanford grad who has worked at Amazon, Wharton School and Associated Press.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like