Unlocking the Wave of AI and Human Connection with Steve Cockram

Subscribe

Free Coaching Call

Need some quick advice? Jump on a call with me, and I’ll provide some insight and action. This is NOT a sales call where I try to get you to hire me. Promise!

Click here to schedule a call. 

Episode Overview:

In this insightful episode, we explore the profound impact of artificial intelligence on leadership, relationships, and our sense of humanity. Steve Cockram shares his journey from coaching giants to leading AI-driven transformation, emphasizing the enduring value of relational intelligence. This conversation offers practical insights for leaders facing unprecedented change and the importance of staying rooted in authentic human connection.

Additional Resources:

* Website
* Website
* Website
* LinkedIn
* Instagram
* X (Twitter)

Timestamps:
00:00:00 – Intro & Introduction of Steve J Cockram
00:06:54 – Steve’s Background and Career Transition
00:08:02 – Leadership & Cultural Reset at Endava
00:17:28 – The Challenges of Corporate Transformation
00:27:36 – Navigating AI in Leadership & Business
00:34:56 – Reflections on Humanity & AI Impact
00:32:11 – Leadership Under Pressure
00:38:35 – The Role of AI in Shaping Future Leadership
00:45:21 – Emotional Intelligence & AI: Can AI Replace Human Connection?
00:55:02 – The Importance of Relationships in Leadership
01:05:30 – Final Takeaways and Wrap-Up

Steve Cockram (00:02.00)
And that was, that’s where it began. And they literally said, one of them said, “Come and be our Chief Performance Officer.”

Because in the end, what we’re doing is we are trying to build a new way of doing business in the world where client relationships will be far more about partnerships.

There’s the variable in human connection, which is people won’t always agree with you. People will have had a bad day. People are dealing with their own sense of opportunities and beliefs. And it’s the infinite dynamic variation that I just don’t believe. I mean you can program it.

Skot Waldron (00:40.00)
When I’m not hosting Unlocked, I’m speaking at events all over the world. I’m helping leaders and I’m helping teams communicate better. I’m helping them build trust faster and actually enjoy working together. I’ve spoken for companies like The Home Depot. I’ve spoken at national architectural firms. I’ve spoken for pharmaceutical company offsites. I’ve spoken at associations, you name it.

With 99% of attendees of all those events, over 1800 people have reviewed me at this point. 99% of them saying they got some value. That’s pretty awesome. Even the caterers have thanked me. And if they are thanking me and they’ve heard a lot of talks and they’re busy doing their jobs, that’s saying something. If you’re an event planner looking for a speaker who’s really easy to work with, trust me, I want to be the last thing you’re worried about on event day. I’m going to take care of you. And who actually delivers value for your audience that they are going to use on Monday morning when they return to the office, then let’s talk.

I have a very special guest on the show today. And that is because Steve Cockram, not only is just full of fun things to talk about, but because he’s had a huge impact on my life. Maybe knowingly, unknowingly, not sure. Well, he knows now because I’ve told him. But I’m talking to Steve and working with him in various capacities, he is the founder, him and Jeremie Kubicek, the founders of GiANT Worldwide. And I am a GiANT consultant. I use their platform and tools to help educate, empower, and help leaders lead better, help teams work better together, help them communicate more efficiently, build better trust with each other. And GiANT is the foundation of a lot of what I use to help do that. So, I’m appreciative to Steve for helping do that and Jeremie, his partner, for helping me on my journey to doing that as well.

But today the conversation is going to be a little bit, you know, it’s going to be on leadership. It’s going to be on emotional intelligence. It’s going to be on how we build relationships. You know, first half conversation is a lot about Steve’s story and about how he came to where he is. And then we really get into this conversation of AI and how that’s working in its way into our world and AI and leadership and emotional intelligence. And we just go back and forth on some thoughts there which you know, I’m open to hear your thoughts on those things. So, leave your stuff in the comments. I really want to hear what you think about what we are talking about and hear your opinions on those things.

Steve Cockram is now the Chief Performance Officer at Endava, where he leads a strategic Performance function embedded within the company’s most significant AI-driven client partnerships. He’s a best-selling author and co-creator of the 5 Voices leadership system. He spent over two decades advising CEOs and executive teams on building high-performance cultures. And he’s worked with over a thousand different leaders in various capacities. Well, probably more than that now at this point.

He co-founded GiANT Worldwide with Jeremie Kubicek and has helped create leadership for the digital age. And the digital age is even more upon us than it was when he started GiANT, him and Jeremie back in the day. And AI is a big part of what we’re going to talk about.

So, I’m excited about this conversation. I hope you are too. It’s a longer one, but that is because Steve has so much to say about this and I have so much to ask about this. So, I hope you enjoy it. Here we go.

(04:22.00)
Steve, it is an honor to have you on the show, man. Not only because I know you’re going to drop the wisdom for my people, but I personally just get some hangout time with you, which I will never pass up on.

Steve Cockram (04:36.00)
Thank you, Skot. American introductions are always the most embarrassing because you can only ever disappoint after the buildup they give you. So, thank you for that.

Skot Waldron (04:43.00)
Well, hold on. What is a British introduction like? I mean, are you like setting the bar really low? Is that what you’re doing?

Steve Cockram (04:50.00)
No, no. We kind of build you up, build you up and just before you come on, we cut you off at the knees. So, you know, that’s usually the way forward.

Skot Waldron (04:58.19)
Okay, that’s fair. I guess I’ll avoid the British shows then.

Steve Cockram (05:03.00)
There you go.

Skot Waldron (05:04.00)
So, I can continue to feel good about myself when I get on the American ones.

Steve Cockram (05:07.296)
Or our usual trick with the Americans is we actually tell you how great you are, but we’re telling it with a level of cynicism that you don’t pick up, that actually we realize we’re actually taking the mick and you don’t realize and celebrate how wonderful what we said about you is. That’s the other British way doing it. None of which are particularly attractive, which is why we say I’m an American in exile in Britain with a British passport.

Skot Waldron (05:27.00)
That is brilliant. Yeah, man. You’ve had a little bit of that American stuff rub off on you. I know, I know it has.

Steve Cockram (05:35.00)
I’ve lived there for 5 years, and I’ve spent probably a minimum of 12 weeks a year for the last 10 years in America, Skot. So, it’s funny that I’m coming to you live from my London office at long last, having lived here for the last 12 years.

Skot Waldron (05:47.00)
I know and your view does not suck. You gave me a little picture of that. That was awesome. Yeah, man, let everybody see it. It almost looks like space world to me. Like I always look out there like, is that even real?

Steve Cockram (05:59.00)
That’s The Shard and that’s The Walkie Talkie. Now, of course, if you go to New York, you’ve got millions of skyscrapers. We’ve just got a few as always, but there we go. It’s real. That’s where I am.

Skot Waldron (06:10.00)
I know it’s real because of how gloomy your skies are, so I know that you’re not faking something.

Steve Cockram (06:14.00)
Yeah, you can’t, you’d never fake a blue sky, would you really?

Skot Waldron (06:18.00)
No, wouldn’t, exactly. I would know that’s fake. Hey, let’s talk about this. So, we’re going to get into some of your history and where you’ve been and what you’ve done. But this new position for you and this new kind of chapter in your life is really exciting, I think, from me on the outside perspective to see what you’re going to do there. I mean, let’s talk about, first of all, your title. I mean, did you just make that up? Did you come to them and be like, “Hey, you know what? Do you have a Chief Performance Officer?” And they’re like, “No, Steve, that sounds really smart. We need one of those.”

Steve Cockram (06:54.00)
Endava, we’re actually our first ever client, is GiANT. So, when Jeremie lived over here, John was a friend of mine and I’d never sell anything to my friends because, you know, why would you do that? Jeremie took John out for a curry and sold the concept of something called Culture Builder because we had no idea what we were doing at the time. Jeremie did it for about 5 weeks and realized that he was telling a whole load of cynical Brits, you’re going to love this, as you’ll know, with a connector to Pioneers. And in the end, said, “Steve, you may need to look after this account.”

So, 12 years on, I’m still trying to honor that promise. So, I know them well. You know, it’s not a company we don’t know. The CEO asked me in May, would I come and do a two-day offsite cultural reset? And I said, “no.” He said, “what do mean, no?” I said, “well, there’s certain individuals, I’ve told you for a while, you have to, you’re never going to change the culture while you have certain individuals there that are toxic to it and incompetent.” And he went, “well, you know, I think you’re over harsh.” And I said, “well, you don’t pay me to be blow smoke up your backside, you got plenty of people do that for you.”

Anyway, long story short, those changes were eventually made. And in May, in from July, I did a two-day cultural reset for this company. And at the end of it, they literally went, you know, over a few bottles of wine in the evening. They said, “fair enough, you’ve done a great job terrifying us that we realized we have a broken culture. We actually need to build a new business model for the AI world as IT services. And we need to build a leadership model because we haven’t lived, it wasn’t strong enough to help us behave ourselves in the pressure.”

So, then they said these immortal world, Skot, which we can probably bleep out on your podcast, they go, “well, Steve, why don’t you put your effing money where your mouth is? And rather than try and coach us how to do it, you’ve just told us how Herculean the lift’s going to be, and we’re going to be putting two new engines on an airplane while flying business as usual or trying to. Why don’t you put your money where your mouth is and come and be part of the team for a season and actually help lead the change?” And that was, that’s where it began. And they literally said, one of them said, “Come and be our Chief Performance Officer.”

Because in the end, what we’re doing is we are trying to build a new way of doing business in the world where client relationships will be far more about partnerships. Where we will basically offer to be the partner to some of the largest companies in the world as they navigate their own AI business transformation. And we think the superpowers that you have and GiANT and all the things we’ve benefited from, if you could just sit at the intersection of our client, big client relationships, so people are signing 5-year deals, minimum 100 million spend a year, if you could help those two separate teams become one team, and actually understand what true partnership might look like, not just we’re going to buy something from you, you deliver it, or we beat you if you don’t.

How do we do partnership into a world that none of us have been to before? How do we help you glean into that future? How do we share the productivity gains? How do we partner on products? How we sell for each other? He said, if you could do that, Steve, for 10 of our largest global clients, and maybe where some of them have got a little bit tired and where the relationships are not what they were, could you do partnership retreats for two days, maybe seven people over here from Paysafe, seven from Endava or seven from MasterCard or whatever it might be. You’re the best we know at translating people to each other. And ultimately, if the performance of those partnerships work, you’re more than we pay your money.

So, that’s where it began, Skot. And they said, obviously, you being around is a bit like having a grownup in the room. It’s a bit difficult for us as an exec team because you know where all the bodies are buried. You know all our insecurities and yet 12 years on, we still trust you enough to believe that we have a better chance of navigating this incredible transition with you with us. And they all went into batscot. This is funny because you imagine persuading a board that we’re about to appoint another white middle-aged male, so a white middle-aged male exec team for a role that no one’s ever heard of with no concrete deliverables. That’s where we are.

So, 4 months on from when that happened, I’m now an executive of a publicly traded company that’s listed on the NASDAQ. We’ve just done our earnings call today. So, I have learned more in 4 months about how publicly traded companies work and all the shenanigans that go with that.

So, that’s a very long answer to your question, but I hope it gives a little bit of context. I’m still chairman of GiANT, I’m still chairman of Workplace, I’m still on the board. They literally said to me, “Steve, you can do whatever you want, but we’d like you to come and be part of. Have a couple consulting clients, whatever you want to do, but we just think we have a better chance of winning with you here, which has been amazing.” So, there you go.

Skot Waldron (11:47.00)
I mean, it’s incredible. When you think about, you know, going and building this thing with Jeremie Kubicek and how y’all built GiANT, you know, as entrepreneurs and founders into what it is today, still being part of that and then coming and getting, you know, your first like big building, big boy job, right? In a corporate space on the NASDAQ. And it’s just such a transition.

But what’s really cool is that Endava saw that in you. I mean, you’ve coached these guys. I mean, you have worked with their leadership team and their people, and you know where the bodies are buried in their culture problems. And how cool of an opportunity for them to be able to see like, hey, this guy can help us with the performance side of what we’re doing because performance is people. And if we can help our people thrive, who better to bring on to help do that than Steve. So, super cool.

Steve Cockram (12:49.00)
So, there you go, it’s perfect for an extra what you see, Skot. Everywhere I go, people go, here in Endava and what is a Chief Performance Officer? Because we differentiate, it’s not people. We have a people officer. And obviously, we work close, I work incredibly closely with Letty because they’ve used GiANT to frame out so much of their leadership development over the years.

But I said to her at the time, said, “Letty, be honest,” you know, Veritas serum, which is white wine. I said, “Letty, if I took this role, would you be okay with it?” Because I’m going, I’m a big character, bigger than ever with the amount of corporate eating and drinking I’m doing. But I said, “I don’t want you to feel somehow, I really want to know what you think.” And she just said, she literally went, if you joined, it’d be the best thing that ever happened to this company. I mean, I’m like thinking, well, okay, you’ve had a few glasses of wine.

She said, “I never believed it was even possible that somebody who’s been a successful you would even consider coming and being employed.” She said, “I would love it. It would be amazing. Would you invest in me?” So that was one of those ones as well, where I go politics. I realized, Skot, is that you know, GiANT was one of the purest cultures I’ve ever been part of. I mean, the love we had for each other and the honor that we lived our values. I’m literally in the middle of the square mile of London where money, sex and power are real.

And I’ve, you know if you think had quite a closeted life, I was a teacher, a pastor, a coach to pastors, then the GiANT. This is me literally in the middle of a quite alien world. But ironically, by me being me, everyone seems to think that somehow, I’m a cross between their father, their pastor, and a grownup who really doesn’t carry a huge amount of insecurity or not like I’m trying to prove myself to everybody, but for me, I’m just, there’s an amazing group of people. I feel like I’ve been allowed to have a 12,000 kind of almost church to lead, ironically. And they all come to me for advice. You know, I’m probably one of the oldest here. They’re all young, cool, hip, track, tech trendies. And I most certainly am not, I’m afraid, but hey, you know, amazing sometimes how the things that we do in our life, we get surprised by. And, this has been a big surprise for me, but I’m enjoying the challenge and I’m learning a lot about AI that I’ve never have learned any other way, I’m sure.

Skot Waldron (15:17.00)
Yeah, and think that and that in itself and bringing you in, I always talked about this too. When I ran my design agency, it was you know, companies would say, “hey, well, do you specialize in education, doing design work and brand strategy for education institution?” And we are like, “nope.” And they’re like, great, we want you because all these other companies that do that, they just give us the same stuff. It’s just rubber stamped, you know, process and solution. I mean, it’s less of a on ramp because they already understand our industry and we don’t have to educate you. But the product we get at the end is kind of the same.

So, I almost see it like being that for you an opportunity to bring in an outsider, somebody who’s not in this world who hasn’t been indoctrinated with the world of corporate, you know whatever and bringing you in as an outside perspective in this. I mean, great, you know Endava very well, but this world’s a bit different, you know.

Steve Cockram (16:21.00)
Well, I’m constantly surprised, Skot, by the level of favor that I receive and people being almost overly generous in their gratitude that I’ve come. I think when, you know, if you imagine an IPO is a big moment and you know, you ring the bell and the share price is 25 and it went all the way to $170, you know, there’s an awful lot of people who have been hugely successful, shall we say.

But the AI disruption and post-COVID has gone all the way back down to $7. Well, that’s a dilution of a people’s value that is significant. And I think there’s a humility that comes with that. And in a sense, the fact that I would join probably at the lowest point of the share price ever. Most people don’t do things like that because it’s like going, well, historically, most companies that go up and down that way don’t recover. Not necessarily in the way that I think this one work. And so, for me, it’s like, okay, I’m just being obedient and feeling incredibly privileged to be here and part of a team.

So, one of the things that for me has been the biggest leadership lesson is to go, historically, I’ve only ever played four-ball better-ball golf or been the player coach for a basketball team. And now I’m playing rugby. Okay. What do I mean by that? Well, I’m a very high-express control school person, which will come as a no surprise to anyone who knows me. I might seem very Joe Villa nice and kind, but I usually like to be in control of the outcome, whatever sport we’re playing or whatever business. So, if you think four-ball better-ball golf, if you’re my partner, I love if you can play well. But ultimately, I can still control the outcome of the game by how I play.

In a basketball game where I’m the player coach, I’ve hand selected probably 5 or 6 people that are I think a world class in what they do. And if really we get into trouble, I can call time out, gather everyone around, get the little whiteboard out, draw around and go, Skot, you’re free. You’re going to take this shot to win us the game. So, I’ve not quite the same control as I do in a game of golf, but I do have a lot of control over who takes the shot and who goes where.

Rugby, okay, is a game for 23 players, 15 on the pitch at any one time, where everything is about the sublimation of ego for the good of the whole. So, no one individual in a rugby team, I’ll come onto American football in a moment, so don’t worry. No one individual in a rugby team can win the game on their own. In fact, what they show is if you have three-star players in a rugby team and the other 12 are at bang average, the three get injured because they try and do too much to carry the team.

Now, so if you think about it, American football is rugby on steroids, where you’re not even on the pitch with the same players. So, if you think of going, you know, am I offense, am I defense, am I special teams, whatever it might be, that’s probably more like a company with multiple brands inside it. We may be part of the same holding company, but we’re not one team.

Endava is one business delivering one set of financials, but the reality is no one individual in the team can actually do it themselves. And that’s been a real challenge for me, because I’m used to being the star of the show for one. Obviously, that comes very humbly. And I’m not the star of show here. I’m always been responsible for revenue. And I ultimately can’t shift the revenue numbers here directly. And me choosing to be part of a team and play a small part, well, actually for me is a learning that I hadn’t realized I’d never had because I’ve coached people obviously, but actually feeling the powerlessness at times of going, I actually have to trust the people that I’m with.

And in rugby, they tell me there are certain positions where you are often not seen by the crowds, but the team know that because you play the role you play, everyone actually gets a chance to play better. But rather than be the star players on the wing or the, you know, the fly house that scores the points. This is the first time in my life I’ve been in one of those unseen roles that actually makes the whole team better. In rugby, they call it, they call me the number 6, you know, which is the blindside flanker. Obviously, Americans are loving this right now, Skot, I can see that.

Skot Waldron (21:10.000)
I don’t. Wait, put rug what?

Steve Cockram (21:12.00)
Rugby. I think you boys play it. But that’s, I’m always trying to share what I’m learning. And then I try and make it into a story and a visual that people can understand. So, I have incredible admiration now more so than I’ve ever had before Skot, for anyone who’s prepared to be the CEO of a rugby team. I cannot even imagine what it’s like to be the CEO of an American football team, where you are, how do you influence the culture when there are so many other people that you have to lead in that place.

So, you know, I’m not sure whether I’ll ever be allowed to play American football. That’s a metaphor rather than for real, by the way. But I go, it’s complicated enough trying to lead a 12,000-person organization in 61 locations around the world when actually you don’t have all the levers of power. So maybe that’s a humbling experience for me, but probably something I’m learning quite a lot at the moment about.

Skot Waldron (22:07.00)
And intimidating, like, do you feel that you can make an impact on a 12,000-person organization when you came from, you know, a company like GiANT, where there are not 12,000 people, right? I mean, how do you go into a situation like that understanding of a Chief Performance Officer. That’s a label there. Like performance. We want performance. Right. And you know, how do you address that?

Steve Cockram (22:41.134)
Well, I’m working at it. think one of the things has been is I basically said we recodified a leadership model because I said to the guys, I said, “look, if it’s not codified, it can’t be multiplied.” But here’s the problem. If we don’t codify it and we don’t model it and reinforce it in the way we promote and the way we reward people, it won’t live.

So, I think that was one of the key things, Skot, where from the offsite, I actually said to them, “what are we prepared to commit to?” And if we codify it, we’re going to have to model it, guys. And if you really want me to come and do this, one of my roles is to make sure that in the language, the executive team is the healthiest team in the organization. And you haven’t been. You’re naturally siloed by then your personalities. There’s INTPs everywhere, creative pioneers.

So, I think that’s a big thing. If we’re going to go and me say this is what we’re committing for the whole, every leader, every team in Endava is going to carry this DNA piece. That will make a difference because I know unless things have executive sponsorship, they don’t happen. So, it’s been really interesting for me being on an executive team where if I asked for anything, everybody drops everything to do. If I said I’d like to meet someone, they cancel their meetings.

I mean, that makes me feel like, “whoa, I’m not that important.” But because the corporate world respects authority, almost like the military. They might bitch about it, and they might say, you got the job instead of me. But the reality is that there’s a much greater sense of we follow. So, can I make a difference? We’ll find out. I think that the thing they would say is, I genuinely believe in the era we’re moving into. Partnership and relational partnerships with clients are going to be different than the transactions of the previous era. And so therefore, in a sense, think historically, you might have the two CEOs, you know, may doing the deal over a game of golf and signing it in the clubhouse afterwards. But then what usually happened is both parties had a henchman or henchwoman who was the goal was to make sure we get whatever we needed out this deal and the margins or whatever it might be.

This is saying how do the whole team of people on one side of the table and the other actually become one team in their own right? And if you think about it, we spent a lot of time and you know, GiANT’s work that I know you’ve done quite a bit obviously, how do we create high performance teams? But I never actually thought about doing, how do you create a high performance one team mentality between two separate companies with competing P&Ls?

So, as a Pioneer, it appealed to my sense of I’ve not done this before. And I’m not sure anyone else has really done this before because people haven’t needed to. So, if I look at the partnerships that being signed at the moment by GiANT, sorry GiANT, by Endava, schizophrenic, forgive me, Skot. Some of these people are signing 5-year deals with us because they want to be partners with us because they trust us. But if you think at the speed that technology is moving, who knows what’s going to be happening in a year’s time, let alone 5 years’ time.

And the idea being is that every person actually has to buy into this partnership model. And then maybe caught us when the part and one side needs more of the margin than the other. And I think that for me is an interesting frontier of going, I get to bring all the things that I am and that we do and that I’ve got team for and actually try and do something that hasn’t been done before. If it works, if we can deliver it, and I’m convinced we can. You imagine what it does when you have a reputation for being the partner everyone wants to work with, because not only do we sign a commercial agreement for 5 years, they call it the Bible, which is it fascinates me, but then after 2 to 3 months of the initial engagement beginning, we take them away on a partnership retreat where we actually do deep dives with everyone before they come so we know how they’re wired. We do synergy and conflict. We have honest conversations about what are your fears coming into this? If this thing doesn’t work, why is it not going to work? What would winning mean for both parties? And we put it all together into a partnership playbook. In the same way they have quarterly business reviews in every big account, we will have a quarterly performance review for the partnership. And then every year we’ll go away together and basically celebrate what’s been and agree what’s there in the future.

Well, I think that becomes very enchanting because people go, you actually are committed to partnership at a deep level. Endava has always been incredibly trusted. They’ve never done marketing really, but everybody believes that if there’s a problem in Endava, we’ll tell them about it. And that level of trust is probably the most important commodity going into a world where everyone’s going to have to navigate the AI transformation. It doesn’t matter which sphere of culture, and you can be in education and go last, but it doesn’t matter if you’re in business, whether you’re in health, whether you’re in faith, nonprofits, every single leader and team is going to have to navigate the AI transformation for their sphere of culture. And the best analogy I always do when I put it on a screen is I show a surfer at Nazaré trying to surf a 100-foot wave. And I go, this is what it’s going to feel like. You are not going to be able to swim over the top of it and pretend you’re miles out sea. You’re either going to surf this wave out the other side where you know, they’re waiting with the jet skis or you’re going to get crushed by it.

That’s the level of disruption we’re dealing with. And the real question then is to go, if you think back to what Jeremie and I wrote about Sherpa being the trusted leaders, we’re not using the vocabulary, but ultimately what we’re saying is to go, you’re going to have to find a Sherpa who you believe has made that transition before and is willing to partner with you to help make sure you get to the other side. That’s our proposition. And in the sense, what I’m doing with performance, dovetails pretty powerfully into this sense of saying, we will be the Sherpa, we will be your trusted partner. Why would you trust us? Well, we are navigating and surfing the 100-foot wave right now. IT services is probably the, if you think the sector that’s been disrupted first, most powerfully by artificial intelligence and all the things that go with Agentic AI.

So, what we’re learning is what we’re going to be sharing and partnering with the people who come with us. We just want to, this is where you Americans are both a joy and a curse, Skot. We’ve just won, basically, the technology partner to build the alternative to swift in the world. There’s no one sure now that we can trust America not to turn off all our payments whenever they feel they don’t like it. But if you can imagine of all the technology companies in the world, Endava was chosen to be the architects of building the new payments gateway system for the world.

These guys are incredible. The engineers are incredible, but they’re far too humble. You know, and it’s the ultimate triumph of substance over style. They’ve never marketed themselves before because they’ve never needed to. So, you know, me telling stories and bragging on other people is part of what I do. But this is one of those ones where as an entrepreneur, I get to go out and tell the world what we do. Knowing that the back end is sensational.

It’s the front end they struggle with because in the past people just go, you’re so good, can we do some more? And now we’re actually having to be in the room with the CEOs and the global leaders of politics and go, what’s keeping you awake at night? And how do we help you navigate that journey? And AI is creating a way of moving from ideation to value creation at a speed that we’ve never known before. But everyone’s having to learn and even old dogs like me are having to do my AI training every day to make sure that at least I’m AI native at some level. Anyway, I think more as a mudblood than a pureblood, Skot, but you know what that means if you’re a Harry Potter fan.

Skot Waldron (30:51.00)
Yeah, I mean, I think that as we look at AI and look at what humans were responsible for, that the productivity, the task management of producing the thing is that AI is now doing that or taking on a bigger role in that. And that we are becoming the decision makers, the responsible ones, the values-based people in that world that need to hold on to the humanity of what some of this work actually is and requires. And that dives a lot into how we interact as humans.

I mean, it’s emotional intelligence we talk about. I mean, that’s where you come from, as we call it, we call you in the GiANT community of coaches and consultants, the personality Yoda, right? Just having come from that knowledge base of people’s wiring and how we interact with humans, with each other. Because AI increases our capability. But in leadership, I mean, how do you think AI is going to play a role in that? Is it going to expose leadership weaknesses faster? What do you think? Or is it going to reshape leadership and how we communicate and lead? I mean, how do think that’s going to play a role?

Steve Cockram (32:11.00)
Gosh, I mean, if I had a crystal ball, Skot, this would be very valuable. Here’s what I’d say is that skills break down under pressure is a universal truth that everyone agrees with. And if you’re an elite athlete, it doesn’t matter how elite you are, at some point you become stressed and fatigued and skills break down. I think what this is doing is it’s putting a level of stress and pressure on top of the usual business as usual responsibilities because every business, every organization, every single sphere of culture right now, all the boards are asking the executive teams, what are we doing with AI? How do we position ourselves?

So, what that’s doing is, it’s putting pressure into the relational dynamics of team systems. It’s also, I think putting pressure on people’s sense of, well, what does this mean for my job? What does this mean for my future? And you will know, mainly because I spent a long time predicting it, what happens to different human beings under different forms of stress, from moderate stress to extreme stress. And I think what you’ll watch is you’ll watch all of those things played out in many teams. Because the pace of change and the sense of, I don’t really understand this. Because most of the people who lead large organizations are, I would say my age, but the skill sets of the world that is coming are usually owned by those that are at the younger end of the continuum.

So, there’s fear for, you know you’d be amazed how many mid 50 executives have been consulting their 401k or their investment pot to go, if I got fired today, would I be okay? So, fear creates usually behaviors which are not particularly attractive in leaders. So, my view would be is to go, you have to know yourself to lead yourself. And if you don’t know as a leader where you go under stress and what it looks like and how you deal with that, you’re probably going to struggle. If you have a team where you don’t understand the dynamics that are going to get manifest as stress and pressure kicks in, you’re going to be driven off course when it happens.

So, in a sense, I would go, teams are still the most valuable unit of creating sustainable change. We’re going to have to surf it together. So, I think leadership skills will be tested like never before. And in a sense, it probably is more important the way leaders lead through this than they really understand right now. I mean, Lindsay only was helpful. He said, no organization will ever be healthier than the health of its number one team.

So, you can’t outsource this. You can’t outsource dealing with AI. You can’t outsource culture that’s unhealthy. You actually have to learn to embrace it and lead it. And that requires usually dealing with your own insecurities and fears first. So, I think the universal truth, if I’m honest, Skot, I just think that AI is that 100-foot wave is creating an adrenaline rush. I mean, I cannot imagine surfing a wave that high in 20-foot would terrify me. But to think we’re going to have to actually ride it or die is stress-inducing for most people, particularly when we go, I’m not really sure I’m a big wave surfer, Steve, because that’s what it feels like to most people at the moment trying to think, what does it mean for my job, let alone all the people as leaders we feel responsible for?

I feel acutely responsible for, gosh, we’ve got 12,000 people around the world, some of them in places where, Endava has been part of societal change, and you know, Moldova and Krishna in places where we are the largest taxpayer at certain points of any company of creating jobs in places which have been transformational, how do we take that responsibility seriously? I think that’s part of reason why I love where I am at the moment, because we can’t guarantee everyone will have the same outcome, but we can actually at least commit to equality of opportunity.

And I think that’s the bit I would say to people in their fear. You have to own your fear. You have to own your insecurities. You have to do the work on you because there’s nothing more attractive in a crisis than someone who appears that they are not reacting badly. I think that may be part of the reason why they like me here, Skot, because in a sense, I’ve got nothing to hide, nothing to prove, nothing to lose, really. You know, sometimes just being a grown up in the room and choosing to be able to help other people and just remain calm under pressure. I think that will probably be one of the most valuable things that leaders can be is probably as much of what we be as what we do at the moment because the people who are going to solve these problems are probably not around the exact table.

Skot Waldron (37:06.00)
This is true. I mean, they’re out there somewhere. I think that, but I mean, you said something earlier too about, you know, the older generations that are there, you know, they have the experience, they have the wisdom, they have the process, they built the things generally. And then you have the younger people that are on the, you know, I mean, my daughter and my son, they go to an AI based high school. Like it is, you know, it centers, they use AI all the time to help, you know build out their structures for what they’re doing. And they look at AI and roll their eyes. They’re just kind of like, that’s just what we do. And so, it’s just ingrained into their world.

And as we think about this, you think about leadership because the younger individuals, they’re not necessarily leaders yet. They’re still learning the trade. They’re still learning work and they’re still learning professionalism and culture and how to grow in their careers. What do you think is a really important leadership trait that will matter in 5, 10 years down the road as we’re coming into the, like AI is going to be full blown interfaces all the time. You know, we’re going to have robots working side by side, doing all of our stuff like we do now in some instances. But what do you think leadership trade, what’s going to matter?

Steve Cockram (38:35.00)
This may seem self-serving, but I actually believe that the ability to establish, maintain, and develop long-term relationships and trust will become more valuable, not less valuable. I think most people have known it’s true. Some have paid lip service so it’s true and hoped their intellect would get them through the answers. I think that piece of human connection and being someone that people enjoy being with, I mean it won’t be long, by the way, before anything that appears in the digital space, will we question whether it’s real? So, I see you on the screen, but is it really Skot Waldron? Yeah, I’m receiving an email and a video message from you. Is that really you?

There’ll come a time when actually probably we will only really trust when we’re together. So, I think we will actually end up more relationally connected physically than perhaps even post-COVID we thought we’d be. So, what’s the most important business skill? I would say it’s to be somebody who knows how to connect with people, knows how to build influence with people, knows how to be stable and trustworthy as a partner in whatever you’re doing and adding value to those around you. None of that is rocket science, Skot.

I think it’s pretty much the same as what we’ve done. I’m just saying, I think that relational influence and that network of influence will become more important than ever before because in a sense, we’re not going to be differentiating based on, you know, my agents will be just as good as your agents. The question is ultimately, who do I trust if it goes wrong? Who do I want to be in the foxhole with? Who do I believe will be a man of integrity? Who will be somebody who ultimately will actually choose to prefer me over themselves at times in the deal? That’s, I think, historically what most successful business people will tell you. That was the truth. I just think it’s going to have to happen at far deeper levels because ultimately, we’re never going to be able to compete really in the IQ space, even if we ever could.

So, that would be my top tip. Self-awareness is the foundations of others’ awareness, influence, building relational trust, not blowing it, being men and women of integrity. I just think part of your, we’ve had a period of time where it said, this may be a bit more controversial, so you can edit this out if you want. But I think there was a time when it was like, judge me as a leader on what I achieve, not who I am. And I just think there’s a generational reaction to going, whether it was me too, or whether it was Epstein or whatever it is that goes, do you know, I’m not prepared to separate your character from your competencies. And you can be the most competent person in the world. Jeffrey Epstein was phenomenally competent. I mean, as a manipulator of people, this guy was like right up there. And a lot of people trusted him in a way that they’re probably now regretting and thinking they get away with things.

So, for me, I would say as leaders as well, I think the younger generation are looking for your competence to match your character. And most people find it easier to be competent than they do to allow their character to be the thing that people see, because that’s why we trust people.

If you think of influence, you know, influence model is character, you know. Do I trust you? Are you a person of integrity?

Chemistry is, do I connect with you? Do I like being with you?

Competency is, are you competent? Do you have a proven track record of competence? And are you confident in it?

And then credibility. Do you have the ability to understand the uniqueness of my challenge and apply your competency and character to help me solve the problem that I have on my side of the aisle? That’s where relational influence and impact comes from and it exists beyond transaction. So, that’s probably, that’s not a new answer to me, but it’s probably one that I would say is even more important now than it’s ever been.

Skot Waldron (42:37.00)
So, is there a specific behavior in there that you think that AI cannot replace?

Steve Cockram (42:44.00)
I mean, yes, I think there’s, I would believe that we as human beings were made for relationships and that we find our deeper sense of meaning and purpose inside relational connection. I remember a psychologist friend of mine telling me, Steve, you know, why everyone’s getting so weary doing Zoom calls during lockdown? Now, I know some parts of America never locked down at all, but we did.

And I ended up having to be in 45-minute Zoom calls, not back-to-back because I just got so weary. And he said, “Do you know why?” He said, because human beings, when they see someone, there’s a part of them which reacts viscerally to the sense of wanting to be with that person. And then what they feel emotionally connects them with their intellect that tells them, yeah, I may be with Skot, but I’m not really with Skot, because Skot’s on the other side of the world. And then there’s, he said, there’s something that kind of is pained in us because we were made to be together, not even be on a screen. And it really struck me. I was like, we all got fatigue from being on Zoom. Because ultimately, we can manage relationships, but we don’t get to establish them usually. And we certainly don’t get to build them without some kind of physical connection of being with each other, which was really interesting learning for me.

So, you know, you and I have been together over the years, so when we see each other, we connect to the time when we were physically together. But if we just exist on Zoom or exist on Teams for a period of time, we will become less connected. We actually need the rhythm of physically being with each other because our body reads something viscerally about that connection of humanity. And so, you know, whether you attach a spiritual significance to that or a psychological significance to that, it is just true that human beings need each other. And loneliness, ironically, is an epidemic in our part of the world right now. We’ve never been more connected, but we’ve never been more lonely.

And that’s really, you know, why are they going to ban social media from the 16th? You know, because in a sense, they’re learning, they’re losing the ability to be with each other. They would rather text each other than even call. Or I mean, text probably dates me, doesn’t it? You know, Snapchat or whatever it is they’re using. But do you see the difference? They’re losing the ability to do the thing that I believe human beings were made for. And that I think is something we have to guard against as leaders, because just remote teams, by the way, if you’re the leader, you need people with you regularly, even if you don’t want it.

Skot Waldron (45:21.00)
Tell me about, so I heard a story recently, there’s some new restaurants, I can’t believe they’re, I think they’re in New York, they must be in New York, just for what they’re doing. But they’re actually building some tables and some structure around you, and you can set up different devices around the table to have interactions with different people, whether it’s a screen or whether it’s AI or whatever.

And so, I’m interested about that. And when you’re talking about human connection and our need for human connection, AI won’t be over a place that necessarily is your argument for that human connection. There are a lot of people using AI for connection because maybe they are lonely and they feel there’s an element of depression and they can pour into their AI and then it gives back and it helps them feel like they’re the best thing in the world.

Cause it agrees with everything we say and everything we do and makes us feel like our ideas are a million bucks. Right.

Steve Cockram (46:27.00)
So, if you think of our robotics with AI, going to actually end up caring for a lot of our old people alone, absolutely. So, the difference between what is good and what is best is different for me. So, if AI is able to become something that becomes a companion and cares for and gives a context for the loneliness piece to be dealt with, part of me goes, yes, I get it. But if that’s what we hold up as best, we’re going to miss something fundamental. Because I truly believe in the same way that the Zoom connection felt amazing when I see you. And then I feel a loss when I realize you’re not real.

I think that will still apply to anything that AI does. Because I think there’s the variable in human connection, which is people won’t always agree with you. People will have had a bad day. People are dealing with their own sense of opportunities and beliefs. And it’s the infinite dynamic variation that I just don’t believe. I mean you can program it, but I still don’t believe it will be the same. So, am I going to be glad that robotics and companions and, I don’t know about you on social media. I turned off social media for January as a fast apart from LinkedIn. I’ve not turned it back on because I don’t need an AI companion who tells me she can meet my every needs and will never say, which seems to be on every piece of feet. And I’m going, I’m sure it’s incredibly attractive. The thought I have somebody who knows is willing to basically meet all my needs. But I’m not sure necessarily that’s probably the best point for human beings to interact.

I actually quite like the fact that I have to deal with my selfishness that I actually want my own way. And I would say for most men as they grow up, you go through stages of learning to deal with how selfish you are. You know, when you get married, in my case, that was a moment where not everything was dependent upon what I wanted. Throw in children, that’s a variable. Then you have caring for parents that need you in older age. And the ultimate one is having people live with you who are not part of your initial culture. All of them challenge our intrinsic selfishness and individualism in a way that I don’t believe AI ever will, because AI is programmed to meet my needs.

Rather than, I think a lot of us, it’s more blessed to give than to receive. Our character gets formed in those moments when we have to make choices that actually don’t always benefit us in the initial sense of, well, what would my pleasure choice be? Or what would my need be? No, I’m going to choose to drive my kids to a game again and I’m going to choose to do whatever it might be, or I’m going to go and visit my elderly parents or have them play golf, or I’m going to choose to do X.

So, I think it’s in the choice of sacrifice where we often develop the character which makes us attract to people to be with. I mean, can they program AI to be like that? Probably. But I don’t think that’s what human beings want in their AI companion. They almost like someone who’s reinforcing what they want to hear or meeting my individual needs. I mean, I sound like probably I’m speaking from a bygone generation, but I still intrinsically believe there is something about human interaction and connection which forms us in the adversity, Skot, that I’m not convinced you get with what AI is promising to be for me or for other people.

Skot Waldron (50:02.00)
And I, you know, we talk a lot about emotional intelligence in our culture world and our leadership and what we try to do. And I love how you said it is like that idea of self-awareness is the foundation of others awareness, which is emotional intelligence and how we can read other people. And we talk about artificial intelligence. Do you believe that there is a, don’t even know what this means. I just thought of it when we were talking about artificial emotional intelligence. Does that exist?

Steve Cockram (50:34.00)
I mean, I think the fact we use artificial is quite helpful because it’s not real. So, even emotional intelligence, I think, is easier to learn than relational intelligence. I think relational intelligence is probably the phrase I used to go, how do I build relationships with human beings who are also dealing with their own stuff on a daily basis?

So, I always say, it’s like leaders, guys, if you think leadership and developing other people is easy, you’ve never really done it. Because I’m a variable, you’re a variable, and in any given day, different dynamics are going on in our nurture and our choices.

So, learning how to calibrate, support, and challenge for people is so difficult. How do you train an AI to replicate the variables of the ups and the downs of being a human being? I mean, can you teach them to go through loss or, you know, they’re experiencing what it means to have cancer or, you know, they’re afraid or they’re, you know, all the things which go with the territory of being a human being. I’m not sure we’re ever going to believe that a machine feels that sense of loss or that sense of joy or that sense of, you know, a newborn baby versus the death of someone we loved or a terminal diagnosis that what it does to us as human beings. I just don’t believe, even if they know the words, I just don’t believe they have the relational intelligence for us to believe it’s real because that’s not what they’re dealing with. That’s an initial response. I don’t know what you think about that.

Skot Waldron (52:14.00)
Well, I mean, I think that I’m on board with you a bit. There is a so there’s a Johnny Depp movie. I forgot what it was called. Do you remember what it called? There was like where his intelligence, his conscience was put into this computer. Of course, it was a Johnny Depp movie, right? But it was like, it was his intelligence. He was working on this with his wife, and you know, he died. He got some illness, but they took his conscience, his brain is sold, I guess what they’re saying and put it into this machine. And she was able to interact with it and hear his voice. And at some point, he got so smart. It got so smart that it created a physical manifestation of him. His wife could not discerned that between the real him and this artificial him and it became this connection.

Now you talk about this idea of all kind of right now AI can’t know what it’s like to have a newborn baby or to experience death or cancer or whatnot. But do you at some point will we be able to take the conscience of humanity of a human and put it into a machine to where they will be able to kind of understand what it feels like.

Steve Cockram (53:46.00)
I think we can train them to understand anything. And the one thing they’re good at is language. So, can they say the right words? Yeah. Will it feel the same engaging interacting with it? You know, so for example, I’m probably talking about the difference between not bad and good. It’s between what is good and what is best. So, for somebody who misses you know, an elderly person who misses their partner of 50 years to have someone who looks like them, sounds like them, and cares for them. I mean, like, go for it. I mean, it might get 80% of the way there. But the question is to go, will it ever feel the same as the person that they knew was another human being? I don’t know.

I mean I think you can get close, but I’m not convinced. You know, if I had a version kind of if I lost Helen, you know, would I like a permanent reminder of her and thought like her and cared like her? Absolutely. But I’m not sure it would feel the same. And that’s where I probably, I have I guess from a different set of views, the kind of the uniqueness and the sanctity of humanity is the pinnacle of creation and somehow we were made for connection with the one who made us and those around us.

There’s a lot of films over the years that have weird science was a particular favorite of mine, I think, as a 15-year-old and the thought of being able to produce a Kelly Brook or whatever it was that would be whatever I needed is deeply attractive. And I always say to people is like, if you think historically, the two things that have always driven technological innovation are warfare and porn. They’re the only things that made money on the internet for a long time.

So, I can see all kinds of ways that people are going to monetize the combination of these two things. Is it going to be helpful for humanity? I don’t know. If people end up dependent upon a relationship with a robot that has the personality and the wiring and the ability, is that actually going to be good for us? I don’t know. If it stops us getting out and actually being real and vulnerable with people that challenge us to be selfless, not selfish. I think that’s the bit for me is like, who’s going to hang out with a selfish robot?

Skot Waldron (56:15.00)
I don’t know. It’s caused me to think a bit. OK. Because what if at the end, like right now, if I told everybody that’s listening right now, that this entire conversation was AI generated, what would it do to their soul? What would they, how would they feel? Would they feel betrayed or would they be like, that’s cool. That was really, you know, but I think it’s that, that is what we’re missing. Is that what we’re getting when we won’t know the difference between what’s real and what’s fake, and then we’ll see an entire, I think people right now, especially the ones that are bridging that gap, generations of reality versus AI, that when we read text, read books, knowing it was written by a human feels different for us.

Now, will it be that way in the same in 20 years? Will they care? Will it? I mean, I don’t know. But that humanity, that connection of feeling like, that was written by AI, there’s something that to me feels like it’s lost. Like, ugh, it feels a little lost.

Steve Cockram (57:29.00)
I mean, people knew that we were just, you know, avatars just, and the AI made it up. I mean, it’s a really good question. I mean, it depends what you’re trying to do. If you’re trying to learn something, then I guess the source of learning doesn’t matter. But if the reason you’re choosing to learn is because in a sense, you’re choosing to connect with the heart of the person that lies behind that learning with all the brokenness and all the travails that come with that. I think it would lose something.

And I think that goes back to the bit I said at the beginning, is to go, there’s going to come a time when people question what is real. Because, you know, we, I finally stopped getting sent for more phishing training. Because we’re obviously, we’re monitoring, we’re working with some incredibly, you know, sensitive information. And they always say that basically, the human beings are the weak points in security. And they’re constantly sending out these cyber tests and see if you click on things or you know, like, hey, guys, but they’re going to do an app now for entrance into the building. You need to click here to book your slot to go down and get your and it looks so real.

And so, I finally become more cynical about anything. Anything that’s in quarantine, I never open. Anything that looks too good to be true, I never touch. I’m learning to check the emails, make sure it’s legit. But here’s the difference. We’re going to get to a point where the question is what is real unless I can hug you and I’m physically with you.

At some point I’m going to always question what is real. Because to your point, we could easily be two avatars. But if I’m in the room with you and I can see you and I can touch you and I can, I feel like I have a greater sense of connection, will that mean something? I think it will. And I think that’s, you know, not to give up being together. It’s interesting back in London, more people in offices now than before COVID. Because we missed each other. And I think that’s probably encouraging to me as a human being, even though I quite like the efficiency of not having to commute every day.

Skot Waldron (59:36.00)
Tell some of your English mates to come over and buy some of our empty buildings because we have plenty of them over here.

Steve Cockram (59:44.00)
Well, the trouble is you have space. So, remember, you know, we’re dealing with one city. There’s three million people working in the square mile of where I am right now. But the infrastructure has been built over the years to sustain that. But we’ve redesigned the spaces here. So, I’ve got my own cafe bar on the 13th floor, which, you know, people follow me on LinkedIn often take pictures and go, that’s my office up there.

And I go, if you ever want to come in and have a coffee with me, come on a Tuesday, because it’s pastries or sweet treats on a Thursday. And we’ve got a third space that people come and work from or sit in. But it’s actually built because people don’t tend to come in anymore and just sit in cubicles and get on with their work. If we’re coming to be together, how do we collaborate in that space? How do we actually recognize that there’s something very special happened? Because not everyone’s coming in every day anymore. Well, unless you’re trying to get rid of people and then you just say it’s five days a week or you leave, which by the way is a very cheap way of just culling your workforce and getting rid of the people that you, without having to pay redundancy in the UK. But that’s what.

So, for me, I’m an optimist, Skot. That’s my, I always get accused of being far too optimistic. So, the utopian dystopian world in which we live, I mean, who knows? But what I do know is that humanity is humanity is humanity. And when I read the stories of thousands of years ago, it still resonates deeply with who I am.

And I still believe we were made to function as human beings in extended families, not necessarily with blood, but we exist most effectively as human beings in extended families. That’s the best context for raising children. And it’s a place where it challenges our selfishness and our individualism. And that’s probably the zeitgeist of our age, that we really are very individualistic. And we are very, very, here’s my truth, what’s yours? You can have your truth, as long as you don’t impose me.

It’s almost counter-cultural, I believe, to the way we were made, which is why we’re seeing so much brokenness.

Skot Waldron (01:01:44.00)
Now we’ve been down too many rabbit holes. I don’t know if I posted this show very well, but it’s been fun. I mean, man, it’s just fun to hear. You’re just different in the way that you share your ideas and the way that you gain your perspective and just the way you’ve interacted with so many, I mean, thousands of leaders, you’ve talked to thousands of leaders in the course of your career with GiANT and Endava now and just the wisdom you’ve gained from that’s really interesting because I know I’m hearing like pieces of it from so much experience.

Steve Cockram (01:02:22.00)
You’re very kind to me. I always say that when you get to my age, you usually need people to ask questions to access the unconscious competence. But here’s the difference, Skot, is I don’t know all the answers for right now. If you listen, what I’m really doing is I’m trying to pull from the things I believe to be intrinsically true and then extrapolate that into how I deal with a reality which is rapidly changing from any of us have ever known.

So, I’m always trying to anchor the truth in a bigger truth, which doesn’t make me postmodern, by the way. So, for me, I find a great deal of comfort in knowing that actually I sit inside a bigger story rather than it being a completely random chaos that we’re all heading towards some dystopian, you know, human beings have an incredible capacity for good and incredible capacity for evil. Human history teaches us that.

So, I don’t think you can disinvent technology when it’s been created, the moment they created the nuclear bomb. You can’t uninvent that stuff. And we’ve had to manage how we try and navigate the complexities of that, from the Cuban Missile Crisis to Iran today. I think the same will apply with AI. I think that what always has the capacity for great good can also be used for great harm.

Facial recognition and gait recognition technology means that Big Brother is always there now. So, you know, I’m glad I don’t live in parts of the world where the ability to protest now is gone because they know who you are, whether you’ve got a mask on or not. I don’t think I’d want to live in that country, but that’s AI. If you think of the technology that allows people to do that, it’s incredible, but that doesn’t necessarily mean I want to live in it.

So, somehow humanity always finds a way, hopefully, to rise and find a way to collaborate for our mutual benefit. But it’s, I think, the liberal world that somehow everyone’s, we’re moving towards a global democracy where we all love each other and, you know, we’ll share the resources more evenly. I mean, I think where we are right now, the world is more frightening and more polarized and, in a sense, more fearful, I think, for people. And AI probably will help some of it and it will cause I mean, by the look of it, great disparity between those who and those who have not.

So, you know, we’ll have to come back in a couple of years and see what happens. A hundred-foot wave is a hundred-foot wave. You either, if you’re learning to surf it and navigate it is not a simple thing to do. So, anyone who says this is easy is not really grasped it yet.

Skot Waldron (01:05:04.00)
The optimists in you though assume believes that good will triumph. The good in people will triumph.

Steve Cockram (01:05:13.00)
Well, I think probably my confidence is more in the one who created all things. So, in a sense, there’s times when I look at it go, this is just a shit show. But ultimately, so that’s what I’m saying is, if you believe there is a purpose behind humanity and creation, and that somehow it is not some random collection of atoms and molecules colliding and doing those things, that there is a creator, that there is purpose.

And that we find ultimately our place in the insecurity based on who we are and our identity. It’s really hard to navigate this world right now if you have nothing other than, you know, I’m going to make of it what I will. So, that’s probably for me the most important thing of going, I’m not God, which is going to come as a huge surprise to most people, not. And I am so finite and I don’t understand everything.

But that doesn’t mean that I haven’t found a place of peace in it. And I think that’s the bit which for me is I always come back to and go, I don’t have to have all the answers, but I quite enjoy the exploration of trying to work out like I did with all the personality stuff. Well, how do you make people unique and different and how predictable is that become so I can actually help people? And I think with AI as well is going, if we learn how to harness it for good and we never forget our humanity is core to who we were made to be, then will be more likely to succeed than if we somehow believe, well, it’s always about me as an individual getting what I need rather than actually thinking about how do I choose to use if I have spent capacity of time or talent treasure? How can I do that for a purpose? And how do I actually recognize my character grows in sacrifice more than it does in pleasure?

Individualism often leads to selfishness which leads us to a choice which says, I will choose always what is most comfortable, most pleasurable for me. But that’s not usually the leadership that people remember.

Skot Waldron (01:07:13.00)
It definitely is not. Well, they remember it in a bad way. They remember it in the sense of, it’s most of the time it comes out as like, I know how I’m not going to be when I become a leader, right? It’s those people, right? And that’s what we don’t want to see. I’m just going to tell you, man, I will take that clip of saying that you are admitting you’re not God. I’m going to send it to Helen. Just that clip. I’m sure that’s also probably here.

Steve Cockram (01:07:40.00)
She’ll probably laugh for an hour. Thank you that, Skot. That was really helpful. It’s great to connect, Skot.

Skot Waldron (01:07:45.366)
It’s been so, so good catching up with you and talking to you and here just hearing your brain just shared thoughts that you share. It’s been fun, man. I really appreciate it. I know you’ve done some good here for us today and good luck over there at Endava and the impact you’re looking to make over there. I appreciate you.

Steve Cockram (01:08:08.00)
Thank you, Skot.

Skot Waldron (01:08:13.00)
Are you feeling some pressure? Are you feeling like you see a giant 100-foot wave coming at you and wondering what to do about that? Wondering if you are able to surf a 100-foot wave or how you’re going to navigate it when it crashes. I mean, we’re all going through this right now and trying to navigate it, plan for it, understand it. It’s something maybe that we haven’t ever encountered before in this way. Okay.

I will say technology is always advancing. There are many of us that have endured technological advances and change in our culture and in our workplace. And this is yet another one. Now it’s a huge one and it will in fact believe it will define a generation. But how we move through this well is really important. And I think that the pressure is what’s really important to think about and how our fear is manifesting itself in our behaviors as leaders and what we’re doing about that.

AI is putting that pressure on relational dynamics and how we interact with each other as a result of AI introducing itself into our workplace and into our workflows and into our products and services. And that’s a big deal. Connection with people ultimately is what Steve was preaching. Our humanity, understanding that when I can be in a room with you and touch you and hug you, that that is a manifestation of your humanity and reality to me. And when we can start to really understand that and embrace it, I think we’ll hopefully create more opportunities for that engagement with each other. That ability to, I’d say, up-level our empathy, our understanding, our curiosity, and our hope in each other and our hope in humanity as a whole.

If you want to find out more information about me or check out the show notes where there’s going to be more information and links to the things referenced in this episode, visit skotwaldron.com. And lastly, I’m asking for a little bit of love, just a little bit. So please take a moment, follow, rate the show. The algorithm is like that; it helps me get the word out. I really appreciate it.

Thank you. And until next time, stay Unlocked.