Best of LinkedIn: World Economic Forum Annual Meeting 2026
Show notes
We curate most relevant posts about Strategy, M&A & Investments on LinkedIn and regularly share key takeaways.
This edition offers an extensive overview on the World Economic Forum 2026 in Davos focuses on a central theme of purposeful dialogue to address global instability and technological shifts. High-level discussions emphasize moving Artificial Intelligence from conceptual hype to operational execution, with a strong focus on responsible governance, human-centric systems, and workforce upskilling. Business and political leaders are also prioritising geopolitical resilience, climate action, and the urgent need for sustainable energy and food system transformations. Economic optimism remains high due to global market resilience, yet participants highlight the necessity of collaborative leadership to navigate trade fragmentation and social inequality. Ultimately, the gathering serves as a platform for interdisciplinary partnerships aimed at fostering long-term prosperity through transparency and accountability.
This podcast was created via Google Notebook LM.
Show transcript
00:00:00: provided by Thomas Allgaier and Frennis, based on the most relevant LinkedIn posts about WEF annual meeting, twenty twenty six.
00:00:07: Frennis specializes in B to B market research for strategy and consulting teams with a focus on tech.
00:00:13: and ICT.
00:00:14: Welcome back to the deep dive.
00:00:16: Today, we are heading up the mountain.
00:00:18: We're going to strip back all the noise and really look at the strategy and consulting trends coming out of Davos, twenty twenty six.
00:00:24: And before we dive into the big topics and believe me, there are some heavy ones, we should probably set the scene a little.
00:00:30: Yeah,
00:00:30: the vibe was.
00:00:32: different.
00:00:32: It really was.
00:00:33: If you're picturing that snowy postcard, well, Joan Guy called it a no snow Davos.
00:00:38: It was literally warm.
00:00:40: And that feels like a pretty good metaphor for the mood in the room.
00:00:42: It does.
00:00:43: We went through, I mean, tons of posts from people on the ground, consultants, CEOs, you name it.
00:00:48: And the feeling wasn't optimistic or pessimistic.
00:00:50: It was just pragmatic.
00:00:52: Right.
00:00:52: The official theme was a spirit of dialogue.
00:00:55: But the real subtext, I mean, the thing people were actually talking about was execution and resilience.
00:00:59: It feels like the age of the big visionary keynote is sort of dead.
00:01:04: Nobody's there for the promise of twenty thirty anymore.
00:01:07: They're just trying to get through twenty twenty six.
00:01:09: Exactly.
00:01:09: The patience for fluff has just evaporated.
00:01:13: With all the geopolitical tension, which we'll get to and the economic pressures, the whole conversation has moved from what if to how now.
00:01:21: It's about operational reality.
00:01:23: Okay,
00:01:23: so we've got a lot to cover.
00:01:25: We've broken it down into four main themes.
00:01:27: We'll hit AI shift from hype to action, the very tense global landscape, the physical cost of all this tech, and finally, how leadership has to change.
00:01:36: And we have to start with AI.
00:01:38: It's the obvious one.
00:01:39: But this isn't the same AI talk from a couple years ago.
00:01:41: The hype cycle is definitely over.
00:01:43: This is theme one.
00:01:45: From hype
00:01:45: to agentic execution.
00:01:47: And that word, agentic, doing a lot of work.
00:01:50: For anyone listening who's hearing that buzzword in every pitch deck, what does it actually mean here?
00:01:54: Well,
00:01:54: think about it like this.
00:01:55: We used to have chatbots.
00:01:56: You ask, it answers, it's passive.
00:01:58: It waits for you.
00:01:59: Right.
00:02:00: Agentic AI has agency.
00:02:02: It can take action, it can execute tasks, it can complete workflows all on its own.
00:02:07: It's the difference between, you know, a tool and an employee.
00:02:11: And Vivian Way had a really... sharp take on this.
00:02:14: She said the big story isn't large language models anymore.
00:02:17: In fact, she pretty much called them commodities.
00:02:20: Which is a huge market shift.
00:02:21: A few years ago, the model was the product.
00:02:24: Now it's a utility like electricity.
00:02:26: Way's point is.
00:02:27: the real story is AI operationalization, embedding it into the actual workflow.
00:02:33: so it does something.
00:02:34: She gave a great example that really makes it click.
00:02:36: She talked about EVA, the WF's own AI concierge.
00:02:40: Right, and this wasn't just some FAQ bot.
00:02:43: EVA handled seventy-five percent of attendee requests, autonomously.
00:02:47: And we're not talking simple questions.
00:02:49: We're talking about rescheduling meetings between high-level people, coordinating calendars, delivering briefs.
00:02:54: That's the agentic part.
00:02:55: It didn't just tell you the meeting moved, it moved it.
00:02:57: It went into the calendar, found a new slot, checked the other person, and confirmed it.
00:03:02: That's a live system at scale.
00:03:03: Precisely.
00:03:05: But, and this is a critical but that came up, just because the tech can do it doesn't mean companies are ready.
00:03:11: Dr.
00:03:12: Florian Mueller raised a really interesting flag about what he calls the micro productivity trap.
00:03:18: This concept really stood out to me.
00:03:20: It's the idea that we're using this revolutionary tech to do the same old things.
00:03:25: just a little bit faster.
00:03:26: It's the classic mistake.
00:03:28: Mueller says most companies are patting themselves on the back because AI writes their emails twenty percent faster.
00:03:34: That's a micro productivity.
00:03:36: If your process is broken, doing it faster just means you fail faster.
00:03:39: You're just rearranging the deck chairs on the Titanic.
00:03:42: While your competitor is building a speedboat, that's the trap.
00:03:45: You feel productive, but you aren't creating new value.
00:03:48: Mihir Shuklu was on the same page, right?
00:03:50: He said, we need a real commitment to applied AI.
00:03:53: Yeah, his point was that if you're still just running pilots in twenty twenty six, you're already behind.
00:03:57: The sandbox phase is over.
00:03:59: It's time to redesign entire value chains around these agents.
00:04:03: OK, but this raises the massive workforce question.
00:04:06: If EVA is doing seventy five percent of the work of a coordinator.
00:04:10: What happens to the coordinator?
00:04:11: The
00:04:11: rules are just changing, fundamentally.
00:04:14: Gina Varju Brewer from SAP framed it as human AI power couples.
00:04:19: Sounds a little like a reality show, but I see the point.
00:04:21: It's functional.
00:04:22: It's about AI handling the speed and scale and humans providing judgment and context.
00:04:28: Vivian Way actually gave this new job a name, Agent Managers.
00:04:31: Agent Managers.
00:04:32: So your job is to manage the bot that does the job.
00:04:35: Basically,
00:04:36: yeah.
00:04:36: You supervise, you check the output, you handle the exceptions.
00:04:39: But this creates a huge new problem, which Helen Oakley flagged.
00:04:43: The governance
00:04:44: gap.
00:04:44: A massive gap.
00:04:45: Oakley said confidence in AI is high, you know, seventy to eighty percent of execs get it.
00:04:50: But there is zero clarity on ownership.
00:04:52: If an agent makes a bad trade or denies a loan,
00:04:54: who's responsible?
00:04:55: Is it the manager?
00:04:57: The CIO.
00:04:58: The vendor.
00:04:58: Nobody knows.
00:04:59: And Oakley's point is you can't scale what you can't govern.
00:05:02: If you don't know who goes to jail when it goes wrong, you can't roll it out.
00:05:05: It becomes a legal problem, not a tech one.
00:05:07: And you definitely can't roll it out if you're feeding it bad data.
00:05:12: Patrick O'Pongrats threw some cold water on a lot of these AI strategies.
00:05:16: Necessary
00:05:17: cold water, I think.
00:05:18: He pointed out that so many strategies fail because they're built on batch data.
00:05:23: Which, for anyone not deep in tech, just means old data.
00:05:26: From yesterday or last week.
00:05:27: Exactly.
00:05:28: And Pongrats used a brilliant analogy.
00:05:30: He said running an AI agent on batch data is like driving a Formula One car with a map from last Tuesday.
00:05:36: You're going two hundred miles an hour.
00:05:38: but you're steering based on where the curve used to be.
00:05:40: A recipe for a crash.
00:05:41: You need real time data.
00:05:43: Avinash Vashistha took it even further.
00:05:45: He said, we're in the era of the AI native enterprise.
00:05:47: He came up with a new metric too, intelligence velocity.
00:05:50: Which
00:05:51: I guarantee you is going to be in every consulting deck by next quarter.
00:05:54: The old model was cost per seat.
00:05:56: The new one is intelligence velocity.
00:05:58: How fast can you turn data into a decision?
00:06:00: Speed is everything.
00:06:02: So that's the internal picture.
00:06:03: Agentic AI, a need for speed.
00:06:06: a governance mess, but this is all happening while outside the world is getting, well, pretty aggressive.
00:06:13: Which is our theme too, the geopolitical landscape.
00:06:16: The Global Risks Report literally calls this the age of competition.
00:06:20: And you could feel it in Davos.
00:06:22: Yeah.
00:06:22: Klaus Schmeinsberg used the word angst to describe the mood.
00:06:26: Angst is perfect.
00:06:27: Yeah.
00:06:27: There was real anxiety about the US president's arrival, and this wasn't just about diplomacy, it was about the business environment.
00:06:34: John Stackhouse noted the US is now actively shaping the agenda, not just participating.
00:06:39: And the pivot is away from stakeholder capitalism.
00:06:42: And toward aggressive growth and AI-driven disruption.
00:06:46: The gloves are off.
00:06:47: Ornick L, who goes by Mimee, she pointed out that economic power is now just being used as a weapon.
00:06:52: It's not just about tariffs anymore.
00:06:54: It's about fragmenting trade using market access as a political tool.
00:06:59: Julio Romo connected this to corporate finance in a fascinating way.
00:07:03: He said, reputation is now a cost of capital.
00:07:06: That's a huge insight for any CFO listening.
00:07:08: He's saying your geopolitical position directly affects your borrowing costs.
00:07:12: Market access is political.
00:07:13: If you're seen on the wrong side of a dispute, your risk premium shoots up.
00:07:16: So you can't just look at the P&L.
00:07:18: You have to look at the map.
00:07:19: Pierre-Marie called it building a geopolitical muscle.
00:07:23: He says companies need to embed this kind of risk analysis into their core strategy.
00:07:27: It's not a PR issue anymore.
00:07:29: It's a survival skill.
00:07:31: But there's a more invisible layer to this competition.
00:07:34: Angelika Sher Regina and Tyler S brought up securing cognitive space.
00:07:40: This is the really scary stuff.
00:07:41: Tyler S noted that cyber insecurity is number six on the global risks list, but it's not just about hacking a power grid anymore.
00:07:49: It's about hacking the public conversation.
00:07:51: AI enabled misinformation.
00:07:53: At a scale, humans just can't counter.
00:07:56: Sherry Jean's point is we have to defend against it.
00:07:58: It's a new domain of warfare.
00:08:00: If you can destabilize a country by flooding it with lies, you don't need missiles.
00:08:04: It's terrifying.
00:08:05: AI agents running our companies and hostile AI is hacking our reality.
00:08:09: And this is the big but.
00:08:11: None of this digital world exists without the physical world.
00:08:14: And that brings us to theme three.
00:08:16: Theme three.
00:08:18: Energy, industry, and planetary boundaries.
00:08:21: The cloud isn't actually in the sky, is it?
00:08:23: It's in a data center.
00:08:24: A
00:08:24: very, very hungry data center.
00:08:26: Benoit Bego had some sobering numbers.
00:08:28: We have over eleven thousand of them now.
00:08:32: Drawing more than sixty gigawatts of power.
00:08:34: and the water use.
00:08:35: That's what got me.
00:08:36: He said semiconductor manufacturing uses over a trillion liters of fresh water a year.
00:08:41: One trillion liters.
00:08:44: In a world with water shortages, this is the collision.
00:08:47: You can't have infinite digital growth on a finite planet.
00:08:51: The physics wins eventually.
00:08:52: We'll eat.
00:08:53: cheetah connected this to the grid.
00:08:55: He said electricity demand in the Middle East and Africa is set to triple.
00:08:59: And the grids just aren't ready.
00:09:01: He's saying we have to use AI to make them more efficient, but that takes huge investment.
00:09:04: It's a chicken and egg problem.
00:09:06: And Kim Hedegard pointed to the money bottleneck.
00:09:08: He said we have to de-risk first-of-a-kind projects.
00:09:12: You hear that term a lot in heavy industry.
00:09:15: First of a kind.
00:09:16: Thanks.
00:09:16: Hate it.
00:09:17: Too much risk.
00:09:18: Hedegard is saying we need to make these big green infrastructure projects bankable, turn dialogue into delivery.
00:09:24: It feels
00:09:24: like the only way out is collaboration.
00:09:26: Michael Camerilango had a really nice line for this.
00:09:28: Great places don't compete, they complete.
00:09:30: Complete.
00:09:31: I like that.
00:09:32: Like organs in a body.
00:09:33: The heart doesn't compete with the lungs.
00:09:35: He's arguing cities and industries need to integrate.
00:09:38: Waste heat from a data center warms the local houses, that kind of thing.
00:09:42: But
00:09:42: Anne Chevrier pointed out a bitter irony here.
00:09:45: She did.
00:09:46: It's the paradox of our time.
00:09:48: Innovation cooperation is rising.
00:09:50: We're sharing open source AI.
00:09:52: But peace and security cooperation is collapsing.
00:09:56: We're building a shared digital brain while we build walls around our countries.
00:10:00: That's
00:10:00: the tension right there.
00:10:02: Technically integrated, politically disintegrated.
00:10:04: Okay, so let's pull this together.
00:10:06: We've got AI that demands a new way of working, a world that demands a new way of strategizing, and a planet that demands a new way of building, who on Earth can lead through all this.
00:10:16: And that's our final theme.
00:10:18: Theme four.
00:10:19: leadership and the human element.
00:10:20: The consensus seems to be that the hero CEO model is broken.
00:10:23: Neha Cabra had a great analogy.
00:10:25: She said we should think of modern CEOs less like geniuses and more like elite athletes.
00:10:30: It makes perfect sense.
00:10:31: It's not just about talent on game day.
00:10:33: It's about training cycles, nutrition, and, you know, recovery.
00:10:37: And recovery is the part corporate culture always ignores.
00:10:41: We celebrate burnout.
00:10:42: Cabra says that's dangerous.
00:10:44: She talks about decision hygiene.
00:10:46: An exhausted leader is a noisy leader.
00:10:48: They over control.
00:10:49: They create panic.
00:10:50: You don't need intensity in a crisis.
00:10:52: You need endurance.
00:10:53: You also need decency.
00:10:55: Shelly Zales talked about FQ, the female quotient, but she had another one.
00:10:59: DQ.
00:11:00: The decency quotient.
00:11:02: We've got IQ, EQ, now DQ.
00:11:03: Is that just another buzzword?
00:11:05: I don't think so.
00:11:06: Zales argues decency and resilience are strategic assets.
00:11:10: In a world this anxious, a leader without decency just won't keep their best people.
00:11:15: It's a retention tool.
00:11:16: And Rachel Kednack added a critical point about who is actually building all this stuff.
00:11:20: She warned that the rooms where AI is built don't look like the world that AI affects.
00:11:25: Which
00:11:25: goes right back to risk.
00:11:27: If you have a homogenous team building your AI, you are baking their blind spots right into the code.
00:11:32: Diversity isn't just nice to have, it's a quality control mechanism.
00:11:36: So for someone listening, maybe not a CEO yet, what's the practical advice to navigate all this?
00:11:40: Triennel, Chef, had the most practical, unglamorous tip of all.
00:11:45: Relentless follow-up.
00:11:46: It's not sexy, is it?
00:11:47: But she says it's what separates the talkers from the doers.
00:11:51: Execution compounds.
00:11:53: In a world full of big ideas, the person who actually sends the email and closes the loop is the one who gets things done.
00:11:59: We've all felt that.
00:12:00: It's true.
00:12:01: And
00:12:01: finally, Alina Bull completely flipped the script on networking.
00:12:05: We always think, what can I get?
00:12:07: Right.
00:12:08: Who do I need to meet in this room?
00:12:09: Exactly.
00:12:10: Bull says, stop that.
00:12:11: Ask, what do I bring to this room?
00:12:13: Focus on contribution.
00:12:14: That's a much healthier and... Frankly, a more powerful way to operate.
00:12:17: It is.
00:12:18: And in a world where an AI can get you any information, your unique human contribution is the only thing left that has real value.
00:12:26: OK, so let's try to recap.
00:12:27: We've gone from the what of AI to the messy operational how.
00:12:32: We've mapped a world where economics is a weapon and our attention is a battleground.
00:12:37: We've hit the physical limits of our digital dreams.
00:12:39: And we've learned that leadership is now an endurance sport that requires decency.
00:12:44: It's a lot.
00:12:45: But if I had to boil it down, it's this.
00:12:48: Reality is back.
00:12:49: The years of hyper over.
00:12:51: Twenty-twenty-six is about the hard, granular work of making things work in a world that's breaking apart.
00:12:57: Which leaves me with one final thought for you to chew on.
00:13:00: We talked about agent managers and human AI power couples.
00:13:04: But if an AI like EVA is already handling seventy-five percent of your tasks,
00:13:09: I think I see where you're going with this.
00:13:10: Are we just training the AI to do our old jobs?
00:13:13: Or are we being forced to evolve into the kind of strategic human leaders that an AI simply can't replace?
00:13:20: Because if your job is just the tasks, the clock is ticking very, very loudly.
00:13:24: That's the question that should drive everything we do from here on out.
00:13:27: If you enjoyed this episode, new episodes drop every two weeks.
00:13:30: Also check out our other editions on private equity, venture capital, strategy and consulting and M&A.
00:13:36: Thanks for listening and don't forget to subscribe.
New comment