Maintaining Coherence: How to Anchor Your Organization in the AI Shift

I've been thinking about this problem for seventeen years.

Maintaining Coherence: How to Anchor Your Organization in the AI Shift

The earliest version of it arrived at Palantir, in the years when the company was still figuring out what it was. The animating idea — the thing that ran underneath the products, the culture, the arguments about what actually mattered — was the intersection. Not what machines could do. Not what humans could do. What happened at the place where they met.

That idea turned out to be right. It is more consequential now than anyone in those rooms imagined it would be.


The wrong side to focus on

Most of the conversation about AI focuses on the machine side. Capability timelines. Model benchmarks. What AI can do today and what it will be able to do by the end of next year. These are real questions — the technology is genuinely remarkable and the velocity of change is genuinely disorienting.

But they're the wrong entry point.

The more important question — the one that determines whether any of this actually changes anything for a specific organization — is what happens at the intersection. Where AI capability meets a real company, with real people, real culture, real history, real incentives, and real anxiety. That is where every AI strategy is ultimately tested. That is where the work either works or doesn't.


The terrain

The intersection is not a metaphor. It is a specific, navigable set of conditions.

When AI capability enters a human organization, it doesn't land in a vacuum. It lands in an environment shaped — over years, sometimes decades — by how people think about their work, what they believe their contributions are worth, and what they expect from the relationship between effort and outcome. That environment determines whether the technology actually changes anything, whether adoption is real or performed, whether the strategy holds under pressure or falls apart when it meets the organization it was built for.

The organizations that treat this as implementation detail — we'll handle the people side after we've made the technology decisions — consistently underperform the ones that treat it as the primary design constraint. Not because the technology doesn't matter. Because technology without an honest read on the human terrain it's landing in is strategy built on assumptions that may not hold.

The AI decisions and the human decisions are not separable. They are the same decision.


The questions nobody says out loud

There is a specific set of questions that leadership teams navigate without surfacing. They sit in individual leaders as private concerns rather than shared problems. They shape how people absorb information, make decisions, and ultimately whether they move at all.

What is my expertise worth now? The knowledge and capability I built over years — that I used as the basis for decisions and the source of my judgment — how much of it survives this shift? What does leadership look like when the work that supported it is being reorganized around different requirements?

These are not rhetorical questions. They are live, uncomfortable, and largely unaddressed. Most leadership teams are navigating AI while holding a private version of these questions alongside all of the public ones. The public conversations are about strategy. The private ones are about identity. And the private conversations are doing at least as much work.


The hardest version

The hardest version of this problem is not organizational. It is personal.

The organizational question — how do we structure our response, what should we invest in, how do we align the team — is real and important. But it is, in some ways, tractable. It can be approached through analysis, sequencing, deliberate design.

The personal question does not yield to analysis in the same way. What does it mean for a leader's relationship to their own competence, to their own identity in the work, when the capabilities that defined them are being reorganized around different requirements? That question requires something different. Not more information. Not a better framework. A willingness to sit with genuine uncertainty about what comes next — and to lead from that place rather than waiting for it to resolve.

I don't separate the organizational question from the leadership question. They are the same problem.


Where the work is

This is why the work I do is focused on the intersection — not just on the technology, and not just on the organization, but on the place where they meet.

Helping a leadership team understand what AI is doing doesn't mean briefing them on capability timelines. It means helping them build a clear picture of what those capabilities mean for their specific business, their specific people, their specific moment — and working through what that implies for how they lead.

The technology is real. The force is real. The question is whether your organization — and the people inside it — can absorb it without losing coherence. Without freezing. Without moving in the wrong direction before they've built a clear enough picture of the terrain.

That has been the question at the intersection since the beginning. The stakes are higher now. The work is the same.


If this is where you are, that's where the work begins.