Permission and Purpose: AI Adoption Is Cultural Change, Not Just Technical Integration
AI adoption isn’t about tools. It’s about culture. The orgs that win are the ones where leadership models the behavior, systems support learning, and teams have permission to experiment and purpose to point the way. Move early. Move with care.

Bringing AI into an organization and sparking native adoption isn’t a tooling problem—it’s a culture shift. Many companies are full of smart people who have gotten where they are by keeping their eye on the ball and executing like hell. But in times of disruption, being able to win in the old way can make it harder to transition to the new.
It's hard for people to stop doing what's been working for something that might work. It's hard for people to risk failure in a new way that may not be perceived as credible. It's hard for people to embrace change.
As I've been helping people with this transition, I’ve been grappling with how frequently people want practical tips on how to use new things—but what they need is cultural change. As executive coach Tarikh Korula put it, people need permission and purpose. I love the alliteration, but more than that, I love the clarity. It's a simple framing that unlocks action.
Leadership Must Go First
The clearest signal that something matters inside a company isn’t a strategy memo—it’s how leadership behaves.
At Shopify, CEO Tobi Lütke doesn’t just endorse AI adoption. He lives it. He builds with agents, talks about it in weekly forums, and made usage a baseline expectation across the company:
"Stagnation is almost certain, and stagnation is slow-motion failure."
—Tobi Lütke, CEO of Shopify
At Gumroad, CEO Sahil Lavingia made it personal. He launched a challenge to his team:
"We did this competition where we split $33,000 between whoever could write more AI-powered code than me over a month."
—Sahil Lavingia
That’s not symbolism. That’s a cultural shift being modeled from the top—visibly and competitively.
🟩 CASE STUDY: Duolingo’s AI Mandate
Luis von Ahn, CEO of Duolingo, recently declared the company “AI-first” in an all-hands email. But the memo wasn’t about technology—it was about changing how the org operates.
He laid out hiring and performance policies that hinge on AI adoption. He called for fundamental rewrites of systems “designed for humans.” And critically, he balanced urgency with support:
“We’re going to support you with more training, mentorship, and tooling for AI in your function.”
This is what transformation looks like when it’s taken seriously. Not an experiment. A new operating model.
Start With the Emotional Layer
Inside most organizations, AI adoption begins in the shadows. People are curious but cautious. They're unsure whether trying new tools will be encouraged—or quietly penalized.
In some creative or legacy environments, there are also real concerns about how AI has been built, who’s been displaced, and what it means for the future of the craft. These are legitimate. And ignoring them just builds resistance.
The cultural shift starts by naming the discomfort. Not minimizing it. Not fast-forwarding to features. People need to know this is a conversation about them, not just about the tools.
Build Systems That Scale Learning
A few early adopters playing with ChatGPT in a sandbox doesn’t create a resilient org. If AI fluency is going to scale, learning needs to become visible and repeatable.
Ask yourself:
- Is there a space to share experiments?
- Are failure stories just as welcomed as success stories?
- Do incentives reflect curiosity—or only output?
Shopify bakes AI into its review cycles. Gumroad is restructuring its stack to maximize AI-native development. Duolingo ties headcount justification to whether teams can automate parts of their work. These aren’t “AI pilots.” These are operating system rewrites.
Becoming AI-Native Is the Real Goal
This isn’t about tools. It’s about trajectory.
Organizations that win this shift will treat AI fluency the way we used to treat digital literacy. As the baseline—not the edge case.
That takes:
- Permission to try (and fail)
- Purpose tied to the mission
- Consistent modeling from leadership
AI isn’t the new tech stack. It’s an opportunity to actually live the values of learning and growth we all claim to believe in. Living your values is hard. And the orgs that do it early will outpace the ones still waiting for the playbook.
Everyone will eventually adopt these tools.
The real question is whether you want to lead from the front—or spend the next few years catching up.
Yes, early adoption carries risk. But the rewards—for individuals and organizations—are disproportionate right now.
Move. And move with care.