AI

Your AI center of excellence should work itself out of a job

Most AI centers of excellence become permanent bureaucratic bottlenecks that slow adoption instead of accelerating it. The smart ones are designed to dissolve as AI capability spreads throughout your organization, measuring success by how quickly they become unnecessary.

Most AI centers of excellence become permanent bureaucratic bottlenecks that slow adoption instead of accelerating it. The smart ones are designed to dissolve as AI capability spreads throughout your organization, measuring success by how quickly they become unnecessary.

Key takeaways

  • CoEs should be temporary - The goal is building distributed AI capability throughout your organization, not creating a permanent special team
  • Bureaucracy kills adoption - Centers that evolve into approval committees and gatekeepers slow AI implementation rather than accelerate it
  • Design for planned obsolescence - Set a dissolution timeline and measure success by how quickly AI becomes embedded in normal work
  • Focus on knowledge transfer - Your CoE should teach, document, and support rather than own and control AI initiatives

AI centers of excellence are meant to be temporary. Almost nobody treats them that way.

That’s the wrong assumption. The best ones are built to disappear.

At Tallyfy, we’ve watched this pattern play out with workflow automation for years. When customers create centralized process teams that eventually dissolve, those implementations succeed. When the team stays permanent, process thinking stays trapped in one department instead of spreading where it needs to go.

Same thing happens with AI. Every time.

Why CoEs become bureaucratic dead ends

I came across research on center of excellence effectiveness that genuinely frustrated me. CoEs aren’t viewed as adding value. They’re seen as bureaucratic auditors policing the organization under the guise of promoting best practices.

That’s the trap. You start with good intentions: centralize AI expertise, share knowledge, establish standards. Then something shifts, usually quietly.

The CoE becomes a bottleneck. Teams need approval to experiment. The approval process gets longer. Politics creep in. The structure that was supposed to speed up AI adoption now slows it down.

The majority of challenges in AI rollout relate to people and processes, not technical issues. CoEs that turn into approval committees make those people problems worse. Not better.

Mid-size companies can’t afford this. You don’t have the overhead budget for a permanent AI coordination layer that doesn’t generate direct value. Every dollar needs to count.

The data is blunt about this. Only about 6% of organizations qualify as AI high performers seeing significant EBIT impact, while the remaining 94% are using AI but not transforming with it. A permanent center of excellence often makes this worse. You create another silo trying to coordinate the silos.

What a good CoE actually looks like

Think of an AI center of excellence as scaffolding, not the foundation.

Scaffolding supports construction. Once the building stands, you pull it down. Same logic applies here. The CoE supports AI capability development. Once that capability lives throughout your organization, the CoE should go away.

Practically, that means focusing on four things:

Knowledge transfer, not knowledge hoarding. Every project includes training for the business team. Documentation lives in their systems, not yours. They own the capability when you’re done.

Standard development without enforcement. Create templates, frameworks, guidelines. Make them available. Don’t make teams ask permission to deviate. The data backs this up: centralized decision making driven by politics measurably impacts organizational growth.

Problem-solving support, not problem solving. When teams hit walls, help them find solutions. Don’t take over and solve it for them. That distinction matters more than most people realize.

Success pattern identification. You see what works across multiple teams and share it. But let each team adapt those patterns to their own context.

Notice what’s absent? Control. Approval. Gatekeeping. Those things emerge when CoEs become permanent fixtures with turf to protect.

Designing for planned obsolescence

Most organizations never define what success actually looks like for their AI center of excellence. That’s where things go wrong.

Almost all GenAI pilots fail to achieve rapid revenue acceleration. Permanent CoEs often contribute to this by accumulating pilots rather than building distributed capability. More launches, less real adoption.

Success isn’t the number of AI projects launched. Not models deployed or cost savings generated. Those are project metrics, not organizational ones.

Real success is: we don’t need this team anymore.

Set a timeline. 18 months works for most mid-size companies. That’s enough time to run multiple AI initiatives, build capability across several departments, and establish working patterns that don’t depend on the CoE.

Then measure capability transfer. Can business teams identify AI opportunities without you? Can they evaluate vendors independently? Do they know how to structure pilots and measure results properly? RSM’s workforce research shows most organizations still have not redesigned roles based on AI capabilities. That’s the gap your CoE should close.

Small and mid-sized organizations stand to gain significantly from establishing an AI CoE, but only if that CoE builds capability rather than dependency.

Track the inverse metric: how often do teams come to you for help? High dependency at the start is fine. Six months in, it should drop. By month 12, teams should only escalate genuinely complex problems. By month 18, they shouldn’t need you at all.

That’s when you dissolve the CoE. Successfully.

The practical structure for mid-size companies

You don’t need a big team. Three to five people, maximum.

One person who understands AI technology deeply. Not someone who reads about it. Someone who has built things, debugged models, knows where implementations typically break down.

One person who understands your business operations. They know the processes, the pain points, the politics. They can translate AI capabilities into something that actually matters for the business.

One person focused on knowledge management. Documentation, training materials, playbooks. Everything the CoE learns gets captured in a form others can actually use.

That’s the core. Add specialists temporarily as needed. Data engineer for a specific project. Change management support for a major rollout. But keep the core small.

Where does this team sit? Not in IT. Not buried in a business unit. Directly under the COO or CEO for mid-size companies. The CoE needs organizational authority to work across departments without getting trapped in any single silo’s priorities.

The critical part most people skip: rotate people through the CoE. Six-month rotations. Business people come in, learn AI. AI people go back to business units with real context. This prevents the knowledge concentration that kills capability transfer. Two-thirds of workers say their organization has not been proactive in AI training and upskilling. Changing that ratio for your company is the CoE’s primary job.

Building capability that lasts

The activities matter more than the org chart.

Every AI project should include embedded training. The business team learns by doing, with CoE members coaching. Not the CoE doing the work while the business team watches and takes notes.

Create templates and frameworks, but make them forkable. Teams should be able to copy, modify, and make them their own. You want proliferation, not rigid standardization.

Run regular knowledge-sharing sessions, but make them peer-to-peer. The CoE facilitates. Business teams present their learnings to other business teams. This builds the muscle for ongoing knowledge transfer after the CoE is gone.

Document everything in the business team’s tools. Not in the CoE’s repository. If they’re still coming back to your documentation system after you’ve dissolved, you’ve failed.

Build a network, not a hierarchy. Connect people working on similar problems across departments. They’ll support each other long after the CoE ends. AI success depends more on how organizations integrate tools into workflows than on the technology itself. Companies succeed when they decentralize implementation authority but retain accountability.

I think this pattern works because it respects how knowledge actually spreads in organizations. I’ve seen it at Tallyfy repeatedly. Customers who build process thinking into their teams rather than centralizing it in a permanent department scale faster and sustain improvements longer.

One exception worth noting: if AI is your core business. If you’re building AI products, running AI services, or competing primarily on AI capabilities, then a permanent CoE probably makes sense. You need ongoing coordination of a strategic capability. That’s different.

But for most mid-size companies, AI is a tool. A powerful one, but still a tool for running your actual business better. Tools shouldn’t require permanent coordination committees.

The sign you’ve succeeded? Two years after launching your AI center of excellence, nobody remembers it existed. AI projects just happen. Teams evaluate and implement AI tools as part of normal work. Knowledge spreads through the networks you built.

That’s what real capability transfer looks like. Not a permanent team maintaining the knowledge. Knowledge so well distributed that the team becomes unnecessary.

Plan for that from day one.

About the Author

Amit Kothari is an experienced consultant, advisor, coach, and educator specializing in AI and operations for executives and their companies. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.

Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.