AI

Your AI can not think straight when your data lives in four different ERPs

Mid-size companies almost always run multiple ERP systems after acquisitions and organic growth. AI needs unified context to be useful, but your data sits in silos that were never designed to talk to each other. MCP servers offer a different integration pattern: connect each system to the AI layer instead of connecting them to each other.

Mid-size companies almost always run multiple ERP systems after acquisitions and organic growth. AI needs unified context to be useful, but your data sits in silos that were never designed to talk to each other. MCP servers offer a different integration pattern: connect each system to the AI layer instead of connecting them to each other.

If you remember nothing else:

  • Most mid-size companies run 2-4 ERP systems because of acquisitions, organic growth, and departmental preferences. This is normal. Pretending you will consolidate to one system someday is not a strategy.
  • AI needs unified context to reason well, but your data lives in silos with different schemas, naming conventions, and update cycles. Connecting systems to each other creates exponential complexity.
  • MCP (Model Context Protocol) lets you connect each system to the AI layer instead of connecting them to each other. One MCP server per system, one reasoning layer across all of them.
  • Start with read-only queries before attempting any write-back. AI amplifies your data quality problems, and you do not want confident garbage flowing back into production systems.

Nobody plans to run four ERP systems. It just happens.

You acquire a company that runs SAP. Your original finance team lives in NetSuite. The warehouse picked Dynamics 365 three years ago because someone on the team had experience with it. And marketing operates out of a custom system nobody fully understands anymore.

NetSuite’s research confirms this is not unusual. Nearly half of ERP implementations encounter failure during initial attempts, and roughly 30% take longer than projected. But here is the thing people miss: even “successful” ERP deployments tend to calcify into silos. Departments customize them, integrate them with local tools, and build workflows around their quirks. Over time, each system becomes its own small kingdom.

Now someone in the C-suite says “we need AI” and the data reality hits you.

Why multi-ERP environments are the norm, not the exception

The fantasy of a single unified ERP died somewhere around the third acquisition. I don’t say this with judgment. Growing companies acquire other companies, and those companies already have systems. Ripping and replacing ERPs during an acquisition is expensive, risky, and typically falls off the priority list within six months.

Research from Forrester shows that companies using fragmented data management systems face a 20-30% increase in operational costs from redundant data management alone. That’s the tax you pay for running parallel systems. But the alternative (a multi-year ERP consolidation project) carries its own brutal cost. 47% of ERP implementations experience budget overruns, averaging 35% over plan. So you do the rational thing: you keep running the systems you have.

This creates a specific problem when AI enters the picture. AI is only as good as the context you give it. Ask it a question about customer profitability and it needs data from your CRM, your billing system, your ERP, and maybe your warehouse management system. If those systems don’t talk to each other, the AI is working with partial information. Partial information produces partial answers. Worse, it produces confident partial answers.

Three integration patterns and why the first two usually fail

There are really three ways to connect multiple business systems, and most companies try the first two before reluctantly discovering the third.

Point-to-point connectors. You build direct connections between each system pair. ERP A talks to ERP B. ERP B talks to the CRM. The CRM talks to ERP C. This works when you have two or three connections. With five systems, you need up to ten connections. With ten systems, forty-five. The math is a formula for maintenance nightmares. Every time one system updates its API, multiple connectors break.

Middleware and iPaaS. A hub-and-spoke model where every system connects to a central platform (MuleSoft, Boomi, Workato, or similar). Gartner estimates that nearly 66% of enterprises are using or plan to use iPaaS solutions. This is better. Five systems need five connections instead of ten. But it still assumes you want to synchronize data between systems. That means mapping fields, resolving conflicts, handling duplicates, and maintaining transformation logic as each system evolves. It works, but the ongoing maintenance burden is real.

AI-native integration via MCP. This is the pattern that genuinely excites me. Instead of making your ERPs talk to each other, you connect each one to the AI layer. The AI becomes the reasoning engine that queries across all systems without requiring them to synchronize with each other.

How MCP changes the integration calculus

Model Context Protocol is an open standard that Anthropic released in late 2024. People call it “USB-C for AI,” and that analogy holds up surprisingly well. Just as USB-C lets any device connect to any peripheral through one standard port, MCP lets any AI model connect to any data source through one standard protocol.

The architecture looks like this: you build one MCP server for each business system. One for SAP. One for NetSuite. One for Dynamics. One for your custom warehouse thing. Each MCP server knows how to read (and optionally write) data from its system. Then Claude, or whatever AI model you choose, connects to all of them simultaneously and reasons across the combined data.

This is fundamentally different from traditional integration. Microsoft’s Dynamics 365 team has already built an MCP server that adapts dynamically to business needs, allowing AI agents to query and interact with ERP data through natural language. CData’s enterprise MCP implementation guide points out that a single MCP server layer can serve ChatGPT, Claude, Copilot, and other AI clients simultaneously, regardless of which model is making the call.

Why does this matter? Because you skip the hardest part of traditional integration: mapping data between systems. Each MCP server translates its own system’s data into a format the AI can understand. The AI handles the cross-referencing at query time. Your SAP server does not need to know anything about your NetSuite server. They never talk to each other directly.

For companies exploring how AI queries business data in practice, the user experience side of this equation matters just as much as the architecture. The best integration in the world fails if people can’t figure out how to ask it questions.

Data quality is still the thing that will wreck you

Here is where I need to temper the enthusiasm. MCP solves the connectivity problem beautifully. It does not solve the data quality problem at all.

When your customer is “Acme Corp” in SAP, “ACME Corporation” in NetSuite, and “Acme Corp.” in Dynamics, the AI is going to struggle. It might treat them as three different customers. It might merge them incorrectly. It might confidently give you a revenue figure that is wildly wrong because it double-counted or missed an entity.

Industry data shows that 84% of all system integration projects fail or partially fail. Most of those failures trace back to data quality, not technology. AI makes this worse, not better. A traditional report that pulls bad data at least looks obviously wrong. Duplicated rows, missing fields, you can spot them. AI takes that same bad data and produces a polished, confident, completely wrong answer.

I have written about this pattern before. The hidden costs of building AI systems that query business data almost always come down to underestimating what it takes to get your data into shape. The AI part is actually the easy part. Making sure it has clean, consistent data to reason over is the real work.

This connects directly to access control, too. When an AI layer can query across multiple ERPs simultaneously, you need to think carefully about who gets access to what data and how those boundaries are enforced. An MCP server that connects to your HR system and your financial system at the same time is powerful. It is also a security surface that did not exist before.

Start read-only and stay there longer than you think

The single most important piece of practical advice I can offer: start with read-only access and resist the urge to move past it quickly.

Here is the staging approach that works. Phase one: build MCP servers that can only read from each system. Let the AI query, cross-reference, and generate reports. This gives you enormous value immediately. Your CFO can ask “what is our total exposure to customers in the manufacturing sector across all divisions?” and get an answer that previously required three people and a week of spreadsheet work.

Phase two: validate relentlessly. Compare AI-generated answers against manual checks. Find where the data conflicts live. Build a map of entity resolution problems (the Acme Corp / ACME Corporation / Acme Corp. issue). This phase typically reveals data quality problems that existed for years but nobody noticed because nobody was trying to query across systems before.

Phase three, and only after you trust the read layer: allow write-back to one system at a time. Maybe the AI can create draft purchase orders in your procurement system based on inventory data from the warehouse system. But that draft goes through human review before it becomes real. Microsoft’s guidance on securing AI agents emphasizes enforcing least privilege, and this matters doubly when agents interact with production ERP data.

Most companies I’ve observed spend about three months in phase one and rush to phase three. The ones that succeed spend six months in phase one and two before even considering writes. The patience pays off.

To make this concrete, a realistic MCP architecture for a mid-size company with three ERP systems might look like this.

Three MCP servers, each running as a lightweight service. The SAP MCP server exposes tools for querying purchase orders, vendor records, and inventory. The NetSuite MCP server exposes tools for financial data, customer records, and billing. The Dynamics MCP server handles warehouse operations, shipping, and logistics.

Claude (or another AI model) connects to all three as MCP clients. When someone asks “which vendors have outstanding invoices over 90 days and pending shipments?”, the model queries the SAP server for vendor and PO data, the NetSuite server for invoice aging, and the Dynamics server for shipment status. It then cross-references the results and presents a unified answer.

No data moved between systems. No synchronization jobs. No transformation pipelines. The AI does the cross-referencing at query time.

The latest MCP roadmap has introduced formal governance processes, authorization frameworks, and support for long-running asynchronous operations. This protocol is maturing fast. The specification now includes the security and compliance features that enterprises need before deploying to production.

Is this the answer for every company? No. If you genuinely have a path to ERP consolidation and the budget and organizational will to execute it, that might be the better long-term play. But for the majority of mid-size companies where consolidation is a fantasy that lives permanently on the three-year roadmap, MCP offers something different. Not a way to unify your systems, but a way to reason across them without pretending they are one thing.

That distinction matters more than most people realize.

Worth discussing for your situation? Reach out.

About the Author

Amit Kothari is an experienced consultant, advisor, coach, and educator specializing in AI and operations for executives and their companies. With 25+ years of experience and as the founder of Tallyfy (raised $3.6m), he helps mid-size companies identify, plan, and implement practical AI solutions that actually work. Originally British and now based in St. Louis, MO, Amit combines deep technical expertise with real-world business understanding.

Disclaimer: The content in this article represents personal opinions based on extensive research and practical experience. While every effort has been made to ensure accuracy through data analysis and source verification, this should not be considered professional advice. Always consult with qualified professionals for decisions specific to your situation.