TL;DR: Most manufacturers are investing in AI in manufacturing before their data is ready to support it. Without a governed, cataloged, and integrated data foundation, AI projects fail regardless of the technology chosen. Fix the data first. Everything else follows.

Here is an uncomfortable truth about AI in manufacturing that no one at your last digital transformation summit wanted to say out loud: most manufacturers are not ready for it. Not even close – sorry! Despite the breathless press releases, the pilot projects, and the seven-figure consulting engagements, AI in manufacturing is failing, not because the technology is wrong, but because the data underneath it is fragmented, ungoverned, and fundamentally untrustworthy. This is not new and the result is predictable. 

According to research from MIT, 95% of companies are realizing no measurable P&L impact from their GenAI initiatives. In manufacturing, where data complexity is compounded by decades of legacy systems, siloed operations, and acquisition-driven IT sprawl, the failure rate is arguably worse.

The uncomfortable reality: AI is only as smart as the data feeding it

Let me be blunt. Generative AI, predictive maintenance algorithms, and digital twins are not magic. They are mathematical models that consume data and produce outputs. If your data is inconsistent, incomplete, duplicated, or locked in departmental silos, and in most manufacturing organizations it is, then the outputs of those models will range from untrusted and unreliable to dangerous.

IndustryWeek coverage of Industry 4.0 adoption reflects a pattern that most manufacturing leaders will recognize: companies rush into technology investments without first addressing the foundational data infrastructure those technologies depend on. The pattern is always the same: a company invests in a shiny AI platform, connects it to a few data sources, gets disappointing results, and then blames the technology. Technology was never the problem.

“…every dollar spent on AI is a gamble. And not the kind with favorable odds.”

The data governance gap no one wants to talk about

Ask a manufacturing CEO about their AI strategy and you will hear about machine learning, computer vision, and autonomous quality inspection. Ask that same CEO about their data governance strategy, who owns the data, how it is defined, where it flows, whether it is accurate, and you will likely get a blank stare. This is the disconnect that is costing the industry billions. IBM’s Institute for Business Value found that over 25% of organizations lose more than $5 million a year due to poor data quality and that 7% reported losses of $25 million or more.

In manufacturing, where a single bad data point can cascade through supply chain planning, production scheduling, and quality management, the true cost is far higher. Yet data governance remains chronically underfunded. It is not glamorous. It does not generate headlines. It does not excite boards of directors the way an AI-powered “factory of the future” demo does. And that is exactly why so many of those demos never scale beyond the pilot. Again, we have seen that movie before with the IoT pilot purgatory!

Stop buying AI tools and start investing in data foundations

Here is where I will lose some of my peers in the technology industry: manufacturers should impose a moratorium on new AI investments until they can answer three fundamental questions. Sorry again! First, do you have a comprehensive catalog of your data assets across the enterprise? Second, do you have defined data ownership, business glossaries, and quality standards that are enforced, not just documented? Third, can you trace the lineage of any data element from source to consumption?

If the answer to any of these is no, then every dollar spent on AI is a gamble. And not the kind with favorable odds. This is not a theoretical argument. A 2024 survey by MES vendor Rockwell Automation and Sapio Research found that 95% of manufacturers are either evaluating or implementing AI and smart manufacturing technologies, but fewer than 25% have the data infrastructure to support them at scale.1

The gap between ambition and readiness is staggering, and the consequences are real: delayed ROI, eroded executive confidence, and a growing pile of abandoned pilot projects that quietly get swept under the rug during the next quarterly earnings call.

“Discipline first. Technology second.”

What AI readiness in manufacturing actually looks like

AI readiness in manufacturing is not about having the most advanced algorithm. Real AI readiness in manufacturing means having a governed, cataloged, and continuously monitored data estate that any analytical system can consume with confidence.

It is about having a trusted, governed, well-cataloged data estate that can feed any analytical workload, whether that is a simple dashboard or a complex deep-learning model. It means investing in metadata management so that every data asset has a known owner, a clear definition, and a documented lineage. It means implementing data quality programs that continuously monitor, score, and remediate issues before they propagate downstream. It means building a data marketplace where engineers, analysts, and data scientists can discover, evaluate, and access trusted datasets without submitting a help desk ticket and waiting three weeks.

None of this is revolutionary. It is foundational. And it is exactly the work that gets skipped in the rush to deploy AI.

One more thing I want to be clear about: the manufacturers that get this right do not do it by assembling a patchwork of disconnected point solutions. They do not buy a standalone data catalog here, a separate quality tool there, and a lineage tracker somewhere else. The reason those investments fail is the same reason AI investments fail: the pieces do not talk to each other, and the result is more fragmentation, not less.

Real AI readiness requires catalog, lineage, quality, and governance to function as a connected, integrated system, where a change in data quality automatically updates trust scores in the catalog, where lineage traces every transformation from source to model input, and where business and technical users share a single governed view of what the data means. That kind of integration is not a luxury. It is the minimum viable foundation for AI that actually delivers.

“The question is no longer whether AI will reshape manufacturing. It will. The question is whether your data is ready for it.”

The choice in front of manufacturing leaders

Manufacturing is at an inflection point. The companies that will win with AI over the next decade are not the ones buying the most sophisticated tools today nor the ones spending billions in AI strategy consulting. They are the ones doing the unglamorous, disciplined work of getting their data house in order. They are investing in data catalogs, governance frameworks, quality engines, and integration platforms. They are treating data as the strategic asset it is, not as a byproduct of operations.

IndustryWeek contributor and manufacturing strategist Larry Fast has written about how operational discipline separates the best manufacturers from the rest. The same principle applies to data. Discipline first. Technology second. And for the leaders willing to embrace that sequence, AI will not just deliver on its promise; it will transform what their factories are capable of becoming.

The question is no longer whether AI will reshape manufacturing. It will. The question is whether your data is ready for it. For most of you reading this, the honest answer is: not yet. And that honesty, not another pilot project, is where the real transformation begins.

Before you approve the next AI budget line, ask whether your data catalog is complete, your lineage is traceable, and your quality standards are enforced. If the answer to any of those is no, you know where to start.

Check out more blog posts by Stephan M. Liozu here.


Sources

1. Rockwell Automation and Sapio Research, “9th Annual State of Smart Manufacturing Report,” 2024.

Stephan M. Liozu is Chief Value Officer at Quest Software with 15+ years as a pricing thought leader specializing in value-based pricing and pricing transformations. He holds a Ph.D. in Management from Case Western Reserve University, an M.S. in Innovation Management from Toulouse School of Management, and an MBA in Marketing from Cleveland State University. Stephan is a Certified Pricing Professional and has authored 16 books including Organizing the Pricing Function (2025) and Value-based Pricing: 12 Lessons to Make your Transformation Successful (2024). He serves on the Advisory Board of the Professional Pricing Society and advises Quantide Growth Partners, Zilliant Inc., and LeveragePoint Innovations. Based in Phoenix, AZ, he practices Krav Maga and follows Stade Toulousain rugby. Learn more at stephanliozu.com.