Manufacturing has become the most data-intensive industry in the global economy. Factories generate massive streams of information from machines, sensors, inspection systems, supply chains and workforce activity.
Industry estimates suggest manufacturing produces roughly 1,812 petabytes of data annually, more than finance, telecom or retail sectors. Meanwhile, a single smart factory can generate more than 5 petabytes of operational data per week through connected equipment and industrial IoT systems.
And that volume is accelerating. Nearly half of manufacturing leaders report that the amount of data their organizations collect has doubled in just the past two years, underscoring how quickly the industrial landscape is shifting toward data-centric operations.
Yet the real question is not how much data factories generate. It is how effectively they turn that data into insight.
Automation Alley convened leaders from industry, academia and technology to explore a central question:
How can manufacturers transform raw operational data into industrial intelligence that improves performance, resilience and competitiveness?
The answer is not constrained by technology. The tools already exist. The friction lies in how data is structured, interpreted and acted upon inside the modern factory. What emerges is a set of systemic challenges and emerging opportunities that together define the next phase of industrial evolution.
Challenges
Data silos
For all the progress made under the banner of Industry 4.0, most manufacturing environments remain fundamentally fragmented. Data is abundant, but it is rarely unified.
Engineering data lives in CAD and PLM systems. Production data is generated through MES platforms and machine controllers. Quality data is captured in inspection systems. Supply chain data sits within ERP environments. Each system is optimized for its purpose, but not for interoperability.
The result is a patchwork of information that resists consolidation.
This fragmentation is not accidental. It is the byproduct of decades of incremental technology adoption, where systems were layered in response to specific needs rather than designed as part of a cohesive data architecture.
Even when integration is technically feasible, organizational realities intervene. Departments often operate with distinct priorities, metrics and ownership structures. Data becomes territorial.
At its core, this is also a structural misalignment of ownership. Tolerances, data structures and production realities often sit in different parts of the organization, disconnected from one another.
“GD&T is a data structure,” said Steve Bannasch, director of business development at Metrologic DCS. “Tolerance is owned by one group and the data structure is owned by another. And when you look on the floor, nothing ever matches.”
The consequence is a lack of visibility across the full production lifecycle.
A defect discovered at final inspection may trace back to a design tolerance, a supplier inconsistency or a process variation on the shop floor. Without integrated data, identifying that root cause requires manual investigation, slowing response times and increasing costs.
Too often, quality teams are left to bridge the gap after the fact.
“When you try to do cross analysis, the quality group is the one trying to identify the issues,” Bannasch said. “Every quality group needs to be elevated to the top so they own the whole product cycle.”
Breaking down silos requires more than connecting systems. It demands a shift toward shared data models and governance structures that prioritize enterprise-level insight over departmental optimization.
Unquantifiable ROI
Data initiatives often struggle to gain traction for a simple reason: their value is difficult to quantify in advance.
Traditional capital investments in manufacturing come with clear expectations. A new machine increases throughput. A tooling upgrade improves precision. The return can be modeled with reasonable confidence.
Data investments are different.
The value of improved visibility, predictive capability or faster decision-making is inherently probabilistic. It is realized over time, often in the form of avoided disruptions rather than immediate gains.
This creates hesitation. Organizations delay investment until a problem becomes visible and measurable.
“There needs to be prioritization of what will be the most important data to collect instead of trying to collect all the data,” said Tina Hurite, vice president of operations at MMTC. “It is just too much.”
This instinct to wait for clarity often results in reactive decision-making.
By the time ROI becomes obvious, the cost has already been incurred in the form of downtime, scrap or lost throughput.
Yet the upside of acting earlier is well documented. Advanced analytics and predictive maintenance programs can reduce machine downtime by 30 to 50 percent, lower maintenance costs by as much as 10 to 40 percent and extend equipment life by 20 to 40 percent.
The challenge is that these gains are rarely realized until after implementation.
A more effective approach is incremental.
“Prioritize what the heavy hitters are, collect that data, then identify solutions that can help incrementally,” Hurite said. “After that, identify what the next batch you want to focus on. It’s done one piece at a time.”
This shift reframes ROI not as a single event, but as a compounding capability.
The Latency Gap
In many factories, data is generated in real time but decisions are not.
This gap between data creation and action represents one of the most significant constraints on industrial intelligence.
Production systems capture vast amounts of information continuously. Sensors monitor temperature, vibration and throughput. Machines log performance metrics.
Yet much of this data is processed after the fact.
Reports are compiled at the end of shifts. Performance is reviewed in weekly meetings. Insights are extracted retrospectively, when the opportunity to influence the outcome has already passed.
This delay reduces the practical value of data.
Closing the latency gap requires moving intelligence closer to the point of operation.
In one implementation, continuous monitoring of machine data combined with real-time dashboards increased overall equipment effectiveness by 11 percent, simply by enabling faster, more informed decisions.
But speed alone is not enough. Trust and usability must follow.
“It is about trust,” said Jason Hamp, enterprise AI technologist at Lenovo/Nvidia. “You can’t just create knowledge. You need to pair the older people with shop knowledge and digital natives together.”
Without that alignment, even real-time insights risk going unused.
The Manufacturing Language Barrier
Data, by itself, does not create understanding.
Manufacturing environments are shaped by deep domain expertise. Engineers, operators, quality specialists and executives each interpret information through different frameworks.
This creates a translation problem between digital systems and physical processes.
Even the most advanced models must be grounded in real-world context.
“There needs to be humans involved in AI data decisions,” Hamp said. “It can be very insightful and draw conclusions, but you need someone who has the experience to know if what it says is true.”
At the same time, much of the most valuable knowledge in manufacturing is tacit. It resides in the experience of seasoned workers who understand how processes behave under varying conditions.
“How do we collect data and collect the tribal knowledge of the shop floor?” Bannasch said. “Even when you collect data from machines, you still have this large missing part of what makes it all work together.”
Bridging this gap requires a convergence of perspectives.
Analytics must be grounded in operational reality, and expertise must be translated into structured, usable data.
The goal is not to replace human knowledge, but to scale it.
Opportunities
Consistency is more important than accuracy
Precision has long been a defining characteristic of manufacturing.
In the context of data, however, consistency often delivers more value than absolute accuracy.
Highly accurate data that is inconsistent or incomplete is difficult to analyze. Consistent data, even with minor imperfections, enables comparability and trend analysis.
This shift is closely tied to how manufacturing systems operate in practice.
“When it comes to automation, it needs to be repeatable and in the same spot,” said Gary Krus, vice president of business development at Hirotec.
Repeatability, not perfection, is what enables control.
Manufacturers are increasingly prioritizing standardized data collection and stable measurement environments, creating a more reliable foundation for analytics and decision-making.
Consistency creates continuity. Continuity enables insight.
Minimalizing Mistakes through Serialized Production
One of the most tangible ways data is reshaping manufacturing is through serialization.
By assigning unique identifiers to individual parts and tracking them throughout the production process, manufacturers create a continuous digital thread.
This allows errors to be traced back to their source with far greater precision.
“If you want to create a quality product, you have to understand what you are collecting from end to end,” Krus said.
That end-to-end visibility is what transforms quality from a reactive function into a proactive one.
Serialization enables manufacturers to identify patterns, isolate variables and intervene earlier in the process. Over time, this reduces defects, lowers scrap rates and improves throughput.
It also reinforces a broader shift toward integrated thinking.
“We talked about data vault silos,” Krus said. “How do I collect data and interlink all the areas from design to production?”
That question sits at the center of modern manufacturing strategy.
Novel Ideas Supercharged with AI
Artificial intelligence is becoming a catalyst not just for efficiency, but for entirely new ways of working.
In production environments, predictive maintenance alone can reduce unexpected equipment failures by up to 70 percent while increasing asset availability by 5 to 15 percent.
More broadly, manufacturers deploying advanced analytics are seeing 30 to 50 percent reductions in downtime, 10 to 30 percent increases in throughput and 15 to 30 percent improvements in labor productivity.
But the path to those outcomes is rarely linear.
“If I look at AI, I’m going to take it small and look at specific areas, and try to grow it,” Bannasch said. “I do not want to look at it forcefully.”
This measured approach reflects a broader shift toward practical adoption.
“Start with the problem at hand,” Jason said. “Start with the resources you have.”
That mindset is shaping how AI is deployed across manufacturing.
Rather than attempting large-scale transformation all at once, organizations are targeting specific use cases, building internal confidence and expanding over time.
The long-term potential, however, extends even further.
“I think the focus will eventually turn to small language models, bespoke to the industry they are meant for,” Hamp added. “That’s where you’ll start seeing these giant leaps.”
These systems, trained on domain-specific data and informed by real-world expertise, have the potential to bridge the gap between data and decision-making in entirely new ways.
From Data to Industrial Intelligence
The manufacturing sector is not short on data. It’s in the process of learning how to use it.
The challenges are structural. Fragmented systems limit visibility. Uncertain ROI slows investment. Delayed decision-making reduces impact. Misaligned perspectives hinder adoption.
But these challenges are not permanent. They are transitional.
As manufacturers standardize data, close the latency gap, align expertise and rethink how value is measured, a different model begins to take shape.
“Get ready to adapt; fail fast, but keep your eye on the ball and keep pivoting,” Hamp said.
That may be the defining characteristic of this moment.
The shift underway is not about collecting more information. It is about creating systems that can interpret and act on it in meaningful ways.
Factories are becoming more connected, but connectivity alone is not the goal.
The objective is industrial intelligence.
And for manufacturers that can bridge the gap between data and action, the reward is not incremental improvement. It is a step change in how industrial performance is achieved.
