Across the manufacturing and industrial sectors, a data and AI-driven revolution has begun. It’s happening in virtually every function, but its impact is particularly pronounced in production and operations. This new revolution builds on the foundation laid by Industry 4.0 — looking beyond the new data-driven technologies and capabilities it introduced, and shifting toward pragmatic implementations, with a focus on delivering tangible ROI and robust governance.
In this article, we’ll explore the key trends shaping this revolution and delve into specific examples to illustrate the changing landscape.
Trend #1: AI at the edge
As technologies like IoT sensors and smart devices have been embedded in manufacturing and industrial processes, companies have seen immense growth in the volume and variety of data available to them. That data has a huge range of powerful use cases — from enabling proactive and preventative maintenance to supporting safety on factory floors.
But, to enable real-time use cases in production environments, that data must be converted into actionable insight very quickly. As a result, many organizations are pushing computation closer to the edge — enabling data to be processed immediately, right where it’s collected.
Consider a complex, multi-stage production line, for example. Instead of sending massive amounts of raw sensor data to the cloud, sophisticated AI models can be deployed directly on edge devices attached to each asset or machine. These models analyze progress, individual steps and quality-relevant data in near real-time to detect anomalies and predict potential failures before they occur. This not only reduces latency and bandwidth costs but also allows for autonomous control, optimizing performance at that step in the production line based on real-time conditions.
While powerful, these edge deployments bring new challenges with them — most notably, in how they’re managed. Computing at the edge requires multiple CI/CD pipelines to a huge fleet of edge devices, rather than a small number of core systems. That in turn impacts data pipelines, security and authorization processes, operational monitoring, and much more.
Adopting AI at the edge creates a need to orchestrate numerous agents and federate learning across them to train models on decentralized data, all without compromising on data security. For most organizations, this demands a major shift in how they manage MLOps and data integrations.
Trend #2: The connected factory is finally realized
Progress toward the seamlessly connected factory environments promised by Industry 4.0 has been widely hampered by fragmented legacy systems. In many cases, the sheer complexity and cost of modernizing those systems has prevented organizations from building better-integrated environments that enable AI at the edge. But, a growing number of organizations have come to realize that bridging the gaps between silos can be almost as effective as modernizing them.
Imagine a manufacturer with decades-old programmable logic controllers (PLCs) automating production lines, alongside newer equipment such as distributed control systems (DCS) or even advanced process control (APC). Instead of ripping and replacing everything, the organization could leverage API gateways and message brokers to extract data from both old and new systems.
Fortunately, there are many frameworks, capabilities and practices capable of enabling that today. We’ve seen growing adoption of standardized data formats like OPC UA. Many organizations are opting to build dedicated data virtualization layers. And Universal Name Spaces and even Catena-X-like Asset Administration Shells are helping to generalize and streamline metadata formats. This allows analysts and engineers to query data from any system as if it was a single, unified database, enabling holistic analytics or even digital twins.
For example, it’s now possible to correlate real-time sensor data with historical maintenance records to predict equipment failures and optimize maintenance schedules, improving overall equipment effectiveness (OEE).
Trend #3: The shift to continuous modernization of isolated legacy systems
Organizations don’t need to choose between integrating data silos and wider legacy modernization. Modernizing legacy systems isn’t a one-time project. It’s a continuous journey — one which can begin with relatively simple integrations between siloed systems.
For example, a manufacturer might have a custom-built ERP system that's difficult to maintain but still needs continuous care, modernization and upgrades. Instead of a complete overhaul, it could adopt a microservices architecture, gradually decomposing the monolithic system into smaller, independent services.
Throughout that journey the manufacturer might choose to ‘buy’ a specialized module for inventory management from a COTS vendor and integrate it through APIs. For complex, highly customized processes, it might decide to ‘build’ a new module in-house, leveraging modern development tools and cloud platforms. So, the challenge is to find the right balance between dealing with systems that were ‘built’, ones that the company should ‘build’ itself, and others that might make sense to ‘buy’ and customize.
The key is a pragmatic approach, focused on incremental improvements that prioritizes functionalities and changes that deliver the highest ROI. This helps create an ‘agile factory’ capable of adapting quickly to changing market demands.
Trend #4: Enabling data-driven efficiency
A pragmatic, incremental approach to modernization that prioritizes changes that deliver the highest ROI should also align well with the market and economic conditions organizations are set to experience throughout the rest of 2025.
Economic downturns often necessitate a focus on immediate profitability. A manufacturer facing declining orders might shift its focus from blue-sky research projects to initiatives that deliver quick wins. During times like these, organizations typically leverage data analytics to identify bottlenecks in the production process, optimize resource allocation and reduce waste.
For example, an organization might use machine learning to predict demand fluctuations and adjust production schedules accordingly, minimizing inventory costs. It might also analyze customer data to identify high-value segments and personalize marketing campaigns, maximizing sales effectiveness.
Even in a recession, data remains a critical asset for driving efficiency and navigating challenging economic conditions. So, while moonshots might be less likely to get financing, investment in building the core capabilities required to enable data-driven efficiency should increase.
Trend #5: Rising governance and trust demands
In just a few years, data governance has gone from being an afterthought for many manufacturers to a foundational principle of their operations.
Consequently, we’re seeing more organizations implement centralized data governance platforms that enable teams to enforce data access policies, track data lineage, and streamline metadata management.
Without clearly-defined roles and responsibilities for data stewardship, and guardrails that ensure all data-related activities comply with regulatory requirements, there is no sustainable way to enable transformation at scale. So, we can expect to see many more organizations invest in data governance platforms, automated data discovery and classification tools, embedded data lineage, and robust access control mechanisms.
Trend #6: Production-ready AI
The era of one-off modeling in notebooks is giving way to a more mature and disciplined approach to data science and generative AI. And as manufacturers’ approach matures, the AI capabilities they build are becoming increasingly production-ready and capable of delivering significant business and operational value.
For example, imagine a manufacturing company developing a predictive maintenance model for its equipment. The company builds a robust MLOps pipeline to automate model training, deployment and monitoring. It uses feature stores to manage and reuse pre-engineered features, accelerating model development. And it leverages cloud-based data processing platforms to handle the massive datasets required for training complex AI models. This helps AI models scale beyond experimental prototypes and evolve into production-ready tools that drive real business value.
Emerging open-source LLMs and cheaper commercial LLMs are making production-ready AI a practical, attainable reality for thousands of organizations. Once they begin to see AI’s true value creation potential, they’ll be able to evolve even further, moving toward cutting-edge capabilities like agentic AI.
The pragmatic data and AI revolution has already begun
Each of these trends marks a major step forward in how manufacturing and industrial organizations are adopting AI and evolving their data-driven capabilities. Together, they’re enabling a true revolution in how data and AI are used across the industry — one characterized by pragmatic modernization, efficient execution, robust governance and a continuous focus on value creation.
If they can respond to all six trends in the right way, manufacturers and other industrial organizations will be well positioned to seize this once-in-a-generation opportunity and lead the data and AI revolution in their industry.
To find out how Thoughtworks can support your organization at every stage of its data and AI journey, visit thoughtworks.com/what-we-do/scaling-ai or contact us today.
Disclaimer: The statements and opinions expressed in this article are those of the author(s) and do not necessarily reflect the positions of Thoughtworks.