Leaders of an overseas manufacturing plant had spent millions of dollars instrumenting sensors across their factory yet struggled to identify its bottleneck machines. They called PTC Inc. in search of a better way to find performance slowdowns and mitigate remaining bottlenecks.
“They said, ‘We have all this data,’” explained Chris MacDonald, head of artificial intelligence (AI) and analytics at PTC. “‘We know this machine is the bottleneck and the pacemaker. We cannot figure out a way to predictably understand when this machine is going to cause a performance problem.’ We asked, ‘Who is the most experienced technician who services this machine? Let us watch him.’ We found out that when he heard something, he did proactive maintenance on the machine. The most important thing in predicting and diagnosing failure in this machine was not tens of millions of dollars in sensors, but a microphone. They didn’t talk to the guy who had maintained the machine for decades. They didn’t have a standard way of understanding where the bottleneck was. When they did identify a bottleneck, they did not have appropriate context to make meaning of everything they had sensorized on the asset, which could have fixed the problem faster and cheaper.”
According to MacDonald, in the past, people who wanted to predict failure had a lot of data about how that asset was behaving but not enough information about if and when it failed.
“The failure was not captured in the data. There’s an adage: ‘There’s no sensor for failure.’ Customers need tools that are easy to use and scalable. We are building toward a future where we can start gaining advanced insights from more advanced analytics tools. People want insights, not data.”
When optimizing performance, data alone is not the answer. Sensors alone are not the answer. One offs are not the answer. The answer lies in data analytics tools that are (finally) evolving from addressing one-off problems to facing down enterprise-scale challenges. The machine that appears to be causing the slowdown may not be the true culprit.
By taking a factory-wide approach that encompasses people, processes and all operations, manufacturers can get a better read on any bottlenecks.
“That one asset you’re looking at: is that really the bottleneck?” MacDonald continued. “If you’ve over optimized an asset and that’s not the bottleneck, you’ve prioritized something that’s not driving losses in production. What’s driving alerts and alarms? Maybe it’s not the machines. Maybe it’s work orders, how the machine is being used. Maybe I need to look into another asset because that asset is the problem in the situation.”
Optimizing data insight means moving beyond error codes and drilling down to learn what else is happening in the factory at the same time.
“When that error code is understood in the broader context, it allows you to answer the larger question: How does that error code fit into the semantics of the data to accurately reflect the performance of the factory? Look at the performance of the entire factory, look at performance against targets, and identify the most impactful issues across the factory. What’s driving loss and quality across different lines? There’s a better way of doing this, a more enterprise-wide way focused not on improvements at only one plant or process, but what are the consistencies among all the plants and processes,” he said.
Manufacturing companies are seeking data analytics tools that boost operational excellence, improve sustainability and drive value across the entire experience of their customers, said Charles Phillips, global marketing lead for manufacturing at SAS.
“To profitably deliver the right products at the right time requires analytics that maximize quality, uptime and yield, and optimize supply chains, including on-time delivery, fill rate and inventory cost,” Phillips said. “The emergence of sustainability commitments is also pushing manufacturers to identify, track, correct and manage drivers of emissions and waste. Here too, analytics can play a central role.”
By implementing digital capabilities to enhance data analytics, manufacturers can more efficiently monitor, process, identify and address bottlenecks, said Lindsey Berckman, principal at Deloitte Consulting LLP and smart manufacturing leader in Deloitte’s industrial manufacturing sector. “By having access to the real-time machine, tool, personnel and material data, decisions and improvements can be made almost instantly,” Berckman said. “The combination can lead to improvements to the user experience and even quick pivots away from processes altogether if necessary.”
With quicker pivots in mind, manufacturers using advanced data analytics can make more thoughtful decisions on what stages of the value chain to improve, resulting in smarter supply chain and factory floor investments, Berckman added.
Advanced analytics, such as AI, machine learning, internet of things (IoT) analytics, computer vision and digital twin technologies, are unlocking new capabilities for manufacturing companies. According to Phillips, manufacturers who have already adopted them are seeing big improvements in quality, efficiency and enhanced decision making. Phillips cited results from SAS customers: Georgia-Pacific LLC used analytics to improve overall equipment efficiency by 10%.
A senior manager at electronics component manufacturer Murata Manufacturing Co. noted that a 1% yield gain from analytics results in $50 million in savings.
Lockheed-Martin Corp. harnessed IoT analytics to deliver 2,000 hours of downtime saved in six months. “Historically, manufacturers waited until an incident occurred, then reacted with root-cause analysis investigations,” Phillips said. “They sought to determine what happened, when it happened, what drivers—human, process and technology—were involved, and what the ultimate root cause was. That is why you see recalls and other after-the-fact outcomes. Advanced analytics like AI, machine learning and IoT analytics enable manufacturers to shift from reacting to predicting. This minimizes the overall impact of each incident—be it quality, safety or uptime. There is real momentum in the industry as additional manufacturers adopt advanced analytics to become more resilient and data driven.”
While manufacturers already collect enterprise data, Deloitte’s 2023 Manufacturing Outlook report found the real advantage comes from converting this data into true insights and action, Berckman said. Manufacturers who invested in and adopted emerging technologies have shown greater resilience. Key tools include IoT, edge computing, digital twins, vision systems and generative AI, Berckman added.
“As we saw following the onset of the pandemic, many companies who did not invest in enhanced technologies were forced to play catch up—and some are still doing so today,” she said. “For organizations looking to increase their digital maturity, implementing advanced data analytics into existing systems is one of the ways they can facilitate increased agility.”
With the addition of AI, emerging technology is now accessible to the large number of companies that do not have in-house data analysts. That’s according to Rahul Garg, VP of industrial machinery and SMB programs at Siemens Digital Industries Software.
“They have manufacturing engineers who are looking at how the manufacturing process should be set up and they have manufacturing operators making sure jobs move forward,” Garg said. “They don’t have people who can look at the data and do the data analytics, especially when we start looking at MES software and IoT and looking at questions like: ‘How do you improve operational efficiency? How do you improve maintenance?’ All of that requires a better understanding of that data. Data analysts have been becoming a critical resource group.” Garg went on to explain that by working with expert resources at Siemens, who bring in AI tools including MindSphere and Opcenter, a manufacturer can get the benefits without having to hire in-house data analysts.
These tools also have evolved to succeed without solely depending on large data analytics teams, according to MacDonald.
“Five years ago, if you had the right organizational mentality—the willingness to innovate, fail fast and recognize the value of data—those things (enterprise level insight) were possible but it was heavily dependent on culture,” MacDonald said.
While many large companies often have dedicated teams for software and analytics, most manufacturing teams don’t have such teams. Increasingly, PTC and other companies are designing tools (such as PTC’s Digital Performance Management, or DPM, built on ThingWorx) that identify, analyze and improve bottlenecks—little or no advanced knowledge required, MacDonald said.
“Manufacturers may have smart teams doing analytics from one-off activities where one person or team finds ways to use derived data for continuous improvement related to one asset or process,” he said. “The challenge is to scale those improvements across multiple machines, across multiple data sources and build standard tools to get data in shape for continuous analytics. Companies like PTC have tried to capture that notion of operational insights from data, enabling the ability to automatically enrich and transform data captured in tools like DPM and inject powerful data insights out of the box. Companies are starting to desire or expect their data to be put into a form through use of an application that provides solutions and increasingly advanced insights,” MacDonald said.
Applications today are becoming more embedded in the production environment so that users can easily view progress, forecast completion against targets, and better assign reasons for lost time. “Advanced diagnostic algorithms identify the scenarios generating the greatest time lost, automatically,” MacDonald said. “If you start thinking about data shape consistently across factories, across the whole enterprise, then you have the ability to identify the most impactful issues and solutions. Once you start tying in other applications and systems in use, you can systematically address things like problematic work instructions that drive rework and scrap, you can start bringing data together to understand what about these work instructions might be optimizing performance but decreasing quality.”
As workers at all levels begin to use these tools, user experience (UX) is also becoming increasingly important. The software or app doesn’t need to be pretty; it just needs to be understood. “If it looks like an ugly spreadsheet and people understand it, they’re going to use it,” MacDonald said.
According to Phillips, the mantra at SAS has been “analytics for everyone, everywhere” or “analytics for the people.” The user interfaces of advanced analytics remove much of the complexity from the experience. These advanced analytics deliver insights that lead to better, faster decisions and improved value for customers.
“From a data-democratization standpoint—which is a fancy way of saying all team members regardless of experience can gain value from analytics—manufacturers gain flexibility and agility, can make better decisions faster and can deploy these tools closer to the asset,” Phillips said. “As the industry continues to experience increased attrition and a shift from older to younger workers, there is a great opportunity for institutional knowledge transfer and onboarding.”
The UX features across a variety of data analytics tools is constantly evolving, putting visualization and decision making at the user’s fingertips even when they’re not physically sitting on the shop floor. “There is no reason our shop floor apps cannot look more like the consumer apps we use on a daily basis,” Berckman said. “For example, if I can trigger an order from a food delivery service that assesses my past history to recommend my most selected restaurants based on my current physical location, why can’t a similar activity be positioned for the shop floor? Based on my position in the shop floor and my job function, the application could recommend the next parts or tooling I may need and trigger a delivery from the warehouse direct to my location. The key is to make the UX of the app highly intuitive, allowing for easy adoption and reducing the need for the worker to get distracted from their current task.”
According to Garg, plant simulation technology now enables companies to make digital twins of their plants and simulate how new technology will work before making changes. “Once you have digital twins, you are able to get a good understanding of supply chain function, material flow, labor requirements, machine requirements, throughput requirements,” Garg said. “By taking these steps earlier in the process, companies can create their plan for manufacturing, how they will, for example, coordinate the 40 steps to assemble a pump, to get the highest throughput possible.” Digital twins—not just of a single asset but of an entire process—are increasingly important, MacDonald said, but not yet at the optimal level of use.
“Digital twins are really good at understanding where a given asset fits within the process as long as the digital twins are cognizant of what comes before and after. We are seeing more digital twins, but not necessarily yet at the appropriate context of scale that I find beneficial.”
Challenges and needs remain. Overall, manufacturers often need good insight on how and when to start digital transformation. Manufacturers also need help assembling data from a variety of data sources such as PLM, ERP and MES, into a form required for impactful analysis.
“Machine learning is only as good as the value of the data,” MacDonald said. “When you start building an application-driven way of employing insights into production facilities, optimizing the performance of people, assets and processes. That’s where and when data is in a shape where more advanced analytics can take hold and scale.”
A recent report by RevGen Partners, a business and technology consulting firm based in Denver, Colo., reveals the top challenges in getting the most out of data to drive business goals. According to RevGen’s Pero Dalkovski, vice president of data and technology, and Ian Foley, director of analytics and insights, “It’s no secret that the relentless proliferation and complexity of data generated every second is changing the way you do business and how you interact with your customers. The challenge comes in figuring out how to harness the power of that data to maximize value to your organization.”
The RevGen report went on to say that “successfully bridging the gap between data inundation and business value requires a comprehensive approach grounded in the tight alignment of business strategy, data and analytics strategy, and tactical execution.”
In addition to a clear vision, alignment of people, processes and technology to leverage data effectively and efficiently is critical.
A successful data culture requires addressing data literacy throughout the organization and enabling employees to make data central to their actions and decisions. According to Dalkovski and Foley, to achieve real business value, manufacturers need to extract and operationalize insights from data to enable action. “Getting the most from your data, so that you can navigate the seas of digital disruption facing your organization, can be overwhelming. But you don’t have to go it alone. The right partner will empower you to harness the full power of your data assets, maximize the value to your organization and set you up for sustainable success.”
Connect With Us