When a new vehicle rolls off an assembly line, some automakers use virtual sets of eyes to do final inspection to ensure quality and make sure the product is free of scratches, dings, and leaks.
Behind those “eyes” are artificial intelligence (A.I.) and its subset, machine learning (ML). These technologies are also deployed in automated computer imaging technology from UVeye, Tel Aviv-Yafo, Israel. In addition to A.I., the company’s proprietary algorithms, cloud architecture and sensor fusion perform complete vehicle checks within seconds and point out defects.
“The machine learning and A.I. are already trained,” said David Oren, chief strategy officer at the company, noting that the ML algortihm was developed with 1 trillion images. “However, the operator has to manually correct any errors.”
In manufacturing, UVeye’s deep learning-driven inspection-as-a-service unified platform has the potential to perform in-line inspection as well, but the startup is focusing now on its end-of-line quality check, said Oren.
As the use of A.I. grows in manufacturing it contributes to higher-quality parts with vision inspection systems like UVeye’s that alert an operator to issues. Some solutions can send a signal to a machine to stop when a problem is detected.
Yet to be overcome, though, are piecemeal software products that collect and analyze data from only part of a line; heterogeneous data formats; and technologies from disparate vendors that are difficult to integrate, said Dean Phillips, innovation strategist and sales engineer, Link Electric & Safety Control Co., Nashville, Tenn.
“Probably the most difficult part is integrating” such varied products together, Phillips said. “Right now the biggest challenge I see out there is that they’re all standalone systems. They’re not fully encompassing the scope yet. That’s not to say it’s not coming but right now that’s one of the biggest challenges.”
Predictronics Corp., Cincinnati, offers an example of how its predictive quality solution helped an oven manufacturer. As rolls of sheet steel inched their way through a forming machine, the oven manufacturer had ruminated over a quality problem with the process. The machine’s job is to press out the ribs that prop up a rack as it slides into an oven cavity. The issue was that the ribs had started to crack during the forming process. In its efforts to find out what was wrong, and eventually fix the problem, the manufacturer called Predictronics.
The data scientists at Predictronics analyzed the relevant information from the machine and used a heat map to establish a relationship between the rib cracking and process temperature. They were able to show that the cracking was more likely to occur at a lower system heat, outside of the range where the formed sheet steel would remain intact.
In one deployment of the Predictronics solution, the oven manufacturer was able to detect issues with 88 percent precision (the percentage of predicted cracks that were actually cracks) and 66 percent recall (the percentage of actual cracks that were predicted).
“When we work with a customer on predictive quality, we’ll analyze their process data and develop a health model to measure those parameters over time and then recognize when a potentially problematic trend is occurring and be able to give some sort of forewarning,” said Patrick Brown, Predictronics’ chief financial officer. “We can help you avoid doing so much testing or so much visual inspection of your products and infer those products are quality based on the process parameters.”
The process also helps reduce scrap, wasted production time on a product that’s already bad and warranty claims brought on by poor quality.
Usually if one looks at the process parameters, whether it’s position, flow rate, pressure, temperature, or something else, he’s looking for departures from the normal behavior. The anomalous behavior could be increases in magnitude for one parameter over its typical value. Or it could be the relationship between two parameters.
“Let’s say temperature increases with pressure typically but all of a sudden temperature is rising but pressure’s going down,” said David Siegel, Predictronics’ chief technology officer. “So it could be trends beyond its typical value but also changes in correlations like the pressure and temperature example. By finding the relationships and monitoring the process over time those anomalies can be related to issues in the process that result in bad quality.”
Predictronics’ approach is to do a criticality analysis. For predictive quality, they’d be looking for which machines have the most scrap or quality-related issues. They work hand in hand with the customer to understand from an engineering and manufacturing perspective which processes are the most problematic. It can even be a mixture of relying on the customer’s experience and using data.
“Some customers may only measure quality at the end of the line,” Siegel said. “In that case you have to rely on their experience in terms of which process may be the most important in terms of quality.”
For a new line or one with no issues, there are at least a couple of approaches.
If it’s a new line, the data scientists would look for a similar line or one with similar processes for comparison and then leverage the experience of those operators on what issues may have been the most problematic.
“Or it could be just based on the maturity of your IoT system,” said Siegel. “Which one has the most data? That may be another good place to start because you’ll have enough transparency to monitor that process more accurately than other parts of a new line where you may not have much data from the process to monitor.”
Or, the decision could be based on domain knowledge. For example, some processes have a lot of variability to them. Even with machines producing data and scientists analyzing it for better results, domain knowledge still counts for a lot.
“We have a customer who knew exactly what the parameter should look like,” said Brown. “They knew exactly what thresholds to set and where to set them and what parameters to look at—they were that intimate with the process. So having somebody like that from the customer on the team is really valuable.”
Predictronics is trying to build domain knowledge into its models as an older, more experienced generation ages out of manufacturing and a new generation catches up.
In the case of a new line, one approach is to apply A.I. where there might be the most opportunity. Is there a chance a part of the process can be instrumented?
“There have been cases in the past where the machine or process does not allow itself to be instrumented in terms of how the manufacturing process is designed,” said Edzel Lapira, Predictronics’ CEO.
For data from disparate systems, Lapira said there are different ways to integrate it. “The easiest one is database integration,” he said. “There are different protocols that would allow one to force data from a data source to a data lake. For machine tools there are protocols like OPC-UA and MTConnect that are already available so you can get data from a machine to either applications like ours or different dashboarding.”
BC Machining LLC, Brasstown, Ohio, a metal fabricator that uses CNC machines and technology such as data collection, machine learning and edge computing, was producing way too much scrap. To hit production targets, they were running their machines at 200 percent capacity, and had lots of broken endmills to add to the scrap pile.
“We would often lose a third of our shift’s worth of parts, not to mention spending at least an hour sorting through parts to identify the scrap,” said Mike Driskell, BC Machining’s manufacturing engineer, in a case study.
For help, the company turned to the adaptive tool monitoring solution from MachineMetrics, Northampton, Mass.
“It’s adaptive in that a change is occurring on the machine to prevent scrap without operator interaction,” said Bill Bither, co-founder and CEO of MachineMetrics. “This has saved our customers hundreds of thousands of dollars and enabled them to run lights out.”
The MachineMetrics’ A.I.-driven software was trained to predict, diagnose and prevent machine tool failures by autonomously implementing a feedhold on BC Machining’s STAR machines. It stopped the CNCs when part failure was imminent. That way, the endmill could be changed proactively, before producing a part with bad quality that had to be tossed.
“Since using the MachineMetrics’ predictive tool breakage technology, that waste has been eliminated,” Driskell said. “The savings at our Swiss turn machines has been monumental to say the least.”
MachineMetrics is exploring with some customers using a robotic arm to change out the damaged or worn tool instead of an internal tool changer. Affordable technology for automatic changeount is beginning to exist. Now, though, it has to be specific types of machines, Bither said.
“Our customers are really interested in this but we’re kind of on the edge whether it’s worth it or not,” he said. “Right now it might be worth it if you have a two-year contract because the price is so high. We see that as something that will be common in the future.”
MachineMetrics’ tool monitoring solution was able to identify the signals on BC Machining’s Star SR-20 CNCs before catastrophic tool failures occurred. It also detected a predictable pattern. The software was able to indicate with near perfect accuracy when a machine tool would likely fail.
Above all, BC Machining is producing quality parts. “I believe the primary use case for predicting problems on machines is quality,” Bither said. “There’s also predictive maintenance but I see detecting issues with quality as often the higher value proposition.”
Part of the savings BC Machining realized is from using endmills to their full life and not changing them as often, which has the added benefit of increasing machine uptime. “Most manufacturers change tools based on the number of parts run,” Bither said. “That’s what we see across many of our customers, that they’re throwing away tools that have a lot of life left, maybe 50 percent or more.”
He explained that MachineMetrics’ tool monitoring software works with data captured at very high frequencies of 1,000–10,000 times a second. Typically machine monitoring systems will pull data at 1 Hz, or one time per second, he said.
“A thousand times per second is notable as it requires more capabilities, processing, and analytics,” Bither said.
His company’s solution collects data from the various motors on a machine, then normalizes the information to a core component they call “cutting torque.”
“And the cutting torque enables us to see the wear of a CNC cutting tool,” said Bither. “And what that does, that data item we’ve made accessible for CNC machines, it enables us to determine whether there’s any anomaly in the machining process and detect cracks in the tool, wear in the tool and any issues that might be seen in the cutting operation.
“We can either predict a failure in some cases, where we’re starting to see the load on that tool or some anomalies in that data to indicate there’s a problem, or we can immediately determine when a part’s produced if there’s a problem where that might be a scrap part.”
“We started with precision metal manufacturing so we’ve gone very deep into building algorithms around it that can be applied to the thousand machines that we’re connected to. But we’ve also made it easy to connect to metal fabrication equipment, welding machines, plastic injection molding, really any type of discrete manufacturing equipment,” said Bither.
Also, the company makes the data it collects available so if a customer has a team of data scientists or manufacturing engineers and they want to use that data to really look at the line as a whole, they can.
“That’s where you can build additional capabilities on top of what we offer out of the box,” Bither said. “We provide a platform for them to deploy those algorithms right to the edge where essentially what they’re doing is taking the data we’ve captured for them and they’re enriching that data, processing it in some way and then the results of that are being sent to our platform where you can build up workflows to notify maintenance or quality or even instruct the machine to stop.”
Manufacturers shopping around for an A.I. solution provider should do their due diligence to find the right fit. It helps to know the right questions to ask, said the experts at Predictronics:
Where have your services been used before? Have they been used for predictive quality applications in manufacturing?
Has your solution demonstrated a clear improvement on the business operations in terms of a reduction in scrap and improvement in quality?
Does your team have industrial domain knowledge?
Does your solution work with both sensor/process data and quality data, and integrate both data sources?
Does your solution use machine learning? Does this include both unsupervised and supervised machine learning models?
How much data does it require for training the model?
Does it need just data from a healthy process or data from when the process was not healthy and producing scrap?
Does your solution require quality data from all parts, or can it work if quality is only measured for a subset of the parts?
Connect With Us