Manufacturing Engineering: Data mining and Big Data are hot topics. Your company develops process mining software; how does it differ from data mining?
Alex Rinke: Data mining has traditionally been KPI [key performance indicator] oriented, focused on identifying patterns to predict future trends and analyzing data to create actionable insights. Businesses use data mining to draw conclusions and solve specific problems, but process mining takes an entirely different approach. Process mining technology leverages event logs to create a visual reconstruction of any process in its ‘as-is’ state, and suddenly instead of combing through data sets to find a relevant pattern, users can actually see how a process is running in real time. We like to describe the difference between data mining and process mining as shining a flashlight in the dark versus flipping on a light switch; both are useful approaches, but one is clearly superior for full visibility.
ME: What specifically does process mining offer discrete manufacturing operations?
Rinke: One critical advantage of process mining is that it is fundamentally applicable to any process, and that includes manufacturing operations. If there are event logs being created, process mining can paint a clearer picture of how operations are actually running. That being said, discrete manufacturing is an industry that stands to gain the most insight from process mining, and customers like Siemens, ABB, and 3M are only a few of the companies that are relying heavily on process mining to improve the flexibility and efficiency of their operations.
Common use cases that we encounter in the manufacturing space are improving schedule adherence, monitoring automation, modeling for capacity, and reduction of waste, and process mining has proven invaluable for all of these areas. For instance, ABB is one of the largest, most complex manufacturing businesses in the world that is currently using process mining for a variety of processes, from its purchase-to-pay process to its production processes. Employees from the ABB plant in Hanau, Germany, used to extract evaluations from their SAP systems several times a day, import them into Excel, and use complex formulas to analyze and understand processes. Today, the relevant production and assembly team leaders at ABB receive an e-mail first thing in the morning that outlines the previous day’s production variants, throughput times, and number of rejections. And it’s important to note that we aren’t just talking about KPIs here—the plant’s full ecosystem of processes is immediately visible with process mining, making discovery of inefficiencies a breeze. It can be difficult to solve a problem that you’re aware of, but it’s nearly impossible to solve a problem you haven’t yet discovered.
ME: How does the Celonis Proactive Insights process mining engine work?
Rinke: Celonis PI is a very exciting addition to our core technology, with enormous promise to contribute continuous value to our customers. Essentially, the machine learning algorithms that we’ve layered onto Celonis are taking prior use cases and building upon them, acting as an automated business consultant and recommending improvements. The automated pattern-recognition capabilities mean that if Celonis has encountered similar inefficiencies in the past, it will be able to feed back relevant solutions to overcome those inefficiencies. If we imagine Celonis as a MRI for business processes, then PI is like an automated doctor to interpret the results of that MRI scan.
ME: Which manufacturers use this technology, and how are they deploying it?
Rinke: ABB, 3M, and Siemens are a few Celonis customers, and they’ve deployed process mining technology across a variety of processes and made it available to a huge variety of employees. Siemens, for instance, has chosen to empower its staff by adopting a broad deployment; several thousand Siemens employees use Celonis every day to get an understanding of exactly what’s happening in their processes. Material sourcing, vendor management, production and assembly, shared services—each of these processes is made transparent and easier to understand when the proverbial light switch is flipped on.
ME: What manufacturing industries can best leverage process mining?
Rinke: The beauty of process mining is that it’s completely relevant for all industries, manufacturing or otherwise. We’ve seen procurement departments completely transformed and made more efficient, and factories made smarter by achieving full data transparency. The trend right now is towards ‘Industry 4.0’ and self-optimizing smart factories, and one of the core facets of this trend is information transparency—Celonis Process Mining is an out-of-the-box solution for achieving that level of transparency.
ME: What’s the future for manufacturing operations using machine learning and AI bundled into your process mining data analytics?
Rinke: The potential is limitless for AI within process mining software, because as the system aggregates more and more information built on more and more use cases, the smarter Celonis becomes. As with all machine learning and AI applications, the system only gets better at identifying patterns as more data gets fed in. Instead of relying on complex manual analysis of processes, we expect Celonis PI to return instant results. Consider a bottlenecked production process that is causing delays in a factory—and imagine that you had no idea where the bottleneck was happening, or the repercussions of the bottleneck. Celonis can easily identify the source of the bottleneck, and PI will make recommendations on how to effectively mitigate the root cause of the bottleneck. It’s an exciting time for manufacturers in any industry.
ERP developer Epicor Software Corp. (Austin, TX) announced Oct. 5 that Joe Cowan, Epicor president and CEO, would retire at the end of October. The Epicor board of directors has appointed Stephen Murphy, former president of OpenText (Waterloo, Ontario), as Epicor CEO.
Before joining Epicor, Murphy was president of OpenText, a $2-billion developer of enterprise information management (EIM) software. Murphy’s career spans more than 20 years in the technology sector, including sales and operations leadership positions at Oracle, Sun Microsystems, and Manugistics, as well as manufacturing and distribution experience leading global logistics and supply chain strategy and major ERP implementations with Accenture and Procter & Gamble. Murphy holds an MBA from Harvard Business School and a Bachelor of Science in Mechanical Engineering from University of California, Davis.
When Stratolaunch Systems Corp. (Seattle) rolled out its all-composite Stratolaunch aircraft to prep for ground testing in the Mojave Desert this spring, the gigantic airplane showed just how far the design and manufacturing of composite materials have progressed in recent years. Last month, the first phase of engine testing on the aircraft’s six Pratt & Whitney turbofan engines was completed.
The world’s largest aircraft by wingspan—wider than a football field is long—is almost entirely fabricated from composite materials, which provide light weight, high stiffness and strength characteristics that are increasingly in demand in aerospace, automotive, sports, medical and industrial fields. Collier Research’s (Newport News, VA) HyperSizer optimization software was used extensively by manufacturer Scaled Composites to optimize the aircraft’s composite fuselage and wing structure.
HyperSizer, the first software package to be commercialized out of NASA, has been employed on a wide variety of aerospace and other industry projects fabricated with composite or metallic materials. The software automatically performs design, stress analysis and sizing optimization, typically reducing the weight of structures by 20-40%.
“To ensure the most efficient use of materials in an all-composite structure of any size requires effective employment of design and manufacturing optimization tools from the very earliest stages,” said Collier Research President Craig Collier.
The Stratolaunch aircraft is the brainchild of Stratolaunch System Corp. founder Paul G. Allen. It has two fuselages connected by a giant single wing and is powered by six engines that will enable it to take off from a runway carrying a payload of up to 550,000 lb (247,500 kg). At the cruising altitude of a commercial airliner, the Stratolaunch air-launch platform will release the space launch vehicle payload and return to the airport for reuse. The first launch demonstration is anticipated to take place as early as 2019.
For the massive Stratolaunch wing, deflection limits were a significant factor to be taken into account. The panels of the duel fuselages were sized for strength, stability and honeycomb sandwich failure modes. By using HyperSizer, the stress team had access to a comprehensive set of automated failure analyses that includes rapid free-body analysis; discrete laminate sizing; ply-based composite failure analysis; honeycomb sandwich analysis methods such as wrinkling, core shear, flatwise tension and intracell dimpling; and scripting API to push in loads from Excel spreadsheets.
Collier is seeing continuing evolution in the integration of the toolsets used for composites design and manufacturing. “HyperSizer software can provide insight into how producible a structure is and whether there might be any manufacturing issues,” Craig Collier said. “It can incorporate laminate fabrication preferences in early-stage design thought; ease-of-manufacturing is becoming a major influence with strength design of laminate structures.”
Product line engineering (PLE) developer BigLever Software (Austin, TX) and Method Park (Pittsburgh), a supplier of engineering process management, have developed the new feature-based PLE Process Framework, which provides an off-the-shelf template of best practices that have enabled PLE successes.
The new framework, which is currently available, combines Method Park’s Stages Process Management System with BigLever’s three-tiered PLE methodology to allow companies to accelerate transition to PLE practice and achieve cross-functional alignment throughout the enterprise. Companies can use the process framework to optimize their PLE operations by improving communication and collaboration across software, electrical, and mechanical domains and avoiding the pitfalls of ad-hoc and one-off approaches. BigLever has incorporated the new framework as a key part of the company’s holistic onePLE solution.
Feature-based PLE dramatically simplifies the creation, delivery, maintenance, and evolution of a product line portfolio by using a shared set of engineering assets, a managed set of features, and an efficient means for automating production of the product line. The new process framework provides a fully customizable Concept of Operations (ConOps) template that lays out the organizational structure and puts that structure into motion by clearly defining the organizational roles, responsibilities, and processes needed to operate effectively under the PLE paradigm.
Boothroyd Dewhurst Inc. (Wakefield, RI), developer of Design for Manufacture and Assembly (DFMA) software, has released its updated DFM Concurrent Costing Version 3.0. Deployed as a cost-analysis tool for engineering and procurement teams, the latest software allows manufacturers to move beyond “price” models, based largely on past bids, to industrial cost models grounded in scientific test data and studies. The result is a highly reliable “should cost” view of the product that offers insight into hidden cost drivers and ways to optimize both design and production, according to the company.
DFM 3.0 allows OEMs and their suppliers to explore bids in a neutral framework where machine types, speeds, processing sequences and optimum levels of automation are discussed. This better-informed environment is said to encourage supplier suggestions and deeper, integrated partnerships built around expertise, best-cost practices and shared goals.
The differences between traditional price models and data-driven cost models can be significant, affecting decisions about what regions or countries a product is moved to for manufacturing. Properly designed and costed products are more likely to stay at their original manufacturing location and near existing resources. OEMs and suppliers can collaborate around DFM software to address these and other strategic issues. Done early in design or during prototyping, DFM analysis reduces time-to-market, impacts direct and indirect costs, and helps optimize product functionality. It can be used by individuals or teams in making trade-off decisions to lower costs.
Some highlights of the latest version include geometry calculators in DFM 3.0 that are simplified and incorporated into the software’s main response panels so that users are more supportively guided through a DFM cost analysis of their parts.
The default manufacturing operations and user-based process libraries have been streamlined and a new Test View panel has been added to all formula windows. Development of customized operations and user processes is faster and easier, according to the company. The overall look and feel of DFM software has been updated to provide a more cohesive user experience between Design for Assembly (DFA) and DFM should costing. Performance of the DFA/DFM software link has been improved for more seamless data integration between software packages.