Skip to content

Industry 4.0 in a Multi-Protocol Universe

Ilene Wolff
By Ilene Wolff Contributing Editor, SME Media
Wise-Jonathan.jpg
Jonathan Wise, CESMII

It may be the thorniest part of implementing smart manufacturing and Industry 4.0: How to bring existing systems into the world of machine data production, collection and analysis?

With a lack of standardization in software protocols for operational and informational technologies, it may seem impossible to get legacy and new factory equipment to work together seamlessly, producing streams of standardized, actionable data.

“The reality is we’re going to have all of these protocols, both the new, great ones and the older, primitive ones, for a long time,” said Jonathan Wise, vice president of technology at CESMII, the Smart Manufacturing Institute.

Waddell-Russ.jpg
Russell Waddell, MT Connect

Wise’s comment came as he moderated a group of experts during the recent Smart Manufacturing Interoperability and Harmonization Panel, part of the online CESMII Smart Manufacturing Summit held to show what it means to democratize smart manufacturing.

Panelist Russell Waddell, managing director of the MTConnect Institute, had defined the terms earlier in the discussion: “The protocol is how you pass data from place to place, system to system, whereas the [information] model is what’s actually in that data,” he noted, pointing out that the MTConnect standard offers a semantic vocabulary for manufacturing equipment to provide structured, contextualized data with no proprietary form.

“How would you guide manufacturers that probably are looking to retrofit brownfield [already existing] systems so they can take advantage of some smart manufacturing or, in other cases, may be looking to buy new machines?” Wise asked the experts. “What should they be looking for in the greenfields and to make sure that they’re well-positioned to move forward into Industry 4.0? And then how do we—the folks trying to create information value out of these data sources—manage a multi-protocol universe?”

Hoppe_Stefan.jpg
Stefan Hoppe, OPC Foundation

OPC Foundation President and Executive Director Stefan Hoppe said, “For brownfield scenarios, definitely,” there is a “huge ecosystem, with gateways talking to the upper-layer OPC UA [a machine communication protocol], including MQTT transport [a protocol that transports messages between devices] and so on, and in the lower level, using what they had before.” For new, greenfield scenarios, nearly all machine vendor have integrated OPC UA into the design and device. “Then you have information modeling from the source directly out of the box,” said Hoppe.

Erich Barnstedt, chief architect, standards and consortia, Azure IoT at Microsoft, weighed in: “Quite frankly, legacy equipment is going to be with us forever. We need to make it easy with the right tools to map that legacy equipment to a standards-based, open-information model. Do that process once, and from then on use the standardized information model and find an automatic mechanism to also send that information model to cloud analytic systems or on-premise analytic systems.

“It doesn’t have to necessarily be cloud-based, but analytic software is where the value-add and Industry 4.0 is, in terms of gaining new insights that you couldn’t do before. So, making that connectivity and data modeling mapping phase as painless as possible—that’s what we owe our customers.”

Barnstedt-Erich.jpg
Erich Barnstedt, Microsoft

No single protocol needs to win out, Barnstedt added. Manufacturers need to know how to get their data into a format they can use going forward, vendor-independently, and then create the right tools for the job to make it a seamless experience, he said.

The key point of interoperability is the information models, Wise stated.

“If you’re a manufacturer with older equipment, you are not blocked by that. We can adapt and implement the kind of bits needed to connect to modern information systems. But along the way, we’re going to need to normalize or standardize that data.

“And if you find yourself creating your own information model, if you think that you have the best way to describe a particular machine, stop. Experts have worked on that particular problem for a decade or more. That’s where we need a commonality. That’s where we need to adopt some common approaches, some common language.”

Faath-Andreas.jpg
Andreas Faath, VDMA

To help get to a common language, OPC UA information models will be published in the UA Cloud Library, which will also include existing information models from VDMA Verlag, an engineering association based in Germany, according to panelist Andreas Faath, managing director of the European group.

Joining these tools in the library in the future may be a global production language based on OPC UA that VDMA is working on.

The open-source software tools for interoperability referenced by the panel—OPC UA, MQTT, the MTConnect companion specification—are already published on GitHub. “Everything that you need is there, and if you need help finding those steps, that’s what CESMII is here for; not to pick sides in a protocol war, but to help you find the steps from where you are to where you want to go,” said Faat.

  • View All Articles
  • Connect With Us
    TwitterFacebookLinkedInYouTube

Always Stay Informed

Receive the latest manufacturing news and technical information by subscribing to our monthly and quarterly magazines, weekly and monthly eNewsletters, and podcast channel.