AI’s learning ability is moving to the compute edge. And manufacturers will be a major beneficiary.
I realized the usefulness of being able to run AI on small computer processing power while working with NASA in 2010. My Boston University colleagues and I created a small brain emulation to control a Mars rover-like device with AI that needed to be capable of running at the edge and learning new things independent of any computer processing power available on Earth. Issues like data bottleneck and latency made it vital to explore different breeds of AI. Traditional Deep Neural Networks (DNNs) were (and still are) incapable of learning without huge amount of processing power, data and time.
We created a new breed of AI: Lifelong Deep Neural Network (Lifelong-DNN). It can learn throughout its lifetime, not just prior to deployment. Little did we know that this AI would become more useful on Earth, in manufacturing, than on Mars.
Mother Nature gave us “portable brains” that enable us to learn quickly and continuously. So it’s only natural that this AI paradigm would be important beyond robotics, as more devices, sensors and data enter our lives.
This is particularly true in manufacturing. IMA Group, for example, has factories filled with cameras, IoT devices and sensors that help in the design and manufacture of automation equipment for the processing and packaging of pharmaceuticals, cosmetics, food and beverages. But with these devices comes a huge stream of data in need of processing in situ.
The Industry 4.0 movement is predicated on the ability of manufacturers like IMA to leverage and mine huge amounts of data produced in modern industrial settings—a process that utilizes edge learning and technologies originally designed for Mars exploration.
Take quality-control cameras: IMA’s machines are equipped with dozens of cameras processing at 60 frames per second, thousands of products per hour. In the same way it was impossible to ship all of the raw Mars rover data to Earth for processing, it would make no sense for IMA’s cameras to ship tons of useless frames to a centralized cloud for AI processing. Only the highly important frames need to go, after processing it at the edge.
Making AI algorithms run on small edge computers has an important subtlety. There are at least two processes at play in it: inference, or "predictions" generated by an edge and edge learning, namely, using the acquired information to change, improve, correct and refine the edge AI.
One barrier to smart factory adoption is AI’s inflexibility and lack of adaptability. AI algorithms can be robust if all the data is captured in the training process beforehand. But that is not how the industrial world works. Training traditional DNNs requires a ton of data, lots of computer power and time, making them unsuitable to fast learning and configurability. Approaches like Lifelong-DNN let AI-powered edge computers understand the data coming to them and adapt.
For manufacturers like IMA, edge learning lets its cameras learn new product types and defects in a real-world scenario where new items are constantly introduced and previously unseen defects show up on the production floor. There is no need to "reprogram" the AI from scratch.
AI that learns at the edge will empower AI to truly serve its purpose: shifting intelligence to the edge where it is needed—at speeds, latency and costs that make it affordable. Moreover, it helps manufacturers like IMA stay competitive on the global stage.
Connect With Us