My original intention for this column was to discuss a phrase getting a lot of buzz lately, artificial intelligence (AI). By any measure, interest in AI is expanding exponentially, both in the number of articles one can read on the subject and, according to Google Trends, the number of searches for those articles.
There are other forms of AI, such as design and shape optimization, generative design, and predictive analytics—to name a prominent few—and one might get the impression intelligent workers should feel threatened. Some experts say intelligence of the artificial kind may well replace us all.
I don’t think it will. I do think it will aid us greatly, enhancing rather than replacing us.
To understand why, consider that what people are calling AI is just mathematics. Yep, you heard right, it’s just digits and numbers combined with some amazingly complex functions and calculations. I have some experience in this. I dabbled in this area earlier in my career, programming a simple neural net, one of the basics behind many forms of AI finding wider acceptance today. I did this to better understand how to use AI in image analysis, and I came to realize a few things.
The first was that the form of AI that uses vast amounts of data to replicate “human” action is just a form of statistical modeling and regression analysis—albeit an incredibly extreme form. Data, lots of it, is needed to train the algorithm. The same is true, using different mathematics, for optimization or generative design. AI is just very complex mathematics packaged in an easier-to-use form.
Another realization was how long we had to wait to get an OK answer 20 years ago. This is in stark contrast with the ease much more difficult problems are being solved today thanks to powerful computing. Cheap, ubiquitous sensors provide vast data sets. You don’t have to program AI functions either, like I did. You can buy or rent models, like “deep neural nets.” IBM’s Watson and the MathWorks Neural Network Toolbox are just two examples. You provide the data and get your own, unique AI solution. Today, AI is making robots on the shop floor more versatile and helping engineers design parts through optimization.
But mathematics, sensors, and data all remain tools to be shaped by the desires of engineers and workers. They might perform individual tasks better than humans, but which tasks to perform and why will always remain in the realm of humans. Artificial intelligence is empty of purpose. That will always be our job.
Finally, fancy mathematics such as AI in its various forms is not always the right tool. This is where we get to the “old” part of this column’s title. This hit home when I was researching an article for this month’s ME on optical comparators, a metrology technology whose roots date back to the 1920s. Although most models today are updated with rudimentary digital electronics, the basic idea remains the same—magnify a shadow of a part on a large screen and have a human make a measurement. You find these machines everywhere in manufacturing, with varying degrees of digital improvement to the basic idea, up to and including machine vision. One of my sources told me his company’s service recently refurbished a 50-year-old model.
It is this coexistence of futuristic technology like AI with the tried-and-true that is the ongoing reality of manufacturing and engineering. Perhaps some of this is simple reluctance on the part of engineers to fully embrace new technology. As the same source told me, he sometimes needs to emphasize that newer technologies, even in optical comparators, is available. Still, technologies like optical comparators and micrometers and calipers and the humans that use them are not likely to be fully replaced by anything fancier.