Skip to content

Vision and Robotics: Happy Together

Bruce Morey
By Bruce Morey Senior Technical Editor, SME Media

The next “dynamic duo” may not involve humans at all. “Machine vision and robots make for a perfect marriage,” stated Klas Bengtsson, global product manager, vision systems for ABB Robotics (Auburn Hills, MI). This is not new. Vision and robotics have gone hand in hand for years. But unlike other marriages, this one is thriving as machine vision is expanding in capability and finding new applications. The “perfect” marriage of the past involved guiding the robot. But new applications—like children from a good marriage—now include inspecting parts, reading bar or QR codes for traceability, and finding new ways to pack and unpack parts.

Lead-Image-1-ABB-YuMi-VI-1024x683.jpg
ABB’s YuMI collaborative robot, or cobot, features vision systems as integral to each system delivered, with cameras in the hands for maximum flexibility.

It all boils down to the flexibility machine vision provides. “Vision has helped ABB enhance the capabilities of many robotic applications,” he said.

The need for flexibly guiding robots has been driven by increasing product diversity and the need to shorten lead times in many industries. Why is flexibility so important? “If you hardcode the robotic automation, the minute you change a product or something in your production line you must change the program,” Bengtsson said. This also means changing precision fixturing or the way to place a part so that a hardcoded robot guidance program can reliably pick, polish or otherwise operate on the part. Costs add up.

Machine vision helps contain those costs by allowing the robot to “see” variations in parts and orientation. 3-D scanning sensors are opening even more new applications. Said Bengtsson, “The question 15 years ago was ‘Why use machine vision.’ The question today is: ‘why not?’”

Humans and Robots Collaborate

Rethink Robotics (Boston), founded in 2008, builds collaborative robots, or cobots. Its flagship product is Sawyer. “A cobot like our Sawyer robot is best used performing repetitive, mundane tasks that are human scale,” said Mike Fair, product manager. This means working with products or movements that replicate what a human does that is also numbingly repetitive and mundane.

“Applications like this include packaging, machine tending, line loading and unloading, inspection, and printed circuit board testing,” he said. Material handling for CNC machines is another application, replacing a human that loads a CNC machine, cycles it, waits for the machining task to finish, and then unloads the machine.

Image-2-Sawyer-Short-Hero-hi-res-May-16-768x574.jpg
Rethink Robotics includes a camera with each of its Sawyer robots, and easy-to-use programming in its Intera Robot Programming package.

Rethink believes in machine vision so much that it embeds a camera in the wrist of each Sawyer. “The embedded vision system [makes] our cobot faster and easier to deploy, removing hassle and costly integration time of typical machine vision systems. We add software to make it easy to deploy [those] vision systems,” said Fair. At the ATX West 2018 trade show, he provided a demonstration proving even a journalist can program a simple task in a few minutes. In conjunction with coded stickers that look like QR codes, their software helps the robot position itself to adapt to variability in the workcell. This also facilitates moving Sawyer robots from task to task and location to location.

Universal Robots USA Inc. (UR; Ann Arbor, MI) is also dedicated to collaborative robots. Its three robots, ranging in payload from 3 to 10 kg, operate within an “ecosystem” of third-party providers. It offers a free developer program called Universal Robots+ to enable third-party providers of end effectors, vision systems, and other peripherals to develop add-ons that plug-and-play with its cobots.

Image-3-UniversalRobots_TaskForceTips_mobile_cobots-750x420.jpg
Collaborative robots are easily moved by cart from station to station, and when equipped with vision can operate more flexibly.

The company’s website shows six different options of vision within this ecosystem that are suited for pick and place, CNC machine automation, packaging and palletizing, assembly, screw driving, and machine tending. Both 2D and 3D scanners are available. Several robot motion control and simulation software packages are available.

Integration, Ease of Use

For machine vision to be useful, it must be accessible, an important element of UR’s ecosystem. For example, one of their third-party packages is called Universal Metrology Automation, offered by 3D Infotech. It delivers integration of motion control systems, 3D scanners, and inspection software to create a configurable, automated metrology solution. Other programs provide online or offline programming, robotic simulation software, and monitoring capabilities.

“Today there is much better integration with robots and vision systems,” said Rob Antonides, president of Apex Motion Control (Surrey, BC). Apex is an integrator that has taken full advantage of UR’s ecosystem approach. “We used to have to hack together vision systems with robot controllers, but today they are more seamless plug-and-play, over Ethernet IP or any kind of networking. Twenty years ago, it used to take me a week to set up a vision-guided robotic system and the communications; today it is 15–20 minutes.”

Machine vision has proven especially useful for Apex. Tasks it has automated include a variety of baking operations, including decorating cakes. “Vision systems allow you to deal with inconsistencies and less-than-perfect presentation and positioning of product. We map the surface of a cake with a 3D sensor so we can decorate it quickly and easily,” said Antonides.

Image-4-FANUC-FANUC-CR-7iA-Collaborative-Robot-1-768x420.jpg
FANUC based its design for the new (green) compact collaborative robot series on its LR Mate series of mini material handling robots. The robots are ideal for small part sorting and assembly, inspection, machine tending and part delivery.

FANUC also emphasizes easy integration. “We released a brand-new controller at the end of last year, featuring prominent functions for integrated robotic vision,” explained David Dechow, staff engineer-intelligent robotics/machine vision for FANUC America Corp. (Rochester Hills, MI). “At FANUC, all of our vision systems are now embedded in the controller.” FANUC was one of the first robot companies to include vision for guiding robotics, offering it 30 years ago as an option in its robotic guidance systems.

Dechow also observed that machine vision companies have concentrated on developing easier interfaces with robots. “The vision suppliers are noticeably supporting this market,” he stated. “The heavy lifting in getting collaboration to work is on the vision side. The robot is expecting a point. The trick is using machine vision to find the right point, merging the coordinate system of the machine vision system to the coordinate system of the robot.” He agreed with others interviewed for this article that machine vision provides incomparable flexibility in many automation tasks. Today, the cost/benefit ratio is beginning to favor vision over precision fixturing and hardcoding, he said.

AI: Posing and Discrimination

A discussion of vision, data, and robots naturally leads to the question of how useful Artificial Intelligence (AI) would be for sighted robots. Dechow noted a catch—AI is not particularly applicable to robotic guidance. “AI learning is exceptionally good at categorization and differentiation. What it is not good at is discrete analysis,” he explained. A robot needs to move to a discrete point. When combined with metrology systems, it measures discrete geometric quantities. “There’s nothing to learn,” he said.

But it doesn’t end there. If not applicable to guidance, there are robotic tasks where AI offers value. “For example, identifying and classifying an object against its background to determine the object the robot is to operate on, combined with robotic guidance, would be a good hybrid application,” he said. Another is in pick and grip optimization. “Finding ideal orientation of the arm in space and the approach path falls into a learning domain rather than [being] deterministic. There are many orientations that are both good and bad,” he explained. “AI is especially useful where parts are random and not homogeneous in the placement and size.”

Peter Cavallo of Denso Products and Services Americas Inc. (Long Beach, CA) agreed that the substantial increase in vision system capability is both real and has opened new opportunities for robotic applications. “In its simplest form, [vision] finds a part or object and operates on it. In its most complex form, which we are seeing now, it can see things, interpret, recognize, and see the part for what it is—even in different configurations or positions—without moving the part or the robot.”

Image-5-Denso_3D_IMG_1746-768x576.jpg
Denso robot with Canon structured
white light sensor.

He sees 3D scanning as an especially important development, making previously difficult tasks such as bin picking straightforward. Denso’s latest system uses a structured light vision system to see in 3D with a CMOS detector.

His view of applying AI to robotics is based on a wider system context. “As we move into the next generation of robots, we are starting to get into things like deep learning, where vision becomes part of an entire system,” he explained. “We displayed something at the Tokyo robot show that showed deep learning in a system with 28 degrees of freedom that included two robots with hands that put together a salad.” Why should industrial users be impressed? The robot selected lettuce, fruit, and vegetables using only the knowledge provided beforehand. “We did not program the robot; instead, we told it what we wanted,” he said. Think of the number of unstructured tasks that exist on the shop floor that would be ideal for such automation.

Not Just for the Big Guys

Machine vision, integration, software, even AI may make one think that this level of robotic automation is only for Fortune 500 companies. That’s not the case. “It is highly applicable to small and medium-sized businesses because vision today is reasonable in cost. What used to be $50,000–$100,000 is now less than $5000. There is no reason to be shy,” Cavallo said.

Rick Brookshire, product manager for Epson America Inc. (Carson, CA), agreed that robotic automation is ideal for small-to-medium enterprises (SMEs). “In addition to larger customers that purchase 50, 100 or even 1000 robots, we have many smaller customers that purchase one, two, or three robots,” he said. Epson specializes in smaller robots optimized for high-precision tasks, growing its robot business from its own needs in building high-precision timepieces. “All of our robots are available with integrated vision guidance, a feature that has been growing in acceptance. It is also becoming more reliable. People are more willing to use it,” he said.

Image-6-EPSON_G6-SCARA_RobotWcamera_masked-768x509.jpg
Epson G6-SCARA robot with camera.

Like others interviewed for this story, he believes simplicity is vital—especially in meeting the needs of SMEs. The Epson Vision Guide program uses an object-based, point-and-click GUI interface, allowing developers to quickly develop applications, according to the company. While the ideal vision-guided robot application is something where parts are not positioned well within a reasonable tolerance level, or not able to be presented in a consistent manner, the precision that Epson robots provide opens another domain of use.

Tolerances are getting tighter. Parts are getting smaller. Precision placement is getting ever more difficult. “That is where precision robots such as Epson’s can help,” he said. “This is happening in smaller companies more and more. And these robots are getting so easy to use that many of our customers do the integration on their own.” The robots even place screws less than 0.7 mm in diameter. As the workforce ages, this may become a critical factor.

James Cooper, vice president of sales for Applied Manufacturing Technologies (AMT; Orion, MI), has a unique perspective based on his experience in the automotive manufacturing industry. AMT is a system integrator specializing in vision-guided robot systems. Multiple part numbers with complex systems, such as in transmission cases, are becoming prevalent. A transmission case is a large object with many holes and complex surfaces—ideal for vision but expensive for precision hard fixturing. “Lot sizes are also getting smaller with more frequent changeovers. That is why robot guided vision systems are becoming so critical in many applications.” He also observed the growth of 3D vision systems, especially in bin and tote picking applications.

It goes beyond guidance. “Quality inspection is an ideal application, looking for the presence and absence of holes and other features or determining if a component has been properly assembled,” he said. Another rich area is in metrology inspection. “We built a system with our partner Hexagon that is based on a collaborative robot that inspects a customer’s product. This used to be done on a CMM in a separate temperature-controlled room. For many years manufacturers wanted to do it on the production floor. It is now possible to have a completely portable system, with no guarding, ideal for first article inspection applications, right at the production machine,” he said.

He agreed that these applications are growing in SMEs. He also noted that these operations are struggling to find skilled labor, driving the need for more automation.

ABB has also developed a robotic inspection system through a collaboration with NUB3D (which it subsequently acquired), a leading innovator of digital, 3D inspection and quality-control solutions. The turnkey system consists of a 3D white-light scanning sensor mounted to the arm of an ABB robot.

Stäubli Corp. (Duncan, SC) is a robot company that uses a variety of machine vision from third-party vendors to build applications for robot vision guidance. But it also offers another use of vision systems with industrial robots that need human interaction: a virtual safety fence around robots.

Image-7-Staubli_ScannerRobot-768x658.jpg
Stäubli TX2-60L with area
sensing camera.

As explained by Olivier Cremoux, business development manager, North America robotics for Stäubli, man/robot collaboration can be binned into five different stages. The lowest stage 1, common in most industrial robots, is no direct contact, where any interaction between man and machine is dangerous. “The robot and operator are separated by hard fences,” he said. The highest, stage 5, is where contact is desired and required, for example when a robot is hand guided by an operator to carry out or assemble a heavy tool or other simultaneous motions with a robot. There are stages in-between where some contact is desired at different levels, and where the robot often needs to stop and let the human do something. “This in-between represents 95% of collaborative applications,” he said.

In these intermediate stages, a 3D scanning vision system that scans outwards from the workcell can detect a human approaching. Within a certain distance, it signals the workcell to slow the robot down to prevent contact and possible injury. As the human approaches even closer, it stops the cell entirely. This allows the human to perform an action, such as inspection or placement, then move away, allowing the cell to automatically restart.

HNJ Solutions Inc. (Carson, CA) is an integration company founded to provide vision systems for manufacturing, including with robots, according to President Greg McEntyre. While acknowledging that vision systems are indeed becoming easier to use and install, he says there remains a class of applications where integrators like HNJ are needed. “In complex applications with multiple colors, sizes, orientations, and general uncertainty, it is important to know how cameras work and why they work to install a good system,” he said.

He emphasized how 3D cameras are expanding relevant applications, such as bin picking. “Today we are even importing the CAD model of our objects so that the vision system can understand in 3D the object it needs to pick or operate on,” he said. “Cameras give complete freedom.”

What advice might he offer, as an integrator of complex vision and robotics? “While it may not be easy as it looks, I would say just jump in,” he said. “There is so much useful hardware that vision guidance and vision applications are becoming ever more useful.” In other words, just do it.

  • View All Articles
  • Connect With Us
    TwitterFacebookLinkedInYouTube

Related Articles

  • Smart Manufacturing

    Robotic Returns

    April 3, 2024
    A few not-so-tall tales of how machine shops gain a competitive edge through automation
    By Kip Hanson - Contributing Editor, SME Media
  • Workforce Development

    New On-Ramps to Workforce Development

    April 1, 2024
    There are two ways to tackle the skilled labor shortage: automate wherever possible or encourage a new generation to consider the manufacturing industry. Bowden Manufacturing of Ohio does both, with the help of industry partners such as Absolute Machine Tools.
    By Absolute Machine Tools, Inc.
  • Automation

    Unfreezing Manufacturing with Data-Driven Agility

    March 27, 2024
    Manufacturers require a deep understanding of operations to make quick, data-driven decisions and keep production running in the face of unpredictable challenges.
    By Chris Balow - Director of Product Management, Plex by Rockwell Automation Inc.

Webinars, White Papers and More!

SME's Manufacturing Resource Center keeps you updated on all of the latest industry trends and information. Access unlimited FREE webinars, white papers, eBooks, case studies and reports now!