Automation remains a focus in manufacturing—for all the obvious reasons—and robot vendors continue to introduce impressive new capabilities. Many would echo Zach Spencer, robotics automation manager at Methods Machine Tools Inc., Sudbury, Mass., when he says the “big, exciting thing right now is collaborative robots. Cobots are innovative automation systems that open up new machining options and increase your return on investment. Some customers have paid for cobots in less than three months.”
Spencer added that cobots are popular in large part because of the intuitive way they can be programmed. “Instead of having to go through the teach pendant to program the robot to move to a specific point, you can manually drag the robot and create a set point in the teach pendant. Then you drag the robot to the next position and designate set point two. That’s one of the better technologies with collaborative robots.”
Cobot makers have also revolutionized automation by facilitating integration with third-party accessories, including special grippers, vision systems, welding heads, and operational software. Ann Arbor, Mich.-based Universal Robots is a leader in this area, with over 300 products in its UR+ program and hundreds more in the pipeline. Senior Manager of Applications Joe Campbell calls UR+ an “app store for Universal Robots. We give the developers deep access [to our operating system], so they can develop software. And then we validate and test it and certify it.”
Advanced Bin Picking
The ability of a robot to pick up parts from a bin, even if they’re randomly oriented, isn’t new. But vendors like Universal Robots make such systems easier to implement. Historically, Campbell explained, you’d have to spend weeks writing code to cover all the different scenarios that arise in such situations. But UR’s ActiNav solution already incorporates “a large chunk of code that was developed over a long period of time that strips away all the programming that’s traditionally required to define how you’re going to get from the pickup location to the put down location. … With ActiNav, you teach it the part … teach it the drop off location, … define the bin, which is basically touching it in four points, … and define any other barriers in the space. And then the system will completely plan its own robot path and trajectory to get from whatever pickup point it identifies to a placement point, with no custom programming.”
Campbell added that “teaching the part” to ActiNav is as simple as importing a CAD file of the workpiece and then designating the surfaces where the cobot should grip it. The part doesn’t have to be simple, it just has to have “surfaces that allow you to get an accurate pick. It’s very straightforward. I’ve watched guys take a brand new part that the system has never worked on before and set it up in about two hours.” What’s more, Campbell said, unlike the e-commerce solutions often associated with random bin picking, ActiNav is precise enough to auto-load a machine tool.
As you might have guessed, ActiNav requires a 3D vision system in order to orient the cobot for each pick-up. In this case, it’s a sensor from UR+ partner Photoneo, Bratislava, Slovakia. The unit is mounted above the bin and generates a detailed image almost instantly, said Campbell. (The system captures 3.2 million 3D points per scan.) Decision and load time is likewise fast, and Campbell sees jobs with a cycle time of roughly 30 seconds as the ideal fit for the product. That’s too fast for an operator to tend multiple machines, but within ActiNav’s processing time. If the time is much faster than 30 seconds, ActiNav would slow the machine cycle. Of course, there are other situations in which you’d gladly accept a slower cycle time, such as automating a nightshift that would otherwise be unproductive.
Smart bin picking isn’t limited to cobots, and Spencer of Methods Machine Tools provided a real-world example in which FANUC robots grab forged aluminum torch bodies out of a bin and load them into a FANUC RoboDrill for machining. The key is FANUC’s iRVision system, which creates a 3D point cloud of the objects in the bin. “And that information is related to the robot so it knows which angle to approach the part at to be able to pick it out accurately.”
Spencer pointed out that forging produces a parting line where the two dies meet, and the line isn’t always in the same spot. The complex, contoured features of the part presented another challenge. But like ActiNav,“you upload a 3D model into the software, and you tell it this is the shape we’re looking for,” Spencer explained. “FANUC’s software accounts for variations in size, and it’s excellent at letting you set limits on those variations.” The Midwest manufacturer of these torch bodies went from hand-loading giant rotary transfer machines to an automated cell of four RoboDrills serviced by two robots. As a result they now produce the parts “with improved throughput, more uptime, and a much lower scrap rate than they did on the transfer machines.”
A more recent example Spencer related is auto-loading large axles for construction equipment from wooden skids into heat treatment and then into a machine. The axles differ in length, diameter, and weight (from about 200 to 500 lb), and the lot sizes are small. Even the height of the wooden skids varies, Spencer added. Methods created a system that enabled the operator to simply move a 4 × 4' (1.22 × 1.22-m) pallet of diverse axles—all standing vertically, flange side down—into the cell with a forklift. Then FANUC software and 2D vision cameras, oriented for a side view, determine how far down the grippers need to be positioned to grab each part, while an overhead camera determines the correct position on the floor (in X and Y). Spencer said the vertical positioning is “the big technology improvement. … Actually being able to take a picture with a robot and take a measurement off that picture [is a big improvement.]”
Other Vision System Improvements
From Campbell’s perspective, ActiNav is an example of how today’ s robotics “hide incredibly complex” technology from the user. In a similar vein, Mika Laitinen, solution sales director for Fastems Oy AB, Tampere, Finland, and West Chester, Ohio, talked about how the task of apturing X and Y coordinates with a camera is seemingly simple, but might actually require advanced imaging algorithms. For example, he pointed to U.S. camera manufacturer Cognex, whose latest smart camera release uses “neural network calculation for certain imaging algorithms.” He added that such systems turn a formerly difficult task like optical character reading into a generic, robust capability. The computing capacity of these systems is improving such that “artificial intelligence is not a separate topic from robotics. … Nowadays, artificial intelligence is increasingly embedded in smart sensors and smart camera systems.”
David Bruce, engineering manager for the general industry and automotive segment, FANUC America Corp., Rochester Hills, Mich., said the company’s iRVision systems have built-in artificial intelligence (AI) error proofing “without any additional hardware.” iRVision features robotic guidance based on 2D or 3D machine vision and “FANUC’s highly reliable robot controller” handles both the robot motion and “the vision processing, including the AI Error Proofing function.
“Since iRVision does not use a PC or smart camera, it does not negatively impact the reliability of a workcell,” Bruce continued. “By providing multiple examples of good parts and bad parts, the AI Error Proofing tool differentiates between the two during production runs. During setup, the operator can present multiple examples of workpieces and classify them into two categories—good and bad. Once the operator classifies the images, the AI Error Proofing feature automatically classifies the parts during production runs.”
Fastems’ Laitinen surmised that LiDAR (light detection and ranging) technology will eventually supplant stereo vision systems (i.e., using multiple cameras to measure in X, Y, and Z). “2D cameras are small, handy, and easy to put anywhere you want. You can hide them in the robot wrist and get all the information. But if you want to use 3D vision, the equipment is much bigger, and might include additional laser profile scanners or whatever technology is used to measure the third coordinate.”
Conversely, he said, LiDAR units look no different than 2D cameras, yet capture 3D data. (Laitinen specifically referred to sensors from SICK AG, Waldkirch, Germany, and Minneapolis, Minn.) “Using LiDAR technology to measure the X, Y, and Z coordinates with only one camera would be some sort of revolution,” as Laitinen put it. Fastems hasn’t fielded such a system yet, but has undertaken feasibility studies with several customers.
New Robotic Metrology Capabilities
Using a robot to tend a CMM or measuring station is neither surprising nor new, but using a robot’s grippers to actually perform a measurement is. Campbell said New Scale Robotics, Victor, N.Y., a UR+ vendor, has developed a gripper that serves as a high-precision caliper, in addition to its pick-and-place function. And it’s accurate to within 2.5 µm.
Campbell said users “can either pick up the part and record the measurement while the part is grasped, or they can use the gripper to mic over the part, testing multiple locations before they acquire it, pick it up and move it. It’s a great example of embedding the metrology process right in the middle of the manufacturing process.”
New Scale Robotics, a division of New Scale Technologies, pairs the high-precision gripper with a UR3e cobot from Universal Robots to create the Q-Span Workstation. One customer, OptiPro Systems, Ontario, N.Y., uses the Q-Span for 100 percent in-process inspection of the optical glass cylinders coming out of its OptiSonic grinding machine. Parts that pass inspection go on to a CMM for final validation. Not only does this eliminate the need for the manual checks the company previously performed, the Q-Span is also four times more accurate than manual calipers, resulting in tighter control over the manufacturing process and reduced scrap, according to the manufacturer. Campbell said the New Scale grippers also accomplish in one cycle what it takes a human six steps to do. Plus, OptiPro can handle a variety of parts with the same set of grippers.
Data From the Robot
Laitinen observed that collecting servo data on the motion and health of the robot also offers two important benefits. The first is the ability to study trends for predictive maintenance, FANUC’s ZDT (zero down time) program being a notable example. The second is QC related, “collecting application-specific process data and distributing it to the upper level control software.” This applies to applications in which the robot is acting upon the workpiece, as opposed to tending another machine.
Laitinen said the aerospace industry offers good examples of both using robots in this way—e.g., finishing and linishing (a finishing technique that smooths or flattens metal) turbine components—and of using robot servo data to control the process. “People in the aerospace industry understand that rather than making parts and checking the quality afterwards, it’s better to make an effort to control the process,” he said. It leads to an understanding of how the part is manufactured and all the key performance variables and how those values vary during the manufacturing process. Done right, they use this data to make real time corrections or, if necessary, take a robot offline automatically and reroute the work to other units before producing any bad parts.
“That has been the success story of Fastems in the aerospace industry. Our control software can adapt to these kinds of changes in the production line without stopping production.” He added that the COVID-19 pandemic has forced the company to find customers outside of commercial aviation, and it is bringing this philosophy about “measuring the process and not the part” to the new customers.
It’s long been the case that some automated lines require one robot to hand a part over to another. And in welding, it’s sometimes the case that one multi-axis robot holds the part while another moves the torch, with both robots under the command of a single control. But it’s relatively rare for several robots to be moving the same part. Fastems recently installed just such a system in the U.S. The parts are giant rings of up to 2.4 m in diameter, with varying thicknesses and weight. Laitinen explained that lifting them with one robot would have required a very large and expensive three-point gripper, not to mention a very big robot, “and accessibility would have been very difficult.
“So instead, we use two robots on the same long linear track [which parallels the production line of machine tools, washing stations, etc]. The robots move like they are one two-armed robot,” he said. “This is controlled by a single robot controller and both robots are instructed from one program.” All the pick-and-place tasks are done with simple two-finger grippers that grab the part from both ends, lift it up simultaneously, and move it.