thumbnail group

Connect With Us:

Advanced Manufacturing Media eNewsletters

ME Channels / Advanced Manufacturing Now
Share this

Robots in the Auto Plant: What Are They Up to Now?

 

With easier programming than ever before, today’s robots leap into the future with new abilities to see, feel and execute different tasks

 

By Ed Sinkora
Contributing Editor

In its latest commercial for the Cadillac SRX, General Motors shows a bevy of robotic arms happily dancing around the car as part of its summer sales event. Robotics have certainly come a long way from GM’s multibillion dollar attempt to go lights-out in the '80s, when robots infamously painted each other and welded doors shut.

But while fixed automation is prevalent throughout the global automotive industry, robotics—or flexible automation systems—have actually been a mixed success in this sector, and automakers have shown caution in expanding their role.

While robotics are now a given in some areas, such as the once-challenging welding and painting areas, the large, high-volume automotive companies have shown some reluctance in other areas where robotics companies believe there are new opportunities given the ease with which one can implement flexible, collaborative robotics systems. Fact is, robotics have changed in exactly the areas that caused so much pain in the past.

ABB’s Integrated Force Control provides this robot with real-time feedback that enables the robot to maintain a constant force between the abrasive wheel and the workpiece. ABB said customers report that robots equipped with such sensors do a better, more consistent job than a man can, owing to the robot’s ability to maintain a tighter tolerance on the force being applied while maintaining proper part dimensions.


Machine Vision Cuts the Cost of Part Pick-Up

The automotive industry is filled with random and semi-random pick-up tasks: Raw castings that vary slightly in size. Connecting rods in a heap. Car seats on a conveyer. The list is endless.

Before, if you wanted a robot to pick up these components, you either relied on specialized tooling or used a human to orient each item uniformly. Both approaches are expensive.

“Now,” said FANUC’s David Dechow, staff engineer-Intelligent Robotics/Machine Vision, “your robot can automatically adjust its pickup move using machine vision. For example, a robot with machine vision can pick up raw, unfinished connecting rods from a bin, in random orientation, for subsequent machining processes.

“Or it could be as far down the line as picking up a seat to be placed in a car for final assembly. The ability to do something like picking up a seat and knowing exactly where it is in space to correct for gripping errors, and positioning errors prior to gripping, is a big plus that lets you make the incoming part flow more flexible and less fixed. We’ll see more and more applications that require the ability to pick up and place randomly oriented parts and that will only be accomplished with vision enabled robots.”
FANUC’s iRVision 3D Area Sensor utilizes stereo cameras and an advanced illumination system that uses binary coded structured light to create a three-dimensional scene. The technique does not require moving the sensor or the parts to analyze the bin contents, which contributes to its efficiency on the plant floor.
Tougher challenges like bin picking or handling a car seat require advanced 3D sensing, which has “really come to the forefront in the last few years,” according to Dechow. “FANUC offers two types of 3D scanning. One is a laser-line-based 3D sensor. The other is a very high-end scanning sensor that produces a full 3D point cloud of a group of objects that might be in a bin or lying on a table. It gives the robot 3D real-world locations to pick the part in.

“The position we take is that we live in a 3D world and the robot operates in 3D space. So we’re working hard to make the robot not just mechanically capable, but also visually capable, in that 3D space.”

FANUC feels so strongly about this that every robot they ship now is “capable of vision with both 2D and 3D guidance, and even inspection, right out of the box.” That doesn’t mean every robot has a camera on it, or needs to. But in Dechow’s view, every robot needs to be visually capable, because changes to a factory’s production requirements will often require robots to be reassigned to applications that require the capability.


Some Robots Prefer to Feel Their Way

Machine vision isn’t the only way to give a robot flexibility in part pickup and placement. New advances in tactile sensing also give robots the ability to adjust their motions in real time. For example, ABB’s new Integrated Force Control technology makes it possible to robotically assemble parts with tight tolerances without requiring highly accurate and expensive fixtures.

According to ABB’s North American manager of technology and support, Nick Hunt, such a “force-controlled robot” can apply search patterns—while using feedback from the force sensor—to find the correct position. Like machine vision, this cuts the cost of tooling and reduces the risk of assembly failures or damaged parts.

ABB’s Integrated Force Control also can be used to improve robotic machining applications such as grinding, polishing, deburring and deflashing. One package feature allows a robot to grind, polish or buff parts while maintaining a constant force between the tool and the work piece. Another feature enables a robot to deburr or deflash part-lines and surfaces of parts at a controlled speed, slowing down when encountering excessive burrs or casting flash.


More Economical Approaches to Flexible Part Placement

Force/torque sensors and machine vision are not always the most cost-effective solution, however.

Hunt points to two approaches that might provide the necessary flexibility at lower cost. The first uses what ABB calls “SoftMove.” Hunt explains that even in applications in which you’re using a highly repeatable robot to place parts in a predictable spot, a nonsensing robot relies on the accuracy of the robot end-of-arm tooling and other fixtures. But specifying tooling with tight tolerances can get expensive in both machining and engineering costs, and subject to wear in any case.

“So,” asked Hunt, “why not just guide the robot into position using a tapered pin and hole and ease the tooling tolerances a bit? That’s what ‘SoftMove’ was initially designed for. It’s a Cartesian soft-servo, a kind of software spring. Allowing the robot to comply in a certain plane or direction allows the process to dictate position, instead of the other way around—not perfect for all applications but more than you might think.”
 An ABB Force Sensor on a robot’s “wrist.”
Hunt said some situations allow a second “software solution” to the problem. “Let’s say we need to align a robot to the plane of some large assembly sitting out there in a fixture, or perhaps being held by a robot or robots. Let’s say we need to weld another part onto this larger assembly but it landed a little cockeyed in the fixture. All we have to do is poke a robot-held tapered pin into three GDT (geometric dimensioning and tolerancing) holes on that part and we’ve established a complete frame including translation and rotation.”

A Ford Motor Co. subassembly supplier gave them just such a case, locating the position and Z rotation of a large assembly. ABB programed the robot to go through the frame, and stop when the part touched the frame without damaging it. (The ability to programmatically alter the force applied to a part for location purposes is a very important aspect of SoftMove according to Hunt.) As one edge of the robot plate rested on the part the rest of the plate would come down on the part, establishing the skew of the plane. The robot would then slide across the plane until it hit the corner. So the system had now determined the location of the part in X-Y-Z coordinates, including its orientation—the POSE. They had everything they needed to successfully weld parts onto the frame, all done with software. This software-only method for aligning a part frame is possible because, as Hunt puts it, “The robot is in all actuality one really big sensor working in reverse. We simply exploit that attribute.”
 


Working Virtually Hand-in-Hand with Your Robot

Collaborative robots, which work in proximity to humans without guards, are a hot topic today. Yet there is skittishness about the idea and few installations in the top tier auto sector. That might soon change. KUKA Systems just introduced a lightweight robot specifically to serve as an auto worker’s “third hand,” relieving him of awkward, uncomfortable jobs like inserting rubber plugs into a vehicle body or screw fastening, even within the confined space of a vehicle’s interior.

KUKA’s new LBR iiwa lightweight robot works right next to humans to take over uncomfortable, cramped jobs like plugging holes in a wheel well.

KUKA calls the lightweight robot LBR iiwa. (LBR for “Leichtbauroboter,” German for lightweight robot, and “iiwa” for “intelligent industrial work assistant.”)

With seven axes, it’s based on the human arm and features high-performance collision detection and integrated joint torque sensors in all axes, so it can handle delicate joining processes. It’s on a mobile platform called the KUKA flexFELLOW, so an operator would place the system near the programmed job and execute the operation while he goes on to other nearby jobs.

 

Let’s say the job is inserting rubber plugs into the holes in a car body’s wheel well. If the operator didn’t place the platform in exactly the position expected by the program, the robot is able to calibrate the exact position and adjust its movements to find the holes by touching the workpiece and “feeling” the resistance with its torque sensors. Combining several points yields a complete “picture,” enabling the program to run. Vision sensing is not required. 

ABB’s RobotStudio can program and optimize a robot’s moves on a PC even from an office, including machine tending (left) or detailed cutting operations (right). Because RobotStudio includes the Cognex program suite and tools it can even set up a machine vision system.ABB’s RobotStudio can program and optimize a robot’s moves on a PC even from an office, including machine tending (left) or detailed cutting operations (right). Because RobotStudio includes the Cognex program suite and tools it can even set up a machine vision system.  

 

 

It’s a compact system, but KUKA said it’s the first and only lightweight robot with a payload capacity of over 22 lb (10 kg). It’s available with payload capacities of 15 and 31 lb (7 and 14 kg). 


It’s Getting Much Easier to Set Up a Robot

Robots can be a challenge to set up, as the early horror stories at GM attest. FANUC’s Dechow is a 30-year veteran of machine vision and a strong proponent of the technology, but even he admits that vision systems have a well-deserved reputation for being difficult to implement, until now. Not only are vision systems better integrated and easier to set up, they aid in setting up the robot itself.

Dechow said, “Before, an operator had to position the robot’s vision system over a calibration grid and then, often with complex moves, get the robot to touch off on the grid, or otherwise measure and correlate the grid to world space and then share that space information with the camera. Then the camera would have to take a picture of the grid, all through a step-by-step process dependent on the operator. With FANUC’s automated calibration, the operator only needs to get the robot to the right spot and command it to do the operation. The robot itself then shows the calibration grid or the calibration article to the camera and automatically performs a calibration with respect to the camera.

“Even more exciting, for fairly simple tasks we’re introducing a no-calibration capability within the FANUC guidance suite. In these cases the robot will automatically discover its real-world relationship with the object using the vision camera without using any calibration grid or operator intervention. It will calibrate itself relative to a target part automatically and know the real-world relationship of that part and where it has to be guided to pick that part up—even for parts in random orientation without fixed tooling.”


Wouldn’t You Rather Set Up in Your Office?

Easing the burden of calibration is one thing, but it still leaves the bigger task of programming the robot. Trying to program, or just debug a program, on the shop floor ties up the space, the robot, the fixtures, maybe a camera, and more. ABB’s RobotStudio lets you program a robot and vision system at your desk. Nick Hunt explains: “RobotStudio gives you full 3D simulation of the robot and its moves. You can import your solid file into RobotStudio and it can recommend the optimum path without interference, then generate that path, complete with the actual acceleration and deceleration for very accurate cycle time.

“We also worked closely with Cognex,” Hunt says, “one of the industry leaders in machine vision, to incorporate their program suite and tools into RobotStudio. You sit in your office with the same machine vision camera you’re going to use, an actual part or a photograph of the part, and software. Then you teach the vision system and debug or improve your application almost as if you were out on the floor. The robot you’re watching on your RobotStudio screen moves in exactly the same way with exactly the same cycle time as what you’ll see on the shop floor. You’re working in a virtual world, producing a program you can run in the real one. 

“The only thing you’re not certain of is how well calibrated the real world is to what you have in your simulation package. Once back out on the floor, if you calibrate properly, you’re going to get pretty much what you developed in your simulation. The time savings are huge.”

Oddly, Hunt adds that many people don’t take advantage of this, preferring to take their new toys out on the floor to do the setup there. That’s a waste that management should be keen to eliminate.


Different Machine Vision Integration Philosophies Yield Different Benefits

Engineers face a basic question in fielding any complex product: Which subsystems should you design and build yourself and which should you source externally and integrate? FANUC elected to field their own machine vision product, which they call iRVision. FANUC’s Dechow said, “IRVision cameras connect directly to the FANUC controller. There is no third-party computing device or external processing going on.” He credits this “seamless integration” with iRVision’s ease of use, its unique ability to perform automated calibration, and its particularly efficient and accurate high-end functions like 3D sensing, “because the vision system always knows where the robot is and the robot always knows what the vision system is seeing in near real-time.”

ABB took the opposite tack. As Hunt puts it, “We’re not a vision company. We’re a robot company. We’re a motion control company. That’s what we do well. We get the robot TCP [tool center point] from point A to point B as quickly as possible with minimum path deviation and a lot of engineering tools to play with to execute logic along the way, or to interrupt it or even subscribe to sensor and other variable data. We don’t want to spend a lot of time developing vision algorithms. We don’t need to. There are many companies who specialize in doing that and they do it quite well. We’ll use what they have and know that we are using the best vision technology industry has to offer.”

This focus on motion control yields faster robots, based on ABB’s frequent benchmark testing. For example, tests of their IRB2600 robot have shown 25% shorter cycle times over comparable competitor models and typical improvement is 10% shorter cycle time right out of the box. Their patented motion control algorithm is “as secret as the formula for Coca-Cola,” but Hunt reveals one trick is to constantly maximize acceleration of the critical joint versus just maximizing speed of the dominant joint. (Remember: acceleration is a vector, while speed is a theoretical scalar.) Hunt said the latter “method alone without a real-time kinematic model will waste torque and consume more power with no added benefit,” whereas ABB’s robots often use less power to move the same load.

However, since there is no standard protocol for communicating between robots and sensors, ABB and other companies who take their approach must develop interfaces. In some cases, they can build upon what has been delivered to offer fully integrated, turnkey solutions. ABB offers such a package in partnership with Cognex, called Integrated Vision. They estimate that a user saves about 25% of the usual setup time by not having to develop the vision system communication drivers. ME

 

This article was first published in the September 2014 edition of Manufacturing Engineering magazine. Click here for PDF.


Published Date : 9/1/2014

Advanced Manufacturing Media - SME
U.S. Office  |  One SME Drive, Dearborn, MI 48128  |  Customer Care: 800.733.4763  |  313.425.3000
Canadian Office  |  7100 Woodbine Avenue, Suite 312, Markham, ON, L3R 5J2  888.322.7333
Tooling U  |   3615 Superior Avenue East, Building 44, 6th Floor, Cleveland, OH 44114  |  866.706.8665