- This industry insights is filed under:
Marriage of Robotics and Machine Vision Illustrates Next-Generation Automation at IRVS and Conference
by Winn Hardin, Contributing Editor
Robotic Industries Association Posted 05/16/2003
From concept to design, to production and shipping, automation decisions are made, affecting quality in the process. Robots, machine vision and metrology automation ... any one of these choices can help elevate yield, savings and quality. This year’s International Robots & Vision Show (IRVS) and Conference takes a snap shot at developments in all three technologies, while drawing insightful relationships that illustrate the convergence of vision, robots and metrology.
The IRVS and Conference addresses this convergence of automation technologies by evaluating issues that fall all along the system development curve, from specification to implementation. Tips from some of these industry’s most experienced professionals will illustrate ways to make your systems more effective and operations more efficient. Whether you’re a vendor of general- purpose, machine vision systems; turnkey robotic work cells; automated metrology solutions, an end user or an integrator, the IRVS and Conference presents a wealth of integration solutions. Conference presentations, tutorials and workshops will show you how to integrate diverse systems such as vision and robotics from specification to implementation.
Start at the Beginning: Specifications
A successful project that uses robots and vision – whether it be for separate inspection and material-handling systems or integrated robotic-guidance applications – must start with a thorough understanding of the application and how that application will dictate a system’s specification. David Dechow, Project Manager at Aptura Machine Vision Solutions (Wixom, MI) and Ronald Potter, Director of Robotic Technologies, Factory Automation Systems Inc. (Atlanta, GA) will discuss the importance of a good spec and the roles of end users and integrators during a dual-track Tutorial on Monday, June 2, starting at 3:30 p.m.
‘‘A potential user of machine vision technology needs to understand the technological capabilities of various approaches as it relates to the application, and make sure that the capabilities are fully implemented in the system,’‘ said Dechow. ‘‘Those of us that have been in the industry for a long time in both vision and robotics always highlight the importance of a good system specification, but it’s always surprising how infrequently that’s competently or successfully done.’‘
The importance of the specification goes beyond the vision and robot subsystems to include all systems that interact with either vision or robots, including conveyors, PLCs, IT networks, etc. Potter, who will focus more on the robotic side of the vision/robot integration question, will discuss the emergence of soft PLCs as a tool to handle systems that involve robotic guidance to moving parts on conveyors and other complex manufacturing cells.
‘‘Normally, we give the PLC master control and use the robot and vision systems as slaves to the PLC because of the familiarity of manufacturing with PLCs and, just as importantly, because there are a lot of things that are going on concurrently – like conveyors bringing parts, sorting, etc. More and more robot manufacturers are using soft PLCs on the robot controllers that give the interface a ‘‘PLC’‘ look. In this case, the PLC becomes the focal point for the system and the robot is just one part of the overall system,’‘ Potter said.
Robotic Guidance Puts Vision to the Test
Robotic guidance or combining robots with automated inspection systems or automated optical- metrology systems require choices between technologies with careful consideration put on the vision system’s capabilities. According to Dechow, ‘‘The key integration points always start with the front end – the imaging system, lighting, acquisition, etc. Its one of the areas in which less- experienced integrators or end users spend way too little time, but it’s the area that requires the most care and consideration.’‘
On Wednesday morning, June 4, Babak Habibi, President of Braintech Inc. (North Vancouver, BC, Canada), a company specializing in robot guidance, will provide an excellent primer for various 2D and 3D-vision approaches. This Session 12 primer: ‘‘New Developments in Vision for Robots,’‘ will include a discussion of strengths and weaknesses.
Habibi said he will provide updated reviews on the pros and cons of manual part-loading systems; precision or hard fixtures, for loading and handling; standard 2D-vision systems, that can yield X, Y and roll coordinates in a single plane; 2.5D systems, that use overall part size and shape distortion to provide a Z offset; and enhanced 3D-vision systems, that provide full six-axis coordinate data for robotic guidance. A discussion of 3D vision with single and multi-camera systems that use standard white, structured white, or coherent light-illumination systems will also be included. Both the photogammetric approach, that keys on geometric relationships between fixed features on the part, and stereoscopic vision, with multiple cameras that use operators, will be covered.
Editor’s note: For more information on 3D vision, see the Automated Imaging Association’s (AIA) May 2003 article, ‘‘Bringing Machine Vision to a 3D World – Big Time!’‘
Babak will be one of several presenters highlighting the importance of calibrating the vision system to help keep a single global-coordinate system between vision and robot controllers, especially with applications that require high accuracy. ‘‘If you’re picking up loose product – like a bag on a pallet – then calibration might not be an issue, but if you need to position better than a few millimeters, calibration becomes a big deal,’‘ explains Ed Roney, Manager of Vision Product and Application Development at FANUC Robotics America, Inc. (Rochester Hills, MI), and chairman of the ‘‘New Developments in Vision for Robotics’‘ session.
For high-accuracy applications, calibration is not only critical, but can be a time-consuming process. John Keating, Product Marketing Manager of Vision Systems at Cognex Corp., will discuss new vision modules specifically geared towards ‘‘calibration on the fly,’‘ reducing downtime and improving production turnaround times.
Calibration robustness assessment
This shows a graph of the alignment-repeatability error of the 3D, vision-guided, part- handling system. The data was collected by repeatedly measuring the location of a cylinder head in a static location. The camera was removed and reinstalled after trial 40. The system was recalibrated using the automatic function, and the test proceeded. It is obvious that the performance of the system was not negatively impacted by the calibration operation. Results for X,Y roll, pitch and yaw are similar. Courtesy of Braintech Inc.
Lastly, the importance of high-speed communication between robot, vision system, and PLC will be a focus of the Robot/Vision session. ‘‘In addition to calibration, communications is a critical hardware issue when it comes to robot guidance, especially in packaging where things are happening very fast. You’re going to use the quickest means possible, which for us has been Ethernet to this point. You have to make your software as efficient as possible without sending needless data between the vision and robot,’‘ Roney said.
Machine vision Industry Analyst, Nello Zuech, estimates that robot guidance only makes up between two and three percent of the total worldwide, automated vision market. That market is likely to grow significantly in the short term as more vendors make software and hardware that specifically addresses the needs of this unique application.
‘‘Automakers have been adamant that machine vision and robotics be used in an environment that benefits the process. Pure inspection is not as desirable when it comes to integrating machine vision and robotics. Much of the inspection responsibilities have been pushed down to the Tier I and Tier II suppliers,’‘ said Aptura’s Dechow. ‘‘Machine vision is used to enhance the process, whether it’s guiding the robot or allowing the robot to go back and correct mistakes. That’s an important point that comes up when analyzing the application: Does it have a payback in improving the process?’‘