• Font Size:
  • A
  • A
  • A

Robotics Industry Insights

Robotic Bin Picking – The Holy Grail in Sight

by Tanya M. Anandan, Contributing Editor
Robotic Industries Association

Monotonous tasks such as unloading a bin one part at a time into a machine, bulk parts sorting, and order fulfillment are labor-intensive. Maybe even dangerous if the parts or operations are heavy, sharp, or otherwise hazardous. For years, bin picking robots have been tackling these tedious jobs. Yet there are still so many applications to be realized. The manufacturing and warehouse automation worlds are anxiously watching.

While more capable than ever, robotic bin picking still has its limitations. We haven’t quite grasped that Holy Grail – random bin picking with robots. But there have been tremendous strides. Empowered by advanced vision technology, software, and gripping solutions, robots are finding their way in unchartered territory.

So why is robotic random bin picking notoriously difficult? It’s about accuracy. While robots are applauded for their repeatability, random bin picking requires accuracy in the face of chaos. The robot has to locate a part in free space, in an unstructured environment where the parts keep shifting positions and orientations every time a part is removed from the bin. That requires a delicate balance between robotic dexterity, machine vision, software, computing power to crunch all the data in real time, and a grasping solution to extract the parts from the bin. A tall order, but not impossible.

A heat treat machine tending cell uses a 3D vision-guided robot equipped with a dual-head end effector to locate and pick randomly stacked automotive parts from a large bin. (Courtesy of Motion Controls Robotics, Inc.)“There is a ton of hype in the marketplace about bin picking and what it can do and what it can’t do,” says David Dechow, Staff Engineer-Intelligent Robotics/Machine Vision at FANUC America Corporation in Rochester Hills, Michigan. “The reality is that machine vision is suitable, robust, and reliable for a subset of that entire world of things we would like it to do, whether it be inspection, 2D guidance, or 3D guidance. From my hands-on experience over a number of years, bin picking is in the same category now. There is real-world capability, but it is still a subset of the whole world of things we would love it to do.”

Dechow worked closely with the late Adil Shafi, a visionary in the field of vision guided robotics and credited for early advancements in bin picking. Many of his innovations still influence ongoing development in these areas.
Shafi predicted that robotic random bin picking will become mainstream by 2020. Many of his contemporaries believe he was on the right track. Subsets of bin picking are already commonplace.

“I would say that random bin picking is already approaching mainstream,” says Dechow. “I think a lot of what Adil did both philosophically and hands-on was driving that. He was certainly a missionary for the technology. The remaining applications that pose specific challenges will be solved in the near future.”

There are three main types of bin picking: structured, semi-structured, and random bin picking. Each presents an increasing level of application complexity, cost and cycle time. (We make a distinction here between parts contained in a bin or tote of some kind, versus a pile of parts without a container. More on this later and why it’s a critical variable.)

Structured – Parts are positioned or stacked in the bin in an organized, predictable pattern, so they can be easily imaged and picked.

Semi-Structured – Parts are positioned in the bin with some organization and predictability to help aid imaging and picking.

Random – Parts are in totally random positions in a bin, including different orientations, overlapping, and even entangled, further complicating the imaging and picking functions.

Within these three subsets, special considerations apply based on the characteristics of the parts or items being picked and how they present themselves in different resting states in the bin.

“Structured and semi-structured bin picking are often very easy to do and can be implemented quickly and easily with most of the technologies on the marketplace,” says Dechow. “When we talk about the Holy Grail, I think more of the random environment, the overlapping, the entangled, and multiple types of parts in a bin.”

He says structured bin picking can often be done with 2D vision. “When you think of bin picking, everybody thinks of 3D imaging and 3D analysis. In reality, a certain subset of the bin picking world can be done with just 2D imaging and 2D analysis.”

Success – Geometrically Symmetric
Part characteristics often determine whether a part is suitable for bin picking. Dechow says they fall into a spectrum from a “sure thing” to challenging applications.
“There are parts that are suitable both in their geometric structure and in their presentation in a random bin,” says Dechow. “On that far scale of parts that are an absolute guaranteed success are larger, geometrically non-complex parts where there are few differences in the part relative to its random resting state.” (Wherever a part can drop down or be lying in the bin is one of its random resting states.)

He references this video of a bin picking demo for high-speed order fulfillment, courtesy of FANUC.

“That is a mixed-part bin, but the parts are very similar in their geometric construction. They’re larger and the geometry is not confusing in terms of its symmetry. Even though a box could get on its end, it is still a relatively consistent and planar surface that can be analyzed. A lot of applications fall into that category. This type of mixed-product order filling is done a lot and very successfully, because the types of products you find in warehousing and distribution tend to be that kind of product (geometrically symmetric).
“They don’t have a lot of strange features on them,” he continues. “They are usually not too heavy, and in all of their random resting states they have some sort of sufficient planar surface that can be identified as a successful pick point. That type of 3D bin picking is today’s real-world success story.”

Dechow says it makes little difference if the items have graphics or text printed on them as you would see in the real world. Most 3D image cloud generation is independent of the graphical markings on an object. (More on order fulfillment later in this discussion.)

“It doesn’t have to be nice, clean warehousing parts either,” says Dechow. “Think of large cylindrical billets that are often handled in factory environments for machining, forging or heat treating, something that might be 20 or 30 pounds. But again, it’s a consistent geometric shape when they lie in their various resting states and they present only a few geometric surfaces. Very easy to pick up with a robot and very easy to grip.”

This video demonstrates bin picking heavy steel billets.

Aided by a 3D area sensor to locate random parts in a bin, the robot plans its next pick. (Courtesy of Motion Controls Robotics, Inc.)Moderate Success – Some Complexity
Along that spectrum from easily picked to more challenging bin picking applications are parts with more complexity, but with easily distinguishable features such as the (pictured) donut-shaped part.

This robotic random bin picking application uses a FANUC iRVision 3D Area Sensor and two robots to pick stamped automotive parts from a bin and tend a heat treating machine. Motion Controls Robotics, an RIA Certified Robot Integrator, designed and installed the cell for a Tier 1 supplier.

Check out the video to see the robotic random bin picking operation in action.

The bin picking FANUC robot is equipped with a dual-head, multifunction end effector that uses a Magswitch magnetic gripper to redistribute parts in the bin for easier picking, and a SCHUNK two-finger gripper to pick the parts and drop them into the chute. A sensor on the chute communicates the orientation of each part to the second robot’s controller, indicating whether the flange is facing up or down, so the robot knows how to grab the part. Then the robot picks the part from the chute in the correct orientation and inserts it into a heat treating machine. That same robot picks up the finished part and sets it on a takeaway conveyor for transportation to the next operation.

The integrator reports trying multiple end-of-arm tool designs and vision processes before arriving at the current solution. Not unusual for random bin picking applications. The 3D technologies still require skilled integration and the more complex the parts, the more difficult the integration task.

Also in this middle range of the spectrum are more complex structures, explains FANUC’s Dechow.

“Think of an object like a crankshaft that is generally circular, but is long and has a lot of geometric structure to it. From one end it might present itself as a cylinder. From the other end, it might be a small diameter rod. It’s a long object, so it might be overlapped by other parts. A 30 pound part suddenly becomes a 50- or 60-pound part when there is another part lying on top of it.

“There’s a whole class of those types of parts, which you find in the heavy industry and factory automation world of manufacturing,” he continues. “These types of parts are often delivered in bins. You see one part of the component at the top surface of the bin, which is where the image is taken, but the rest of the part is like an iceberg. Most of it is below the surface, so it’s difficult to understand the best gripping technique for it.”

Still Challenging – Shingled, Packaged, Deformable
Then there are those parts that are particularly difficult to image with machine vision, even with 3D imaging techniques.

“For example, shingled parts, which are parts that are thin but perhaps broad and wide, that are shingled and lying on top of each other,” explains Dechow. “These kinds of parts make it very difficult to discern individual parts because the height variation is so thin and the geometry of the parts makes the scene extremely confusing.

The Vision Show - May 3-5, 2016 - Hynes Convention Center - Boston, Massachusetts“Parts that are packaged, parts that are in plastic baggies, or soft (deformable) parts. Those parts become very difficult to differentiate in a bin picking application.”

“There’s a never say ‘no’ attitude,” he adds. “We want to do our best to push the technology to the limits, and we do quite often.”

Random Bin Picking with 2D Vision
One company that continues to push the envelope is JR Automation Technologies, LLC, in Holland, Michigan. This RIA Certified Robot Integrator is one of a small select group of test sites for FANUC vision guided robotics technology.

“We get to feel out the new technologies in their beta stage and then we’re asked to evaluate them and provide feedback,” says Tyler McCoy, Controls Engineering Manager at JR Automation. “We’ve been doing this for many years now. We were a testbed for the original FANUC iRVision.”

McCoy says they integrate a lot of structured and semi-structured bin picking applications, and consider them rather mainstream. Random bin picking installations, however, are still few and far between.

“We recently implemented a random bin picking solution for an automotive seat back assembly cell,” says McCoy. “This involved a main U-shaped frame that looks like a seat back, two tubes that get welded onto that frame where you put the headrest, and then another part called the trim frame bracket, which runs across the middle of the frame to reinforce the cushion when installed in the car.

“It was an assembly and welding cell,” he explains. “Our task was to take all these parts from bulk, assemble them into weld fixtures and put them on a four-position indexing table. Then we had two robots welding the components together.”

He says the main U-shaped frames were loaded onto a gravity rack for an easy feeding solution to the weld process. The two headrest tubes were small enough for a bowl feeder.

“But this trim frame bracket is about 16 inches long, 9 inches wide, and about 0.75 inches in depth. It’s this long sheet metal stamping. There was no real good feeding technology available,” says McCoy. “So we evaluated it for 3D area scanning, but we found that the low-profile geometry of the part made it difficult to get good contrast or good 3D distinction within the pile. The 3D bin picking just wasn’t quite there yet for this application.” (This would fall into the category of thin, shingled parts and the more challenging range of the spectrum.)

“First and foremost, best practices for bin picking start with evaluating the parts,” says McCoy. “In this application we found that the low-profile parts tended to lock together. We also found that the part naturally could sit in the bin in a number of orientations, limiting our ability to target a picking strategy based on part datum.

“So we implemented a solution where we have a robot picking the brackets out of the bin. Using a non-contact displacement sensor to probe the height of the bin, we create a virtual topographical map. We pick up three or four parts at a time with a compliant end-of-arm tool with magnets, picking from the high areas to the low areas, and monitoring as the parts fall. The parts are placed into a recirculating flex-feed system that uses a series of conveyors to singulate the parts, and then a 2D camera and a second robot to pick those parts. With the combination of using our own software, a compliant end-of-arm tool, and some general picking strategies, we were able to get the parts out of the bin and then reacquire and singulate them into a more conventional flex-feed application.”

The cell incorporates six robots in total. One robot to do the bin picking, another to pull the singulated brackets from the flex-feed conveyors, a third robot to pick the main frame and the headrest tubes from the bowl feeder, two welding robots, and an unload robot.

“The current system gives our customer about a 2-1/2 hour buffer per bin on a 20-second cycle machine,” says McCoy. “We did this system with two bins, so that they don’t have to wait for the bin to empty to replace it. The robot will empty one bin and then they have 2-1/2 hours of the robot working on bin number 2 before somebody has to replace bin number 1.”

Robotic random bin picking and part loading system uses 3D vision guided robots with magnetic grippers to locate and pick parts for a heat treating operation. (Courtesy of Midwest Engineered Systems Inc.)Before JR automated the process, the seat back parts had to be manually loaded into the weld fixtures. McCoy says this is the second system of its kind they’ve integrated for this customer. The first system has been in production for about six months.

Sensors, Software and EOAT
Random bin picking requires a convergence of technologies, particularly three main components that raise the robot’s intelligence: sensors, software, and end-of-arm tooling. Development in all three areas is moving us ever closer to that elusive prize.

“One of the big ones is the hardware,” says McCoy. “That just continues to get more cost-effective, with higher resolution and faster processing times. The ability to generate 3D point data is becoming more industrial hardened. Companies like Keyence and LMI are releasing very interesting products right now.  It seems like everybody is racing to get 3D point clouds into the industrial space.

“More importantly is the software component,” he continues. “There are a lot of cool companies right now independent of the robot manufacturers that are trying to say that this is nothing more than a race for software. You have companies like Recognition Robotics that are doing really cool six degrees of freedom systems with 2D cameras.” (We covered Recognition Robotics’ random picking system in Intelligent Robots: A Feast for the Senses.

“I think all the hardware manufacturers are making huge strides in the resolution of 3D point cloud generating hardware,” says McCoy. “There are a lot of clever software developers out there that have taken advantage of it. I do feel like we’re at the point where those two technologies are converging at the right time. I’m excited to see what will be out there by 2020.”

Check out this case study and video by RIA Certified Robot Integrator Midwest Engineered Systems for the (pictured) random bin picking and part loading system for a heat treating operation.

Interference Avoidance
A critical part of the software development for bin picking, especially for random bin picking environments, are the algorithms that prevent the robot and its end-of-arm tool from colliding with the sides of the bin and other objects.

“That’s a huge part of bin picking that most people don’t think about,” says FANUC’s Dechow. “In particular, our Interference Avoidance feature is a really elegant solution. You’re sending the robot to a kinematic pose that you have no prior knowledge about. That robot can be literally trying to drive through the wall of a bin to get to a part.

“It’s a very interesting task to have software that’s smart enough to make sure the robot never gets itself near a wall,” he continues. “But the more complex task is to have the robot decide where it wants to go and if it’s going somewhere that might cause any interference, it can automatically change the approach path of the robot in all axes of freedom to move the robot towards that part and grip it without touching anything. That’s the high end of Interference Avoidance.”

FANUC’s Interference Avoidance feature is all software-based and comes standard with the 3D Area Sensor. Other robot manufacturers and software developers have similar algorithms, often called obstacle avoidance.

“You model the end joint of the robot, you model the tooling on the end of the robot, and you model anything within the gripping area that could interfere with the task,” explains Dechow. “As the robot gets sent to totally unknown random points, the software calculates the desired target and changes the approach and gripping points. Like a part leaning up against the wall of the bin, or a part at a particular angle where the robot would have to drive it into the wall in order to get the part. It’s an extremely elegant side of bin picking that not a lot of people think about.”

AI for Bin Picking
For one software developer, that’s all they think about. Born out of the labs at NASA, where researchers were preparing humanoid robots for the unknowns of space exploration, Universal Robotics has been on a mission. They’re in a race for accuracy.

A six-axis vision guided robot moves randomly stacked items from one bin to another with speed and ease, regardless of part orientation and bin depth. (Courtesy of Universal Robotics, Inc.)  We first introduced you to Universal Robotics in 2013, when the company released its unlimited depalletization system for recognizing random boxes of any shape, size or orientation. Now the company is taking its sophisticated algorithms and applying them to random bin picking.

“Picking from a bin is all about accuracy, because every pick requires a different path plan,” says David Peters, Chief Executive Officer and cofounder of Universal Robotics, Inc., in Nashville, Tennessee. His brother and cofounder, Alan, is the Chief Technology Officer and the scientist behind the algorithms developed in cooperation with NASA and Vanderbilt University. David brought his background in the motion picture business to the private equity funding pursuits of the brothers’ venture.

“The single biggest thing that’s happened with the technology in the last couple of years is we’ve worked towards 100 percent accuracy in transactions,” says Peters. “Underneath the hood, the algorithms are getting more sophisticated. The learning function is getting broader. All of those things around the artificial intelligence continue to expand based on the application. If you think about how we started out with moving boxes and we’ve moved into bin picking, and now the bin picking has begun to expand to where we are capable of handling thousands of SKUs.”

Universal Robotics’ Random Bin Picking software solution has two integral parts to the modular application platform. Spatial Vision handles traditional vision and communication between the devices, sensor data, and drives control of the robot functions and calibration. Neocortex combines its real-time machine learning, or artificial intelligence, with the 3D vision results generated by Spatial Vision.

“We’re using parallel processing,” says Peters. “That gives us the capacity to take large data sets, and in milliseconds, analyze what they are, understand what’s happening, and then drive behavior in the machine to whatever the transaction requires. We’re at a point where we can compete with human speeds in the supply chain.”

Order Fulfillment
One supply chain solution involves a partnership between Universal Robotics, R/X Automation Solutions, and Yaskawa Motoman to provide a turnkey solution for robotic pharmacy order fulfillment systems.

“We’ve put Neocortex on some industrial robots for order filling, which until this point has required manual activity. We’ve built triple redundancy into the system to absolutely 100 percent guarantee that what gets picked is the drug that was ordered.”
This video demo of random bin picking for pharmaceutical order fulfillment shows how quickly the stacked items are picked, regardless of shape, size, orientation, and their position in the bin.

“R/X Automation’s customers are the big direct-to-customer pharma companies,” explains Peters. “In their transactions they are supplying the entire 100 percent validated automation solution that’s required for the customer, and then we’re handling the work cell component. Historically, this is where the person would be in the supply chain. We’re providing the turnkey solution for that particular cell, including the actuation, the industrial robots, the sensors, the controls, the processing, and the safety that’s required for that machine to operate in that work cell.”

Structured light sensor and sophisticated software maps the positions of random items in a bin. (Courtesy of Universal Robotics, Inc.)Universal Robotics software is agnostic to robot, actuation, and sensor. The system typically uses off-the-shelf industrial-grade structured light sensors coupled with pairs of cameras for stereoscopic vision.

“We can link together as many sensors as required for the task,” says Peters. “So if the customer says I need you to read a bar code, or a serial number, or I need you to do an OCR read of this object before it’s put in this box, we just put the sensor either on the robot arm or next to the robot.

“Because we work with any sensor, as sensors improve, we can fold that right into existing systems. The latest version of the Intel i7 processor just came out. It has exquisite efficiencies. Guess what, I can pull that right into my solution. That gives me an advantage in being able to handle more sophistication, speed, and processing.”

Universal Robotics software resides on a PC. “We have white-labeled what is essentially the equivalent of a gamer machine,” explains Peters. “It’s a very fast, high-end PC. Spatial Vision always resides on that machine in the cell. Neocortex can reside on that machine, in a server room in a factory, or can reside in the cloud.

“We load into Spatial Vision the model of the robot, the CAD, and it will automatically pull an understanding of how that machine actuates and moves through space. (The same with the end effector.) It then uses that as part of its drivers that are making the decision on how to do the obstacle avoidance.”

E-commerce and order fulfillment is a big growth area for robotic bin picking.

“Walmart, Costco, Kmart, Target, they all have online ordering,” says Peters. “Those tasks have until now required a human once it’s in the pack station. That process of packing outbound boxes is a really good application for what’s happening with bin picking.”

He says the items can be any geometry. But picking up a bottle of detergent versus a tube of lip balm is very different. The challenge then becomes the end effector.

Grasping Challenges
Many believe smart gripping technology is the final challenge we must surmount to move random bin picking into the mainstream. FANUC’s Dechow sees great potential for gripping solutions.

“I find that more often than not we can extract the required data from a 3D or 2D scene, but the question becomes how do we get something into the bin and around the object to successfully pick it up given the data we have available?

“The problem is that one gripper often doesn’t solve every random resting state of a part that’s in a bin,” says Dechow. “Certainly the imaging components are critical in determining what part can be imaged, but the end effector becomes one of the last components on the critical path in determining whether or not the part can be picked.”

Soft-actuating gripper picks fresh produce from a bin using 3D vision guided robotics. (Courtesy of Soft Robotics Inc.)Adaptive gripping technologies such as Soft Robotics’ flexible grippers are poised for the challenge. Check out this video of the soft-actuating gripper bin picking random e-commerce items with the aid of Universal Robotics Spatial Vision software (3:17 into the footage).

Peters says just like they can dial in the CAD for conventional end effectors, they can do the same for these squishy grippers.

“You know its inherent reach capabilities and the range of variables for how each actuated finger moves,” he says. “When you put the CAD in, you can also set those variables.”

Now, check out this video of an electroadhesion robotic gripper picking boxes from bins and shelving compartments.

Various companies are working on standardizing bin picking technology for a variety of industrial and order fulfillment applications. This makes it easier and cost-effective for small and medium-sized companies to implement robotic random bin picking. This video shows some off-the-shelf solutions.

Even Google’s moonshot chasers are getting into the act. Watch a team of robots equipped with adaptive grippers use deep learning to pick random e-commerce items.
Groundbreaking research like this has far-reaching implications for not only that apex of robotic random bin picking, but also other complex robotic path planning and manipulation tasks. Adil Shafi would be proud. The Holy Grail is almost in our grasp as 2020 comes into view.

RIA Members featured in this article:

Originally published by RIA via www.robotics.org on 03/17/2016

Back to Top

Browse by Product:

Browse by Company:

Browse by Services: