Robotics Industry Insights
- This industry insights is filed under:
The Power of Cloud Robotics
by Tanya M. Anandan, Contributing Editor
Robotic Industries Association Posted 09/27/2019
Cloud technologies have transformed the way people live and perform daily tasks, and now, cloud technology has come to robotics.
“The cloud” is a term used to describe software and services on a global network, operating off the Internet and allowing users to access information on virtually any device. Servers are designed to store and manage data, run applications, and deliver content.
These applications are now seeing massive growth in the robotic industries space, as cloud robotics are enabling the use of cloud computing, cloud storage, and other technologies that focus on a centralized infrastructure and the benefits of shared services.
The emerging trend is already being implemented in robotics on the move in logistics, healthcare, mining, agriculture, construction and more, and is enabling supervised autonomy and machine learning. Cloud robotics allows for higher levels of human-robot interaction and learning, and is contributing to the digital transformation of companies. As with any developing technology, enabling cloud systems raises questions about safety and security, and companies are quick to address these concerns.
More powerful robot solutions are growing thanks to cloud technologies; solutions that give the ability to handle computationally-heavy tasks, which enable more power and cognitive collaboration, and that greatly increase the available data to share with other machines and humans.
Offload Computation-Heavy Processes
The story of cloud robotics begins with robot developers, says Roger Barga, GM Robotics & Autonomous Services at Amazon Web Services (AWS) in Seattle, Washington. Robot developers are no longer limited by the amount of processing power and software they can cram on board their robots. One of the primary advantages of cloud robotics is the extra computing power it provides for expanded robot capabilities.
“Many of the algorithms developers would like to run won’t fit in the memory footprint,” says Barga. “Developers can now partition work by what should run on the robot and push the more computationally heavy tasks that can tolerate latency over to the cloud.”
AWS is one of the world’s most comprehensive and broadly adopted cloud platforms. Millions of users, including the fastest-growing startups, largest enterprises and leading government agencies, trust the Amazon Cloud to power their network infrastructure.
Other leading cloud platforms, Microsoft Azure and Google Cloud, also have notable rosters. All three support cloud robotics. Innovative robotics companies are increasingly seeking out cloud providers to support their internet-connected robots. Many are pursuing a multi-cloud approach, where you pick and choose services from different providers. Others are exploring private, on-premises cloud options. More service options and the typical pay-as-you-go pricing model for cloud services makes it easier for both large multinationals and startups to access only the computing power, storage and application resources they need.
Fetch Robotics will tell you their cloud robotics platform provides the only autonomous mobile robot (AMR) solution for material handling and data collection that deploys in hours versus weeks, and it doesn’t require immediate integration to show value. They call it “on-demand automation.”
A cloud software platform efficiently manages a fleet of cart-carrying autonomous mobile robots at this third-party logistics warehouse for an automotive manufacturer. (Courtesy of Fetch Robotics, Inc.)
“We’re one of the only companies that have all of our systems supported in the cloud,” says Melonee Wise, CEO of Fetch Robotics in San Jose, California. “You don’t have to install infrastructure. You don’t have to install stickers or QR codes. You don’t have to install IT equipment or dedicated Wi-Fi. You don’t need WMS integration to get started, but can certainly add in later once you have proven value out of your workflows.
“You can just unbox the robot, connect it to your internal Wi-Fi, and the robot will start working on day one.” Wise says return on investment is also quick at only three to six months.
Fetch Robotics material handling AMRs include two large-payload platforms that transport pallet loads of up to 500 kg or 1500 kg. Part and parcel transport platforms come in a variety of configurations, including one model that transports wheeled carts, one with a modular conveyor top, and another platform with an integrated touchscreen and adjustable shelving. Data collection AMRs automate inventory counting by tracking RFID tags on products and bins in warehouses and factories.
Established in 2014, the startup just closed a $49.5 million Series C funding round in August and has raised nearly $100 million in total venture funding. Customers include third-party logistics providers DHL Group, Universal Logistics Holdings, Ryder System and other 3PLs, plus manufacturers in the automotive, aerospace and electronics industries.
Over-the-Air Updates, Upgrades, and Management
FetchCore Cloud Software handles all of the robot management. Scheduling the robots, pushing over-the-air software updates, and communicating with third-party devices, such as conveyors, automatic doors, elevators and hand scanners, are all managed in the cloud. Beyond supporting all of the different AMR configurations, the software supports large numbers of robots.
“We can deploy 1 to 100 robots without any difference, even more than 100,” says Wise. “The scale doesn’t matter. It gives us a big advantage because typically if you want to scale up a system with a lot of robots, you have to add more servers. For us, since we’re in the cloud, it’s done automatically.”
Regardless of payload, or whether it’s a cart-transporting robot or one with a conveyor top, the software is the same. Wise says the software knows the capabilities of each of the robots, so tasks are allocated to the robots dynamically based on those individual capabilities.
For example, when a warehouse operator picks a bunch of goods to a cart and then deposits the cart at the end of an aisle, they typically scan a barcode. This triggers the system to request a robot with the cart transport capability to come pick up the cart. The different robots communicate with Fetch's cloud-based software and the correct type of robot is dispatched to autonomously move that cart from location to another. At any given moment, the system knows where each robot is and what it’s doing, and chooses the closest or most opportune robot for each task.
Using the cloud-based web interface, a warehouse manager can designate what Wise calls “the rules of the road” for their facility by creating robot-executable workflows. An example of a workflow might be for a robot to visit these five stations every 15 minutes, or when someone scans this barcode or presses that button, go do this task.
“It gives our customers the ability to create whatever they want the robot to do,” says Wise. “But they don’t have to be physically with the robot, and the workflows can be transmitted to hundreds of robots over the cloud.”
Watch Fetch Robotics’ Wise demonstrate this cloud-based user interface. For ease of use, it’s all drag and drop, requiring no robot programming knowledge on the user’s part.
These robots may have their heads in the cloud, but that doesn’t mean there’s no brain on board. All of the robots are autonomous units, each with their own intelligence.
Mobile robots, especially those in the intralogistics space, must navigate warehouses, factories and distribution centers while safely maneuvering around workers, forklifts and other equipment. This requires a fusion of sensors and software packed on board the robot.
To safely navigate the space, the robot must first map its environment. That data can be pushed to the cloud for processing and building the map, which can then be transmitted back down to robot for local navigation. The same map is transmitted to other robots in the facility using the cloud.
If a robot encounters a problem it can’t resolve on its own, a warehouse manager can assist the robot via the cloud.
“In the user interface, you can click on the map and ask the robot to go there. You can also tell the robot where it is on the map,” says Wise. “That’s very unique to cloud-based systems.”
This level of human-robot assistance is called supervised autonomy.
“Being in the cloud gives us a lot of power for supporting the platforms,” says Wise. “We know everything about the robots all the time. We can support customers globally 24/7.”
Fetch Robotics partnered with Ricoh USA two years ago. You may know Ricoh as a printer and copier company, but they also offer a suite of enterprise support services.
“They have a call center team that supports all of our robots around the clock,” says Wise. “Our support team can react to any customer issues immediately. That’s really important to us, because we’re deployed in 19 different countries.”
A Note on Safety: Supervised autonomy should not be confused with teleoperation, which implies full control.
Roller-top autonomous robot communicates via the cloud with third-party devices to facilitate conveyor-to-conveyor material transport. (Courtesy of Fetch Robotics, Inc.)
“We typically don’t give the user full control of the robot at any one time,” explains Wise. “They have to be working in conjunction with the autonomous system. Supervised autonomy is giving a robot a task, not direct commands. We don’t tell the AMR how to get there and at what speed, merely where to go.”
The robot will figure out the optimal, safe route, avoiding obstacles and yielding to humans. Wise says they don’t want people to have complete control of the robot, for safety reasons. Human error could spell the difference between driving the robot into a wall, or worse, a person.
“The function of the AMR in terms of its safety comes from the software on the robot. It’s not in the cloud,” says Wise. “We don’t care about latency issues, because we don’t give robots real-time commands. We give them task-level commands. Getting a task command three seconds late doesn’t matter. Getting a stop command three seconds late really matters.”
Plus One Robotics builds perception software and integrated solutions for each picking, case packing, and parcel induction applications. Their quick deployment solution consists of a robotic arm with base and gripper, their robot-agnostic AI-enabled perception software, lights and cameras, and Yonder, a cloud-based remote robot management system.
“We provide the eyes and the brain,” says Dan Grollman, Ph.D., Senior Engineer at Plus One Robotics, a San Antonio-based company. Grollman leads the startup’s satellite office in Boulder, Colorado. “We sell to customers in the logistics space who want their valuable humans to do more interesting work and leave the drudgery to robots.”
That’s why the company slogan is “Robots work. People Rule.” This is where cognitive collaboration comes in.
Cloud software platform for logistics applications enables people and warehouse robots to work collaboratively and efficiently in parcel handling and each picking. (Courtesy of Plus One Robotics)
Through the cloud, Yonder empowers a human supervisor, called the crew chief, to assist the robot whenever it encounters a situation in which it’s not sure what to do next. For example, the item or parcel may be askew and difficult to grasp or perceive properly. The robotic system will provide the crew chief with the information they need (camera images, etc.) so they can make the required decision.
“The crew chief clicks on a pick point (on screen) and provides any additional information needed,” explains Grollman. “That data gets sent back down to the robot, and the robot continues autonomously.
“Our goal is to provide the crew chief with enough situation awareness that they can make an appropriate decision to assist the robot,” he continues. “The robot is in one location and the human could be any place else. Doesn’t matter what time zone, what part of the world. That interaction between the robot and the crew chief all takes place in the cloud.”
The crew chief can be customer personnel or a Plus One team member. The latter is more common.
“We have a median response time of under 10 seconds, from robot request to a human response,” says Grollman. “Our goal is to get it lower. Still, if you compare that to a human having to walk over to a robot, even if it’s across the room, it’s a huge timesaver. The nice thing about the remote crew chief is that one person can service many different robots.”
The remote robot management solution can connect a crew chief with an unlimited number of robots across the cloud.
“The reason we call it cognitive collaboration is because it’s remote decision assistance,” says Grollman. “It’s at the level of decision-making and not at the level of direct control.
Watch Plus One CEO and Cofounder Erik Nieves explain how supervised autonomy, or cognitive collaboration, is that “missing middle” of automation.
“We’re not trying to do 100 percent autonomy right out of the gate,” says Grollman. “Yonder is a great tool to get people up and running and in production quickly, and it develops from there.”
Data Collection and Sharing, for Insights
Big data is a welcome byproduct of all that extra computing power the cloud affords to robots. While the robots go about their picking tasks or transport duties, or any other robotic job, they’re moving through our physical world on an all-digital highway. Just by the nature of their digital service lives, robots are collecting massive amounts of data about their movements, the environments around them, process variables and outcomes, and even what we humans may be up to at any given moment.
Fetch Robotics cloud platform provides telemetry and analytics. If a customer wants to know data about their different workflows, how long each one took and how far the robot traveled, for example, that information is readily available.
“It’s not just what the robot did, but also what the robot observed,” says Wise. “If you have 20 robots, they can see almost an entire 300,000-square-foot facility at one time. They can build a model of the warehouse. We can show where people walk over time, or where forklifts drive over time. We can show a warehouse manager how congested their facility is. That’s part of our advanced data analytics. It’s an algorithm we have on our robots and we use that to provide extra value to the customer. It’s something they can opt-in.”
Wise says they’re only collecting and using a small fraction of that data off of their robots right now. The robots generate 30 to 60 gigabytes of data a day. She would like to be able to move all of that data to the cloud, but current network technology doesn’t support it. Industry’s promise of 5G on the horizon may or may not be the end-all, be-all solution.
AWS’ GM of Robotics & Autonomous Services says the reality is that most software runs locally on robots. It’s the data that comes off of the robots that’s valuable to send up to the cloud.
Cloud-connected robotic walker for patients with Parkinson’s and other musculoskeletal conditions automatically maneuvers around obstacles and responds to voice commands. (Courtesy of Amazon Web Services)
One of our customers is Robot Care Systems. Lea™ is their walker robot,” says Barga. “Lea has 72 sensors. All of the onboard processing power is used to keep the patient safe. They stream the data off of Lea and are building patient health dashboards that doctors can monitor to see how active a patient has been and if they become unstable when walking with Lea.
“They have a digital exhaust that gives them insights,” he says. “They are working right now to build predictive models. For example, is someone progressing the way we think they should? Is their stability regressing? They did not have an update mechanism before. Now, with the cloud, they can do over-the-air updates to Lea.”
AWS RoboMaker is a service that makes it easy to develop, test and deploy intelligent robotics applications at scale. RoboMaker extends the most widely used open-source robotics software framework, the Robot Operating System (ROS), with connectivity to cloud services. The service provides a robotics application development environment, a robotics simulation service to accelerate application testing, and a robot fleet management service.
AWS also joins the growing fan club advocating open-source software for industrial automation the Robotic Industries Association highlighted earlier this year in ROS-Industrial for Real-World Solutions. Barga says AWS is in the final stages of joining the ROS-I Consortium. AWS is also a founding member of the technical steering committee for ROS 2, the latest version of the open-source software. He’s quick to point out that Microsoft, Intel, Bosch and Toyota Research Institute have also joined Amazon and other heavy hitters on the committee.
“No other cloud provider has the breadth and diversity of services we have from storage to analytics, to focused services for IoT analytics,” says Barga. “All of our cloud service extensions we have released as open source. All of our contributions to ROS we have released as open source software. Fleet management is uniquely ours, but we’re very transparent on what we’ve built upon. It’s the magic we add behind the scenes that sets AWS apart.”
AWS customers use computer simulation to validate that their applications are running correctly before they allow their developers to check in any new code.
“For them as a robotics company to have racks of machines sitting around to run simulations every time a developer checks in code, that’s a perfect example of how to use our cloud service that runs simulation-as-a-service.”
More Powerful Solutions
Amazon AI services add voice commands to the robotic walker’s capabilities. The Lex speech recognition and Polly speech generation services allow patients to talk to the robot and the robot to respond. Users can change the language and run multiple languages.
“If you’re deploying a robot in Japan, we have the Japanese Polly and Lex versions of Amazon cloud service integration services,” says Barga. “That’s also how people interact with Lea. When they summon Lea to come to them from across the room. That voice command is picked up by a microphone on Lea and sent up to a cloud service which does the translation into text. It understands what the customer is saying and then sends commands back to Lea.
“We’re not trying to build everything ourselves. Instead, we’re trying to help make it easy for roboticists to add power to their robot through AWS services.”
Watch AWS’ Barga demonstrate the power of the RoboMaker service at the ROS-Industrial Conference in December 2018.
Plus One uses the cloud for continuous improvement through machine learning, Grollman explains.
“We use machine learning in two places. One is in the onboard system. During autonomous operation, we use a machine learning system to improve the autonomy. The second place is every time a human interacts with the robot. That is a teaching point. We can use that information to improve the system’s own capability and confidence, so perhaps next time it doesn’t need to ask for help.”
He says the goal is algorithmic adaptation, versus a software developer analyzing the data and writing new code or updating the system.
Their remote robot management solution also includes an API for publishing performance data to third-party software. As the system runs, it publishes statistics on its activities, which can be analyzed to identify any bottlenecks.
“That’s one of the nice things about cloud robotics, even apart from Yonder and the crew chief, just having that data exposed and people having access to it, is a reason to put your robot on the cloud.”
Wise says Fetch Robotics is very focused on the long-term vision of doing machine learning with the data from the robots.
“We build models. Those models are trained on the data we get from the robots and then we use that to improve the system constantly. We also use that data to provide insights to the customer. We can actually take large datasets and make the robots better.”
She cites an example. It’s lunchtime at a warehouse facility and two drivers park their forklifts at the ends of an aisle, completely blocking access in or out.
“We have an alert that looks for when a robot drives back and forth between two points several times in a short amount of time. We have a whole bunch of detectors that look for weird behaviors or very specific behaviors in the robots. Then the robots alert us when there is a problem.”
Machine learning is compute-intensive. All of that learning training runs in the cloud.
AWS customer Woodside Energy, a mining company in Australia, was moving materials from one side of their operation to another. Rather than expending valuable human capital to drive a transport vehicle, they deployed ground mobility robots on their mining field.
As the robots were carrying equipment from point A to point B, they would pass by the company’s see-through glass reservoirs that contain oil for lubricating equipment. They decided to put a camera on the robot to record the oil level. Woodside used AWS machine learning services to label the image data collected by the robot-mounted camera. Engineers labeled shots of the reservoirs as different percentages of full, for example, 60 percent full or 20 percent full. They built up a machine learning training set that ran the machine learning model on their robots.
Each time the camera sees one of the oil reservoirs, it runs the machine learning model. If the oil level is below a certain percentage, it sends a notification back to a Woodside manager for a maintenance request to fill the reservoir.
“That digital transformation is helping a robot automate a task, but it’s also sending valuable data back so you can make your business run more efficiently,” says Barga. “That’s a story people don’t typically consider when they start thinking about robots and automation. When they do, they realize they need to have a long-term plan for the services they’ll use to ingest the data and the kind of analytics they can build around it to run their business better. Then things start getting really interesting.”
Last Word, Security
We’ll give Barga the last word on security, a critical subject especially when it comes to internet-connected robots. His AWS team, in collaboration with other roboticists, built the first security threat model for ROS 2.
“You have to secure the entire code chain from what runs when the operating system is running on the robot, all the way to your software that’s running on the robot. A number of companies have picked up this security model and used it to evaluate the security posture of their robot and its software stack.”
AWS IoT Greengrass software hails from Amazon’s IoT team. “Think of it as a big vault running on your robot,” says Barga. “Unless you have the combination to access the vault, you can’t do anything to the software. We’ve locked down your robot so nobody can install software, nobody can communicate with it and nobody can get data off of it.”
Cloud technology and its impact on cloud robotics is interesting, indeed. Companies worldwide are just starting to embrace the Internet of Robotic Things. It will be fascinating to see what comes down the pipeline next. These robots may have their heads in the cloud. But their eyes are on the prize. Productivity and greater efficiency. Better performance. Make life easier on humans. Connected robots and humans.
Originally published by RIA via www.robotics.org on 09/27/2019