Your robots just aren’t smart enough
KUKA’s Andy Chang shares his vision for the factory of the future, and the critical transition from robot whisperers to AI-driven communications
My friend the robot? Human to machine (H2M) and machine to human (M2H) communications will enable simple, secure control of robots on the factory floor with the communications channels workers already know and use.
“It’s never about the technology. It’s always about the people.”
By Ken Herron
This is not a story about robots.
This is a story about how global manufacturing — powered by the Internet of Things (IoT) and Artificial Intelligence (AI), and driven by consumers’ rapidly rising expectations — is changing, and how a 119 year-old German automation company is leading this change from proudly weird Austin, Texas.
On the show floor of this year’s SXSW, in full view of LBR iiwa (a lightweight robot (pictured) for human-robot collaboration) and KR AGILUS sixx (a six-axis robot that operates at high-working speeds, and has a real knack for flipping water bottles (AGILUS’ cowboy hat is not a standard accessory)), I sat down with Andy Chang, KUKA’s Director of Product Marketing, Americas. In his role, Chang focuses on introducing cloud, web, and mobile technologies that augment traditional robotics operating technologies. I asked Andy to share his vision for the factory of the future. I also asked him what we need to do to get there faster. His answer? We need smart, connected, mobile robots — matrix production — with the global people and manufacturing community to support them.
Today’s manufacturing robots just aren’t that smart.
Today’s robots perform a single function. They’re fixed in one place. And they’re unable to tell if there’s an obstacle in their paths, let alone modify their actions based on differences in the materials they’re operating on, such as those caused by changes in raw products or environmental conditions. There was a long period of time when this didn’t matter. Product life-cycles — for cars, planes, appliances, and other items — ran 7-10 years. Optimum production efficiency meant massive “warehouses” filled with specialized workstations.
Shift back to today. Can you imagine having a seven year-old iPhone? That’s the problem, neither can today’s consumers, and this change has not just affected how Apple, Samsung, and their smartphone peers manufacture their products. Retooling, whether for a new product or for even a minor change in an existing product, requires a significant investment in infrastructure. Even worse, it’s slow, which means shutting down production lines until retooling is complete.
By their very nature, OEMs are cautious, risk-averse. It’s worth noting that while the even more conservative and risk-averse financial services industry has fully embraced cloud technologies as they have matured and become more secure, manufacturers are still weary, making hyper-customization impossible. Where smartphone manufacturers have had to evolve to enable economical mass production of one, they are the exception, not the rule.
Andy sees robots being able to perform 10-20 different processes, from welding, gluing, and drilling to cutting, screwing, and more. He see robots being able to change their “personalities” on demand, giving factories the ability to reorient and retool their shop floors on demand. Leaving behind their fixed functions, robots will be able to automatically and continually self-adjust, self-orient, and self-train based on the needs they themselves identify. The retirement of a generation of “robot whisperers” who understand their charges so well that they can anticipate costly failures and maintenance, makes it all the more urgent to have robots who can accurately predict their own needs for parts, supplies, and adjustments.
Today’s robots struggle with talking to both each other and to humans. Thanks to the exponential growth of smart home devices, the average consumer has more connected products in their smart home than the average worker does in a factory. Workers dream of being able to message and “chat” with their robots as easily as they do the appliances and electronics in their homes.
My friend the robot? Human to machine (H2M) and machine to human) (M2H) communications will enable simple, secure control of robots on the factory floor with the communications channels workers already know and use.
Manufacturing robots are now being loaded with sensors and the computational power to manage, analyze, and store the big data they generate on every one of their tools, joints, and actuators. Components are now talking to each other within and across robots, between robots and other machines, and between their operators and owners. Robots’ self-awareness comes from computational and artificial intelligence. The end goal according to Andy? Automation which will enable your grandmother to train a robot more easily than she can train her grandkids to tie their shoes.
Mobility requires power. Whether it takes the form of innovative new battery technologies and/or wireless power and charging (i.e., if roads can power electric cars (http://inhabitat.com/israel-to-test-electric-roads-that-wirelessly-charge-vehicles-as-they-drive/), factory floors that power electric robots won’t be far off) Andy sees manufacturing working together as a community to overcome the challenges and obstacles presented by current battery technologies.
To get us to the factory of the future from where we are today, these are the steps Andy sees the industry needing to take. The first step is connectedness (see OPC Unified Architecture (https://en.wikipedia.org/wiki/OPC_Unified_Architecture). It’s the foundation for everything else in a smart factory. Step 2 is the ability to extract insights from big data to learn about the individual manufacturing processes themselves. And step 3 is optimization, replacing the robot whisperers to predict failure based on each machine’s actual operating time and work conditions.
Tapping into both Austin’s talent and culture, Andy Chang is leading KUKA to lead the industry.
— Toby Ruckert (@tobyruckert) March 12, 2017
How do you get a robot to flip water bottles?
“What is easy for humans is hard for robots.”
First, you bring together a group of very smart people (having a whole lot of water bottles helps too).
Helmuth Radrich, KUKA’s Senior Software Engineer, shares the behind the scenes story, “When humans perform the bottle flip, they use a very complex motion, at least from a robot’s point of view. We couldn’t get anywhere near that with the robot’s motion, so what the robot lacked in technique and finesse (given the extremely short prep time for this year’s SXSW), we made up for with the robot’s speed and accuracy.”
Helmuth continued, “Because this particular robot doesn’t have any force (torque) sensors to measure the amount of water in each bottle, we filled every bottle with the same amount of water, so we could then calibrate the robot. To handle varying amounts of water, our flip models would have needed to be much more sophisticated.”
According to Helmuth, the biggest challenge the KUKA Austin team found was the variability in the bottles themselves. “Robots thrive on predictability, but what happens when the ‘thin and squishy’ water bottles used for bottle flipping deform after every throw? To make matters even more complex, because of the water bottles’ injection molding process, bottles of the same brand and size vary tremendously – including wall thickness, weight, elasticity, and other factors – based on the specific machine they’re made on (as they’re produced for cost and speed, not uniformity). Our solution? We found the machine that made the best-performing water bottles and then pre-tested each bottle’s individual performance. The bottles you saw KR AGILUS sixx flipping at SXSW were the best performing bottles from that single bottling machine’s run.”
The bottom line is that your robot’s performance is a direct result of the smart team of people working behind it!