October 26, 2022

By Keith Shaw

Amazon NewRobots400x275It’s no secret that Amazon’s success in being able to ship goods quickly is partially a result of their use of robots, given the company’s acquisition of Kiva Systems in 2012. This kickstarted the mobile robot revolution, which continues today as robot makers aim to put a variety of robots within warehouses, distribution centers and fulfillment centers for both online and traditional retailers.

Interestingly, the Amazon robots being used today are more like automated guided vehicles, navigating the warehouse by following a grid of encoded markers. Human workers are largely separated from these robots, similar to how large industrial robots are kept in cages away from people. They are unlike the more freely moving autonomous mobile robots (AMRs) seen by the likes of companies such as Locus Robotics, 6 River Systems and Zebra (Fetch). But that is now changing.

Earlier this week in a corporate blog post, Amazon’s John Roach discusses how the company is testing a new class of robots that use artificial intelligence and computer vision “to roam freely throughout the fulfillment center.” In other words, Amazon is testing AMRs.

“This is the first instance of AI being used in autonomous mobility at Amazon,” said Siddhartha Srinivasa, director of Amazon Robotics AI. As my good friend John McClane says in Die Hard, "Welcome to the party, pal!"

The post continues that Amazon’s new robots include “semantic understanding,” defined as the ability to understand the 3D structure of their world “in a way that distinguishes each object in it and with knowledge about how each object behaves. With this understanding updated in real-time, the robots can safely navigate cluttered, dynamic environments.” To me, this sounds a lot like obstacle avoidance and 3D vision technology that has been implemented in AMRs and other self-driving vehicles for years, but maybe I’m missing something.

The post continues that the robots are being deployed in a few fulfillment centers where they are performing a narrow set of tasks, such as helping move items that are too long, wide or otherwise won’t fit in pods or on conveyor belts that are used in Amazon’s fulfillment centers. For those items, employees go back to largely manual processes that might also involve pulleys and forklifts.

To be honest, I’m not sure if what Amazon is doing here is any more breathtaking or innovative than what other robotics companies are accomplishing or deploying in other warehouses, as I’m not privy to see what is happening every day in a fulfillment center or distribution center. But Amazon being Amazon, of course they believe that they are on the cutting edge. “We are writing the book of robotics at Amazon,” said Srinivasa in the post, while also noting it’s an ongoing process.

One interesting part of the post is that they discuss developing techniques for robots in interacting with human workers, such as how they can indicate what their next move is without resorting to bright lights and loud sounds. The robots are doing this through “imitation learning,” where the robots watch how people move around each other and learn to imitate the behavior. Similar work is being done by Zebra/Fetch as a way for robots to avoid the situation that humans face all the time when you are walking down a hallway and someone is coming at you, and you have to do the whole “do I move left or right?” dance with that person. Perhaps the robots will figure out a better way to do this than humans can. Or they can always just move to the right.

In terms of the bright lights and sounds conundrum, my suggestion is that they just add a human voice that blurts out of the robot’s speakers that says, “HEY! I’m walkin’ here!” in a very distinct Brooklyn or Bronx accent.

All kidding aside, it should be interesting to see what Amazon does next in terms of the AMR deployment. They've invested a LOT of money into their Kiva-based mobile robot system that brings products directly to human workers at packing stations. I can't see them ripping those up and replacing it with a more Locus-like system where AMRs go to shelves and have human pickers place the items in the totes there. More likely, fulfillment centers will likely see a combination of robots that perform specific tasks depending on the items ordered, or they will possibly deploy AMRs in fulfillment centers where they haven't yet adopted robots (in a brownfield scenario).

If you’re interested in learning more about Amazon’s new robots, check out the blog post here.

Keith Shaw is the managing editor of Robotics-World. His opinions (and bad jokes) are his own.