CNET también está disponible en español.

Ir a español

Don't show this again

iPhone 12 and 12 Pro review Netflix subscriber growth NASA Osiris-Rex Stimulus negotiation reckoning MagSafe accessories for the iPhone 12 The Haunting of Bly Manor ending Walmart Black Friday

One day a robot may ask, 'Paper or plastic?'

Researchers from Stanford University create an autonomous checkout clerk capable of scanning and bagging your items in real time.

The steps involved during the autonomous checkout process. Not pictured: store manager HAL-9000 tending to the tills. Stanford AI Lab

Willow Garage's $400,000 open-source PR2 personal robot is a jack of all trades, dabbling in household chores, bartending, and even playing pool.

Now, according to the IEEE Spectrum Automaton blog, a group of researchers from Stanford's AI Lab is looking at putting it to work as a cashier in the retail store of the future.

PR2: "I can do anything!" Willow Garage

The primary goal of the research is finding a way for a robot to sort, grasp, and identify objects with minimal programming. To make this work, a 3D sensor on PR2 takes a picture, and from that single frame, the robot is able to use the raw depth data to pick up an item.

The team coined the phrase "autonomous checkout clerk," which is exactly what follows, as the robot locates the bar code by spinning the object in its hands, reading the numeric code, and then putting the item in a bag. No training or model programming is required, and the research revealed a grasping success rate of 91.6 percent when picking up 100 various items.

Just don't let it handle the eggs. As you can see in the video below, the integrity of the item is in question once it leaves the robot's mechanical hand.

The project was presented in a paper titled Grasping with Application to an Autonomous Checkout Robot (PDF) at the IEEE International Conference on Robotics and Automation. The research team consists of Ellen Klingbeil, Deepak Rao, Blake Carpenter, Varun Ganapathi, Andrew Y. Ng, and Oussama Khatib. Of course, the process is dreadfully slow in real time, but it's still a fascinating accomplishment.