Alphabet Xs new Everyday Robot project wants to build robots that can learn – The Verge

Today, Alphabets X moonshot division (formerly known as Google X) unveiled the Everyday Robot project, whose goal is to develop a general-purpose learning robot. The idea is that its robots could use cameras and complex machine learning algorithms to see and learn from the world around them without needing to be coded for every individual movement.

The team is testing robots that can help out in workplace environments, though right now, these early robots are focused on learning how to sort trash. Heres what one of them looks like it reminds me of a very tall, one-armed Wall-E (ironic, given what the robots are tasked to do):

Heres a GIF of a robot actually sorting a recyclable can from a compost pile to a recycling pile. This is wild check out how the arm actually grasps the can:

The concept of grasping something comes pretty easily to most humans, but its a very challenging thing to teach a robot, and Everyday Robots robots get their practice in both the physical world and the virtual world. In a tour of Xs offices, Wired described how a playpen of nearly 30 of the robots (supervised by humans) spend their daytime hours sorting trash into trays for compost, landfill, and recycling. At night, Everyday Robot has virtual robots practice grabbing things in simulated buildings, according to Wired. That simulated data is then combined with the real world data, which is given to the robots in a system update every week or two.

With all that practice, X says the robots are actually getting pretty good at sorting, apparently putting less than 5 percent of trash in the wrong place (Xs humans put 20 percent of trash in the wrong pile, according to X).

That doesnt mean theyre remotely ready to replace human janitors, though. Wired observed one robot grasping thin air instead of the bowl in front of it, then attempting to put the bowl down. Another lost one of its finger during the demo. Engineers also told Wired that, at one point, some robots werent moving through a building because some types of light caused their sensors to hallucinate holes in the floor.

There are whole startups dedicated to the problem of teaching a robot how to grasp, such as Embodied Intelligence and the nonprofit OpenAI. And Google, also owned by Alphabet, has done research into grasping check out this 2016 video of some Google-made robot arms trying to grab differently-sized objects:

But progress is being made beyond the work X and Google are doing. For example, Boston Dynamics (formerly owned by Google) released this video in 2018 of its SpotMini robot grabbing a doorknob to open a door for a friend:

And research from Google from this March showed off a robot that could pick up objects and, over time, learn the best way to throw a specific shape:

Despite all this research, Google and Alphabet have a troubled history with robotics. Googles last serious attempt at robotics work started in 2013 in a division led by Android co-founder Andy Rubin. Though that division made some high-profile acquisitions, including Boston Dynamics, nothing concrete came from it, and Rubin departed from Google in 2014 following allegations of sexual harassment. Google is apparently dipping its toes back into robotics, though, based on a report from March of this year, and its new robots are also learning how to grab, but it seems Googles work is different from that of Everyday Robots.

Everyday Robot lead Hans Peter Brondmo told Wired that he hopes to one day make a robot that can assist the elderly. But he also acknowledged something like that might be a few years out so for now, it seems the robots will keep getting better at sorting trash.

Originally posted here:

Alphabet Xs new Everyday Robot project wants to build robots that can learn - The Verge

Related Posts

Comments are closed.