{"id":196009,"date":"2017-06-01T22:39:17","date_gmt":"2017-06-02T02:39:17","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/the-next-big-leap-in-ai-could-come-from-warehouse-robots-the-verge\/"},"modified":"2017-06-01T22:39:17","modified_gmt":"2017-06-02T02:39:17","slug":"the-next-big-leap-in-ai-could-come-from-warehouse-robots-the-verge","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/the-next-big-leap-in-ai-could-come-from-warehouse-robots-the-verge\/","title":{"rendered":"The next big leap in AI could come from warehouse robots &#8211; The Verge"},"content":{"rendered":"<p><p>    Ask Geordie Rose and Suzanne Gildert, co-founders of the    startup Kindred, about their companys philosophy, and theyll    describe a bold vision of the future: machines with human-level    intelligence. Rose says these will be perhaps the most    transformative inventions in history  and they arent far    away. More intriguing than this prediction is Kindreds    proposed path for achieving it. Unlike some of the most    cash-flush corporations in Silicon Valley, Kindred is focusing    not on chatbots or game-playing programs, but on automating    physical robots.  <\/p>\n<p>    Gildert, a physicist who conceived Kindred in 2013 while    working with Rose at quantum computing company D-Wave, thinks    giving AI a physical body is the only way to make real progress    toward a true thinking machine. If you want to build    intelligence that conceptually thinks in the same way a human    does it needs to have a similar sensory motor as humans do,    Gildert says. The trick to achieving this, she thinks, is to    train robots by having them collaborate with humans in the    physical world. Rose, who co-founded D-Wave in 1999, stepped    back from his role as chief technology officer to work on    Kindred with Gildert.  <\/p>\n<p>    Kindred wants to train robots by having them collaborate    with humans in the physical world  <\/p>\n<p>    The first step toward their new shared goal is an industrial    warehouse robot called the Orb. Its a robotic arm that sits    inside a hexagonal glass encasement, equipped with a bevy of    sensors to help it see, feel, and even hear its surroundings.    The arm is operated using a mix of human control and automated    software. Because so many warehouse workers today spend a    significant amount of time sorting products and scanning    barcodes, Kindred developed a robotic arm that can do some    elements automatically. Meanwhile, humans step in when needed    to manually operate the robot to perform tasks that are    difficult for machines, like gripping a single product from a    cluster of different items.  <\/p>\n<p>    Workers can even operate the arm remotely using an    off-the-shelf HTC Vive headset and virtual reality motion    controllers. It turns out that VR is great for gathering data    on depth and other information humans intuitively use to grasp    objects.  <\/p>\n<p>    Kindred is now focused on getting its finished Orb into    warehouses, where it can begin learning at an accelerated pace    by sorting vastly different products and observing human    operators. Because the company gathers data every time a human    uses the Orb, engineers are able to improve its software over    time using techniques such as reinforcement learning, which    improves software through repetition. Down the line, the Orb    should slowly take over more responsibility and, ideally, learn    to perform new tasks.  <\/p>\n<p>    But Kindreds ultimate goal is much more ambitious. It may    sound counterintuitive, but Rose and Gildert think warehouses    are the perfect place to start on the path toward human-level    artificial intelligence. Because the US shipping marketplace is    already rife with single-purpose robots,     thanks in part to Amazon, there are plenty of opportunities    for humans to train AI. Finding, handling, and sorting products    while maneuvering in a fast-moving environment is a data gold    mine for building robots that can operate in the real world.  <\/p>\n<p>    Rose and Gildert believe the next generation of AI wont be in    the form of a disembodied voice living in our phones. Rather,    they believe the greatest strides will come from programs    running inside a physical robot can gain knowledge about the    world and itself from the ground up, like a human infant does    from birth.  <\/p>\n<p>    Kindreds is working toward whats known as artificial general    intelligence, or software capable of performing any task a    human being can do. Artificial general intelligence, or AGI, is    sometimes referred to as strong or full AI because it    exists in contrast to AI programs, like DeepMinds AlphaGo    system, with very specific applications. Other more    conventional forms of weak or narrow AI include the    underlying software behind Netflix and Amazon recommendations,    Snapchat camera effects that rely on facial recognition, and    Googles fast and accurate language translations.  <\/p>\n<p>    These algorithms are developed by applying deep learning    techniques to large-scale neural networks until they can, say,    differentiate between an image of a dog and a cat. They perform    one task, or perhaps many in some cases, far better than humans    can. But they are extremely limited and dont learn or adapt    the way humans do. The software that recognizes a sunset cant    predict whether youll like a Netflix movie or translate a    sentence into Japanese. Right now, you cant ask AlphaGo to    face off in chess  it doesnt know the rules and wouldnt know    how to begin learning them.  <\/p>\n<p>    Kindred thinks our physical body is intrinsic to the secrets    of human cognition  <\/p>\n<p>    This is the fundamental challenge of AGI: how to create an    intelligent system, the kind we know only from science fiction,    that can truly learn on its own without needing to be fed    thousands of examples and trained over the course of weeks or    months.  <\/p>\n<p>    The biggest names in AI research, like DeepMind, are focused on    game-playing because it seems to be the most viable path    forward. After all, if you can teach software to play    Pong, perhaps it can take the lessons learned and    apply them to Breakout? This applied knowledge    approach, which mimics the way a human player can quickly    intuit the rules of a new game, has proven promising.  <\/p>\n<p>    For instance, AlphaGo Master, DeepMinds latest Go system that    just bested world champion Ke Jie, now effectively teaches    itself how to play better. One of the things were most    excited about is not just that it can play Go better, but we    hope that thisll actually lead to technologies that are more    generally applicable to other challenging domains, DeepMind    co-founder and CEO Demis Hassabis     said at the event last week.  <\/p>\n<p>    Yet for Kindreds founders, the quest to crack the secret of    human cognition cant be separated from our physical bodies.    Our founding belief was that in order to make real progress    toward the original objectives of AI, you needed to start by    grounding your ideas in the physical world, Rose says. And    that means robots, and robots with sensors that can look    around, touch, hear the world that surrounds them.  <\/p>\n<p>    This body-first approach to AI is based on a theory called    embodied cognition, which suggests that the     interplay between our brain, body, and the physical world    is what produces elements of consciousness and the ability to    reason. (A fun exercise here is thinking about how many common    metaphors have physical underpinnings, like thinking of    affection as warmth or something inconceivable as being over    your head.) Without understanding how the brain developed to    control the body and guide functions like locomotion and visual    processing, the theory goes, we may never be able to reproduce    it artificially.  <\/p>\n<p>    The body-first approach to AI is based on a theory called    embodied cognition  <\/p>\n<p>    Other than Kindred, work on AI and embodied cognition mostly    happens in the research divisions of large tech companies and    academia. For example, Pieter Abbeel, who leads development on    the Berkeley Robot for the Elimination of Tedious Tasks    (BRETT), aims to     create robots that can learn much like young children do.  <\/p>\n<p>    By giving its robot sensory abilities and motor functions and    then using AI training techniques, the BRETT team devised a way    for it to acquire knowledge and physical skills much faster    than with standard programming  and with the flexibility to    keep learning. Much like how babies are constantly adjusting    their behavior when attempting something new, BRETT also    approaches unique problems, fails at first, and then adjusts    over repeated attempts and under new constraints. Abbeels team    even uses childrens toys to test BRETTs aptitude for problem    solving.  <\/p>\n<p>    OpenAI, the nonprofit funded by SpaceX and Tesla CEO Elon Musk,    is working on both general purpose game-playing algorithms and    robotics, under the notion that both avenues are complementary.    Helping the team is Abbeel, who is on leave from Berkeley to    help OpenAI make progress fusing AI learnings with modern    robotics. The interesting thing about robotics is that it    forces us to deal with the actual data we would want an    intelligent agent to deal with, says Josh Tobin, a graduate    student at Berkeley who works on robotics at OpenAI.  <\/p>\n<p>    Applying AI to real-world tasks like picking up objects and    stacking blocks involves tackling a whole suite of new    problems, Tobin says, like managing unfamiliar textures and    replicating minute motor movements. Solving them is necessary    if were to ever deploy intelligent robots beyond factory    floors.  <\/p>\n<p>    Wojciech Zaremba, who leads OpenAIs robotics work, says that a    holy grail of sorts would be a general-purpose robot powered by    AI that can learn a new task  scrambling eggs, for instance     by watching someone do it just once. This is why OpenAI is    working on teaching robots new skills that are     first demonstrated by a human in a simulated VR    environment, much like a video game, where its much easier    and less costly to produce and collect data.  <\/p>\n<p>    You could imagine that, as a final outcome, if its doable,    you have files online of recordings of various tasks, Zaremba    says. And then if you want the robot to replicate this    behavior, you just download the file.  <\/p>\n<p>    When I first operated the Orb, on an April afternoon in    Kindreds San Francisco warehouse space, a group of six or so    engineers were scattered about testing the robotic arms with    various pink-colored bins of products  vitamin bottles, soft    plastic cylinders of Lysol cleaning wipes, rolls of paper    towels.  <\/p>\n<p>    The Orb is designed to help sort these objects in a large heap    inside its glass container, while the arm sits affixed to the    roof of the container. First, an operator wearing a VR headset    moves the arm to a desired object, lowers the gripper, and    adjusts the two clamps until a firm grip is established. Then    the human can simply let go. Kindred has already automated the    process of lifting the object in the air, scanning the barcode,    and sorting it into the necessary bin.  <\/p>\n<p>    Using the Orb resembles operating a video game version of a    toy claw machine  <\/p>\n<p>    In any gigantic warehouse, people have to walk around and pick    up things, says George Babu, Kindreds chief product officer.    The most efficient way to do that is to pick up a whole bunch    of different things at the same time. Those go to someplace    where you have them separated. Our robot does that job in the    middle. The idea is that warehouse workers can dump a bunch of    products into the Orb, while a remote operator works with the    robot to sort them.  <\/p>\n<p>    Amazon is working on something similar, and the company now    holds an     annual picking challenge to spur development in industrial    robotics that are capable of handling and sorting physical    items. Kindred is quick to recognize Amazons prowess in this    department. In the fulfillment world, Amazon uses a different    set of approaches than all of the other fulfillment    provisioners. They have the scale, the scope, and the know-how    to implement end-to-end systems that are very effective at what    they do, Rose says. But he thinks Amazon is likely to keep    this technology to itself. The advancements that Amazon makes    toward doing this job well dont benefit all of their    competitors.  <\/p>\n<p>    Kindreds system, on the other hand, is designed to integrate    into existing warehouse tools. Last month, Kindred finished its    first deployable devices, and it created more demand than we    anticipated, according to Jim Liefer, Kindreds chief    operating officer, though he wont disclose any initial    customers.  <\/p>\n<p>    I was surprised when using the Orb, with a Vive headset, by    just how much it resembles a video game. Think of a toy claw    machine, where the second the clamp touches down on an object,    the automated process takes over and the arm springs to life    with an uncanny jerkiness. It makes sense, considering Kindred    built its depth-sensing system using the game engine Unity.  <\/p>\n<p>    Kindred imagines future versions of the Orb being affixed to    sliding rails or bipedal roaming robots  <\/p>\n<p>    Max Bennett, Kindreds robotics product manager, says that the    process is designed so that human warehouse workers can operate    multiple Orbs simultaneously, gripping objects and letting the    software take the reins before cycling to the next setup.    Kindred imagines future versions of the robotic arm being    affixed to sliding overhead rails or maybe even to bipedal    robots that roam the floor. There is also a point at which the    Vive is no longer necessary. Nobodys going to want to use a    VR headset all day, Bennett tells me, suggesting that an Xbox    controller or even just a computer mouse will do in the future.  <\/p>\n<p>    As for how the Orb might impact jobs, Babu says there will be    need for human labor for quite some time. Hes partly right:    Amazon hired 100,000 workers in the last year alone, and        plans to hire 100,000 more this year, mostly in warehouse    and other fulfillment roles. But systems like the Orb raise the    possibility that fewer jobs are needed as the work becomes more    a matter of assisting and operating robots.  <\/p>\n<p>    My view is that the humans will all move on to different work    in the stream, Babu says.  <\/p>\n<p>    Still, Forrester Research predicts that automation     will result in 25 million jobs lost over the next decade,    with only 15 million new jobs created. The end goals of    automation have always been to reduce costs and improve    efficiency, and that will inevitably mean the disappearance of    certain types of labor.  <\/p>\n<p>    Kindred is unique in the AI field not just for its robotics    focus, but also because its diving head first into the    industrial world with a commercial product. Many of the big    tech companies working on AI are doing so with huge research    organizations, like Facebook AI Research and Google Brain.    These teams are filled with academics and engineers who work on    abstract problems that then help inform real software features    that get deployed to millions of consumers.  <\/p>\n<p>    Kindred, as a startup, cant afford this approach. Day one we    said: Were going to find a big market. Were going to build a    wildly successful product for that initial market, and build a    business by executing along that path first with one vertical    and then maybe others, Rose explains. He adds that his    experience with D-Wave, which raised more than $150 million    over the course of more than a decade just to release its first    product, inspired him to seek out a different approach to    tackling big-picture problems.  <\/p>\n<p>    Gildert and Rose dont want to rely solely on venture    capital funding to build Kindred  <\/p>\n<p>    You have this quandary that doing it right is going to take a    long time, on the order of decades. How do you sustain that    organization for that length of time without all the negative    side effects of raising a lot of rounds of VC? Rose says. The    answer is that you have to create a real business that is    cash-flow positive very early. Kindred has raised $15 million    in funding thus far from Eclipse, GV, Data Collective, and a    number of other investors. But Rose stresses that the companys    focus is to become profitable with the Orb, and that will help    it in its main objective.  <\/p>\n<p>    That objective, since the beginning, is human-level AI with a    focus on what Gildert calls in-body cognition, or the type of    thought processes that only arise from giving AI a physical    shell. Intelligence absence a body is not what we think it    means, she says. Intelligence with a body brings to it a    number of constraints that are not there when you think about    intelligence in a virtual environment. We certainly dont    believe you can build a chatbot without a human-like body and    expect it to pass [for a human].  <\/p>\n<p>    Brains evolved to control bodies, Rose adds. And all these    things that we think about as being the beautiful stuff that    comes from cognition, theyre all side effects of this.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See the rest here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/www.theverge.com\/2017\/6\/1\/15703146\/kindred-orb-robot-ai-startup-warehouse-automation\" title=\"The next big leap in AI could come from warehouse robots - The Verge\">The next big leap in AI could come from warehouse robots - The Verge<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Ask Geordie Rose and Suzanne Gildert, co-founders of the startup Kindred, about their companys philosophy, and theyll describe a bold vision of the future: machines with human-level intelligence.  <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/the-next-big-leap-in-ai-could-come-from-warehouse-robots-the-verge\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187743],"tags":[],"class_list":["post-196009","post","type-post","status-publish","format-standard","hentry","category-ai"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/196009"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=196009"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/196009\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=196009"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=196009"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=196009"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}