{"id":206294,"date":"2017-02-08T15:57:47","date_gmt":"2017-02-08T20:57:47","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/robot-cars-can-teach-themselves-how-to-drive-in-virtual-worlds-singularity-hub.php"},"modified":"2017-02-08T15:57:47","modified_gmt":"2017-02-08T20:57:47","slug":"robot-cars-can-teach-themselves-how-to-drive-in-virtual-worlds-singularity-hub","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/singularity\/robot-cars-can-teach-themselves-how-to-drive-in-virtual-worlds-singularity-hub.php","title":{"rendered":"Robot Cars Can Teach Themselves How to Drive in Virtual Worlds &#8211; Singularity Hub"},"content":{"rendered":"<p><p>    Over the holidays, I went for a drive with a Tesla. With, not    in, because the car was doing the driving.  <\/p>\n<p>    Hearing about autonomous vehicles is one thing; experiencing it    was something entirely different. When the parked Model S    calmly drove itself out of the garage, I stood gaping in awe,    completely mind-blown.  <\/p>\n<p>    If this years     Consumer Electronics Show is any indication, self-driving    cars are zooming into our lives, fast and furious. Aspects of    automation are already in useTeslas Autopilot, for example,    allows cars to control steering, braking and switching lanes.    Elon    Musk, CEO of Tesla, has     gone so far as to pledge that by 2018, you will be able to    summon your car from across the countryand itll drive itself    to you.  <\/p>\n<p>    So far, the track record for autonomous vehicles has been    fairly impressive. According to a report from the National    Highway Traffic Safety Administration, Teslas crash rate        dropped by about 40% after turning on their    first-generation Autopilot system. This week, with     the introduction of gen two to newer cars equipped with the    necessary hardware, Musk is     aiming to cut the number of accidents by another whopping    50 percent.  <\/p>\n<p>    But when self-driving cars mess up, we take note. Last year, a    Tesla vehicle slammed into a white truck while Autopilot was    engagedapparently confusing it with the bright, white    skyresulting in the companys first fatality.  <\/p>\n<p>    So think about this: would you entrust your life to a robotic    machine?  <\/p>\n<p>    For anyone to even start contemplating yes, the cars have to    be remarkably safe fully competent in day-to-day driving, and    able to handle any emergency traffic throws their way.  <\/p>\n<p>    Unfortunately, those edge cases also happen to be the hardest    problems to solve.  <\/p>\n<p>    To interact with the world, autonomous cars are equipped with a    myriad of sensors. Googles button-nosed     Waymo car, for example, relies on GPS to broadly map out    its surroundings, then further captures details using its    cameras, radar and laser sensors.  <\/p>\n<p>    These data are then fed into software that figures out what    actions to take next.  <\/p>\n<p>    As with any kind of learning, the more scenarios the software    is exposed to, the better the self-driving car learns.  <\/p>\n<p>    Getting that data is a two-step process: first, the car has to    drive thousands of hours to record its surroundings, which are    used as raw data to build 3D maps. Thats why Google has been    steadily taking their cars out on field tripssome    two million miles to datewith engineers babysitting the    robocars to flag interesting data and potentially take over if    needed.  <\/p>\n<p>    This is followed by thousands of hours of labelingthat is,    manually annotating the maps to point out roads, vehicles,    pedestrians and other subjects. Only then can researchers feed    the dataset, so-called labeled data, into the software for it    to start learning the basics of a traffic scene.  <\/p>\n<p>    The strategy works, but its agonizingly slow, tedious and the    amount of experience that the cars get is limited. Since    emergencies tend to fall into the category of unusual and    unexpected, it may take millions of miles before the car    encounters dangerous edge cases to test its softwareand of    course, put both car and human at risk.  <\/p>\n<p>    An alternative, increasingly popular approach is to bring the    world to the car.  <\/p>\n<p>    Recently, Princeton researchers     Ari Seff and     Jianxiong Xiao realized that instead of manually collecting    maps, they could tap into a readily available repertoire of    open-sourced 3D maps such as     Google Street View and     OpenStreetMap. Although these maps are messy and in some    cases can have     bizarre distortions, they offer a vast amount of raw data    that could be used to construct datasets for training    autonomous vehicles.  <\/p>\n<p>    Manually labeling that data is out of the question, so the team    built a system that can automatically extract road featuresfor    example, how many lanes there are, if theres a bike lane, what    the speed limit is and whether the road is a one-way street.  <\/p>\n<p>    Using a powerful technique called     deep learning, the team trained their AI on 150,000 Street    View panoramas, until it could confidently discard artifacts    and correctly label any given street attribute. The AI    performed so well that it matched humans on a variety of    labeling tasks, but at much faster speed.  <\/p>\n<p>    The automated labeling pipeline introduced here requires no    human intervention, allowing it to scale with these large-scale    databases and maps,     concluded the authors.  <\/p>\n<p>    With further improvement, the system could take over the    labor-intensive job of labeling data. In turn, more data means    more learning for autonomous cars and potentially much faster    progress.  <\/p>\n<p>    This would be a big win for self-driving technology,     says Dr. John Leonard, a professor specializing in mapping    and automated driving at MIT.  <\/p>\n<p>    Other researchers are eschewing the real world altogether,    instead turning to hyper-realistic gaming worlds such as        Grand Theft Auto V.  <\/p>\n<p>    For those not in the know, GTA V lets gamers drive around the    convoluted roads of a city roughly one-fifth the size of Los    Angeles. Its an incredibly rich world the game boasts        257 types of vehicles and     7 types of bikes that are all based on real-world models.    The game also simulates     half a dozen kinds of weather conditions, in all giving    players access to a huge range of scenarios.  <\/p>\n<p>    Its a total data jackpot. And researchers are noticing.  <\/p>\n<p>    In a study     published in mid-2016, Intel Labs teamed up with German    engineers to explore the possibility of mining GTA V for    labeled data. By looking at any road scene in the game, their    system learned to classify different objects in the roadcars,    pedestrians, sidewalks and so onthus generating huge amounts    of labeled data that can then be fed to self-driving cars.  <\/p>\n<p>    Of course, datasets extracted from games may not necessarily    reflect the real world. So a team from the University of    Michigan trained two algorithms to detect vehicles one using    data from GTA V, the other using real-world imagesand     pitted them against each other.  <\/p>\n<p>    The result? The game-trained algorithm performed just as well    as the one trained with real-life images, although it needed    about 100 times more training data to reach the performance of    the real-world algorithmnot a problem, since generating images    in games is quick and easy.  <\/p>\n<p>    But its not just about datasets. GTA V and other    hyper-realistic virtual worlds also allow engineers to test    their cars in uncommon but highly dangerous scenarios that they    may one day encounter.  <\/p>\n<p>    In virtual worlds, AIs can tackle a variety of traffic    hazardssliding on ice, hitting a wall, avoiding a deerwithout    worry. And if the cars learn how to deal with these edge cases    in simulations, they may have a higher chance of surviving one    in real life.  <\/p>\n<p>    So far, none of the above systems have been tested on physical    self-driving cars.  <\/p>\n<p>    But with the race towards full autonomy at breakneck speed,    its easy to see companies incorporating these systems to give    themselves an upper edge.  <\/p>\n<p>    Perhaps more significant is that these virtual worlds represent    a subtle shift towards the democratization of self-driving    technology. Most of them are open-source, in that anyone can    hop onboard to create and test their own AI solutions for    autonomous cars.  <\/p>\n<p>    And who knows, maybe the next big step towards full autonomy    wont be made inside Tesla, Waymo, or any other tech giant.  <\/p>\n<p>    It could come from that smart kid next door.  <\/p>\n<p>    Image Credit:     Shutterstock  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See the original post here:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/singularityhub.com\/2017\/02\/08\/robot-cars-can-teach-themselves-how-to-drive-in-virtual-worlds\/\" title=\"Robot Cars Can Teach Themselves How to Drive in Virtual Worlds - Singularity Hub\">Robot Cars Can Teach Themselves How to Drive in Virtual Worlds - Singularity Hub<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Over the holidays, I went for a drive with a Tesla. With, not in, because the car was doing the driving.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/singularity\/robot-cars-can-teach-themselves-how-to-drive-in-virtual-worlds-singularity-hub.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[431648],"tags":[],"class_list":["post-206294","post","type-post","status-publish","format-standard","hentry","category-singularity"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/206294"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=206294"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/206294\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=206294"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=206294"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=206294"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}