{"id":234867,"date":"2017-08-15T17:44:32","date_gmt":"2017-08-15T21:44:32","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/beyond-hal-how-artificial-intelligence-is-changing-space-systems-spacenews.php"},"modified":"2017-08-15T17:44:32","modified_gmt":"2017-08-15T21:44:32","slug":"beyond-hal-how-artificial-intelligence-is-changing-space-systems-spacenews","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/beyond-hal-how-artificial-intelligence-is-changing-space-systems-spacenews.php","title":{"rendered":"Beyond HAL: How artificial intelligence is changing space systems &#8211; SpaceNews"},"content":{"rendered":"<p><p>  This computer-generated view depicts part of Mars at the boundary  between darkness and daylight, with an area including Gale Crater  beginning to catch morning light. Curiosity was delivered in 2012  to Gale crater, a 155-kilometer-wide crater that contains a  record of environmental changes in its sedimentary rock. Credit:  NASA JPL-CALTECH<\/p>\n<p>    Thisarticleoriginally appeared in the July 3,    2017 issue of SpaceNews magazine.  <\/p>\n<p>    Mars 2020 is an ambitious mission. NASA plans    to gather 20 rock cores and soil samples within 1.25 Mars    years, or about 28 Earth months a task that would be    impossible without artificial intelligence because the rover    would waste too much time waiting for instructions.  <\/p>\n<p>    It currently takes the Mars Science Laboratory team at NASAs    Jet Propulsion Laboratory eight hours to plan daily activities    for the Curiosity rover before sending instructions through    NASAs over-subscribed Deep Space Network. Program managers    tell the rover when to wake up, how long to warm up its    instruments and how to steer clear of rocks that damage its    already beat-up wheels.  <\/p>\n<p>    Mars 2020 will need far more autonomy. Missions are paced by    the number of times the ground is in the loop, said Jennifer    Trosper, Mars Science Laboratory mission manager. The more the    rover can do on its own, the more it can get done.  <\/p>\n<p>    The $2.4 billion Mars 2020 mission is just one example of    NASAs increasing reliance on artificial intelligence, although    the term itself makes some people uneasy. Many NASA scientists    and engineers prefer to talk about machine learning and    autonomy rather than artificial intelligence, a broad term that    in the space community sometimes evokes images of HAL 9000, the    fictional computer introduced in Arthur C. Clarkes 2001: A    Space Odyssey.  <\/p>\n<p>    To be clear, NASA is not trying to create HAL. Instead,    engineers are developing software and algorithms to meet the    specific requirements of missions.  <\/p>\n<p>    Work we are doing today focuses not so much on general    intelligence but on trying to allow systems to be more    independent, more self-reliant, more autonomous, said Kelly    Fong, the NASA Ames Research Centers senior scientist for    autonomous systems and director of the Intelligent Robotics    Group.  <\/p>\n<p>    For human spaceflight, that means giving astronauts software to    help them respond to unexpected events ranging from equipment    failure to medical emergencies. A medical support tool, for    example, combines data mining with reasoning and learning    algorithms to help astronauts on multi-month missions to Mars    handle everything from routine care to ailments or injuries    without having to talk to a roomful of flight controllers    shadowing them all the time, Fong said.  <\/p>\n<p>    Through robotic Mars missions, NASA is demonstrating    increasingly capable rovers. NASAs Mars Exploration Rovers,    Spirit and Opportunity, could do very little on their own when    they bounced onto the red planet in 2004, although they have    gained some autonomy through software upgrades. Curiosity, by    comparison, is far more capable.  <\/p>\n<p>    Last year, Curiosity began using software called Autonomous    Exploration for Gathering Increased Science that combines    computer vision with machine learning to select rocks and soil    samples to investigate based on criteria determined by    scientists. The rover can zap targets with its ChemCam laser,    analyze the gases that burn off, package the data with images    and send them to Earth.  <\/p>\n<p>    Scientists on the mission have been excited about this because    in the past they had to look at images, pick targets, send up    commands and wait for data, said Kiri Wagstaff, a researcher    in JPLs Machine Learning and Instrument Autonomy Group.  <\/p>\n<p>    Although data can travel between Earth and Mars in 10 to 30    minutes, mission controllers can only send and receive data    during their allotted time on the Deep Space Network.  <\/p>\n<p>    Even if the rover could talk to us 24\/7 we wouldnt be    listening, Wagstaff said. We only listen to it in a 10-minute    window once or twice day because the Deep Space Network is busy    listening to Cassini, Voyager, Pioneer, New Horizons and every    other mission out there.  <\/p>\n<p>    The Mars 2020 rover is designed to make better use of limited    communications with mission managers by doing more on its own.    It will wake itself up and heat instruments to their proper    temperatures before working through a list of mandatory    activities plus additional chores it can perform if has enough    battery power remaining.  <\/p>\n<p>    Ideally, we want to say, This area is of interest to us. We    want images of objects and context from the instruments. Call    us when youve got all that and we will use the information to    get a sample, Trosper said.  <\/p>\n<p>    NASA isnt there yet, but Mars 2020 takes the agency in that    direction with software to enable the rover to drive from point    to point through Martian terrain while avoiding obstacles.    Its the kind of basic skill toddlers learn, not to run into    things, but its a good skill, Fong said. That type of    autonomy is increasingly being added to our space systems.    Going forward, I see us adding more and more of these    intelligent skills.  <\/p>\n<p>    Future missions like NASAs Europa Clipper will need robust    artificial intelligence to look for plumes rising from a    subsurface ocean and cracks in the moons icy surface caused by    hydrothermal vents. When scientists cant predict when or where    they will make discoveries, they need artificial intelligence    to watch for things, notice them, capture data and send it    back to us, Wagstaff said.  <\/p>\n<p>    As the Europa Clippers instruments collect data, the    spacecrafts onboard processor will need to assign priorities    to the observations and downlink the most interesting ones to    Earth, Wagstaff said. We always can collect more data than we    can transmit.  <\/p>\n<p>    That is particularly true of missions beyond Mars, where NASA    orbiters can relay data. Missions to Europa or Saturns moon    Enceladus also will experience communication delays because of    the distance.  <\/p>\n<p>    NASA has developed software on Earth-observation satellites    that could be used in future missions to ocean worlds. The    Intelligence Payload Experiment cubesat launched in 2013 relied    on machine learning to analyze images and highlight anything    that stood out from its surroundings.  <\/p>\n<p>    It has its eyes open to look for anything that doesnt match    what we expect or anything that stands out as being different,    Wagstaff said. We cant predict what we are going to find. We    dont want to miss something just because we havent trained    instruments to look for it.  <\/p>\n<p>    A proposed future mission to bore through Europas ice to    investigate whether life exists in an ocean below would require    even more onboard intelligence. NASA probably would design    software to look for inconsistencies in chemical composition or    temperature. That would keep you from having to say what life    would look like, what it would it would be eating and its    energy source, Wagstaff said.  <\/p>\n<p>    Before engineers send hardware or software into space, they    test it extensively in analogous environments on Earth.    Engineers test Mars missions in the desert. The best analog for    Europa missions may be glacial deposits in the Arctic.  <\/p>\n<p>    We are acutely aware of risk mitigation because we are dealing    with spacecraft that cost hundreds of millions or even billions    of dollars, Wagstaff said. Everything we do is thoroughly    tested for years in some cases before it is ever put on the    spacecraft.  <\/p>\n<p>    AI at the controls  <\/p>\n<p>    Thecapsules SpaceXand Boeing are building to ferry    astronauts between Earth and the International Space Station    are designed to operate autonomously from the minute they    launch, through the demanding task of docking and on their    return trip.  <\/p>\n<p>    NASA crews will spend far less time learning to operate the    spacecraft than preparing to conduct microgravity research and    maintain the orbiting outpost, said Chris Ferguson, the former    space shuttle commander who directs crew and mission operations    for Boeings CST-100 Starliner program.  <\/p>\n<p>    It provides a lot of relief in the training timeframe. They    dont have to learn everything. They just have to learn the    important things and how to realize when the vehicle is not    doing something its suppose to be doing, Ferguson told    SpaceNews.  <\/p>\n<p>    Starliner flight crew will train to monitor the progress of the    spaceship. If something goes wrong, they will know how to take    control manually and work with the ground crew to fix the    problem, he added.  <\/p>\n<p>    NASA insisted on that high degree of autonomy, in part, to    ensure the crewcapsules could serve as lifeboats in case    of emergencies.  <\/p>\n<p>    If theres a bad day up there and the crew needed to come home    quickly, they could pop into the vehicle with very little    preparation, close the hatch and set a sequence of events into    play that would get them home very quickly, Ferguson said.  <\/p>\n<p>    In many ways, Starliners autonomy in flight is similar to an    airplanes. Whether on commercial airplanes or spacecraft,    everyone is beginning to realize pilots are turning into    systems monitors more than active participants, Ferguson said.  <\/p>\n<p>    When Starliner docks with space station, the crew will be    monitoring sophisticated sensors and image processors. Boeing    relies on cameras, infrared imagers and Laser Detection and    Ranging sensors that create three-dimensional maps of the    approach. A central processor will determine which sensor is    more likely to be accurate and will weight the data accordingly    to ensure that two vehicles that were previously traveling    quickly relative to one another come into contact at about four    centimeters per second.  <\/p>\n<p>    In spite of the complexity, astronauts will view displays that    look similar to the ones airplane pilots see on instrument    landing systems, Ferguson said.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Go here to read the rest:<\/p>\n<p><a target=\"_blank\" href=\"http:\/\/spacenews.com\/beyond-hal-how-artificial-intelligence-is-changing-space-systems\/\" title=\"Beyond HAL: How artificial intelligence is changing space systems - SpaceNews\">Beyond HAL: How artificial intelligence is changing space systems - SpaceNews<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> This computer-generated view depicts part of Mars at the boundary between darkness and daylight, with an area including Gale Crater beginning to catch morning light. Curiosity was delivered in 2012 to Gale crater, a 155-kilometer-wide crater that contains a record of environmental changes in its sedimentary rock. Credit: NASA JPL-CALTECH Thisarticleoriginally appeared in the July 3, 2017 issue of SpaceNews magazine.  <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/artificial-intelligence\/beyond-hal-how-artificial-intelligence-is-changing-space-systems-spacenews.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[13],"tags":[],"class_list":["post-234867","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/234867"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=234867"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/234867\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=234867"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=234867"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=234867"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}