{"id":215186,"date":"2017-03-11T03:26:07","date_gmt":"2017-03-11T08:26:07","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/the-future-of-human-centered-robotics-electronic-design.php"},"modified":"2017-03-11T03:26:07","modified_gmt":"2017-03-11T08:26:07","slug":"the-future-of-human-centered-robotics-electronic-design","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/robotics\/the-future-of-human-centered-robotics-electronic-design.php","title":{"rendered":"The Future of Human-Centered Robotics &#8211; Electronic Design"},"content":{"rendered":"<p><p>    Luis Sentis will lead a session, A Developers Primer for Coding Human Behavior in    Bots, at SXSW on Sunday, March 12, 2017.  <\/p>\n<p>    Human-centered robotics hold a special place in the robotics    field because they both mimic human sensing and cognitive    behavior, and are designed to assist humans for safety and    productivity. To explore human-centered robotics is to explore    human beings and how we sense the world, analyze complex and    often conflicting information, and act upon our findings,    modifying perception, understanding, and action as new    information is available.  <\/p>\n<p>    Such machines could be of great practical benefit to humans on    long space flights to Mars, for instance, or as human proxies    in hazardous environments such as a chemical spill or even    ordinary circumstances like education or elder care.  <\/p>\n<p>    Obviously, creating human-centered robots poses many challenges    in conception, design, and the hardware and software that    support them. My own work in this burgeoning field focuses on    designing high-performance series elastic actuators for robotic    movements, embedded control systems, motion planning for    dynamic locomotion, and real-time optimal control for    human-centered robots.  <\/p>\n<\/p>\n<p>    Once we have a platform for human-centered robotics, and once    we can create the hardware and software and the logic to drive    them, we can turn to its real-world applications, which are    many.  <\/p>\n<p>    Most readers probably have only a passing acquaintance with    human-centered robotics, so allow me to use this brief blog to    introduce a few ideas about this topic and its challenges.  <\/p>\n<p>    Humanoid and Human-Centered  <\/p>\n<p>    Since perhaps the 1950s, television and the movies have often    portrayed humanoid robotsrobots that take roughly human    formentertaining us with how closely they mimic humans or by    how far they fall short. Sometimes, in a dramatic plot turn, a    humanoid robot becomes malevolent or uncontrollable by humans.  <\/p>\n<p>    I prefer the term human-centered robot, because it most    closely describes my field of endeavor: How to create a robot    that is focused on assisting a human being; sometimes guided by    a human, but also learning on its own what action or behavior    would be most helpful to that human.  <\/p>\n<p>    In my view, we do not yet have sufficient evidence to say that    humanoid robots are most effective when interacting with    humans. They may well be, but we do not have definitive data on    the question.  <\/p>\n<p>    However, it appears anecdotally true that humanoid robots fire    the human imagination and that has its benefits. In addition to    their portrayal in popular media, I have found that humanoid    robots draw the most, well, human interest. Soon after creating    one we named Dreamer in 2013, in the Human Centered    Robotics Lab (HCRL) at the Cockrell School of Engineering    at the University of Texas at Austin, we generally received    more attention from curious students, engineers, investors    andwouldnt you know itmovie producers. (Dreamer eventually    had a bit part in Transformers 4 in 2014.) If humanoid robots    help draw attention and interest to human-centered robotics, so    be it.  <\/p>\n<\/p>\n<p>    Applications and Productivity  <\/p>\n<p>    The more important aspect of this field is how to create    human-centered robotics that sense their surroundings and    either respond to human directions or intuit what actions would    best serve their human counterpart.  <\/p>\n<p>    Ive mentioned the possible robotic applications of space    travel, perhaps as a companion to astronauts on a space walk,    as a human proxy in hazardous environments or as a caregiver to    an elderly person. In each case, the notion of productivity is    different.  <\/p>\n<p>    If you think of productivity for robotics generally in a    manufacturing setting, it can be measured in terms of hours of    work performed and profits earned. But in a long space journey    to Mars, productivity will be measured instead in terms of the    astronauts enhanced safety and ability to accomplish difficult    tasks. In a hazmat spill, productivity might be measured in    terms of human lives saved. In elder care, how well did a robot    perform in changing bandages or applying ointment to a sore,    preserving the persons health?  <\/p>\n<p>    Robot Knows Best  <\/p>\n<p>    Another quest in human-centered robotics is to create the    ability of a robot to not just predict human behavior, but to    perform what I call intervention. Whatever its level of    complexity, can we build a robot with logic that assesses a    situation for optimal actions, whether directed by a human or    not? This translates to a robots ability to say to itself,    Well, the human is operating the system in such a way. We    could do better if computationally I have a hypothesis about    what would be best for the human and intervene with that    particular behavior.  <\/p>\n<p>    This ability requires pairing social cognitive theories with    mathematics. And I have found that advances are possible for    what I call self-efficacy, which is basically the    self-confidence to achieve a certain behavior.  <\/p>\n<p>    At this point, self-efficacy can be achieved in very simple    scenarios. One potential application is to use a human-centered    robot to motivate students to solve problems by sensing and    reacting to students level of engagement, then producing an    interaction that motivates the student and enhances learning. I    hope to demonstrate this and give attendees a chance to code    such behavior in a human-centered robot at SXSW.   <\/p>\n<p>    Touch and Whole Body Sensing  <\/p>\n<p>    One major way in which humans interact is through touch. We    place a hand on a shoulder or grasp someones forearm to gain    their attention. Robotsparticularly humanoid ones with    mobilityare likely to be large and quick, so touch becomes an    important element in the safety of their human counterparts. We    do not want a robot that runs into an astronaut on a space walk    or pins someone to a wall. Thus, we are developing what I call    whole body sensing. Though some in this field are pursuing    something known as sensory skin, at the HCRL we have taken a    more economical approach to minimize the amount of electronics    needed.  <\/p>\n<\/p>\n<p>    We use a distributed sensor array on the robots surface, but    they number in the dozens, not the thousands employed in    sensory skin. Instead, we combine different sensing modalities    internal to the robot, such as accelerometers, which aid    stabilization, and vibration sensors that enable the machine to    triangulate information on whats happening in the immediate    environment. This enables the robot to respond to human touch,    but within the context of other information it is receiving    from its environment. We call this whole-body contact    awareness, a combination of internal and external sensing and    awareness.  <\/p>\n<p>    Spin-offs  <\/p>\n<p>    I hope this mere glimpse into the world of human-centered    robotics piques your curiosity. It may serve to attract those    who wish to work in the field. But the general public should    also understand that advances in this field will eventually    make their way into human-centered robotics in our homes, our    businesses, manufacturing, agriculture, smart cities, the    Internet of Things, you name it. Well have systems somedaywe    already do, with limited abilitiesto sense human behaviors and    intervene to produce optimal conditions based on an    understanding of whats best for the people involved in a    particular situation.  <\/p>\n<p>    Today, we have smart thermostats that learn our preferences for    heating and cooling our homes. Tomorrow, we may have    human-centered robotic systems that optimize our cities.  <\/p>\n<p>    Luis Sentis is an Assistant Professor in the Department of    Aerospace Engineering at the University of Texas (UT) at    Austin. He also leads UT Austins Human Centered    Robotics Laboratory and is co-founder of Apptronik Systems    Inc., a contractor for NASA's Johnson Space Center.  <\/p>\n<\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See more here:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"http:\/\/electronicdesign.com\/robotics\/future-human-centered-robotics\" title=\"The Future of Human-Centered Robotics - Electronic Design\">The Future of Human-Centered Robotics - Electronic Design<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Luis Sentis will lead a session, A Developers Primer for Coding Human Behavior in Bots, at SXSW on Sunday, March 12, 2017. Human-centered robotics hold a special place in the robotics field because they both mimic human sensing and cognitive behavior, and are designed to assist humans for safety and productivity <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/robotics\/the-future-of-human-centered-robotics-electronic-design.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[431594],"tags":[],"class_list":["post-215186","post","type-post","status-publish","format-standard","hentry","category-robotics"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/215186"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=215186"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/215186\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=215186"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=215186"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=215186"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}