{"id":236662,"date":"2017-08-21T19:32:13","date_gmt":"2017-08-21T23:32:13","guid":{"rendered":"http:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/uncategorized\/this-3d-printed-robotic-arm-is-built-for-sign-language-techcrunch-techcrunch.php"},"modified":"2017-08-21T19:32:13","modified_gmt":"2017-08-21T23:32:13","slug":"this-3d-printed-robotic-arm-is-built-for-sign-language-techcrunch-techcrunch","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/robotics\/this-3d-printed-robotic-arm-is-built-for-sign-language-techcrunch-techcrunch.php","title":{"rendered":"This 3D-printed robotic arm is built for sign language | TechCrunch &#8211; TechCrunch"},"content":{"rendered":"<p><p>    While we usually see robotics applied to industrial or research    applications, there are plenty of ways they could help in    everyday life as well: an autonomous guide for blind people,    for instance, or a kitchen bot that helps disabled folks cook.    Or  and this one is real  a robot arm that can perform    rudimentary sign language.  <\/p>\n<p>    Its part of a masters thesis from grad students at the    University of Antwerp who wanted to address the needs of the    deaf and hearing impaired. In classrooms, courts and at home,    these people often need interpreters  who arent always    available.  <\/p>\n<p>    Their solution is Antwerps Sign Language    Actuating Node, or ASLAN. Its a robotic hand and forearm    that can perform sign language letters and numbers. It was    designed from scratch and built from 25 3D-printed parts, with    16 servos controlled by an Arduino board. Its taught gestures    using a special glove, and the team is looking into recognizing    them through a webcam as well.  <\/p>\n<p>    Right now, its    just the one hand  so obviously two-hand gestures and the cues    from facial expressions that enrich sign language arent    possible yet. But a second coordinating hand and an emotive    robotic face are the next two projects the team aims to tackle.  <\/p>\n<p>    The idea is not to replace interpreters, whose nuance can    hardly be replicated, but to make sure that there is always an    option for anyone worldwide who requires sign language service.    It also could be used to help teach sign language  a robot    doesnt get tired of repeating a gesture for you to learn.  <\/p>\n<p>    Why not just use a virtual hand? Good question. An app or even    a speech-to-text program would accomplish many of the same    things. But its hard to think less of the ASLAN project;    taking an assistive technology off the screen and putting it in    the real world, where it can be interacted with, viewed from    many angles, and otherwise share the physical space of the    people it helps, is a commendable goal.  <\/p>\n<p>    ASLAN was created by Guy Fierens, Stijn Huys and Jasper Slaets.    Its still in prototype form, but once its finalized the    designs will be open sourced.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Read this article: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/techcrunch.com\/2017\/08\/18\/this-3d-printed-robotic-arm-is-built-for-sign-language\/\" title=\"This 3D-printed robotic arm is built for sign language | TechCrunch - TechCrunch\">This 3D-printed robotic arm is built for sign language | TechCrunch - TechCrunch<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> While we usually see robotics applied to industrial or research applications, there are plenty of ways they could help in everyday life as well: an autonomous guide for blind people, for instance, or a kitchen bot that helps disabled folks cook. Or and this one is real a robot arm that can perform rudimentary sign language. Its part of a masters thesis from grad students at the University of Antwerp who wanted to address the needs of the deaf and hearing impaired <a href=\"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/robotics\/this-3d-printed-robotic-arm-is-built-for-sign-language-techcrunch-techcrunch.php\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"limit_modified_date":"","last_modified_date":"","_lmt_disableupdate":"","_lmt_disable":"","footnotes":""},"categories":[431594],"tags":[],"class_list":["post-236662","post","type-post","status-publish","format-standard","hentry","category-robotics"],"modified_by":null,"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/236662"}],"collection":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/comments?post=236662"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/posts\/236662\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/media?parent=236662"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/categories?post=236662"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/futurist-transhuman-news-blog\/wp-json\/wp\/v2\/tags?post=236662"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}