{"id":213074,"date":"2017-08-22T23:58:40","date_gmt":"2017-08-23T03:58:40","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/even-artificial-intelligence-is-sexist-glamour\/"},"modified":"2017-08-22T23:58:40","modified_gmt":"2017-08-23T03:58:40","slug":"even-artificial-intelligence-is-sexist-glamour","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/artificial-intelligence\/even-artificial-intelligence-is-sexist-glamour\/","title":{"rendered":"Even Artificial Intelligence Is Sexist &#8211; Glamour"},"content":{"rendered":"<p><p>        Sexism      is so deeply ingrained in the way we    think about the world, we've actually passed it on to our    computers, according to a new University of Virginia (UVA) and    University of Washington study     . Artificial    intelligence (AI) is more likely to label people who are    cooking, shopping, and cleaning as women and people who are    playing sports, coaching, and shooting as men.      <\/p>\n<p>    UVA computer science professor Vicente    Ordez got the idea for the experiment when he noticed that    his image-recognition software was associating photos of    kitchens with women. After training software using two photo    collections that researchers use to create image-recognition    software, including one supported by Facebook and Microsoft, he    and his colleagues found that not only do these collections    contain gender bias, but they also multiply that bias when they    pass it on to the software. The program these photo sets    produced actually labeled a man a \"woman\" because he was    standing by a stove.   <\/p>\n<p>    This isn't the only evidence we have    that technology contains biases. In addition to    image-recognition software, software that analyzes writing and    speech also reflects hidden assumptions about gender, according    to a study      published    earlier this year in Science.     The researchers analyzed how computers    interpreted words from Google News and a 840 billion-word data    set used by computer scientists, and they found that machines    linked     male      and         man      with STEM fields and         woman      and         female      with chores. The problem wasn't just    with gender either: Stereotypically white names were more    likely to be associated positive words like         happy      and         gift.      Another     study      published last summer found that when    software based on Google News was asked, \"Man is to computer    programmer as woman is to X,     \" it responded with \"homemaker.\"       <\/p>\n<p>    Of course, computers don't make up    these associations out of nowhere. They're reflecting our own    biases back to us. But when they pick those beliefs up, these    can take on a life of their own. The snafu that Google's image    software made in 2015, when it mislabeled black    people as gorillas    , demonstrates this. Google image    searches are another example: Search for         hand      and you get mostly white ones, while         girl      yields sexy photos and         boy      yields kids.  <\/p>\n<p>    This tendency becomes even more    problematic when AI is used to create robots that interact with    people. Mark Yatskar, a researcher at the Allen Institute for    Artificial Intelligence and an author of the new study,     told         Wired      he could imagine a scenario where a    robot asks a woman if she wants help with the dishes while    handing a man a beer. \"This could work to not only reinforce    existing social biases but actually make them worse,\" he said.       <\/p>\n<p>    The way artificial intelligence    identifies words and images is based on the way people use    them, so in order to promote a more egalitarian world,    engineers would have to intervene in the creation of the    software. And that's a possibility many are considering. Eric    Horvitz, director of Microsoft Research, told         Wired      that Microsoft has a committee for    this. \"I and Microsoft as a whole celebrate efforts identifying    and addressing bias and gaps in data sets and systems created    out of them,\" he said. \"Its a really important questionwhen    should we change reality to make our systems perform in an    aspirational way?\"   <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Follow this link:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/www.glamour.com\/story\/even-artificial-intelligence-is-sexist\" title=\"Even Artificial Intelligence Is Sexist - Glamour\">Even Artificial Intelligence Is Sexist - Glamour<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Sexism is so deeply ingrained in the way we think about the world, we've actually passed it on to our computers, according to a new University of Virginia (UVA) and University of Washington study . Artificial intelligence (AI) is more likely to label people who are cooking, shopping, and cleaning as women and people who are playing sports, coaching, and shooting as men <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/artificial-intelligence\/even-artificial-intelligence-is-sexist-glamour\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187742],"tags":[],"class_list":["post-213074","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/213074"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=213074"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/213074\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=213074"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=213074"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=213074"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}