{"id":204761,"date":"2017-07-10T20:20:19","date_gmt":"2017-07-11T00:20:19","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/if-youre-not-a-white-male-artificial-intelligences-use-in-healthcare-could-be-dangerous-quartz\/"},"modified":"2017-07-10T20:20:19","modified_gmt":"2017-07-11T00:20:19","slug":"if-youre-not-a-white-male-artificial-intelligences-use-in-healthcare-could-be-dangerous-quartz","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/if-youre-not-a-white-male-artificial-intelligences-use-in-healthcare-could-be-dangerous-quartz\/","title":{"rendered":"If you&#8217;re not a white male, artificial intelligence&#8217;s use in healthcare could be dangerous &#8211; Quartz"},"content":{"rendered":"<p><p>    Healthcare inequalities are systemic and closely intertwined    with social inequalities. In the US,     black men and women can be expected to live a decade less    than their white counterparts, and are also     much more likely to die from heart disease, various types    of cancer, and stroke. Rates of diabetes in Hispanic Americans    are around 30%    higher than in whites. Gay, lesbian, and bisexual adults    are     twice as likely to suffer with mental-health problems.    Access to and quality of healthcare is similarly dismal when it    comes to diversity, starkly cutting across racial,    social, and economic divides.  <\/p>\n<p>    If developed and used sensitively, artificial intelligence    systems could go a long way to mitigating these inequalities by    removing human bias. A careless approach, however, could make    the situation worse.  <\/p>\n<p>    AI has the potential to revolutionize healthcare, ushering in    an age of personalized, accessible, and lower-cost medicine for    all. But theres also a very real risk that those same    technologies will perpetuate existing healthcare inequalities.    A large part of this risk comes from existing biases in    healthcare data.  <\/p>\n<p>    AIs transformative potential comes from its ability to    interrogate, parse, and analyze vast amounts of data. From this    information, AI systems can find patterns and links that would    have previously required great levels of expertise or time from    human doctors. For this reason, AI is particularly useful in    diagnostics, creating personalized treatment plans, and even    helping doctors     keep up to date with the latest medical research.  <\/p>\n<p>    If we want to use AI to    facilitate a more personalized medicine for all, it would help    if we could first provide medicine that works for half the    population.But this use of data risks    exacerbating existing inequalities. Data coming from randomized    control trials are often     riddled with bias. The     highly selective nature of trials systemically disfavor    women, the elderly, and those with additional medical    conditions to the ones being studied; pregnant    women are often excluded entirely. AIs are trained to make    decisions using this skewed data, and their results will    therefore favor the biases contained within. This is especially    concerning when it comes to medical data, which weighs heavily    in the favor of white men.  <\/p>\n<p>    The consequences of this oversight are pernicious. Women are    far more likely to suffer the deleterious side effects of    medication than men. Pregnant women get sick, but the    consequences of taking many medications when pregnant are    chronically understudied, or worse yet, unknown entirely. Women    are far less likely to receive the correct treatment for heart    attacks because their symptoms do not match typical    (read: male) symptoms.  <\/p>\n<p>    If evidence-based medicine is already far less evidence-based    for anybody who is not a white male, how can the use of this    unmodified data do anything other than unwittingly perpetuate    this inequality? If we want to use AI to facilitate a more    personalized medicine for all, it would help if we could first    provide medicine that works for half the population.  <\/p>\n<p>    The effects of this data can be even more insidious. AI systems    often function as black boxes, which means technologists are    unaware of how an AI came to its conclusion. This can make it    particularly hard to identify any inequality, bias, or    discrimination feeding into a particular decision. The    inability to access the medical data upon which a system was    trainedfor reasons of protecting patients privacy or the data    not being in the public domainexacerbates this. Even if you    had access to that data, the often proprietary nature of AI    systems means interrogation would likely be impossible. By    masking these sources of bias, an AI system could consolidate    and deepen the already systemic inequalities in healthcare, all    while making them harder to notice and challenge. Invariably,    the result of this will be a system of medicine that is    unfairly stacked against certain members of society.  <\/p>\n<p>    This is especially true of less-connected communities. There is    already an     unhealthy digital divide where poorer and older members of    society dont have access to the digital technologies that can    be used to improve healthcare. This also means theyre not    producing the data that comes with its use, and as this chasm    grows, the system will stack against older and poorer patients    even further than it currently does. Even if they were to    readily gain access to these technologies in the next decade,    it would be too late, as the systems will already be calibrated    for younger, more urban bodies.  <\/p>\n<p>    If we dont closely monitor AIs use in healthcare, theres a    risk it will perpetuate existing biases and inequalities by    building systems with data that systemically fails to account    for anyone who is not white and male. At its core, this is not    a problem with AI, but a broader problem with medical research    and healthcare inequalities as a whole. But if these biases    arent accounted for in future technological models, we will    continue to build an even more uneven healthcare system than    what we have today.  <\/p>\n<p>    You can follow Rob on Twitter. Learn how to        write for Quartz Ideas. We welcome your comments at    <a href=\"mailto:ideas@qz.com\">ideas@qz.com<\/a>.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>Continue reading here:<\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"https:\/\/qz.com\/1023448\/if-youre-not-a-white-male-artificial-intelligences-use-in-healthcare-could-be-dangerous\/\" title=\"If you're not a white male, artificial intelligence's use in healthcare could be dangerous - Quartz\">If you're not a white male, artificial intelligence's use in healthcare could be dangerous - Quartz<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Healthcare inequalities are systemic and closely intertwined with social inequalities. In the US, black men and women can be expected to live a decade less than their white counterparts, and are also much more likely to die from heart disease, various types of cancer, and stroke <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/if-youre-not-a-white-male-artificial-intelligences-use-in-healthcare-could-be-dangerous-quartz\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187743],"tags":[],"class_list":["post-204761","post","type-post","status-publish","format-standard","hentry","category-ai"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/204761"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=204761"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/204761\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=204761"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=204761"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=204761"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}