{"id":182481,"date":"2017-03-09T03:22:39","date_gmt":"2017-03-09T08:22:39","guid":{"rendered":"http:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/artificial-intelligence-for-cars-may-drive-future-of-healthcare-healthline\/"},"modified":"2017-03-09T03:22:39","modified_gmt":"2017-03-09T08:22:39","slug":"artificial-intelligence-for-cars-may-drive-future-of-healthcare-healthline","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/artificial-intelligence\/artificial-intelligence-for-cars-may-drive-future-of-healthcare-healthline\/","title":{"rendered":"Artificial Intelligence for Cars May Drive Future of Healthcare &#8211; Healthline"},"content":{"rendered":"<p><p>    The same artificial intelligence that may soon drive your new    car is being adapted to help drive interventional radiology    care for patients.  <\/p>\n<p>    Researchers at the University of California, Los Angeles    (UCLA), have used advanced artificial intelligence, also called    machine learning, to create a chatbot or Virtual    Interventional Radiologist (VIR).  <\/p>\n<p>    This device communicates automatically with a patients    physicians and can quickly offer evidence-based answers to    frequently asked questions.  <\/p>\n<p>    The scientists will present their research today at the Society    of Interventional Radiologys 2017 annual scientific    meeting in Washington, D.C.  <\/p>\n<p>    This breakthrough will allow clinicians to give patients    real-time information on interventional radiology procedures as    well as planning the next step of their treatment.  <\/p>\n<p>    Dr. Edward W. Lee, assistant professor of radiology at UCLAs    David Geffen School of Medicine, and one of the authors of the    study, said he and his colleagues theorized they could use    artificial intelligence in low-cost, automated ways to improve    patient care.  <\/p>\n<p>    The fundamental technology that has made self-driving cars    possible is deep learning, a type of artificial intelligence    modeled after the connections in the human brain, explained    Dr. Kevin Seals, resident physician in diagnostic radiology at    UCLA Health, and a study co-author, said in a Healthline    interview.  <\/p>\n<p>    Seals, who programmed the VIR, said advanced computers and the    human brain have a number of similarities.  <\/p>\n<p>    Using deep learning, computers are now essentially as good as    humans at identifying particular objects, making it possible    for self-driving cars to see and appropriately navigate their    environment, he said.  <\/p>\n<p>    This same technology can allow computers to understand complex    text inputs such as medical questions from healthcare    professionals, he added. By implementing deep learning using    the IBM Watson cognitive technology and Natural Language    Processing, we are able to make our virtual interventional    radiologist smart enough to understand questions from    physicians and respond in a smart, useful way.  <\/p>\n<p>        Read more: Regenerative medicine has a bright future      <\/p>\n<p>    Think of it as an initial, superfast layer of information    gathering that can be used prior to taking the time to contact    an actual human diagnostic or interventional radiologist, Seals    said.  <\/p>\n<p>    The user simply texts a question to the virtual radiologist,    which in many cases provides an excellent, evidence-based    response more or lessinstantaneously, he said.  <\/p>\n<p>    He noted that if the patient doesnt receive a helpful    response, they are rapidly referred to a human radiologist.  <\/p>\n<p>    Tools such as our chatbot are particularly important in the    current clinical environment, which focuses on quality metrics    and follows evidence-based clinical guidelines that are proven    to help patients, he said.  <\/p>\n<p>    Seals said a team of academic radiologists curated the    information provided in the application from the radiology    literature, and it is rigorously scientific and evidence-based.  <\/p>\n<p>    We hope that using the application will encourage cutting-edge    patient management that results in improved patient care and    significantly benefits our patients, he added.  <\/p>\n<p>    It can be thought of as texting with a virtual    representation of a human radiologist that offers a significant    chunk of the functionality of speaking with an actual human    radiologist, Seals said.  <\/p>\n<p>    When the non-radiologist clinician texts a question to the VIR,    deep learning is used to understand that message and respond in    an intelligent manner.  <\/p>\n<p>    We get a lot of questions that are fairly readily automated,    Seals said. Such as I am worried that my patient has a blood    clot in their lungs. What is the best type of imaging to    perform to make the diagnosis? The chatbot can respond to    questions like this in a supersmart, evidence-based way.  <\/p>\n<p>    Sample responses, he said, can include instructive images (for    example, a flowchart that shows a clinical algorithm), response    text messages, and subprograms within the application  such as    a calculator to determine a patients Wells score, a metric    doctors use to guide clinical management.  <\/p>\n<p>    The VIR application resembles an online customer service chat.  <\/p>\n<p>    To create a crucial foundation of knowledge, the researchers    fed the app more than 2,000 data points that simulated the    common inquiries interventional radiologists receive when they    meet with patients.  <\/p>\n<p>        Read more: A watch that tells you when youre getting sick      <\/p>\n<p>    When a referring clinician asks a question, the extensive    knowledge base of the app allows it to respond instantly with    the best answer.  <\/p>\n<p>    The various forms of responses can include websites,    infographics, and custom programs.  <\/p>\n<p>    If the VIR determines that an answer requires a human response,    the program will provide contact information for a human    interventional radiologist.  <\/p>\n<p>    The app learns as clinicians use it, and each scenario teaches    the VIR to become increasingly smarter and more powerful, Seals    said.  <\/p>\n<p>    The nature of chatbot communications should protect patient    privacy.  <\/p>\n<p>    Confidentiality is critically important in the world of modern    technology and something we take very seriously, Seals said.  <\/p>\n<p>    He added that the application was created and programmed by    physicians with extensive HIPAA (Health Insurance Portability    and Accountability Act of 1996) training.  <\/p>\n<p>    We are able to avoid these issues because users ask questions    in a general and anonymous manner, Seals said. Protected    health information is never needed to use the application, nor    is it relevant to its function.  <\/p>\n<p>    All users  professional healthcare providers such as    physicians and nurses  must agree to not include any specific    protected patient information in their texts to the chatbot, he    added.  <\/p>\n<p>    None of the diverse functionality within the application    requires specific patient information, Seals said.  <\/p>\n<p>        Read more: Artificial bones are the latest thing in 3-D    printing   <\/p>\n<p>    This new technology represents the fastest and easiest way for    clinicians to get the information they need in the hospital,    starting with radiology and eventually expanding to other    specialties such as neurosurgery and cardiology, Seals said.  <\/p>\n<p>    Our technology can power any type of physician chatbot, he    explained. Currently, there are information silos of sorts    that exist between various specialists in the hospital, and    there is no good tool for rapidly sharing information between    these silos. It is often slow and difficult to get a busy    radiologist on the phone, which inconveniences clinicians and    delays patient care.  <\/p>\n<p>    Other clinicians at the UCLA David Geffen School of Medicine    are testing the chatbot, and Seals and Lee say their technology    is fully functional now.  <\/p>\n<p>    We are refining it and perfecting it so it can thrive in a    wide release, Seals said.  <\/p>\n<p>    Seals engineering and software background allowed him to    perform the necessary programming for the as-yet unfunded    research project. He said he and his colleagues will seek    funding as they expand.  <\/p>\n<p>    This breakthrough technology will debut soon.  <\/p>\n<p>    The VIR will be made available in about one month to all    clinicians at the UCLA Ronald Reagan Medical Center. Further    use at UCLA will help the team to refine the chatbot for wider    release.  <\/p>\n<p>    The VIR could also become a free app.  <\/p>\n<p>    We are exploring potential models for releasing the    application, Seals said. It may very well be a free tool we    release to assist our clinician colleagues, as we are academic    radiologists focused on sharing knowledge and improving    clinical medicine.  <\/p>\n<p>    The researchers described the importance of the VIR in a    summary of their findings: Improved artificial intelligence    through deep learning has the potential to fundamentally    transform our society, from automated image analysis to the    creation of self-driving cars.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>View original post here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow\" href=\"http:\/\/www.healthline.com\/health-news\/artificial-intelligence-car-radiology\" title=\"Artificial Intelligence for Cars May Drive Future of Healthcare - Healthline\">Artificial Intelligence for Cars May Drive Future of Healthcare - Healthline<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> The same artificial intelligence that may soon drive your new car is being adapted to help drive interventional radiology care for patients. Researchers at the University of California, Los Angeles (UCLA), have used advanced artificial intelligence, also called machine learning, to create a chatbot or Virtual Interventional Radiologist (VIR).  <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/artificial-intelligence\/artificial-intelligence-for-cars-may-drive-future-of-healthcare-healthline\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":5,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187742],"tags":[],"class_list":["post-182481","post","type-post","status-publish","format-standard","hentry","category-artificial-intelligence"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/182481"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=182481"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/182481\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=182481"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=182481"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=182481"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}