{"id":1122505,"date":"2024-02-26T00:18:51","date_gmt":"2024-02-26T05:18:51","guid":{"rendered":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/uncategorized\/google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis-the-verge\/"},"modified":"2024-02-26T00:18:51","modified_gmt":"2024-02-26T05:18:51","slug":"google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis-the-verge","status":"publish","type":"post","link":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis-the-verge\/","title":{"rendered":"Google explains Gemini&#8217;s embarrassing AI pictures of diverse Nazis &#8211; The Verge"},"content":{"rendered":"<p><p>    Google has issued an explanation for the embarrassing and    wrong images generated by its Gemini AI tool. In     a blog post on Friday, Google says its model produced    inaccurate historical images due to tuning issues. The    Verge and others caught Gemini generating images of        racially diverse Nazis and US Founding Fathers earlier this    week.  <\/p>\n<p>    Our tuning to ensure that Gemini showed a range of people    failed to account for cases that should    clearlynotshow a range, Prabhakar    Raghavan, Googles senior vice president, writes in the post.    And second, over time, the model became way more cautious than    we intended and refused to answer certain prompts entirely     wrongly interpreting some very anodyne prompts as sensitive.  <\/p>\n<p>    This led Gemini AI to overcompensate in some cases, like what    we saw with the images of the racially diverse Nazis. It also    caused Gemini to become over-conservative. This resulted in    it refusing to generate specific images of a Black person or    a white person when prompted.  <\/p>\n<p>    In the blog post, Raghavan says Google is sorry the feature    didnt work well. He also notes that Google wants Gemini to    work well for everyone and that means getting depictions of    different types of people (including different ethnicities)    when you ask for images of football players or someone    walking a dog. But, he says:  <\/p>\n<p>      However, if you prompt Gemini for images of a specific type      of person  such as a Black teacher in a classroom, or a      white veterinarian with a dog  or people in particular      cultural or historical contexts, you should absolutely get a      response that accurately reflects what you ask for.    <\/p>\n<p>    Raghavan says Google is going to continue testing Gemini AIs    image-generation abilities and work to improve it    significantly before reenabling it. As weve said from the    beginning, hallucinations are a known challenge with all LLMs    [large language models]  there are instances where the AI just    gets things wrong, Raghavan notes. This is something that    were constantly working on improving.  <\/p>\n<p><!-- Auto Generated --><\/p>\n<p>See the original post here: <\/p>\n<p><a target=\"_blank\" rel=\"nofollow noopener\" href=\"https:\/\/www.theverge.com\/2024\/2\/23\/24081309\/google-gemini-embarrassing-ai-pictures-diverse-nazi\" title=\"Google explains Gemini's embarrassing AI pictures of diverse Nazis - The Verge\">Google explains Gemini's embarrassing AI pictures of diverse Nazis - The Verge<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p> Google has issued an explanation for the embarrassing and wrong images generated by its Gemini AI tool. In a blog post on Friday, Google says its model produced inaccurate historical images due to tuning issues. The Verge and others caught Gemini generating images of racially diverse Nazis and US Founding Fathers earlier this week <a href=\"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/ai\/google-explains-geminis-embarrassing-ai-pictures-of-diverse-nazis-the-verge\/\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[187743],"tags":[],"class_list":["post-1122505","post","type-post","status-publish","format-standard","hentry","category-ai"],"_links":{"self":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1122505"}],"collection":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/comments?post=1122505"}],"version-history":[{"count":0,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/posts\/1122505\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/media?parent=1122505"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/categories?post=1122505"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.euvolution.com\/prometheism-transhumanism-posthumanism\/wp-json\/wp\/v2\/tags?post=1122505"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}