Technology has found a way to seep into every nook and cranny of the human experience. Throughout this digital shift, there has been one universal language that moves society toward an upward trajectory and lets humans live in an easier manner: coding.
Computer science is an increasingly important area of study, as the digital and physical world blends seamlessly together, like a beautifully choreographed dance. But like all things that humans touch, the mathematical sequences that create our digital existence are flawed by the systems of inequality that thrive within the physical world.
Each person, regardless of their background, has inherently adopted a set of subconscious biases. The prejudice that each person holds can come across in both superficial and significant ways, and when an engineer is creating an algorithm for a new system of artificial intelligence, those biases will undoubtedly affect the outcome.
This is not to say that humans are intentionally or deliberately ingraining their biases into these systems, but traces of their preferences break through from the data that is fed into these new creations.
Take, for instance, the AI software ImageNet.
In September, thousands of people uploaded their photos to a website called ImageNet Roulette, which used the AI software to analyze a persons face and describe what it saw.
This seemingly amusing game churned out a plethora of responses from nerd to nonsmoker. However, when Tabong Kima, a 24-year-old African American man, uploaded his smiling photo, the software analyzed him as an offender and wrongdoer, according to the New York Times.
To add insult to injury, the software also analyzed the man with him, another person of color, a spree killer.
At first glance, some may write this flawed social media trend as unimportant in the grand scheme of things, but that is far from the truth.
ImageNet is one of many data sets that have been extensively used by tech giants, start-ups and academic labs when they train new forms of artificial intelligence. This means that any flaws in this one data set, such as the racist labelling in Kimas case, have already spread far and wide throughout the digital realm of existence.
As engineers increasingly drift toward the production of AI software, with the goal of lifting tedious responsibilities from the shoulders of busy individuals, it is important to ensure the footprint of systemic inequality that has historically permeated throughout the world does not find a way into the powerful realm of software.
Software has progressively impacted each persons ability to thrive in the world. Whether thats through applying for credit cards or jobs, there continues to be less of a hands-on approach when these applications are reviewed.
This past week, Apple came under fire for their credit cards alleged sexist algorithms.
The conversation first surfaced when David Hansson, a prominent software engineer, tweeted about the issues he and his wife were having with Apples credit card.
The @AppleCard is such a f sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apples black box algorithm thinks I deserve 20x the credit limit she does. No appeals work, Hansson tweeted.
Not long after, Apples co-founder Steve Wozniak weighed in on the issue, stating that he and his wife experienced a similar issue with the Apple credit card.
To explain simply, a black box algorithm is a system where the inputs and outputs can be viewed by an observer, but without any knowledge of the internal system works. Meaning that although theres an output of data, no one knows how the system created that information.
In response to the deeply unsettling prospect of Apple limiting users in a way which hints at sexist black box practices, Sen. Ron Wyden (D-Oregon) tweeted, The risks of unaccountable, black box algorithms shouldnt be underestimated. As companies increasingly rely on algorithms to handle life-changing decisions and outcomes, federal regulators must do their part to stamp out discrimination before its written into code.
Goldman Sachs, the financial company that handles the credit limit for Apple Cards, has denied the use of black box algorithms, an even more important issue has surfaced from their response.
Even if black box algorithms are not in place, and a credit card application is not explicitly asking for the applicants gender, these refined machine-learning algorithms can still analyze the information its being fed and describe what gender they are analyzing. This is then applied to determine credit limits.
For instance, the machines could learn that applicants who have credit cards open at a particular womens clothing store are a bad financial risk. It could then provide lower credit limits for those who carry these cards, which results in women receiving lower credit limits than men, according to Forbes Magazine.
Another instance of gender-biased algorithms occurred in 2018, after Amazon engineers created an AI engine with the sole purpose of vetting through over 100 resumes to help choose the top candidates to be hired.
The tech giant realized that the engine was not rating women software engineer applicants in a fair way, because the resume patterns that the engine had been taught to replicate illustrated the stark gender gap within the tech industry.
In a male-dominated industry, Amazons system taught itself that male applicants were preferred over women.
Even after the engineers reprogrammed the system to ignore explicitly gendered words, like womens, the system still picked up on implicitly gendered words and used that to rate its applicants.
As these new systems are trained to learn from historical decisions made by humans, it must come as no surprise that the race and gender-based inequality that has plagued society for so long has now found a new home within the digital realm.
Discrimination is entangled in our private lives, thats just the truth. The tango of privilege continues to strut across all facets of human existence, and as this experience dives deeper within the world of artificial intelligence, engineers must ensure that they are not dipping further into the discrimination that minority communities have historically been shown.
Were all beginning to understand better that algorithms are only as good as the data that gets packed into them, said Sen. Elizabeth Warren (D-MA) in an interview with Bloomberg News. And if a lot of discriminatory data gets packed in, in other words, if thats how the world works, and the algorithm is doing nothing but sucking out information about how the world works, then the discrimination is perpetuated.
Theres no easy fix to this problem. The way in which bias affects the livelihood of individuals, and how to combat that in a fair manner, has long been a question for social scientists and philosophers. Expanding that issue into technology, where concepts have to be defined in mathematical terms, illustrates the hard work that must be done to create a truly fair digital environment.
While fixing these computing errors will rely on an extreme amount of trial and error, its the responsibility of software engineers to ensure that these new technologies will not cause more harm and discrimination toward people.
See the article here:
Artificial intelligence engines have implicit biases - The Daily Titan
- Classic reasoning systems like Loom and PowerLoom vs. more modern systems based on probalistic networks - November 8th, 2009 [November 8th, 2009]
- Using Amazon's cloud service for computationally expensive calculations - November 8th, 2009 [November 8th, 2009]
- Software environments for working on AI projects - November 8th, 2009 [November 8th, 2009]
- New version of my NLP toolkit - November 8th, 2009 [November 8th, 2009]
- Semantic Web: through the back door with HTML and CSS - November 8th, 2009 [November 8th, 2009]
- Java FastTag part of speech tagger is now released under the LGPL - November 8th, 2009 [November 8th, 2009]
- Defining AI and Knowledge Engineering - November 8th, 2009 [November 8th, 2009]
- Great Overview of Knowledge Representation - November 8th, 2009 [November 8th, 2009]
- Something like Google page rank for semantic web URIs - November 8th, 2009 [November 8th, 2009]
- My experiences writing AI software for vehicle control in games and virtual reality systems - November 8th, 2009 [November 8th, 2009]
- The URL for this blog has changed - November 8th, 2009 [November 8th, 2009]
- I have a new page on Knowledge Management - November 8th, 2009 [November 8th, 2009]
- N-GRAM analysis using Ruby - November 8th, 2009 [November 8th, 2009]
- Good video: Knowledge Representation and the Semantic Web - November 8th, 2009 [November 8th, 2009]
- Using the PowerLoom reasoning system with JRuby - November 8th, 2009 [November 8th, 2009]
- Machines Like Us - November 8th, 2009 [November 8th, 2009]
- RapidMiner machine learning, data mining, and visualization tool - November 8th, 2009 [November 8th, 2009]
- texai.org - November 8th, 2009 [November 8th, 2009]
- NLTK: The Natural Language Toolkit - November 8th, 2009 [November 8th, 2009]
- My OpenCalais Ruby client library - November 8th, 2009 [November 8th, 2009]
- Ruby API for accessing Freebase/Metaweb structured data - November 8th, 2009 [November 8th, 2009]
- Protégé OWL Ontology Editor - November 8th, 2009 [November 8th, 2009]
- New version of Numenta software is available - November 8th, 2009 [November 8th, 2009]
- Very nice: Elsevier IJCAI AI Journal articles now available for free as PDFs - November 8th, 2009 [November 8th, 2009]
- Verison 2.0 of OpenCyc is available - November 8th, 2009 [November 8th, 2009]
- What’s Your Biggest Question about Artificial Intelligence? [Article] - November 8th, 2009 [November 8th, 2009]
- Minimax Search [Knowledge] - November 8th, 2009 [November 8th, 2009]
- Decision Tree [Knowledge] - November 8th, 2009 [November 8th, 2009]
- More AI Content & Format Preference Poll [Article] - November 8th, 2009 [November 8th, 2009]
- New Planners Solve Rescue Missions [News] - November 8th, 2009 [November 8th, 2009]
- Neural Network Learns to Bluff at Poker [News] - November 8th, 2009 [November 8th, 2009]
- Pushing the Limits of Game AI Technology [News] - November 8th, 2009 [November 8th, 2009]
- Mining Data for the Netflix Prize [News] - November 8th, 2009 [November 8th, 2009]
- Interview with Peter Denning on the Principles of Computing [News] - November 8th, 2009 [November 8th, 2009]
- Decision Making for Medical Support [News] - November 8th, 2009 [November 8th, 2009]
- Neural Network Creates Music CD [News] - November 8th, 2009 [November 8th, 2009]
- jKilavuz - a guide in the polygon soup [News] - November 8th, 2009 [November 8th, 2009]
- Artificial General Intelligence: Now Is the Time [News] - November 8th, 2009 [November 8th, 2009]
- Apply AI 2007 Roundtable Report [News] - November 8th, 2009 [November 8th, 2009]
- What Would You do With 80 Cores? [News] - November 8th, 2009 [November 8th, 2009]
- Software Finds Learning Language Child's Play [News] - November 8th, 2009 [November 8th, 2009]
- Artificial Intelligence in Games [Article] - November 8th, 2009 [November 8th, 2009]
- Artificial Intelligence Resources - November 8th, 2009 [November 8th, 2009]
- Alan Turing: Mathematical Biologist? - April 25th, 2012 [April 25th, 2012]
- BBC Horizon: The Hunt for AI ( Artificial Intelligence ) - Video - April 30th, 2012 [April 30th, 2012]
- Can computers have true artificial intelligence" Masonic handshake" 3rd-April-2012 - Video - April 30th, 2012 [April 30th, 2012]
- Kevin B. Korb - Interview - Artificial Intelligence and the Singularity p3 - Video - April 30th, 2012 [April 30th, 2012]
- Artificial Intelligence - 6 Month Anniversary - Video - April 30th, 2012 [April 30th, 2012]
- Science Breakthroughs - April 30th, 2012 [April 30th, 2012]
- Hitman: Blood Money - Part 49 - Stupid Artificial Intelligence! - Video - April 30th, 2012 [April 30th, 2012]
- Research Members Turned Off By HAARP Artificial Intelligence - Video - April 30th, 2012 [April 30th, 2012]
- Artificial Intelligence Lecture No. 5 - Video - April 30th, 2012 [April 30th, 2012]
- The Artificial Intelligence Laboratory, 2012 - Video - April 30th, 2012 [April 30th, 2012]
- Charlie Rose - Artificial Intelligence - Video - April 30th, 2012 [April 30th, 2012]
- Expert on artificial intelligence to speak at EPIIC Nights dinner - May 4th, 2012 [May 4th, 2012]
- Filipino software engineers complete and best thousands on Stanford’s Artificial Intelligence Course - May 4th, 2012 [May 4th, 2012]
- Vodafone xone™ Hackathon Challenges Developers and Entrepreneurs to Build a New Generation of Artificial Intelligence ... - May 4th, 2012 [May 4th, 2012]
- Rocket Fuel Packages Up CPG Booster - May 4th, 2012 [May 4th, 2012]
- 2 Filipinos finishes among top in Stanford’s Artificial Intelligence course - May 5th, 2012 [May 5th, 2012]
- Why Your Brain Isn't A Computer - May 5th, 2012 [May 5th, 2012]
- 2 Pinoy software engineers complete Stanford's AI course - May 7th, 2012 [May 7th, 2012]
- Percipio Media, LLC Proudly Accepts Partnership With MIT's Prestigious Computer Science And Artificial Intelligence ... - May 10th, 2012 [May 10th, 2012]
- Google Driverless Car Ok'd by Nevada - May 10th, 2012 [May 10th, 2012]
- Moving Beyond the Marketing Funnel: Rocket Fuel and Forrester Research Announce Free Webinar - May 10th, 2012 [May 10th, 2012]
- Rocket Fuel Wins 2012 San Francisco Business Times Tech & Innovation Award - May 13th, 2012 [May 13th, 2012]
- Internet Week 2012: Rocket Fuel to Speak at OMMA RTB - May 16th, 2012 [May 16th, 2012]
- How to Get the Most Out of Your Facebook Ads -- Rocket Fuel's VP of Products, Eshwar Belani, to Lead MarketingProfs ... - May 16th, 2012 [May 16th, 2012]
- The Digital Disruptor To Banking Has Just Gone International - May 16th, 2012 [May 16th, 2012]
- Moving Beyond the Marketing Funnel: Rocket Fuel Announce Free Webinar Featuring an Independent Research Firm - May 23rd, 2012 [May 23rd, 2012]
- MASA Showcases Latest Version of MASA SWORD for Homeland Security Markets - May 23rd, 2012 [May 23rd, 2012]
- Bluesky Launches Drones for Aerial Surveying - May 23rd, 2012 [May 23rd, 2012]
- Artificial Intelligence: What happened to the hunt for thinking machines? - May 25th, 2012 [May 25th, 2012]
- Bubble Robots Move Using Lasers [VIDEO] - May 25th, 2012 [May 25th, 2012]
- UHV assistant professors receive $10,000 summer research grants - May 27th, 2012 [May 27th, 2012]
- Artificial intelligence: science fiction or simply science? - May 28th, 2012 [May 28th, 2012]
- Exetel taps artificial intelligence - May 29th, 2012 [May 29th, 2012]
- Software offers brain on the rain - May 29th, 2012 [May 29th, 2012]
- New Dean of Science has high hopes for his faculty - May 30th, 2012 [May 30th, 2012]
- Cognitive Code Announces "Silvia For Android" App - May 31st, 2012 [May 31st, 2012]
- A Rat is Smarter Than Google - June 5th, 2012 [June 5th, 2012]