Algorithms and Prejudice
- Palace Jones
- Nov 19, 2018
- 2 min read
Palace Jones
The Campus Echo
November 19, 2018
Ah, Algorithms. A conundrum that is steadily plaguing the modern mind. It all comes down to the “chicken and the egg” philosophy. We created AI, but AI bases itself on how we continuously interact with it as well as impacts our search results based off of the top hits as well as your personal searches. It is hard to be objective in our web searches when AI has become biased, which helps to develop the statement of “Google is racist”. I know this may seem like a loaded statement, but the facts support this statement. Safiya Umoja Noble is a communications professor at the University of Southern California that wrote a whole book on the topic, exposing the AI monster that is Google called Algorithms of Oppression: How Search Engines Reinforce Racism.
Noble first got interested in the topic when she Google searched photos of black girls in hopes of encouraging her daughters, but as a result, she was met with pornographic photos of black women. She even went as far as to search for photos of Latina and Asian women and received the same kind of results. Google has since addressed the problem, but the theme continues to appear throughout Google searches.
In a NewScientist article, “Discriminating Algorithms: 5 times AI showed prejudice,” they presented an example of an AI prejudiced incident where a program called PredPol failed in producing the results for tracking drug dealers. The program’s purpose was to assist with reducing human biases in policing. Instead the program produced results of neighborhoods with racial minorities disregarding their crime rates.
Google also ended up playing major roles in racially motivated tragedies that occurred within the United States. An example of this can be found in the case of Dylann Roof. Roof used Google to search up “black-on-white crime” which in turn lead him to white supremacist websites that filled his mind with the propaganda to go through with the South Carolina Church Massacre.
We as users can also alter the results that appear when we search up a topic of a name in Google. An example of this in action is the connection of a photo of a monkey to the past First Lady Michelle Obama. Whenever someone would search up a photo of Michelle Obama, a photo of a monkey would pop up. The fact that individuals can alter the search results of Google is a problem. This is a power that can negatively impact our society and the growing generations that continuously use search engines such as Google to complete their work.
Noble’s work has positively impacted the shift in Google algorithms as they work to make search results less racist and biased, but as a society, we can do more. There is a lack of diversity in the technological field, especially on the internet. Minorities need to apply to work in positions of AI in Google to work and create a more objective form of Google. As we demand more for ourselves and expect better on the internet, we can begin to climb to a more equal playing field and lifestyle.
Comments