24 thoughts on “Reading for Wednesday October 25th”

  1. In Safiya Umoja Noble’s Algorithms of Oppression, Noble details how neoliberal capitalist interests along with pre-existing biases within society combine to produce discriminatory search results for users. As Noble explains, “the dominant notion of search results as being both ‘objective’ and ‘popular’ makes it seem as if misogynist or racist search results are a mirror of the collective” (Noble 36). On the same page, the author explains how search results are often trusted and indistinguishable from sponsored content by the general public.

    I believe this is an important point to emphasize in that it relates to already pre-existing epistemological concerns regarding groups of people and kinds of methodologies that are seen as capable of producing knowledge. If people continue to believe that search results are generally objective and popular, it can influence people’s beliefs regarding how knowledge can be formed and reinforce pre-existing beliefs of what is true. Through search engines, people may perceive a correlation between objectivity and popularity. This may encourage people to view information they perceive as being popular as also being credible. In the case of the racist and sexist search results examined by Noble, the perceived popularity of racist and sexist information can convince a user that the information is what everyone else thinks or an accurate depiction of reality. After all, some argued that Donald Trump was a credible political candidate in 2016 because “he says what everyone is thinking,” which Google search results may be used to support (https://www.bbc.com/news/election-us-2016-36253275). Google search results can give credibility to information that is entirely divorced from reality, like racist and sexist search results, adjusting people’s criteria for what makes information credible.

    Reply
  2. At first, when reading the first chapter in this book I was skeptical about how much of a role Google played in search results. I thought that the majority of the search results were just a result of the frequency of humans and the more you visit a site, the higher the link would go to make it convenient for the users of the Internet. But boy oh boy was I semi-wrong. Perhaps I thought this because of the innocent persona that Google has crafted that entices us to trust the search results brought up by Google. Furthermore, I think this book is doing a great job of exploring the importance of generated search results because they are perpetuating racist sexist ideologies. Nowadays, I think it’s even more important that the information offered by a huge corporation like Google does not perpetuate white supremacy. Because now everyone uses Google to learn this is a very well-trusted website that should not be prioritizing making money over enforcing harmful stereotypes. It’s very important what we see and accept as models of society. For example, when I was younger, Disney Channel did not have that many shows featuring Black people they were typically best friends are supporting characters. While, the rest of the Disney shows focused on white presenting characters, doing amazing stuff, and living the main character’s life. So in that aspect, I think it’s easier for little white children to see themselves be successful and live fun lives. But since there weren’t that many shows of black kids living peacefully and having success, it made it harder to craft that image in my mind of with my success looked like. So the more that Google pushes these harmful false narratives the more it prevents us from seeing other cultures in holistic ways.

    Reply
  3. Noble’s analysis of Google’s search results provides a clear illustration of how technology is not neutral and reflects societal biases. It is alarming that a simple search for “black girls” can surface dehumanizing and stereotypical content, underscoring the need for more diversity in tech and better regulation. The role of commercial interests influencing access to information, replacing public resources like libraries, is concerning. It’s a stark reminder that we must critically engage with these platforms and advocate for more inclusive and equitable digital spaces. This analysis also highlights the need for alternatives to the current search engines, which continue to marginalize and misrepresent certain groups. It’s clear that true tech diversity and inclusion are crucial to prevent such oppressive outcomes.

    Reply
  4. Google’s search algorithm that we read about today adds to a list of technologies and algorithms that disproportionately benefit the elite. In the first chapter of Algorithms of Oppression by Safiya Umoja Noble, this is summarized by the fact that “Despite the widespread beliefs in the Internet as a democratic space where people have the power to dynamically participate as equals, the Internet is in fact organized to the benefit of powerful elites” (p.34). Again in these readings we approach a “welp it’s not human” point where the impact of human influence in these algorithms is ignored. In chapter two this is brought up since not only are minorities underrepresented by “jobs that could employ the expertise of people who understand the ramifications of racist and sexist stereotyping… are nonexistent” at Google and almost every big tech company (p.70). The reading points out some clear and obvious steps to at least alleviate the failures of equal representation in the technology workforce. The common excuse mentioned before is Google’s way of continuing to ignore critical solutions to the ‘algorithms’ problems.

    As we read, the lack of intersectional awareness in these algorithms leads to disgusting results. I checked a Google search today of both “Jews” and “Black Girls” and saw that the results that appeared in the article were no longer the top-appearing links. However, the fact that these results once came to the top is a signal to me that no matter how ‘far’ Google search has come, its history of problems reminds us of the inequities apparent in big tech to this day.

    Reply
  5. This reading was super interesting and informative. I really appreciated the examples given throughout the chapters. Most of them are absolutely surreal. When I was younger, I remember Google being more Wild West than it is now, with the occasional inappropriate link or image finding its way on the screen during innocent/unrelated searches. I was shocked to see just how easy it was to be loaded with pornographic, misogynistic, hypersexualizing content from completely neutral and unloaded words such as “black girls.” I can only imagine potential situations where, as others have said, small children sought to engage with content that placed people like them at the center and were assaulted with sexual content instead of finding what they were looking for.

    I also appreciated the sociological elements introduced to explain digital mirroring of racial hierarchies present in America, especially about the inability to be “raceless” in a society that bases position heavily on race and the dichotomy of black versus white. It reminded me of frustrations biracial people have expressed to me about being forcibly confined to one single box because of how people interpret their appearance and the rigid sociological boundaries and roles our society has based on race.

    I liked how the idea of intent was eschewed as well. It truly isn’t important what was “intended” by Google engineers when they let such search results get past them. What truly matters is that they are responsible for what their systems do and the harm they can cause.

    Reply
  6. Today’s readings discussed how the internet perpetuates the suppression of marginalized groups online. How the internet works when it comes to search results is that companies can pay to be put at the top search position for a specific search. Although, the average user does not usually have the knowledge to distinguish a sponsored top search result and a real one. The top result is widely accepted as the most credible, although this is not always the case. The internet pushes what is the most popular to the top, not the most important, which is extremely detrimental to marginalized groups.

    To discuss Google specifically, although it began as a simple search engine that promoted credible and fast results, it has morphed into the biggest advertising company in the world. They have changed from a company that prioritizes the betterment of people’s search for information to a company that prioritizes money above all else. When it comes to prioritizing money, it comes to the detriment of the quality of information that gets circulated. This raises concerns about the impact on the users who rely on Google for information. With so much information available on the internet, it’s crucial to have a search engine that can filter out the noise and provide trustworthy information.

    Reply
  7. The analysis of Google’s search results by Noble illustrates the non-neutrality of technology and its reflection of social biases. It raises concerns about the need for diversity in tech, better regulation, and the influence of commercial interests on information access.
    While Noble raised certain issues in the book, the situation is not as dire as she suggests. For instance, she cited a search result for “Black girls” as evidence of social biases, but when I did the same search, the result did not align with her argument. While it’s tempting to say that Google prioritizes profits and disregards marginalized groups, it is evident that the company is making efforts to improve its practices.
    Instead of getting an entirely new search engine, our focus should be on addressing the problems in the search engine.

    Reply
  8. Today’s reading was genuinely informative, highlighting once again the prevalence of race and gender biases within big tech corporations. I distinctly recall watching a video that explained one of the examples provided in the reading, specifically the one regarding the results displayed when searching for ‘three teenagers.’ However, I was unaware of the extent of hyper-sexualization, particularly concerning women of color. One point Noble made that stood out to me was when she posed the question of responsibility for these results. Initially, it may seem straightforward to blame the search engine, but Noble’s in-depth analysis reveals that the roots of these issues trace back to the times of slavery and involve various sources.
    Another aspect of the reading that surprised me was Google’s response to the search term ‘Jew’ in 2005. This example illustrates how Google, instead of actively seeking solutions to these problems, chose to place the burden on its users. It also raises the question of why Google addressed this particular issue publicly but remained silent about the hyper-sexualization of women. Finally, the reading underscores the importance of diversity in the development of these powerful tools.

    Reply
  9. Moderation is always a hot button issue when it comes to the internet, and this reading offers a convincing argument in support of better moderation. The various inappropriate responses, whether it be adult, sexist, racist, etc is disturbing, considering the various compounding factors that exist in today’s society. Trust in search engines is intrinsically tied with laziness and lack of attention span toward responsibly consuming content in the internet. We, as users of the internet, are often in search of an answer, and we are accustomed to instant gratification, so we want it quick. And Google (and other search engine makers) know this, which is why they are increasingly designed to scratch this itch. Think about what happens when you Google something now: a text excerpt usually pops up with some highlighted text similar to what you searched, and it’s usually one of the top results, and I’m positive many people in their haste will take whatever this information says at face value. I’ve done it, it’s a natural response by users of the internet these days. But this aspect of internet usage makes it incredibly dangerous for those who might be confronted with inaccurate—or worse—damaging material. Additionally, engaging with this material makes it easier to fall into the rabbit hole of misinformation that permeates the internet, even if it may not be the top search result.
    Some other lingering concerns I have about search engines are the inclusion of AI in search engines like Bing—which we have previously established as prone to present a false sense of confidence in incorrect answers to its users—and the fact that even despite googles apparent efforts to moderate their platform more effectively, it can offer an easy portal to other sites which are not moderated at an comparable level—let alone at all. 4chan, Reddit, discord, quora, etc are all accessible with simple google searches, and often may be the sources that pop to the surface for Google search results. It is not hard to imagine a user googling a question, being directed to a message board site or social networking site, and then being springboarded into a wealth of harmful and innacurste information.

    Reply
  10. I think it is interesting how ingrained Google and search engines in general are now to the way we process information. I think, growing up with Google at my fingertips, I have always been somewhat aware of the issues and pitfalls of Google. We are taught in research classes how to tailor our search wording and how to avoid certain misinformation, but in everyday interaction there are also unspoken rules about what terms are associated with porn and how different sites will be pushed to the top. I think it sometimes comes up in conversation, but I have never really thought about Google’s issues as something that could be fixed. The reading points out the many ways from picture results to search suggestion that Google harms minorities and enshrines systems of power, and while I, on some level, knew that Google had these issues I am not sure why I never critically thought about the impacts. I always thought of these as merely depictions of a greater cultural problem, without looking critically at Google’s impact. I think Google does a good job of pushing the blame off of themselves and onto their users, but I think after all we have discussed in this class, we cannot leave the makers of such an algorithm without blame.

    Reply
  11. The first chapter delves into an essential aspect of technology ethics, i.e., the role of search engine algorithms in perpetuating societal biases. It questions the neutrality of algorithms, emphasizing the socio-technical nature of these computational processes. The search results, as the author argues, are not merely reflections of societal opinions but are also shaped by commercial interests, advertising, and algorithmic prioritizations. In terms of practical implications, the biases present in search results can significantly impact various aspects of daily life. For instance, when individuals search for information, the biased results can reinforce stereotypical ideas, indirectly shaping people’s beliefs, attitudes, and decisions in various fields such as education, employment, and social interactions. The second chapter discusses the partnership between Google and Black Girls Code, an organization dedicated to mentoring African-American girls interested in computer programming, as part of Google’s diversity initiative. However, Noble critiques this partnership, asserting that it does not address the core issues of racist and sexist biases in the tech industry. The author reflects on historical algorithmic biases in Google searches, citing how searches for “black girls” previously yielded hypersexualized and offensive content. Despite the presence of organizations like Black Girls Code, these underlying biases within technology and its designs persist, partly due to the lack of diversity and inclusive perspectives in the tech industry.

    Reply
  12. Search engines, particularly Google, are not neutral but influenced by the broader social, cultural, and economic contexts. These effects often lead to reinforce systemic racism and discrimination as well as perpetuate biased social and cultural norms. These observations led Noble to create the term “algorithm oppression” and explain how search engines amplify harmful stereotypes and misinformation. I see this term apply to a lot of previous readings such as LLMs, where one of the readings mentioned how Twitter had a bot and ended up being racist and not suitable to the public due to its influence from society and gathering of data. This is similar to the search engines as they are also influenced by the people who search for certain keywords. Noble also points this out as they reveal how search engines prioritize commercial interests over social justice or accuracy. Although the search engines are not doing this intentionally, it shows that the people who created search engines and LLMs to learn from their environments did not consider the different possibilities that could occur. By showing ignorance of these concerns, we see that certain groups of people are misrepresented and stereotyped. I feel like these concepts also tie in with the idea of efficiency over actual consideration, as people are constantly thinking of the best way to create such things with the idea of what would be in the best interest of the general people but no concern for the minorities in this situation. So many concepts that we have learned so far are intertwined with each other that I find it very interesting as well as concerning how we would be able to tear this problem down one at a time.

    Reply
  13. In a completely unsurprising twist, given our experience with Google in this class so far, it is still evil. I was particularly floored by its putting the onus on users to not search Jew, but rather Jewish or Judaism, rather than filter out or deoptimize the SEO of pages of white supremacist rhetoric. In doing that, Google essentially is assuming that hate is the norm, and while they “don’t endorse it”, more people than not searching that term (a known name for a member of Judaism), they’re seeking the hateful content. Similarly, I found it very interesting, in the insiduous kind of way, the way it artificially affiliated pornographic themes with searches about women of color, particularly Black women, as this assumption of their bodies as objects continues to perpetuate the logics of slavery and, relatedly, white domination.
    All of this puts a great distaste in my mouth for Google, and potentially search engines broadly, but this is very difficult to reconcile with the increasingly digital world. I am not owed answers, but would love to know what, if anything, as a scientist or as a user, I can be doing to not contribute to this cycle of abuse.

    Reply
  14. I went through and did all of the google searches that the author did in chapter 1. It seems that we are in a better place in that regard, I made sure safe search was off and everything. That does not mean that Google is completely fixed, but at least some of these issues have been addressed. Seemed like a nice positive note to share, we don’t have many positive notes in this class.

    Regardless it is cool that this book called search algorithms into question and challenged our beliefs in terms of our own level of influence on the system. I knew that advertisers had an influence of course, but I also expected people’s clicks to be more influential than they came off in this reading. Just goes to show that google is still a corporation with profit motives, it’s public image is not representative. The whimsical nature of google presented in the film The Internship might exist in the office in real life, I can’t know, but it’s impact on society lacks whimsy.

    Reply
  15. For me, it seems that a big issue underlying these chapters is that Google doesn’t have an incentive for really doing too much because there are people who want to find racist and sexist content and they get more money the more people it can draw in, and people whose desired content is pushed out of the way by that garbage will just rephrase their search and likely be trained eventually to just do that by default, but a neo-nazis who’s not able to find their Nazi memorabilia will go use another search engine and Google loses money. Same with fair representation of minority groups, which by definition are less numerous and/or powerful and thus less “valuable” as customers. Unless there’s a shift in actually holding companies significantly monetarily accountable for actions with societal externalities, they won’t do anything.
    I also just wanted to say that the analogy of TVs from chapter 1 actually just kinda represents what the author was saying the UN campaign did wrong anyways. Like the TV content being racist is analogous to the search results being sexist, but the critique of the UN campaign is that it fails to acknowledge that the underlying infrastructure is too. It feels like this would be equivalent to including in this analogy the fact that TVs and photos and are unable to display the level of detail in dark skin that they can in lighter skin, which has led to underrepresentation and homogenization of people of color. But the underlying technology is exactly what they say people don’t need to understand in their analogy. It’s a minor thing and that analogy isn’t super important, but I thought it was interesting that it seemed to be an unintentional acknowledgement of how difficult/unattainable it is to expect the layperson to have the time to understand how a lot of tech around us work.

    Reply
  16. The racist and anti semetic search results discussed by th author were honestly kind of surprising to me. I was aware of phenomenon such as the over representation of white faces when searching “beautiful” for example, but the influx of pornography when simply searching “black girls” was new to me. This is obviously apalling, and extremely confusing to me. Why were pornographic results accessed without user prompting, from search strings completely devoid of sexual markers. I was also curious about the different statistics about how many people were comfortable using search algorithms if asked, but then balked at privacy concerns that they were probably already exposed to. In this case, I do think some greater tech literacy would be good, but when it comes to the search results, this is a responsibility that lies with computer scientists and algorithm writers to consider the ethical ramifications of our work.

    Reply
  17. It is interesting to think about how results should appear. It is not always the case that the most relevant is the most accurate. It is surprising to know that from the Google search engine, the keywords would bring that much difference. And I think I didn’t realize that it is a problem that the professor’s outfit is always appearing as white men. The lack of a model and the stereotypical biases that search engines bring based on page rank would destroy the interest of the young generation in thinking about careers. This would also intensify the identity bias problems that occur these days. As mentioned in the reading (page 39), the technology does not account for the complexities of human intervention involved in vetting information and does not pay attention to the relative weight of certain types of problems.

    Reply
  18. These chapters were incredibly interesting to me. Google parroting the biases of our offline world is apparent and unsurprising, but it’s still deeply uncomfortable to read about. It’s incredibly dangerous to believe what search engines show us are objective and representative of the real world, but to an extent many of us do that. Noble uses a quote Alex Halavais where he states search engines have become “an object of faith”, and I agree with this sentiment. For a lot of us, search engines are the primary source of how we interact with media. Much of the time it becomes difficult to gage what is trustworthy or not, or sponsored or not. I often catch myself taking the first few search results as objective without really processing where they come from. Instead of the internet being an equalizer, giving power to marginalized groups, it has only become a tool for reinforcing biased viewpoints.

    These problems were especially apparent to me as a kid. When I first started exploring the internet on my own, it wasn’t hard to find incredibly damaging content. What was usually on the front page of YouTube in the late 2000s/early 2010s was incredibly racist, demeaning content. As a half-white kid, I never saw white men being represented in degrading ways due to their race or gender. But as a half-Asian kid, watching videos on the front page representing Asian men with harmful stereotypes and poor impressions led me to believe that Asians were largely seen as punchlines by the American public. A number of people have brought up that Google has improved in recent years, but it’s still not hard to see how the internet displays our biases. While real progress has been made within search engines, the largest media platforms on the internet like FaceBook and Twitter still display harmful biased content regularly.

    Reply
  19. In the chapters by Noble, we once again see harmful results provided by algorithms taken without question by the public. Even when there is an awareness that the top search results aren’t the most credible or popular, people seem to think there must be some semblance of credibility for them to appear on Google in the first place; this ultimately acts as another mode to reinforce prejudices. Present in Google is also the lack of accountability when determining search results, falling back on the reasoning that computer generated results—that the company even acknowledges are harmful—mean the company is not at fault. It’s disheartening to see that the only time they’ll filter out harmful results when governments have laws in place and also worrying how easily search engines can choose to display or hide results while many users remain unaware.

    Reply
  20. In Noble’s book, she acutely breaks down the biases inherent in Google search that allow for pornographic results to top searches for “black girls” for instance. I thought this quote on the intent of technologist was particularly important in this context:

    “Many people say to me, ‘But tech companies don’t mean to be racist; that’s not their intent.’ Intent is not particularly important. Outcomes and results are important.” (Noble 90).

    I think this unimportance of intent is important to keep in mind, since techno-optimists often prioritize results to show how their product succeeds, and intent to cover for when it fails. The general public, however, who don’t know how Google Search is implemented, don’t care or think about how it’s implemented. They only see the results, so it’s critical that engineers validate their product and their own biases before carelessly releasing software.

    Reply
  21. It is interesting how one of the first solutions mentioned or proposed to the misrepresentation of minority groups in commercial search engines is to encourage them to become programmers and fix the systems themselves instead of educating programmers of majority groups. As much as it is necessary to have such an increased representation of voices and to resolve gender inequality in relevant industries, it is sad to think that we’re still at a point where minority groups have to take matters into their own hands and actively fight for their own rights. In order to have this radically addressed, we need to incorporate more ethics, sociology and humanities courses in technology curricula. Although it has been pointed out several times in the reading how challenging this might be, I believe it is a necessary step, especially as we are also moving towards more interdisciplinary education in undergraduate programs.

    I thought the discussion about the intent versus the outcome of the products is also really thought-provoking. The reading briefly talked about impact testing, which goes back to one of the points that was brought up in readings and class discussions a few weeks ago. All algorithms and products should be carefully tested on the groups of people that are most likely to be negatively impacted. Without this level of thorough and comprehensive analysis of how users of different identity backgrounds are affected in reality, these technological products should not be commercially sold or advertised. In response to the claim that outcomes are what matters and Google should be responsible for the racial and ethnic biases that show up on their search results, I was wondering how much they actually have control over those results, and whether the fact that these algorithms have drastically improved can be attributed to their engineers tweaking the logic or fixing glitches behind the scene, or if it’s attributed to something else. For example, I just did some quick Google searches on some of the keywords in the readings, and the results that I looked at seem much more diverse and free of bias. However, most of the images that came up were from top-searched websites, which made me wonder if the improvement in search algorithms is due to the fact that every company, organization and website is becoming more socially aware of these historical biases and making attempts to create more inclusive brand images.

    Reply
  22. Linking to our earlier discussions of search engines as oracles and the downfalls of this idea, Noble shows us how google search results do not necessarily yield the information that is most credible for a given search, but rather that information which benefits “neoliberal capital and social elites in the United States”. Noble also shows us how the math washing and purported objectivity of such technologies, technologies that often deal the classification and ranking of social information in relation to a social term, leads to the opposing belief (or installed ideology) that these search engines reflect social reality. While society if flawed, Noble’s argument that search engines, with the cultural power they hold, actually perpetuate these flaws because it is economically beneficial to do so implicates the algorithms and their creators as well as their true beneficiaries in a way in which blaming it on the organization of society does not. It exposes proliferation of social power structures as deliberate.

    Reply
  23. I despise the way in which Google tries to invade every aspect of our lives and in many ways it has become impossible to live a normal life without making numerous concessions to Google. I have tried to de-Google myself as much I can: I use Qwant (a search engine I highly recommend for others looking to take more control over their personal information), Firefox instead of Chrome (which also has privacy issues, but more private browsers tend to not be supported by many websites and programs), and Protonmail instead of Gmail. And yet I still use YouTube extensively, because there aren’t really any widely used video sharing alternatives other than TikTok, which doesn’t have videos longer than 10 minutes and also collects a lot of user data. I don’t have options for cloud-based collaborative file editing other than Google Drive, at least not any that the average person already has an account for. Even for personal use, my options for manipulating spreadsheets are basically limited to Google Sheets and Excel. I even have a Google Pixel, because I hardly have a choice. My cellphone has become an integral part of my daily life and ability to function and communicate with others, and seeing as I am still financially dependent on my parents, the alternatives are limited by the financially viable cellphone plans available to my family. The average person’s over-reliance on Google is highly coercive and in many cases institutionally endorsed and sanctioned, as Noble points out (36). This makes the biases endemic to their algorithms and practices all the more insidious.

    Reply
  24. I thought it was really sad/scary that people were dismissive of initial warnings/wariness of the potential harm of technology and algorithms as early as 2005. Regarding the situation around the word “Jew” (versus search results for “Jews” and “Jewish people”), Google’s resistance towards restructuring their algorithm and instead calling on users to alter their personal habits was interesting. I think holding ourselves accountable for the language we use to speak about marginalized groups is important, but it’s weird to think Google recognized a problem with their algorithm and took little sustainable action to rectify it.

    Both of these chapters critically engage with race based hyper-sexualization. A lot of the language search engines use to justify these egregious results focuses on the intentions of the tool, as if that absolves them of the nefarious ways people have used them. Instead, Google should be dynamically responding to instances of abuse with their technology, and rethinking the ways they approach coding these tools.

    Reply

Leave a Comment

css.php
The views and opinions expressed on individual web pages are strictly those of their authors and are not official statements of Grinnell College. Copyright Statement.