Automating Inequality: How high-tech tools profile, police, and punish the poor. Chapter 3 (AE-Chapter3.pdf )
Invisible Women: Data Bias in a world designed for men. Chapter 1: Can Snow-Clearing be Sexist? (ebsco link)
Automating Inequality: How high-tech tools profile, police, and punish the poor. Chapter 3 (AE-Chapter3.pdf )
Invisible Women: Data Bias in a world designed for men. Chapter 1: Can Snow-Clearing be Sexist? (ebsco link)
As discussed in a history class at Grinnell, the nature of governance has changed significantly since World War II. After World War II, the United States government and a number of European governments started state welfare programs. For example, the United Kingdom socialized medicine through the National Health Service, and the United States created the Social Security Administration, Medicare, and Medicaid. To distribute welfare benefits and determine welfare eligibility, these countries began collecting more information on income, employment, and health status. People began to conceive of the state as responsible for its citizens, being responsible for the distribution of some kind of welfare. However, changing attitudes regarding the state and its role in society required governments to collect information on its citizens. After all, a government cannot determine welfare spending if it does not collect information on citizens to determine their eligibility. That being said, government collection of information can be subject to abuse.
As Virginia Eubanks explains in Automating Inequality, government collection of information for welfare benefits, like low-income housing, allows for the criminalization of unhoused people, creates privacy concerns, and delay the distribution of benefits. Moreover, in the digital age, organizations created a coordinated entry system, where information would be collected from unhoused people and used to distribute housing benefits (p. 85). An algorithm would analyze the collected information and score applicants on how urgent their case is for housing welfare. In automating the process of distributing welfare benefits, the organization created a standard for what an unhoused person with urgent needs looks like (p. 93). As discussed in previous class periods, these algorithms can be biased and can encourage people in desperate situations to game the system. However, I would like to discuss the issue of government surveillance. In the modern era, governments tend to provide people with welfare benefits, but this can encourage groups with lower socioeconomic statuses into being surveilled by the state. So, this information can be a powerful weapon against those of lower socioeconomic status, but it allows for welfare benefits, making this a difficult balance.
When it comes to how governments “improve” their public spaces such as housing projects and layouts, according to these readings, they usually fail to consider a lot of the needs of the people they are creating them for, and/ or they make it difficult for them to inhabit those spaces.
According to Automating Inequality-How High-Tech Tools Profile, Police, and Punish the Poor, Eubanks discusses how people in need of housing are surveyed, where they answer a series of questions and give a myriad of personal information, and are then given a score that determines what housing they will receive if they receive any at all. However, this gathering of information seems to enable the increase of policing and incarceration of these people, rather than give them the opportunity to put themselves in a better situation by being housed. If they opt for a more in-depth description of who they are giving their information to, one of the institutions receiving it is the LAPD. The more low-risk a candidate is, the more likely they are to be housed. But, since they are being constantly surveilled (they even have to specify where they would be throughout the day), they are more likely to be arrested, which raises their risk.
When looking at Invisible Women: Data Bias in a World Designed for Men, the inequality when it comes to housing is discussed through a different lens, that mainly focuses on how housing and roads are not usually built with women’s needs and common duties in mind. Male contractors usually fail to consider why a woman would need an elevator to her floor (strollers, groceries, etc.) why women would need the kitchen in the heart of the home (better family management), or why the sidewalks should be even and smooth for wheels (pushing stroller or trollies of some kind). If these considerations were paid more attention to, they would benefit women the most, but they would benefit men as well, and once that is realized, there may be a possibility of it becoming common practice.
In Automating Inequality-How High-Tech Tools Profile, Police, and Punish the Poor, Eubanks talks about how algorithms, based on the survey of VI-SPDAT, could build the shelter selection system in Los Angeles. Although it somehow helps the government resolve the problem that who should get the shelter, it also highly depends on the criteria of different survey takers and how they are familiar with the homeless people group. One of the homeless women mentions that she feels bad about that she got the house but many women who have the same conditions are not able to get the house in the next 3 years. This is created by the bias of the interviewer since they are the ones who set the score.
Besides, it also leads to the problem that some people who are healthy enough have so less opportunities to go back to normal life since they are not the priority group in getting the shelter. They need to work even harder to bear the living costs but are not able to get houses. It is a pity to read the story that a bachelor-degree owner who has relatively good job experience is not able to get a job and has to stay in the shelter for many years.
The readings for class today first divulged into the use of big data in finding housing for the homeless population of Skid Row. However, as we have often seen, access to big data and the promises of technology for efficiency can leave room for harm. What was particularly tough about this reading was how data is being collected from those with no other place to turn. The individuals of Skid Row have no other choice but to submit as much personal information as they can to VI-SPDAT for the sake of a chance to find somewhere to live. Through the stories told within the chapter, it becomes clear that the VI-SPDAT survey has much room for improvement. In my eyes and I am sure in the eyes of many of our class, considering prison to be housing, thus reducing your VI-SPDAT score seems inherently problematic. Such a system thus encourages not getting arrested and going to prison when the people of Skid Row live in an environment subject to frequent and overbearing police surveillance. While data collection was exemplified to be a source of good in supporting the HHH reforms, to me it seems that the wide collection of data from individuals lacking knowledge and are inherently disparate with the slim possibility of housing seems extremely corrupt.
The second reading for today’s class, Invisible Women: Data Bias in a World Designed for Men by Caroline Perez, exposed the underlying forms of patriarchy in our society. In particular, the discussion of houses, buildings, and roads among other things were designed for men and with only them in mind. Some truly staggering figures were brought up in this reading including how “women’s unpaid care work contributes $10 trillion to annual global GDP” (Perez, p.42) giving evidence of how women are underpaid in a capitalistic world.
The chapter may have only discussed infrastructure but the nature of a world designed for men stretches beyond that. Childcare in particular is a burden often assumed to be and often is taken on by women. From the onset of childbirth with the lack of adequate paid maternity leave in most of the US, the system designed for men begins. As we read today, this system continues through almost all aspects of childcare duties. Whether internalized or not, we live in a world that has been designed for men in both infrastructure and society.
Algorithms are only as good as the data they use, and as mentioned in Automating Inequality, data is only as good as its collector. We’ve talked a lot about this point in class, but it is always interesting to hear new facets of this discussion, especially as we move further into data-driven daily life. In homelessness, while the effort to distinguish who needs housing the most is rooted in social good, it presents a depressing reality, since housing is a universal need, and in constructing an algorithm to delineate risk, we are bound to leave some well-deserving people behind. Again this conjures up the eternal conflict between doing marginally better, and if it should be thought of as an improvement if there are still massive flaws in the system.
In the “Can Snow-Clearing be Sexist?” article, the notion of incomplete data comes into play, where data that may have certain implications is used in public settings, but may lack important distinctions, such as gender. This is another reason for algorithmic inaccuracies, since the data may suggest a certain pattern outwardly, but there are numerous intersection analyses that are missing, and thus of whatever solution is proposed from that data is lacking.
Both excerpts highlight how systemic biases and lack of data can lead to the exclusion of marginalized groups in public systems and services. The first excerpt on gender bias in urban planning demonstrates how spaces like public transit, parks, and housing are often designed based on male experiences and needs. Things like zoning laws, public toilet allocation, and transit routes cater to traditional male roles and patterns of movement through cities. As a result, women’s needs for safety, accessibility, and care work accommodation are overlooked, restricting their access and convenience.
Similarly, the LA homelessness services excerpt shows how a supposedly objective coordinated entry system ended up perpetuating existing biases and barriers against the homeless population it aimed to serve. By largely lacking affordable housing to match people to, it became more a surveillance tool than an effective solution. Extensive personal data collection also raised serious privacy issues given police access. While well-intentioned, both examples show systems based on limited perspectives and priorities that end up discriminatorily impacting disadvantaged groups.
I found the excerpts eye-opening in demonstrating the pervasiveness of male-centered design and hidden biases in public systems. Things like zoning codes, transit routes, and homelessness prioritization algorithms seem technical and neutral but in fact, encode certain assumptions and value judgments. To create truly equitable, accessible public services, diverse lived experiences, and qualitative narratives, not just quantitative data, need to be considered in design processes. Systems must be constantly re-evaluated to examine who is being underserved and how policies and environments can be reshaped to meet the needs of all citizens, especially the most vulnerable.
These readings are on different cases of potentially well-intentioned government interventions which failed to truly meet the needs of the people which they serve. The first one on Skid Row talks about how homelessness is not a systems engineering issue, it’s a carpentry issue. The second one talks about a lot of things, the favela relocation initiative in Brazil is what I’m going to focus on. Specifically, because it references more successful projects in Vienna, ones that considered the real perspectives of the people they were serving. The homes were built in a way that specifically met the needs of those that were homed there, whereas in Brazil they erected buildings and tossed people in there without consideration as to what it would be like for them to live there. It feels like this applies to the homelessness issue on skid row in that the social programs there are well-intentioned and are trying to collect the information necessary to help the people who are most in need of help first. But there is no consideration as to what the effect of collecting and distributing that data might be, or what to do with all of the other people that are not YET rated as highly vulnerable.
The reading delves deep into the sociopolitical evolution of the Skid Row neighborhood, providing insights into its historical transformation, the systemic issues, and the technologies applied to address the challenges of homelessness. The core technology discussed is the coordinated entry system, which can be seen as a large-scale data management and matchmaking service for unhoused individuals. While it serves a functional purpose, it simultaneously raises concerns about privacy, ethical treatment of vulnerable populations, and the potential for data misuse or misinterpretation. The “Match.com of homeless services” analogy suggests an algorithmic approach to pairing individuals with services. Behind this seemingly simple system lies a myriad of computational challenges – accurate data collection, efficient and fair allocation algorithms, and safeguards to protect sensitive personal information. It brings to the forefront the larger conversation about how technology is being used to address complex societal problems and whether it is, at times, a superficial solution to deeper-rooted issues. In our daily lives, we’ve seen similar tensions arise, from personalized advertising based on our online behaviors to facial recognition technologies employed in public spaces. As technology becomes increasingly intertwined with our lives, understanding and navigating these ethical dilemmas will become a daily challenge for many of us.
Both readings argue about how the lack of data for marginalized groups causes disadvantages to them.
The first reading highlights the challenges faced by the homeless population and the efforts to address their needs through a VI-SPDAT system. It emphasizes the struggles and personal stories of individuals experiencing homelessness, their interactions with social services, and the complexities of the VI-SPDAT survey used to assess their vulnerability.
The second reading discusses gender disparities in urban transport planning and housing policies. It highlights how traditional transport planning often neglects the specific needs of women, who tend to travel encumbered by shopping, children, and other responsibilities. It emphasizes the importance of considering gender-specific needs and collecting sex-disaggregated data in urban planning and policy-making.
It is impossible to establish a system that covers everyone’s needs, but policy providers should consider marginalized groups’s needs.
In reading the Automating Inequality chapter, the court cases were what appealed to my academic interests, but the whole chapter was interesting and saddening. Having to present yourself as in dire-enough straights to need housing but not “risky” enough to need supervision. A single impounded car putting you on the streets and needing to surrender your cellphone or pay for a credit check to even have a chance at housing. Judges essentially blackmailing you to get out of their jurisdiction so you’re someone else’s problem. It’s not controversial to say that America’s social safety net is a failure, but I think it’s striking how much it is. I also wonder what it would be like if the Prosperity Gospel had never been so popularized since I think this enduring association of money with moral character (which def existed before then too) prevents people from having the same concerns that they do about the tax rate towards the rich. Since the popularly presented thinking – “someday I might be a millionaire and I wouldn’t want the government taking all my money” – is powered by a view that I’m a moral person and so I’ll get what’s coming to me, the thinking that “someday I might be unhoused and I wouldn’t want the government to abandon me” has much less strength since everyone tends to believe that they’re a good (or at least not bad) person and that bad things don’t usually happen to good people (while also underestimating the number of things out of our own control). Regardless, I was struck by Monique’s quote that “you get tired of being mentally, physically, and emotionally beat the hell down… There’s three ways to go if you don’t get housed: jail, institutions, or death.” It really pisses me off when I hear people (politicians or family members) discuss poverty as an issue of laziness, when it’s so completely draining to even exist when society is actively fighting against your presence, much less if you’re a person of color where that’s often the case even when you have housing and wealth or if you don’t speak English and your only hope lies in stacks of forms you can’t understand.
I find it very interesting to point out the problems of algorithms and how they don’t focus much on marginalized groups. I noticed that even when we were talking about how algorithms do not judge based on race last week, we still realized that the data we put into these algorithms are what makes them not too focused on marginalized groups as we mainly have data on the bigger and “important” groups of society. By doing this, we are ignoring several other groups with the idea of being able to be efficient and helpful. However, by being efficient, we are not considering other groups that could be involved in this or even just individuals that need it more than others. These ideas create such inequalities as well that it makes us think, are algorithms really the solution to our problems? If we are just using our own judgment and data, wouldn’t the algorithms be just as bad? By being able to point out these problems and being aware of certain points, I believe that we as computer scientists could be a bit better at solving these issues without having to rely on the power of technology and let algorithms do the work, especially when it is a very unequal algorithm.
I think that the use of data to find weaknesses in the effects of snowplowing on gendered travel was a fascinating and useful way of using algorithms and data. I find that often a representation of data to visualize biases goes a long way in fixing problems. The VI-SPDAT score on the other hand seemed to bring up a concerning use of data. Ultimately, we did not need to collect the data to know that there was a housing access and affordability issue in LA. Yet, even after collecting the data and understanding the magnitude of the issue, the only solution has been to allocate a little more money and put the most “needy” into housing. Is there a point to doing all of this work if you do not actually act upon it. How is it possible that you can create initiatives and pour money into this kind of work, but then when it comes time to actually fix the issue by providing more housing, the community with the wealth, power, and privilege is able to block this from happening. It makes my stomach churn to realize the steps that the community has gone through to not only remove housing in the area, but to pretend to focus and help individuals affected by historic unit destruction by giving the homeless access to a home lottery without intention to provide for all those suffering.
Both readings for today discussed the use of algorithms to analyze data about social problems but did not prioritize the perspective of marginalized groups who are the most affected by these social issues. The first reading, “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor” by Eubanks, gives a detailed explanation of the housing problem the city of LA is facing. She discusses the use of the VI-SPADT survey, which serves as a tool to determine which individuals are more likely to get housing. Something that I found really interesting was when Blassi said, ‘But homelessness is not a systems engineering problem. It’s a carpentry problem.’ I think that sometimes the approach taken to solving these social problems relies on the use of technology when the basic solution to the problem is rather straightforward and intuitive. The problem goes beyond the practical solution, understanding many variables and factors, in this case, both in the city of LA and in all the homeless individuals who are striving to get housing. For example, besides the housing issue, the criminalization of the poor is another social problem that is involved in this whole situation. Similarly, in ‘Invisible Women: Data Bias in a World Designed for Men,’ Chapter 1: ‘Can Snow-Clearing be Sexist?’ we can see how the issue of public transportation in many countries doesn’t take into account women and pedestrians when thinking about solutions for traffic and mobility.
I loved the chapter “Can Snow-Clearing be Sexist?” because it highlights the issue with subtle sexism/misogyny. I tend to think that almost everything is political, so when I read the chapter title I was considering issues like “time poverty” as the article phrases it (though I definitely did not have a name for this phenomenon like the chapter did) and the implicit labor of motherhood. This article unpacks a seemingly innocent joke and reveals the systemic harm of transportation infrastructure and cultural norms.
Spencer makes a great point that I think we’ve been poking at in class discussions. There is so much conversation around the ability of technology to advance humanity. But, even when the VI-SPADT score quantitatively categorized vulnerability among homeless people, there is a lack of legislative and structural change. What is the point of these tools to produce “proof” if we’re not even using them to improve conditions for homeless people and respond to the housing crisis?
It actually makes me think of the “You are not expected to understand this,” reading we did last week. I think within the field of CS there is a lot of emphasis on innovation and pioneering something incredible as a means of carving out a favorable reputation regarding ones own “competency,” and it seems that the more obscure and opaque an algorithm is, the more celebrated it is (as long as it “works”). But this very idea of something “working” does not consider whether or not it is 1) accessible to others and 2) had practical applications outside of the vacuum of CS.
The Automating Inequality reading reminds me about the other programs and algorithms that sound like they’re gonna do great for society and help people, but then they just end up getting used by the wrong people to harm people. Similar to the other algorithms we’ve studied the past VI-SPDAT app was created to help solve reduce homelessness and help homeless people ranked with high need with getting temporary or permanent housing according to their needs. However, one major downfall of this algorithm is that it takes a lot of personal information related to the applicant and leaves them open to law-enforcement scrutiny. Another downfall of this program is that they count prisons as a housing solution, which seems very unfair. One point I did like from this reading is that they said, if homelessness is like a disease or a natural disaster, why not focus on triad solutions? I completely agree with this because why aggravate the situation when you can just expect it to come and plan accordingly remember that we’re all just humans wanting a sense of security and home.
The Automating Inequality reading discussed a well-intentioned, but poorly resolved initiative to find housing for the homeless population of Skid Row. Throughout decades of removal of low-income housing for the area, Skid Row has become an area where many find themselves unhoused. The area has mission/emergency shelter beds, as well as SRO housing for those struggling with mental health and/or addiction, but the number of those needing housing far exceeds the beds available, leaving 3,000-5,000 people on the streets every night. An algorithm, called VI-SPDAT (Vulnerability Index-Service Prioritization Decision Assistance Tool), was created to help appropriately allocate these beds/housing to those in the greatest need of it. It prioritizes those who are at greatest risk for ending up dead, in an emergency room, or in a mental hospital, and offers them housing first. It does this through a series of questions, some rather personal, ranging from if they had thoughts of hurting themselves or others in the recent past, to social security numbers. This may be justified by the fact that participation is voluntary. However, the way data is collected and the way decisions are made still fail to meet the needs of those the system is nominally devised to help. Several barriers to access to even filling out the VI-SPDAT are illuminated by the story of Gary. Firstly, to even fill out the VI-SPDAT often requires a fair bit of travel—Gary had to travel 17 miles to an office where he could fill it out, and though this was accomplishable through public transit, unhoused folk without access to vehicles have a much more uphill battle in this department. Upon filling out the forms, he then learned that being on the waiting list for one of the places he was otherwise eligible for housing required a 3-5 year verifiable rental history—something many unhoused folk understandably lack, considering they are unhoused. For those who have been unhoused for the greatest length of time this can be an additional barrier because any history they do have may be harder to verify as time goes on, leaving those who have been unhoused the longest more likely to be trapped in that position. Furthermore, the rating system is rather opaque—one woman interviewed who did receive housing noted that she was never made aware of her score and to this day does not know why, 3 years later, folks she knew in very materially similar circumstances have still not been given housing. Though this was not explicitly discussed, the need for a social security number also disadvantages unhoused folks without US documentation. With all these things combined, even though the VI-SPDAT went out with a noble mission, it fails because an algorithm can only be as good as its data, and data on unhoused populations that privileges having a somewhat recent formal housing history, stable access to transportation, and legal US residency, is unlikely to actually identify those most in need. This fundamental principle that says data that underrepresents certain (likely marginalized) groups is bad data applies to not just VI-SPDAT, but all algorithms with any sort of human-facing outcomes. Good intentions do not cancel out inequitable results, and inequitable results are inevitable when the data does not facilitate full knowledge of a population, and in fact leaves out its most vulnerable members.
The VI-SPDAT process essentially makes homeless people show extreme vulnerability by sharing intimate details about themselves, making private information accessible to those who can hurt them in exchange for a sliver of a chance of obtaining housing if an algorithm deems them worthy. Regardless of how much data is collected, the sentiment of what the reading calls “professional middle-class apathy” in combination with policies that erase communities people can actually afford to live in prevent any improvement of living conditions for homeless people.
So many people in LA say they want to provide more housing and resources for homeless people yet refuse for the building of this housing in their own communities and don’t want to fund for any increase in resources. By outsourcing the allocation of houses and resources for homeless people to coordinated entry/VI-SPDAT, people can distance themselves from the human impacts of inaction (as the reading said) whereas those actively working to find people houses and people who need houses face a multitude of difficulties because of this system.
I think the article about snow-clearing reminded me of the curb-cut effect; by re-designing transport systems that are not gender neutral, taking into account the effects of infrastructure on women’s needs allows us to create infrastructure that benefits everyone.
I think both readings do a very effective job of showing the importance of transparency and intersectional thinking when it comes to data collection and analysis. “Can Snow-Clearing be Sexist?” shows how every aspect of our society is influenced, either minimally or greatly, by our internal biases and history. Nothing can be brushed off as unimportant or irrelevant when analyzing data during decision-making. The snow-clearing schedules in cities all over the world greatly benefit the male-dominated highways while bike paths and walkways are not prioritized. But, when you take the concerns of marginalized groups into consideration, everyone benefits.
Sadly, like others have mentioned, these data-driven approaches to combatting issues in our society don’t fundamentally change how these insufficient systems work. The fact that pedestrians, who were largely women, were considered more often in snow-clearing schedules doesn’t change the fact that nearly every aspect of society is also gendered, and that many countries are still shifting towards a largely inefficient car-dominated transportation structure. The fact that the VI-SPADT was more effective than other coordinated entry systems doesn’t change the reality of police harassment of houseless people and the diminishing amount of affordable housing. This isn’t to say data-driven approaches are futile, but I think rather than being approached as THE solution to these issues, they should be used as evidence of the need to change these systems fundamentally. Real, permanent work can’t be done without acknowledging the issues that cannot be solved with just numbers.
While reading both articles (honestly this is not very related to algorithms but just biased design that is thought of as the norm), I was really thinking about the design of cities back home and who they favour (in terms of zoning, infrastructure, and even elevation). I think the idea of inequity being reproduced physically through policy and design decisions is one that I have not spent much time thinking about. As the article on snow clearing states, the transportation infrastructures within cities particularly reinforce and enable the importance of what we deem as productive work. In this conception, any work that does not bring money into the home is given less importance, relegated to a realm of relative inefficiency and necessarily requires more effort to travel to and from. The idea of ‘time poverty’ was also one that I hadn’t before, so that was super cool.
Thinking about the zoning in Skid row while reading social services article, I was also thinking about zoning policy reflects and has magnified inequalities back home, specifically in Mumbai. There certain parts of South Bombay that are protected due to them containing UNESCO world heritage sites. So, large parts of the city cannot have new skyrises built to accommodate the burgeoning population of the city. These parts of the city also happen to be one of the richest. Invariably, the land that is then bought and repurposed by builders into apartments is land where slums are erected. The richer regions of the city (which is actually almost all of South Bombay in comparison to the region called the Suburbs) are also higher in elevation. Mumbai has extremely heavy rainfall in Monsoon and lower lying regions, a large swathe of the suburbs specifically, are extremely susceptible to flooding during the monsoon months. So, the properties (and wealth therein) of richer people in South Bombay remain intact while the monsoon ravages the wealth of those living in low lying areas, who are disproportionately of lower socio-economic strata. One more really peculiar thing I have noticed, although I don’t think it is a direct product of policy, is that newly built higher-end apartments in Mumbai do not tend to have basement parking. Instead, the first several floors the apartments are reserved for resident and guest parking. I was talking to my parents, and here is our theory. As mentioned earlier, newer buildings are getting built on land that was home to poorer people. The government requires them to give these people housing, so they will build what is referred to as a vertical slum for these people to reside in on some part of the land they purchased. These vertical slums are extremely cramped and their genuine sanitation and health concerns in their designs, however they serve the purpose of clearing up the land for the development of the apartments. So, the parking floors are designated in such a manner that all residential floors start above the top floor of the nearby vertical slums, so that residents do not have to look into those houses (this is so classist and wild, and yet it is just accepted as normal). There is literally a vertical stratification of class.
The coordinated entry system is another algorithmic band-aid solution to a problem that goes far deeper than assignment to resources. I think it’s a really positive thing that people who are highest risk to receive priority for housing, but this sort of system only works when you have the capacity to handle all people who need help. The coordinated entry system is better at prioritizing, sure, but for unhoused people who are lower or middle risk, there is functionally no difference between the system using VI-SPDAT or the system in which they had to put their name on a giant nebulous waiting list. It’s frustrating that even here there are obvious pitfalls that don’t really align with the interests of groups trying to help the unhoused. There is a definite path to a cycle between prison and homelessness encouraged by the algorithm’s consideration of prison as housing or shelter. I have to wonder how such a consideration of prison got past activists and officials who clearly have a passion for helping the homeless.
I appreciated that this reading also offered some insight on concepts of surveillance in the modern day. Its interesting that now very little has to be done to make yourself a target for increased surveillance. As discussed in the reading, old targets had done much to make themselves known/connect themselves to controversial ideas or movements. Now, since everything we do is tracked and followed down to the finest detail, surveillance organizations become passively aware of such activity without even having to do the legwork. Even the increased surveillance involves much less work, as its simply closer tracking and investigation of data being collected passively. I imagine this would greatly increase the effective output of surveillance bodies without much more effort.
Chapter 3 of Automated Inequality was shocking and disturbing for me to read, partly because once every few minutes my mind kept going back to the list of questions asked to evaluate housing needs. These questions are very personal; yet, the homeless feel compelled to give some answers just for the sake of having a roof over their head. It was evident through some of the conversations in the chapter how desperate they felt and how willing they were to divulge any information asked for just to have a place to stay. Since they also have to sign an agreement allowing for these responses to be sent to any homeless service providers, it’s highly possible that the data can be misused for purposes that do not necessarily benefit them in any way. The fact that these providers have access to such a database seems very corrupt and obviously raises a lot of privacy concerns. It also feels to me like they are taking advantage of people who don’t really have a choice in many ways.
Furthermore, the data gathered to evaluate risks and housing needs are not necessarily processed in an ethical way by the algorithm. It seems extremely unfair that a person with a college degree and a wealth of job experience who just happened to suffer from an economic recession in his industry was regarded as low risk and undeserving of housing assistance. This is equivalent to cases in college admissions where students coming from a middle class background cannot afford to pay the tuition but are also not “financially disadvantaged” enough to receive enough financial aid to fund their education. Although the coordinated entry system was created with the best of intention, there remain many flaws in the logic of the algorithm and many ethical concerns regarding how certain groups of well-deserving people can be left out of the equation. There is also not enough action taken on the government part to support the system, just as expressed in the reading – “But in the absence of sufficient public investment in building or repurposing housing, coordinated entry is a system for managing homelessness, not solving it.”
The automation of inequality continued the theme of algorithms and data collection surrounding people/ demographic statistics as a possible site for discrimination/ cutting off access to neccesities like housing.
The piece on invisible women was an interesting example of how ingrained gender is to our society, and the levels at which inequality can be identified. The idea that a certain pattern of snow clearing would disproportionately impact men and women doesnt even seem too far fetched to me, but I can understand how that would perhaps raise eyebrows for those who are not used to this type of feminist inquiry. I am curious about how tech regulations and law making in the US connect to this idea.