Imagine this: sprinting down the street in pursuit of the perfect coffee for your boss. Doing jobs that no one else wants to do. Getting compensated negative dollars. So basically the plot line of The Devil Wears Prada. Now imagine the opposite, and that’s what my internship has been like. If you know anything about the partners at Nuvalence then you’d also laugh at the idea of them bossing around anyone. In other words, I can indeed confirm that they are not Devils, nor do they Wear Prada (from what my camera over Zoom tells me).  If you’re reading this then you probably work in the tech industry. So maybe you have no idea what movie I’ve been referring to and none of these jokes have landed (this often happens to me as a non-tech professional working in the tech industry). Some context: I was hired as the marketing / graphic design intern for Nuvalence. Some more context: I’m currently in school studying Creative Industries at Ryerson University, specializing in Fashion and Visual Culture. What does any of that mean you may ask? It’s essentially business through the arts, with specializations that involve theory, research, and analysis of the ways in which advertising and mass media systematically and individually shapes, and is shaped by, people, industries and bias. Specifically, I’m interested in the ways in which technologies influence information, communication and representation, and cultures that influence technologies and information.

And so here enters the point in this blog post that I can talk about the one thing that I know about web tech: 

Assuming that algorithms are objective assumes the people building mathematical functions behind API are objective (Noble, 2018). However, everyone has implicit biases. We all have different lived experiences and no amount of learning or listening can (1): make someone who has never experienced something understand what it is like, and (2): make someone ‘neutral’. Researchers and scholars such as Safiya Noble have long been investigating whose perspectives are prioritized in algorithmic design, and long-emphasized that discriminatory automated decisions and results are not just malfunctions in the system, but actually at the core of the web itself (Noble, 2018). So how do we begin to change this? One notion I’ve heard often is to diversify the tech industry. This however, can pose a problem. More specifically, molding people to fit into roles that have been shaped by certain cultural perceptions and by the people who have predominantly been in these roles poses a problem (Jordan-Young, 2010). We need to look at the industry and its culture as a whole and inspect who is most accommodated for, able to thrive, reflected in policies, procedures, practices, environments etc., and evaluate what types of knowledge and lived experiences are considered. In conclusion, while including more people with alternate perspectives, experiences and identities than the tech industry’s current majority can promote positive change, there is still a dominant bias and audience that is favored and marketed towards on the web.

To combat these implicit biases, what if we explicitly redefined the industry and its culture as a whole? What if we centered roles and prioritized support systems for BIPOC (Black, Indigenous and People of Color)? What if we actively defined new careers in tech for critical scholars, data researchers and feminists that call for such expertises and knowledge? What might this look like? An industry that creates sustained space for cultural theorists, data researchers, and feminists to directly collaborate with software developers and engineers to build web tech. The result? A web that doesn’t limit BIPOC to stereotypes and memes. A search tool that doesn’t pull up dating sites and pornified images when searching “Asian girls” or “Latina girls” (Noble, 2018). A facial-analysis software that doesn’t use biased, binary performance assessments with data that is 77% male and 83% white, and does not result in a 46.8% error rate for dark-skinned women as opposed to a 0.8% error rate for light-skinned men (Hardesty, 2018). A system that acknowledges the disparities in health care (which have long been known to black-women-led reproductive groups and many black women) and doesn’t lack data collection of maternal mortality, complications in childbirth, and reporting systems in general (D’lgnazio & Klein, 2020). The list goes on. As daily life increasingly digitizes and institutions increasingly rely on web-based tools, it becomes only more critical to consider and examine the roles that cultures, representation and bias have in shaping technologies and data. We must inspect and begin to shift the markers and prioritized audiences of the web (Wajcman, 2010). The first step? Recognizing that, “what we choose to measure is a statement of what we value…. it’s a measure of who we value” (D’lgnazio & Klein, 2020).

References:

 D’lgnazio, C., & Klein, L. 2020. Data feminism. The MIT Press.

Hardesty, L. 2018. “Study Finds Gender and Skin Type Bias in Artificial Intelligence Systems. MIT News. Retrieved from: https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212

Jordan-Young, R. 2010. Brain storm: The flaws in the science of sex differences. Harvard University Press.

Noble, S. 2018. Algorithms of oppression: how search engines reinforce racism. New York University Press.

Wajcman, J. 2010. Feminist theories of technology. Cambridge Journal of Economics, Volume 34, Issue 1.