Queering Digital Technologies: Rethinking Gender in the Digital Age
- BHATTACHARYA ADRITA AMIT 2333119
- May 4
- 7 min read
It is hard to know for certain what the twenty-first century will be remembered for. Will it be known as the era of breakneck technological advancement, or as the period when global cultures began to radically rethink and blur the boundaries of gendered categorisation, or something else entirely? Modern institutions would have us believe that STEM, being highly “rational” and “empirical,” is insulated from ideology — and that it simply happens to be the case that most positions of authority in technological innovation are occupied by men, or those upholding masculinist ideals.
Yet, like most revolutionary occurrences throughout history, technology carries the potential to disrupt and challenge existing social hierarchies. The world wars saw women entering the workforce in unprecedented numbers, challenging long-held beliefs about gendered divisions of labour. The Industrial Revolution, too, laid the groundwork for the emergence of first-wave feminist movements. At the same time, these moments in human history also reinforced gender norms in subtle and lasting ways — evidenced by the fact that gender continues to be a powerful, structuring force in contemporary society.

Today, it remains to be seen how technology will ultimately reshape the institution of gender, but its influence is undeniable. Gender has long resisted neat definitions. Its meaning has, throughout time, been fluid and contested. Gendered norms however have a chokehold on society and culture that is arguably unmatched by any other force. Contemporary scholars, following Judith Butler’s seminal interventions in gender theory, have increasingly disentangled the concept of gender from biological sex — although this disconnection remains highly contentious in mainstream discourse.
A truly genderless society may well be an unattainable ideal, given how deeply gender is woven into the fabric of social organisation. In such a world, it is inevitable that the technologies we build — especially those trained on human data — will reflect and reinforce the biases embedded within the societies that produce them. Increasingly, we have come to find that systems and models uphold the gendered status quo in ways both subtle and explicit. The infamous case of Amazon’s AI recruiting tool, which systematically favoured male candidates by learning from historically biased employment data, offers a stark example of how such technological biases translate into real-world harm.
In this context, it becomes immensely necessary to embed gender and queerness sensitisation into technological curricula, which will ensure that technological solutions in the future are socially responsible and just.
The Man and the Machine?
In her groundbreaking work, The Cyborg Manifesto, Donna Haraway proposed the idea that the future of feminism might not lie in returning to ‘nature,’ but in embracing the cyborg - a hybrid of machine and organism, boundary-blurring, and deeply unsettling to institutions based on binaries. Haraway’s cyborg offered a radical vision of the posthuman, making the subversive claim that the human is not fixed, and neither, therefore are human-made categorisations of gender. It refuses the strict classifications between human and machine, organic and synthetic, male and female.

An understanding of the queer in critical theory circles goes beyond identity. Queer Theory questions the very categorisations that one has come to take for granted - male and female on a rudimentary level, normal and deviant in the larger scheme of concepts. It challenges the notion that there is only one “correct” way to present, to live in a body, to desire, to relate to technology, or to exist in society. In this sense, it can become a tool to examine systems - including but not limited to technological ones - that are built to enforce conformity in seemingly harmless ways. It can help us arrive at several important questions: Who is this system designed for? Whom does it exclude?
Foundational structures may seem innocuous to the casual observer if not critically viewed through the lens of marginalisation, but they can be value-ridden and have ideologies built into them. Take the example of Robert Moses, the architect who designed the overpasses in New York City before the Civil Rights Movement. They were designed to be too low to let buses (which were largely used by people of colour) pass under them, hence contributing to the race-based segregation of the city.
As technology develops, it invariably becomes an inextricable part of human lives. Inequitable access and non-inclusion in technological frameworks hence can lead to disparities in human competencies.
Decoding Encoded Gender
Despite ongoing ruminations about the nature and validity of gender as a social construct, it is encoded in databases as a binary boolean value. That is to say it can only take two possible inputs - true or false, 0 or 1, male or female - and it is often mandatory to fill it in. This may not seem to be the case, especially when one considers that a lot of online forms now have more than two options for gender identification, sometimes even an open text field. However, as Rena Bivens’ 2017 article tells us, despite the seemingly progressive software update on Facebook in 2014 which allowed for up to 58 options for identification, binary gender is too important to Facebook’s advertising clients. Ultimately, in the backend, non-binary users are also classified into binary categories, for the sake of simplification.
Katta Spiel’s article in the ACM Digital Library recommends the following steps to make digital infrastructure more inclusive:
Protect gendered information: Gender should be considered personal and private information
Minimise on gendered data: Disclosure should not be mandatory and should instead be on a need-to-know basis.
Refactor existing databases: To better be able to explain and account for gender fluidity or non-conformity.
Educate upcoming developers on gender: For thoughtful and sensitive modelling of gender (if it is necessary).
Update existing textbooks: STEM courses often treat the gender binary as default, hence reifying it.
A better understanding of gender for students in STEM can help sensitise and equip them to interrogate the assumptions behind the algorithms they train, the data they collect, or the interfaces they design. Importantly, this requires more than just adding some gender categories to a dropdown list. It requires a paradigm shift that
Does not treat gender as a sorting or targeting tool, attempting to commodify or instrumentalise it
Is still aware of the workings of gender as a structuring force in real life that entails biases which systems should actively implement protections and measures to counteract.
The Tyranny of the Normal
Paisley Currah and Tara Mulqueen’s study on how airport security and biometrics respond to transgender bodies that defy expectations of binary gender which often forms the basis of security systems reveals key insights about the invasive nature of factoring gender as a “known known” in the Rumsfield Matrix.
In 2010, the TSA started employing advanced imaging techniques in full-body scanners which flagged atypically gendered bodies as potentially carrying threatening hidden devices. Transgender people were scrutinised, interrogated, and thoroughly investigated if the scanner found that their body scan did not align with their gender presentation. This happened because the security system in question was based on the naturalisation of binary gender as a key immutable characteristic that determines what “normal” bodies should look like.

This normalisation and naturalisation of a certain kind of human body has impacted not only gender non-conforming people but also cisgender women historically. Throughout much of medical history, the male body was treated as the default standard of health, anatomy, and physiology, while female bodies were seen as variations or even deviations from this norm, framing women as incomplete or defective men. Even well past the rise of modern medicine, clinical trials, drug development, and anatomical studies overwhelmingly focused on male bodies. This led to serious consequences. Conditions like heart disease were studied and diagnosed based on how they appeared in men, and pain reported by women was often dismissed as “emotional” or “psychosomatic”. We know from history thus, that treating one particular type of body as the norm can lead to systemic lapses in extremely critical areas.
Even today, despite increasing efforts to include female bodies in medical research, there remains a tendency to treat certain physiological patterns as natural and others as unhealthy. If health technologies are based on data drawn from a narrow demographic, this can have far-reaching consequences. For instance, many period-tracking apps, which claim to offer predictive insights and pre-diagnoses, are not based on reliable clinical trials but around a single, idealised hormonal cycle. Users whose symptoms do not align with this model - like those on birth control medication, or those with naturally irregular cycles - may be flagged for conditions like PCOS without considering their medical context accurately. Attempting to standardise bodily functions without accounting for variability can, therefore, replicate and maintain the broader issues plaguing medical history, and, when it comes to broader applications of data-based technologies, cultural representation.
Fighting Fire With (Classi)fier
The question therefore is this: should technology work to reflect and aid in the perpetuation of biases inherent to human society? Or should it, instead, be used to challenge and dismantle societal tendencies that actively marginalise and harm people? We see many technological entities that are based on the former. We also see that it is entirely possible to use technological tools to do the latter.
Shhor AI, founded by Aindriya Barua, is an NLP-based system developed to tackle casteist and queerphobic hate speech in India’s complex digital context. It responds to the lack of tools equipped to handle low-resource vernacular languages and the unique challenges in the detection and comprehension of online digital communications in India. Hate speech in the Indian context often appears in code-mixed varieties and uses strategic substitutions to slip past traditional moderation systems. It is frequently multimodal as well, moving beyond text to images, memes, and videos. Shhor AI is trained on real, crowd-sourced instances of such evasive online hate, allowing it to recognise and respond to the culturally and linguistically unique nuances of hate speech in India.

In Conclusion
It is important to remember that the foundations of the technological world were not always so rigidly normative. Ada Lovelace, often regarded as the world’s first computer programmer, imagined the potential of machines to move beyond mere numerical calculation. In the early history of computing, early programming work was considered clerical and thus deemed “women’s work”.
Future technologists must equip themselves not merely with technical skills but with critical attitudes that are demanded of the architects of a new, digital world, attitudes that can effectively address the flaws of contemporary human society instead of merely moving them onto a new medium. The technologies of tomorrow must emerge from the needs of the margins of today, such that the world we make demands that the full complexity of the human experience be acknowledged and engaged with.
Comments