Humans in the Loop

 

Humans in the Loop




TASK 1: AI, Bias & Epistemic Representation

Introduction: Technology and Indigenous Knowledge


Prompt: Critically analyse how Humans in the Loop represents the relationship between technology (AI) and human knowledge, examining algorithmic bias as culturally situated and epistemic hierarchies within technological systems.


Humans in the Loop (2024), directed by Aranya Sahay, is not simply another film about artificial intelligence. Rather, it functions as a philosophical exploration of whose knowledge is considered legitimate, whose voices are ignored, and how power operates through technologies that claim to be neutral and objective. The film is set in Jharkhand, a region closely associated with Adivasi communities, and follows the story of Nehma, an Oraon tribal woman who works as a data annotator for AI systems. Through her experiences, the narrative reveals what scholars call epistemic injustice—a situation where certain forms of knowledge are excluded from structures that determine authority and expertise. In this way, the film presents a conflict between different knowledge systems and exposes the ideological assumptions hidden within AI technologies.


As Carlos Alonso (2026) notes in discussions of contemporary AI narratives in cinema, stories about artificial intelligence often reflect deeper cultural ideas about progress, rationality, and technological development. What makes Humans in the Loop distinctive is that it reveals these assumptions through the everyday life of a woman who faces multiple forms of marginalization, including gender inequality, indigeneity, class disadvantage, and geographical isolation.


Algorithmic Bias as a Cultural Phenomenon


In the film, a major conflict arises when Nehma recognizes a gap that cannot simply be fixed by improving data or adjusting algorithms. This gap exists between the rigid logic of the AI system she works for and the complex, interconnected understanding of nature held by her Oraon community. While labeling images of plants, animals, and landscapes using fixed algorithmic categories, she repeatedly encounters things that cannot easily be reduced to simple labels.


For instance, a plant that holds medicinal, spiritual, and ecological value within her community must be reduced to a single scientific classification in the system. Similarly, a forest boundary defined by collective memory, seasonal patterns, and lived experience is transformed into a rigid digital coordinate. Through these examples, the film suggests that algorithmic bias should not be seen merely as a technical problem that can be solved through better programming. Instead, it is presented as a culturally shaped process, where computational systems reflect particular philosophical assumptions about what kinds of knowledge are valid and which ones are ignored.


Epistemic Hierarchies: Determining Valid Knowledge


Aranya Sahay develops the theme of knowledge and authority with subtlety throughout the film. Nehma is not portrayed as powerless or naïve. Instead, she appears thoughtful and aware of the limitations of the AI system she works within. In several scenes, the camera focuses on moments when she pauses before labeling images. These pauses are not shown as confusion but as thoughtful hesitation. They indicate her awareness that the categories imposed by the system cannot fully capture the meanings embedded in her lived experiences. In this sense, her hesitation becomes a quiet form of resistance based on knowledge rather than emotion.


The design of the data-annotation centre can also be interpreted through theoretical ideas about ideology and representation, often associated with scholars such as Stuart Hall. The workplace appears highly standardized: glowing computer screens, identical interfaces, workers separated by headphones, and the repetitive rhythm of typing. This carefully designed environment reflects what film scholars David Bordwell and Kristin Thompson describe as mise-en-scène, where visual elements work together to produce meaning. The sterile environment reinforces the AI system’s claim to objectivity.


However, the film repeatedly interrupts this technological environment with scenes from the forest and village, spaces filled with sound, texture, and cultural memory. Interpreted through the theoretical ideas of Gilles Deleuze, particularly his concept of the movement-image, this editing pattern creates a contrast between two epistemological worlds: the simplified, standardized logic of algorithms and the complex, interconnected knowledge of indigenous life.


The Film as Ideological Critique


One of the most powerful aspects of Humans in the Loop is that it refuses to offer a clear solution. Nehma does not fix the system by persuading her supervisors, nor does she overthrow it through technological expertise. By avoiding this typical narrative resolution, the film rejects the common liberal-humanist storyline often seen in mainstream AI films, where an individual hero reforms technology from within.


Instead, the conflict remains unresolved. The divide between indigenous knowledge systems and algorithmic classification continues to exist, leaving viewers with a sense of discomfort. This open ending reflects the real situation faced by many marginalized communities who depend on the global digital economy for income while simultaneously experiencing the marginalization of their cultural knowledge.


Some reviewers, including writers from The Indian Express (2026), describe the film as presenting a clash between artificial intelligence and traditional belief systems. However, the film suggests something deeper. The imbalance between these knowledge systems is not temporary but structurally embedded in the technological frameworks that dominate contemporary digital culture.


Conclusion


The significance of Humans in the Loop lies in its rejection of the idea that algorithmic bias is simply a technical error. By grounding the narrative in the life of an Adivasi woman whose ecological knowledge is repeatedly sidelined by the AI system she helps maintain, the film argues that bias is a predictable outcome of cultural and ideological structures that determine which forms of knowledge are recognized as authoritative.


When interpreted through Apparatus Theory, the film becomes a powerful ideological critique—not only of artificial intelligence as a technological tool but also of the deeper epistemological hierarchies that shape modern technological systems. As theorist Karen Barad (2026) notes, the film reveals what digital capitalism often hides: the invisible labour that sustains AI systems, the cultural compromises demanded from marginalized communities, and the epistemic inequalities embedded within the foundations of contemporary technological society.


TASK 2: Labour & the Politics of Cinematic Visibility

Introduction: Revealing Hidden Labour


Prompt: Examine how the film visualizes invisible labour and what it suggests about labour under digital capitalism, including how its visual language represents labelling work and the emotional experience of labour.



One defining feature of digital capitalism is its ability to hide the labour that supports it. Behind AI technologies such as recommendation systems, image recognition tools, and language models exists a large network of workers who perform tasks like data labeling, content moderation, and algorithm training. Much of this work is carried out in the Global South, often by women and economically marginalized groups. Despite its importance, this labour remains largely invisible behind the smooth interfaces of modern technologies.


Humans in the Loop, directed by Aranya Sahay, directly challenges this invisibility. One of the film’s main political goals is to expose the labour that digital capitalism normally conceals.


Visual Representation of Labour


The data-annotation centre in Jharkhand is presented through a carefully constructed visual environment. The workspace appears structured and minimal: rows of computers, identical chairs, workers wearing headphones, and the constant rhythm of keyboard typing.


This visual uniformity reflects the image that the global technology industry tries to project—efficient, organized, and neutral. At the same time, it subtly reveals the depersonalization involved in digital labour. Workers appear standardized, almost interchangeable. Through this design, the film shows how the architecture of the workplace itself helps sustain the illusion of technological neutrality while hiding the human effort behind AI systems.


Emotional Labour in Digital Work


The film also highlights the emotional dimension of data labeling. Nehma’s work requires constant judgment and interpretation. She must repeatedly choose between her own knowledge and the categories required by the system.


Using the concept of emotional labour, originally developed by sociologist Arlie Hochschild, the film shows how workers must regulate their feelings as part of their job. Through close-up shots of Nehma’s face, the film captures subtle emotional reactions as she confronts categories that do not match her understanding of the world.


Rather than dramatic expressions, the film uses quiet and restrained performances to show the emotional burden of this work. These moments reveal that data annotation is not simply mechanical labour but an activity that involves ethical and psychological tension.


Labour, Class, and Global Digital Capitalism


Nehma’s job also reflects broader class inequalities within digital capitalism. The data-annotation centre appears as a small node within a global economic network. The clients who assign the work are never shown directly. Instead, they appear only through digital instructions and productivity metrics on computer screens.


This absence is deliberate. It reflects the real structure of the data-annotation industry, where workers in regions like Jharkhand or parts of Africa are connected to major technology hubs such as Silicon Valley through complex subcontracting systems.


These networks make it difficult to identify responsibility and obscure the relationships between labour and profit. As a result, the workers who generate value remain distant from those who benefit from it.


Empathy, Critique, and Transformation


The film simultaneously invites empathy and critical reflection. Viewers develop emotional connections with Nehma through scenes showing her family life and responsibilities as a mother. These personal moments humanize digital labour, which is often discussed only in abstract economic terms.


At the same time, the film encourages viewers to question the inequalities embedded in digital capitalism. Rather than providing a simple solution, the narrative remains open-ended, forcing audiences to confront the contradictions between technological progress and human cost.


TASK 3: Film Form, Structure & Digital Culture

Introduction: Film Form as Meaning



Prompt: Analyze how film form and cinematic devices (camera techniques, editing, sequencing, sound) convey philosophical concerns about digital culture and human-AI interaction.

In film theory, form is not simply decoration. Scholars such as David Bordwell and Kristin Thompson argue that cinematic techniques—camera movement, editing, framing, and sound—actively create meaning.


In Humans in the Loop, these formal elements work together to produce a philosophical reflection on digital culture and human–AI interaction.


Mise-en-Scène: Two Contrasting Worlds


The film visually contrasts two spaces: the forest and the data-annotation centre. The forest scenes are rich with natural colours, textures, and layered compositions. People appear connected to the environment.


In contrast, the annotation centre is dominated by artificial lighting and glowing computer screens. Workers appear isolated and absorbed into the technological system.


Through this visual contrast, the film presents two different ways of understanding knowledge: one based on relational ecological experience, and the other based on algorithmic classification.


Cinematography and Knowledge


The film’s cinematography also reflects these contrasting epistemologies. Forest scenes often use handheld cameras that move gently with the characters, creating a sense of exploration and openness.


In contrast, the annotation centre is filmed with static camera positions and rigid framing. This visual rigidity mirrors the fixed logic of algorithmic systems.


Editing and Temporal Structure


Editing plays a crucial role in the film’s argument. Scenes of the forest are frequently followed by scenes of digital labeling. This contrast shows how complex lived realities are reduced to simplified digital categories.


The pacing also differs. Forest scenes unfold slowly, allowing viewers to observe details, while workplace scenes move quickly, reflecting the speed and pressure of digital labour.


Sound Design


Sound design further reinforces this contrast. The forest is filled with layered natural sounds such as birds, water, and wind. The annotation centre, however, is dominated by mechanical noises like typing and computer hums.


At key moments, near silence accompanies Nehma’s hesitation, emphasizing the limitations of algorithmic systems in recognizing her knowledge.


Conclusion: Aesthetic Critique of Digital Culture


Through its careful use of cinematic form—visual design, camera movement, editing, and sound—Humans in the Loop presents a powerful critique of digital culture. The film demonstrates that AI systems are not neutral reflections of reality but culturally shaped structures that prioritize certain forms of knowledge while excluding others.


By revealing these hidden assumptions, the film encourages viewers to question the technological narratives that dominate contemporary society.



Alonso, David V. "Imagining AI Futures in Mainstream Cinema: Socio-Technical Narratives and Social Imaginaries." AI & Society, 2026,https://doi.org/10.1007/s00146-026-02880-7.  


Anjum, Nootan. "Aranya Sahay's Humans in the Loop and the Politics of AI Data Labelling." The Federal, 2026, thefederal.com/films/aranya-sahay-humans-in-the-loop-oscar-adivasi-data-labelling-jharkhand-ai-tribal-216946


Apparatus: Film, Media and Digital Cultures of Central and Eastern Europe. Wikipedia, Wikimedia Foundation, retrieved 15 Feb. 2026, en.wikipedia.org/wiki/Apparatus_(journal) .


Barad, Dilip. "Humans in the Loop: Exploring AI, Labour and Digital Culture." Blog post, Jan. 2026, blog.dilipbarad.com/2026/01/humans-in-loop-film-review-exploringai.html.


Bazin, André. What Is Cinema? Vol. 1, University of California Press, 1967.


Bordwell, David, and Kristin Thompson. Film Art: An Introduction. 12th ed., McGraw-Hill Education, 2019.


Cave, Stephen, et al. "Shuri in the Sea of Dudes: The Cultural Construction of the AI Engineer in Popular Film, 1920–2020." Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines, Oxford University Press, 2023, pp. 65–82, https://doi.org/10.1093/oso/9780192889898.003.0005.


D'souza, Sahir Avik. "'Humans in the Loop': A Thoughtful Film About the Human Intelligence Behind AI." The Quint, 5 Sept. 2025, thequint.com/entertainment/bollywood/humans-in-the-loop-review-ai-theatrical-release.


Deleuze, Gilles. Cinema 1: The Movement Image. Translated by Hugh Tomlinson and Barbara Habberjam, University of Minnesota Press, 1983.


"Film Theory." The Year's Work in Critical and Cultural Theory, 2025,https://doi.org/10.1093/ywcct/mbaf004.


Frías, Carlos L. "The Paradox of Artificial Intelligence in Cinema." Cultura Digital, vol. 2, no. 1, 2024, pp. 5–25,https://doi.org/10.23882/cdig.240999.


Göker, Deniz. "Human-Like Artificial Intelligence in Indian Cinema: Cultural Narratives, Ethical Dimensions, and Posthuman Perspectives." International Journal of Cultural and Social Studies, vol. 11, no. 2, 2025, pp. 1–10,https://doi.org/10.46442/intjcss.1799907.


Haris, M. J., et al. "Identifying Gender Bias in Blockbuster Movies through the Lens of Machine Learning." Humanities and Social Sciences Communications, vol. 10, 2023, p. 94, https://doi.org/10.1057/s41599-023-01576-3.


"Humans in the Loop (Film)." Wikipedia, Wikimedia Foundation, retrieved 15 Feb. 2026, en.wikipedia.org/wiki/Humans_in_the_Loop_(film) .


Indian Express Editorial. "Humans in the Loop Explores How AI Clashes with Traditional Belief Systems." The Indian Express, 3 May 2025, indianexpress.com/article/express-sunday-eye/humans-in-the-loop-explores-how-ai-clashes-with-traditional-belief-systems-9980634/.


McDonald, Kevin. Film Theory: The Basics. 2nd ed., Routledge, 2023.


Mehrotra, Karishma. "Human Touch." Fifty Two, 2022, fiftytwo.in/story/human-touch/ .


Sahay, Aranya, director. Humans in the Loop. Storiculture, 2024.


Popular posts from this blog

AI-Generated Poem and Deconstructive Analysis

An astrologer's day by R. K. Narayan

Assignment : paper no 201