Sunday, February 22, 2026

Humans in the Loop (2024) Movie

This blog has been assigned by Prof. Dr. Dilip Barad as part of our critical exploration of film theory and digital culture.In this context, I engage with Humans in the Loop (2024), directed by Aranya Sahay, as a film that brings artificial intelligence into conversation with rural life, indigenous knowledge, and invisible labour. Set in Jharkhand, the narrative follows Nehma, an Adivasi woman working in an AI data-labeling center, and uses her story to question the idea that technology is neutral or autonomous. Through its portrayal of digital labour, algorithmic bias, and power structures, the film invites us to rethink the relationship between human knowledge and technological systems in the contemporary world.

Humans in the Loop (2024) Movie 





Humans in the Loop (2024)


Information

Title

Humans in the Loop

Director

Aranya Sahay

Year of Release

2024

Country

India

Language

Hindi (with Jharkhand tribal dialect influences)

Duration

Approx. 74 minutes

Genre

Independent / Art-house / Social Drama

Setting

Rural Jharkhand, India

Protagonist

Nehma – An Adivasi woman and AI data labeler

Daughter

Dhaanu (Dhanū) – Nehma’s 12-year-old daughter struggling between village and city identity

Son

Guddu – Nehma’s younger child

Husband

Ritesh (mentioned in custody conflict; represents patriarchal and social authority)

Supervisor / Manager

Workplace authority who enforces client-driven AI labeling rules

AI Team Members

Rural data labelers working in the AI center

Central Conflict

Clash between Nehma’s indigenous ecological knowledge and corporate AI labeling protocols

Key Workplace Task

Image tagging, pest/crop classification, skeletal tracking, bounding box annotation

Important Dialogue

“AI is like a child. If you teach it wrong, it will learn wrong.”

Major Themes

Algorithmic bias, invisible digital labour, indigenous knowledge, gender precarity, digital capitalism

Social Context

References tribal custom Dhuku and custody struggle

Cinematic Style

Minimal background score, natural soundscape, contrast between forest landscapes and digital workspace

Significance

Highlights invisible labour behind AI systems and critiques technological neutrality




Q. Critically analyze how Humans in the Loop represents the relationship between technology (AI) and human knowledge. 




Introduction


In Humans in the Loop, director Aranya Sahay relocates artificial intelligence from metropolitan tech spaces to a rural Adivasi village in Jharkhand. The film follows Nehma, an indigenous single mother who works in a data-labeling center where she “teaches” AI how to classify the world. Through this narrative, the film dismantles the myth of technological neutrality. It argues that AI systems are not autonomous, objective entities but are shaped by human decisions, cultural assumptions, and institutional power structures.

The relationship between technology and human knowledge in the film is therefore not cooperative but hierarchical and conflictual. By dramatizing the tension between Nehma’s ecological knowledge and corporate AI protocols, the narrative exposes algorithmic bias as culturally situated and highlights epistemic hierarchies that determine whose knowledge counts in technological systems. Through the lens of representation, ideology, power relations, and Apparatus Theory, the film reveals that AI functions as an ideological apparatus that reproduces dominant worldviews.


Algorithmic Bias as Culturally Situated, Not Purely Technical


The most significant example of culturally embedded bias appears in the “caterpillar” scene. Nehma refuses to label a caterpillar as a pest, explaining that it consumes decayed leaf matter and supports ecological balance. Her supervisor dismisses her reasoning and insists: “If the client has said that it is a pest, then it is a pest. Your job is to label it.”

This moment is crucial because it reveals that bias enters AI at the level of categorization, not computation. The algorithm itself does not decide that the caterpillar is harmful; it merely learns from human-provided labels. The category “pest” is not scientifically neutral—it reflects an agricultural ideology that prioritizes crop yield over ecological complexity. The AI model becomes a vehicle for institutional assumptions.

From the perspective of film theory, this is a question of representation. Representation involves selecting and framing aspects of reality in ways that produce meaning. In AI systems, labeling performs a similar function: it frames reality through predefined categories. When the caterpillar is labeled as a pest, the AI internalizes a worldview that simplifies ecological interdependence into binary distinctions—harmful versus beneficial. The system’s “bias” thus reflects the cultural and economic priorities embedded in its training data.

This dynamic parallels real-world AI scholarship. Researchers such as Safiya Umoja Noble have argued that algorithms reflect existing social inequalities because they are trained on historically skewed data. In the film, the AI art generator scene reinforces this idea. When participants search for “Indian tribal,” the generated image resembles a Native American stereotype rather than an Indian Adivasi woman. The mismatch demonstrates that global datasets privilege certain representations while marginalizing others. Bias here is geopolitical, rooted in data dominance rather than technical malfunction.

Thus, the narrative makes clear that algorithmic bias is not an accidental glitch but a cultural artifact of the systems and institutions that produce it.


Epistemic Hierarchies: Whose Knowledge Counts?


Beyond exposing bias, Humans in the Loop highlights epistemic hierarchies—structures that determine which forms of knowledge are considered legitimate. Nehma possesses deep ecological and experiential knowledge derived from living within her environment. She believes that stones, forests, and insects are interconnected forms of life. However, within the AI workplace, her understanding is subordinated to corporate instruction.

When her supervisor corrects her labeling decision, the hierarchy becomes explicit. The authority of the client overrides local ecological expertise. This reflects what philosopher Miranda Fricker calls epistemic injustice, where individuals from marginalized communities are discredited as knowers. Nehma’s knowledge is not evaluated on its accuracy but on her position within institutional power structures.

The film also foregrounds the invisibility of data labelers. Millions of workers like Nehma perform repetitive annotation tasks that enable global AI systems, yet their intellectual contributions remain unacknowledged. AI is marketed as self-learning and autonomous, obscuring the human labor behind it. This invisibility reinforces epistemic hierarchy: users see technological output but not the marginalized human input that shapes it.

The narrative thus demonstrates that technological systems privilege institutional and corporate knowledge while rendering indigenous and experiential knowledge subordinate. AI becomes a site where global economic power dictates what counts as truth.


Representation, Ideology, and Power Relations


The conflict between Nehma and her workplace reflects broader ideological struggles. The agricultural AI project seeks efficiency and precision, categorizing weeds, pests, and crops into rigid classifications. Nehma’s worldview, by contrast, emphasizes relational balance and ecological interdependence. The tension between these perspectives illustrates how ideology shapes technological design.

Drawing on Louis Althusser’s theory of ideology, we can interpret the AI system as an ideological state apparatus that naturalizes dominant economic priorities. The client’s instruction—label the caterpillar as a pest—functions as a directive that transforms economic ideology into technological truth. The system does not merely reflect reality; it produces a version of reality aligned with institutional interests.

Michel Foucault’s insight that knowledge and power are intertwined is also relevant. Classification is a form of power because it structures how reality is perceived and acted upon. In the film, once the AI model learns to identify the caterpillar as a pest, it may guide machines to eliminate it. Thus, representation leads to material consequence. The act of labeling is not symbolic; it is politically charged.


Apparatus Theory: AI as Ideological Machine


Apparatus Theory, developed by Jean-Louis Baudry and Christian Metz, argues that cinema itself functions as an ideological machine. The cinematic apparatus shapes how spectators perceive reality while concealing its mechanisms. Humans in the Loop mirrors this structure in its portrayal of AI.

The film visually contrasts the expansive forest landscape with the confined digital interface of the labeling screen. Nature appears complex and fluid, while the computer interface reduces reality to boxes, outlines, and binary options. The repetitive clicking, timed CAPTCHA tests, skeletal labeling tasks, and quality reviews expose the hidden infrastructure behind AI systems.

By revealing the mechanics of data labeling, the film demystifies technological authority. It shows that AI’s apparent intelligence is built upon repetitive human mediation. Just as cinema hides cameras and editing to create seamless illusion, AI hides human labor to create the illusion of autonomy. Through this parallel, the film suggests that both cinema and AI are apparatuses that produce ideological meaning while masking their foundations.


Conclusion


Humans in the Loop offers a profound critique of the relationship between technology and human knowledge. It demonstrates that algorithmic bias is culturally situated, emerging from human categorization and institutional priorities rather than purely technical flaws. Through the conflict between Nehma’s indigenous ecological knowledge and corporate AI directives, the film exposes epistemic hierarchies that privilege institutional authority over marginalized knowledge systems.

By engaging with concepts of representation, ideology, and power relations, and through the framework of Apparatus Theory, the narrative reveals AI as an ideological machine that reflects and reinforces dominant worldviews. Technology in the film is neither neutral nor independent. It is a mirror of human decisions—economic, cultural, and political.

Ultimately, the film leaves us with a critical question: if AI learns from us, whose knowledge will shape its intelligence—and whose will be erased?


Q. Examine how the film visualizes invisible labour and what it suggests about labour under digital capitalism. 




Introduction: Making the Invisible Visible


Humans in the Loop intervenes in contemporary representations of artificial intelligence by shifting attention from glamorous innovation to the hidden labour that sustains AI systems. Instead of engineers or tech CEOs, the film centers Nehma, an Adivasi woman working as a data labeler in rural Jharkhand. Through this choice, the film visualizes invisible digital labour and interrogates how digital capitalism commodifies human perception while rendering workers culturally unseen.

The film does not treat AI as autonomous intelligence; rather, it repeatedly reminds us that “AI is like a child. If you teach it wrong, it will learn wrong.” This line becomes central not only to technological ethics but also to labour politics. If AI learns from humans, then the workers who train it are foundational—yet paradoxically invisible.


Visual Language: Repetition, Constraint, and Mechanized Time


The film’s visual grammar emphasizes repetition and confinement. The camera lingers on Nehma’s face as she draws bounding boxes around objects, traces skeletal outlines of moving bodies, and clicks through agricultural images. The mechanical sound of the keyboard replaces dramatic music. Timers count down. Screens glow in dimly lit rooms.

In one early workplace scene, the supervisor instructs:
“Forty seconds per task. The form will lock automatically.”

This line encapsulates digital capitalism’s regulation of time. Labour is measured in seconds, productivity in micro-tasks. The ticking timer visually dramatizes what Marx described as the commodification of labour power: time itself becomes a unit of exchange.

From a Marxist film theory perspective, the repetitive visual rhythm reflects alienation. Nehma does not see the final outcome of her work; she only produces fragments—labels, outlines, tags. The product of her labour is abstracted into a global AI system beyond her control. The film’s static framing reinforces this detachment. The digital interface reduces complex ecological realities into binary categories: pest / not pest, weed / crop, correct / error.


Emotional Labour and Ethical Conflict


Although the tasks appear mechanical, the film reveals their emotional depth. In the now-famous caterpillar scene, Nehma resists labeling an insect as a pest. She explains:

“That is not a pest, madam. It does not cause harm. Yes, it eats leaves, but it eats the rotten parts so the plant remains healthy.”

Her supervisor responds coldly:

“Listen, Nehma ji, if the client has said that it is a pest, then it is a pest. Your job is to label it. Just do that. Don’t overthink.”

This exchange exposes the emotional contradiction of digital labour. Nehma’s ecological knowledge—rooted in lived experience—is dismissed in favor of corporate instruction. Her cognitive interpretation becomes valuable only when aligned with institutional objectives. When she diverges, her insight is marked as error.

From a Marxist cultural theory lens, this reflects the commodification of subjectivity. Nehma is not merely selling physical labour; she is selling perception itself. Yet she does not control how that perception is used. The film therefore reveals that digital labour extracts not only time but also ethical judgment and cultural knowledge.


Cultural Valuation of Marginalised Work


A key question the film raises is: who is recognized as a contributor to AI?

Globally, AI is associated with elite programmers and urban tech hubs. Yet the film shows that its foundation rests on rural, marginalized workers performing annotation tasks. The supervisor’s dismissive tone—“Have you come here to start a family? Go and do your work.”—illustrates how labourers are reduced to function rather than acknowledged as knowledge holders.

Through the lens of Representation and Identity Studies, Nehma’s identity as an Adivasi woman complicates dominant narratives about technological production. She embodies indigenous ecological knowledge while participating in global digital infrastructure. However, her contribution remains socially invisible.

The AI art generator sequence deepens this critique. When workers input “Indian tribal,” the system produces Westernized or Native American imagery. The gap between lived identity and algorithmic output highlights cultural misrecognition. The very workers who correct datasets are themselves misrepresented within them. This suggests that marginalised labour sustains technology while being erased from its cultural imagination.


Digital Capitalism and Class Structure


The data labeling center operates as a microcosm of global capitalism. Clients—unseen but powerful—define categories. Supervisors enforce compliance. Workers execute micro-tasks under surveillance. The chain of authority is transnational and hierarchical.

The line “If the client says it is a pest, then it is a pest” encapsulates this structure. Truth is determined by economic power, not ecological knowledge. This dynamic echoes Marxist critiques of class relations: those who control capital define reality.

The film also highlights precarity. Nehma’s probation period, custody battle, and economic vulnerability make refusal risky. Her labour is essential yet insecure. Digital capitalism integrates marginalized workers into global supply chains while denying them authorship and recognition.


Apparatus Theory: Cinema Revealing the Digital Apparatus


Drawing on Apparatus Theory, we can see how the film parallels cinema and AI as ideological machines. Just as cinema traditionally hides cameras and editing to produce seamless illusion, AI hides the data labeling workforce to produce the illusion of autonomy.

However, Humans in the Loop reverses this invisibility. By foregrounding clicking sounds, timers, skeletal annotations, and quality reviews, the film exposes the apparatus behind AI. The screen becomes a site of ideological production. Reality is not discovered—it is categorized.

The contrast between forest landscapes and rectangular digital frames visually encodes this tension. Nature appears fluid and relational; the interface is rigid and binary. Through this contrast, the film critiques technological reductionism and reveals how classification systems mirror societal hierarchies.


Empathy, Critique, and Transformation


The film invites empathy through close-up shots and quiet realism. We witness Nehma’s exhaustion, her maternal anxieties, and her moral conflict. Yet the narrative does more than humanize; it critiques structural invisibility.

When Nehma says, “AI is like a child. If you teach it wrong, it will learn wrong,” the statement resonates beyond technology. It suggests that systems reflect the values of those in power. If marginalized knowledge is ignored, AI will reproduce that erasure.

The film therefore pushes viewers toward transformation. It asks us to reconsider who we imagine when we think of technological innovation. It reassigns authorship of AI from distant corporations to rural women drawing bounding boxes on screens.


Conclusion


Humans in the Loop visualizes invisible labour by centering the repetitive, ethically complex work of data labeling. Through restrained cinematography and the integration of original dialogue, the film reveals how digital capitalism commodifies human perception while obscuring its dependence on marginalized workers.

Using Marxist and Cultural Film Theory, we see how labour is fragmented, alienated, and regulated by global capital. Through Representation and Identity Studies, we understand how identity intersects with labour to challenge assumptions about technological contribution. And through Apparatus Theory, the film exposes AI as an ideological machine that constructs reality while hiding its human foundation.

Ultimately, the film insists on a radical recognition:

Behind every “intelligent” system stands human labour—often rural, female, indigenous, and unseen.

By making this labour visible, the film not only invites empathy but also demands a rethinking of value, authorship, and justice in the age of digital capitalism.


Q. Analyze how film form and cinematic devices (camera techniques, editing, sequencing, sound) convey the philosophical concerns about digital culture and human-AI interaction. 




Introduction: Form as Philosophy


In Humans in the Loop, film form is not decorative; it is philosophical. Through camera framing, editing rhythms, sequencing, and sound design, the film constructs a visual argument about digital culture and human–AI interaction. Rather than explaining its themes through exposition, the film embeds its critique within cinematic language.

Using frameworks from Structuralism, Film Semiotics, Formalism, and Narrative Theory, we can see how the film organizes meaning through oppositional visual codes: nature versus interface, fluidity versus classification, relational knowledge versus algorithmic reduction. The film’s aesthetic structure mirrors its intellectual concern—how digital systems transform human perception into data.


Natural Imagery vs Digital Spaces: A Semiotic Opposition


From a Structuralist perspective, meaning in cinema emerges through binary oppositions. Claude Lévi-Strauss’ structural logic—nature/culture, organic/mechanical—becomes central to understanding the film’s visual grammar.

Throughout the narrative, the forest is filmed in wide, fluid shots, often with handheld camera movement that follows Nehma walking freely through landscapes. Natural light dominates these sequences. Ambient sounds—wind, insects, footsteps—create a sense of openness and continuity. The forest functions semiotically as a sign of interconnectedness and relational knowledge.

In contrast, the AI labelling center is framed through static compositions and rectangular framing devices. The computer screen literally boxes reality into geometric shapes. Bounding boxes, skeletal outlines, and cropped images fragment the world into analyzable units. Fluorescent lighting replaces natural sunlight. The diegetic soundscape shifts to keyboard clicks and countdown timers.

This visual opposition constructs a structural code:

  • Forest = fluid, relational, ecological worldview

  • Interface = rigid, classificatory, instrumental worldview

Through Film Semiotics, we understand that these are not neutral settings but sign systems. The forest signifies indigenous epistemology; the screen signifies digital capitalism. The viewer reads this opposition unconsciously through repetition.


Camera Techniques: Framing, Proximity, and Confinement


From a Formalist lens, camera techniques shape viewer experience and meaning. The film frequently uses close-up shots of Nehma’s face during labelling tasks. These close-ups create intimacy but also claustrophobia. Her concentration is intense, yet the narrow framing visually traps her within the digital interface.

In contrast, forest sequences often employ medium-long shots and tracking shots, allowing Nehma’s body to move freely within the frame. The spatial expansion visually encodes autonomy.

This shift in camera distance is not accidental. According to David Bordwell and Kristin Thompson’s narrative theory, spatial organization directs viewer cognition. When the camera compresses space in office scenes, viewers feel the mechanical constraint of labour. When the frame opens in forest scenes, viewers experience breathing room and continuity.

Thus, cinematography itself conveys the philosophical tension between human subjectivity and technological containment.


Editing & Sequencing: Rhythm of Labour vs Rhythm of Nature


Editing patterns further reinforce thematic concerns. The AI lab scenes are structured through repetitive cuts—screen, face, cursor, timer, supervisor, back to screen. This cyclical sequencing mimics the loop structure of machine learning itself.

The repetition produces what Formalist critics call temporal monotony—a rhythm that reflects alienated digital labour. The editing pace is measured and mechanical.

By contrast, forest scenes unfold in longer takes, allowing time to stretch. There is minimal cutting. Silence lingers. The pacing feels organic.

This contrast shapes viewer perception of labour. In the lab, time is fragmented and accelerated. In nature, time is continuous and experiential. The film thereby visualizes a key philosophical question of digital culture: does algorithmic logic fragment lived experience into calculable units?


Sound Design: From Silence to Mechanization


Sound plays a critical ideological role. The film avoids dramatic background scores during labelling sequences. Instead, we hear:

  • Keyboard clicks
  • Mouse movements
  • System notifications
  • Countdown timers

The absence of emotive music renders labour stark and procedural. Sound becomes a marker of industrialization.

In forest scenes, however, sound is immersive—birds, wind, footsteps, subtle breathing. At key moments, spiritual chanting and heartbeats emerge, reinforcing relational cosmology.

From a Semiotic perspective, these sonic textures function as signifiers of competing epistemologies:

  • Mechanical sound = algorithmic culture

  • Natural sound = embodied, ecological culture

The viewer does not simply understand the thematic tension intellectually; it is experienced sensorially.


Narrative Structure: The Loop as Form


The title itself—Humans in the Loop—is a structural metaphor. The narrative is organized cyclically:

  • Home → Work → Conflict → Return to nature → Work again

  • Label → Correction → Quality check → Feedback

This looping mirrors machine learning feedback systems. The narrative structure therefore imitates digital logic while simultaneously critiquing it.

From a Structuralist lens, the “loop” functions as a narrative code. It signifies repetition, control, and feedback—but also the possibility of intervention. Nehma’s refusal to label the caterpillar as a pest interrupts the loop. The interruption becomes narratively and philosophically significant.


Identity, Representation, and Postcolonial Form


Through Cultural and Postcolonial Film Theory, the film’s form can be read as a critique of representational hierarchies. The AI art generator scene visually dramatizes misrecognition. When “Indian tribal” produces Westernized imagery, the screen becomes a site of epistemic distortion.

The camera alternates between the digital output and Nehma’s reaction. This shot–reverse-shot structure emphasizes the gap between lived identity and algorithmic representation.

By foregrounding this discrepancy, the film questions who controls visual archives and datasets. It exposes how digital culture often encodes colonial visual dominance. The aesthetic strategy thus aligns with postcolonial critiques of global media systems.


Viewer Experience: Embodiment, Empathy, and Critical Distance


Formalist theory argues that aesthetic choices guide emotional response. Here, minimalism and restraint create a dual effect:

  • Empathy, through intimate close-ups and natural lighting

  • Critical distance, through repetitive mechanical editing

The viewer oscillates between identification with Nehma and awareness of structural constraint. This oscillation mirrors the film’s philosophical stance: human agency exists within technological systems, but it is limited.

The aesthetic experience itself becomes pedagogical. The audience feels the monotony of labour and the constriction of digital space. The form makes the philosophy experiential rather than abstract.


Apparatus & Digital Culture


Although primarily structural and formalist, the film also resonates with Apparatus Theory. Just as classical cinema hides its machinery to produce ideological illusion, AI hides its human infrastructure to appear autonomous.

However, this film reverses that invisibility. By foregrounding the act of drawing boxes and tracing skeletal forms, it exposes the apparatus of AI production. The viewer becomes aware of mediation.

Thus, the film critiques not only AI but also digital spectatorship itself. Screens do not simply display reality—they construct it.


Conclusion: Form as Critique of Digital Modernity


Through its careful orchestration of camera framing, editing rhythm, sequencing patterns, and sound design, Humans in the Loop transforms film form into philosophical argument. The opposition between forest and interface, fluidity and fragmentation, relational sound and mechanical clicking encodes a broader critique of digital culture.

Using Structuralism and Film Semiotics, we see how binary visual codes organize meaning. Through Formalist and Narrative Theory, we understand how pacing, framing, and repetition shape viewer experience of labour and technological control. Through Postcolonial and Cultural Film Theory, we recognize how representation intersects with identity and power.

Ultimately, the film suggests that digital culture restructures perception itself—turning lived experience into data, identity into category, and knowledge into commodity. Yet through its aesthetic choices, the film also preserves the possibility of interruption.

In revealing the tension between human subjectivity and algorithmic logic, Humans in the Loop demonstrates that cinema can make visible what digital culture seeks to render invisible.


References : 


Alonso, D. V. “Imagining AI Futures in Mainstream Cinema: Socio-technical Narratives and Social Imaginaries.” AI & Society, 2026.

https://doi.org/10.1007/s00146-026-02880-7


Anjum, N. “Aranya Sahay’s Humans in the Loop and the Politics of AI Data Labelling.” The Federal, 2026.

https://thefederal.com/films/aranya-sahay-humans-in-the-loop-oscar-adivasi-data-labelling-jharkhand-ai-tribal-216946


Barad, Dilip. “Humans in the Loop: Exploring AI, Labour and Digital Culture.” Blog, Jan. 2026.

https://blog.dilipbarad.com/2026/01/humans-in-loop-film-review-exploring-ai.html 


Bazin, André. What Is Cinema? Vol. 1, University of California Press, 1967.

https://www.ucpress.edu/book/9780520242278/what-is-cinema


Bordwell, David, and Kristin Thompson. Film Art: An Introduction. 12th ed., McGraw-Hill Education, 2019.

Cave, Stephen, et al. “Shuri in the Sea of Dudes: The Cultural Construction of the AI Engineer in Popular Film, 1920–2020.” Feminist AI: Critical Perspectives on Algorithms, Data, and Intelligent Machines, Oxford University Press, 2023, pp. 65–82.https://doi.org/10.1093/oso/9780192889898.003.0005

Deleuze, Gilles. Cinema 1: The Movement-Image. Translated by Hugh Tomlinson and Barbara Habberjam, University of Minnesota Press, 1983.

Film Theory. “The Year’s Work in Critical and Cultural Theory.” 2025.https://doi.org/10.1093/ywcct/mbaf004


Frías, C. L. “The Paradox of Artificial Intelligence in Cinema.” Cultura Digital, vol. 2, no. 1, 2024, pp. 5–25. https://doi.org/10.23882/cdig.240999


Göker, D. “Human-like Artificial Intelligence in Indian Cinema: Cultural Narratives, Ethical Dimensions, and Posthuman Perspectives.” International Journal of Cultural and Social Studies, vol. 11, no. 2, 2025, pp. 1–10. https://doi.org/10.46442/intjcss.1799907


Haris, M. J., et al. “Identifying Gender Bias in Blockbuster Movies through the Lens of Machine Learning.” Humanities and Social Sciences Communications, vol. 10, 2023, article 94. https://doi.org/10.1057/s41599-023-01576-3


Indian Express Editorial. “Humans in the Loop: Technology, AI and Digital Lives.” The Indian Express, 2026. https://indianexpress.com/article/opinion/columns/humans-in-the-loop-aranya-sahay-technology-ai-digital-10391699/


McDonald, Kevin. Film Theory: The Basics. 2nd ed., Routledge, 2023.

Sahay, Aranya, director. Humans in the Loop. 2024.

Shepherdson, Charles, John Simpson, and Annette Utterson, editors. Film Theory: Critical Concepts in Media and Cultural Studies. Vols. 1–4, Routledge, 2004.

Sui, Z., and S. Wang. “Dogme 25: Media Primitivism and New Auteurism in the Age of Artificial Intelligence.” Frontiers in Communication, vol. 10, 2025, article 1659731.https://doi.org/10.3389/fcomm.2025.1659731

Vighi, Fabio. Critical Theory and Film: Rethinking Ideology through Film Noir. Bloomsbury Academic India, 2019.

Yu, Y. “The Reel Deal? An Experimental Analysis of Perception Bias and AI Film Pitches.” Journal of Cultural Economics, vol. 49, 2025, pp. 281–300.

https://doi.org/10.1007/s10824-025-09534-4


No comments:

Post a Comment

Humans in the Loop (2024) Movie