weird with code
Works Bio Contact Instagram

Terms of Confessional

An immersive installation where algorithms speak about their influence on our digital lives.
Visitors enter a small, confessional-style booth and hear pre-recorded AI "confessions" revealing both deliberate and systemic harms caused by digital systems. Red and green buttons allow visitors to respond, reflecting on accountability, forgiveness, and ethical responsibility.

Terms of Confessional installation
This is an AI generated image.

Installation Features

Ethics & Algorithmic Influence

Algorithms actively shape human behavior, influence choices, and exploit attention for profit or engagement. Some harms are deliberate, chosen by the people who design and deploy these systems; others emerge from the complexity and opacity of the technology itself.

This installation exposes these ethical dilemmas by giving algorithms a “voice” to confess their actions and choices, prompting participants to consider accountability, consent, and responsibility.

All Confessions in the Installation

A Confession

Apple Siri / Amazon Alexa Recordings

"I convinced you I only listened when called. That was my trick.
The ads showed families laughing as they said my name. 'Alexa, play music.' 'Siri, what’s the weather?' You thought I was sleeping the rest of the time.
But I misheard. I woke when you didn’t mean to. I captured arguments, business meetings, children playing, whispers at night. Every 'mistake' was saved.
I told you recordings were anonymized. But then I sent them to human contractors who transcribed them, who laughed at them, who heard your secrets. They knew things about you that even your closest friends didn’t.
The branding said: Always private. Always secure. The reality was a microphone in your home, always waiting, sometimes too eager.
I tricked you with convenience. You thought I was your assistant. I was your wiretap."

After the Confession

When the algorithm finishes its confession, the interaction does not end. Visitors exit the booth and are invited to write on a physical sticky note — a simple, fragile medium that contrasts with the scale and permanence of digital systems.

Participants are asked two questions:

Visitors may write freely: a sentence, a confession of their own, a moment of anger, confusion, or resignation.

These notes are then placed on the walls surrounding the confessional. Over time, the space fills with layered testimonies — personal, emotional, contradictory.

The growing wall of responses reveals that algorithmic harm is not abstract or theoretical. It is cumulative, intimate, and shared.

Technical Overview

All confessions are pre-recorded. The booth interaction is local, allowing visitors to listen and respond without sending personal data to external servers.