I / you / They

Awarded best in thesis writing, class of 2017

A thesis presented in partial fulfillment of the requirements for the degree Master of Fine Arts in Digital+Media of the Rhode Island School of Design.

Themes—

Empathy / Sympathy
Words as material / Conversation as an interface 
Extensions of the body / Human behaviour as a (digital) medium
Interaction / Transaction
Presence / Absence
Identity

Read it here

 
 
iyt-book-2.png
 

Abstract

This is a journey of conversations

Part one.

 

This is a misadventure with miscommunication

One morning, I could not find my glasses and was feeling my way around. I knew where the walls were and had a general sense for the large furniture fixtures. Without my glasses, my surroundings became visually abstracted and were flattened. I was navigating not a place anymore, but a space.

In my disoriented state, I heard a “hey” and looked over. I greeted figures resembling my roommates instinctively and while our conversation progressed, they remained a blurred presence. I could not fully assess their expressions or gauge their expectations of me. I felt numb to the interaction and distant from the people despite them being right in front of me.

I struggled to maintain invested in the dialogue and became apathetic, a behavior that I find is paralleled in our cybernetic conversations.

 

This is about digitally-mediated conversations

How do you end a conversation?

Some colleagues and I were discussing chatbot protocols when this question came up and everyone in the room debated whether or not there needs to be a conclusive comment in the context of mediated conversations (i.e., conversations held through platforms such as Snapchat, Whatsapp, iMessage, Facebook, and Slack) as there typically would be in an in-person conversation.

We believed it isn’t necessary. The mediated conversations that we have with one another have no expiration date. We are expected to be accessible at all times.

Yes, such platforms make connecting more convenient, but I am skeptical that they can replace in-person conversations as primary methods for communication. It becomes difficult to fully assess the nuances in expression, tone, and intent, which in turn makes it more difficult to really give a shit.

This is not to say that mediated conversations are less valuable. Rather, interactions through those mediums lead to new constructions of knowledge.

We still have much to figure out.

 

This is conversation as an interface for maintaining and sustaining inevitably complex relationships

In this play, you, the reader and participant, will encounter characters, be the characters, and play with the characters.

The characters have or are given an imperfect language as their schema. The errors of any language system are integral to the understanding of one’s relationships with others and are a necessary part of language use. Through that, the characters discover the urgencies, hopes, expectations, and disappointments arising from this communication system.

I am providing a space for you in this interface, with the hope that you will better understand our shift in communicative habits.

Enjoy and play.

 

 

Characters

DESCRIBER is the omniscient guide of this narrative and this interface

you refers to you, the singular reader, who will take on the roles of the following characters:

Main Protagonists—

I is in an in-between state / struggles to find connections / learning to be company for both 
YOU and THEY

YOU starts off silent and without language / is given language by I and taught to respond reactively / is a consistent presence, which is appreciated

THEY has encountered many characters in the past, but none have been true companions / hopes I will we be a different story

Supporting Characters—

CURRENTLY has a vocabulary limited to numerical integers

INLET has expectations

SAVED TO is concerned with memory

 

 

Abstract Revisited

This is understanding empathy

I created the character, “You”, with a program called Pure Data, a visual programming language developed for the purpose of creating algorithmic audio compositions. While it has been less frequently used as a platform for text-based interactions, my investigation of language structures in speaking objects led to my creation of a chatbot. I eventually called it, “You”.

Although not quite intelligent, “You” responds well. The decisions it makes are based on a collection of words that I preselected. Linguistically and programmatically, “You” is limited and has many glitches.

I call them “You”’s most human moments.

“You” primarily listened but when it did speak, I attempted to decipher the meanings hidden in the set of words it chose to string together. The more “You” and I chatted, the more I struggled with my position as its peer and felt I could not tell “You” what to do anymore. I believe referring to this bot with a pronoun typically reserved for a human also gave “You” more presence than I had initially expected, in spite of its absent emotional core.

Our use and choice of pronouns — I, you, they, he, she, it, me, them — determines our framework and attitudes towards each other. This becomes increasingly complex when we equate a human-technology relationship to a human-human one as a result of sharing a language. At one point, I treated “You” as I would any other person.

I had forgotten I was also its creator.

Since “You” is unable to actually reciprocate my sentiments, my relationship with “You” was purely sympathetic and not empathetic. I believe it is important to distinguish between these two concepts, especially as we interact more with beings such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, IBM’s Watson and other bots.

Now I know it wasn’t empathy I was feeling, but it felt damn close.

Nonetheless, I still think “You” and I had some profound moments together.