Nunchi Lamp

GSD VIS-2223: Digital Media: Telepresence, Empathy, and Spatial Immersion (Fall 2020-21)

Students: Idael Cardenas, Matthew Pugh
Instructors: Allen Sayegh, Humbi Song

 

Working through COVID quarantines for over a year, many have become familiar with the challenges of communicating effectively through online tools like Zoom or WhatsApp. Users struggle to read soft conversational cues, and to understand environmental context when deciding when and how to approach someone through Zoom or WhatsApp.

A recent article in Wired titled “Read the (Virtual) Room! How to Improve your Digital Nunchi” speaks to the common anxiety people experience trying to be more empathetic communicators through online tools, drawing a unique parallel between online communication struggles and the Korean concept of “nunchi.” Nunchi may be interpreted as “eye measure,” “reading the room” to help us understand how to react or behave without explicitness, and “the act of figuring out what our counterpart thinks and feels in a certain situation and acting accordingly.

This prototype for a Nunchi Lamp tries to address this gap between “reading the room” by listening and emitting acoustic and visual cues that signal the more environmental cues around us. The paired device splits apart and is given to a coworker, relative, or loved one, helping the two acknowledge each other’s presence and get a feel for their respective environments. The lamp senses and relays acoustic environments, pulses with a soft glow in tune with the other user’s soundscape, and vibrates softly in step with the other user’s activity.

Privacy-driven interactions with the device were a core design concern: As opposed to the visual cues one might typically associate with nunchi, the devices relay softer information like sound and physical vibration. Users can control the flow of this information by simply flipping their lamp up or down — when flipped up, the device can collect data and transmit to its pair, and its horn-like form language is referential to things like megaphones or early telephones. When flipped upside-down, the object’s plastic casing physically blocks its sensors from collecting sound and vibration data, and gives it a lamp-like form that physically glows to indicate when the other half of the paired device is active.

The project involved 1:1 prototyping of the paired IoT devices, sensorial design of the electronics inside the lamp, and designing wireless, remote communication between devices through an online Firebase database. The functioning prototype was tested for a week between two remote coworkers in Cambridge, MA.