I have noticed a trend of opposition between artists and robots. Throughout history, the development of technology has constantly pursued the “creative machine” while artists have continuously fought against the possibility of technology replacing them. I have alternatively found the pervasiveness of computers and productive machines exciting in my creative practice. I see a way forward that lies in the partnership of these two disciplines, where the human hand and the machine create together. What would it look like if artists were invited to collaborate with robots instead?
I am designing robots that interfere with the creative process. I hypothesize that introducing playful adversary to the behavior of robots designed for creatives could allow artists to find more value in these machines developed as collaborators rather than industry opponents. I see this as a means for robots to help artists exercise creativity, be catalysts for experimentation, and provide novel sets of creative constraints.
“UnsTable” is a robot drawing desk that moves while an artist draws on the platform. Its motion has been shown to facilitate mistake-making, experimentation, and inspire new tracks of thinking through artist-robot collaboration.
In designing UnsTable, I decided to move the drawing surface as opposed to building an arm or other mechanism that reaches over the artist or manipulates a drawing tool. This allows the artist’s hand to remain an integral part of the process to confront this notion of replacement. It also challenges the expectation for robots to operate in pursuit of perfect output, often exaggerating imperfection instead.
UnsTable is controlled through infrared signals that are randomly selected, in sets of two, on a 1–5 second interval. These signals were derived from the pulses emitted when a button is pressed on the remote that controls the robot-vacuum base of UnsTable. Once captured with an infrared receiver, I copied and re-sequenced the signals using Arduino to trigger movements with an algorithm rather than through a human-operated remote control. [See code on GitHub here]
While the motion sequence is currently randomized, this prototype represents the beginning of a track of collaborations between an artist and desk robot. There are many ways data could be collected about an artist’s movement, posture, mark making, etc. (for example through sensing the pressure applied to the drawing surface or using computer vision). Any of the possible inputs could then instigate motion more directly in response, and in adversary, to what the artist is doing. Randomized motion does, however, count as adversarial since the goal of the robot (moving around) is contrary to the goal of the artist (to make a drawing or painting).
Even prior to the End of Year Shows, artists of many disciplines have produced 60 drawings in collaboration with UnsTable. Their combined works show the addition of new details, guided experiments and abstraction, and other diversions from initial ideas. What I am the most excited about is that people have been inspired to play and experiment with a robot, that many of them laughed and smiled while drawing, and that several people changed what, where, and how they were drawing.
I am thrilled with these responses so far, yet I never intended for UnsTable to exist as the final form for this concept. Instead, I see UnsTable as an invitation to start a conversation. How would we design robots differently if we considered radical use cases like adding turbulence to the creative process rather than automating enjoyable tasks?
In addition to the creation of UnsTable, I have conducted pilot studies, written an academic paper, and produced a video figure which were published under the 2024 International Conference of Human Robot Interaction in the ACM Digital Library. I attended the conference in Boulder, Colorado (March 2024), and presented posters for this research during the Late Breaking Report session as well as the Rebellion and Disobedience in HRI workshop. This work is now a part of the broader body of human-robot interaction research, where it has the potential to be considered and iterated on in the future beyond my personal exploration.
I want to thank Justin Bakse who encouraged me to share my research with the world. His time, feedback, connections, and support made all the difference. The NSF Research Experiences for Undergraduates program supported my introduction to the Future Automation Research Lab at Cornell Tech, and to my co-author Wendy Ju. Thank you to Maria Teresa Parreira, David Goedicke, Natalie Friedman, Dave Dey, and others from the lab, and to my Parsons Design and Technology thesis faculty Melanie Crean, Ernesto Klar, Alexander King, Samuel Leigh, Gloria Duan, and Ayodamola Tanimowo Okunseinde. Finally, I want to also thank Harpreet Sareen, Kyle Li, Anthony Dunne, Jim McCann, Carla Diana, Sven Travis, Margaret Rhee, Mattia Casalegno, Arielle Mella, Jacob Hennessy-Rubin and those in the NY Robotics Network, whose perspectives on art, creativity, and robotics helped shape this research.
The thesis show really could not have been possible without everyone who was there to help me. A huge shout out goes to my family and my partner who helped me set up my project, brought me food, collected people’s contact information, and were there to support me even though they got to try the robot last.
Thank you, also, to Ishaan Das, who stuck around to document my project the whole evening, as well as Jesse Harding for helping me find all of the right tools in the chaos before the show to pull final things together. Lastly, I sincerely appreciate the over 20 people who made a drawing with UnsTable at the show, as well as everyone else who came over to ask questions and discuss my project. I am honored to have shared so much excitement with you all!