Eyemole was an interactive arts cooperative that engaged in corporate roleplay. For one show, we bought a set of cheap paintings and created augmented reality overlays as the hidden digital artwork. The artworks were hooked into an e-commerce platform that updated the artwork’s exchange value based on different algorithms for each piece (duration of viewing… Continue reading EyeMole’s Art Sale
Author: Paul Bucci
Performance: manipulated bodies
In collaboration with artist Kathy Yan Li, we created a series of performances that dealt with digital and physical body manipulations. In two performances, I painted Kathy with white paint and invited audience members to Photoshop her live. In once performance, an audience member felt it was unfair to have only Kathy painted, and wanted… Continue reading Performance: manipulated bodies
How should we measure emotion?
In a world of high-tech sensors and machine learning, it seems like we can build a machine to detect anything. But if people have a hard time figuring out how they feel, how can we build machines that detect emotion? Related academic publications > P. H. Bucci, X. L. Cang, H. Mah, L. Rodgers, and… Continue reading How should we measure emotion?
Complex stories about simple robots
Given what we found when we asked voice actors to puppet our robots, we wanted to test how the stories people told about our robots impacted their perceptions of the robot’s emotions. Turns out, it was a lot. Related academic publications > P. Bucci, L. Zhang, X. L. Cang, and K. E. MacLean, “Is it… Continue reading Complex stories about simple robots
Voodle: translating sound to movement
Designing naturalistic robot motion is tough. We found that using our voices to make a robot move gave us some delightful expressions. But when we tested the system with voice actors, we were surprised to see how engaged and emotional they were. Related academic publications > P. Bucci, X. L. Cang, A. Valair, D. Marino,… Continue reading Voodle: translating sound to movement
Detecting gestures on a fabric touch sensor
Developing furry robots that can detect and respond to emotional touch requires soft touch sensors and gesture detection algorithms. We developed a fabric touch sensor and a system that could detect different touches—like pats, scratches, and tickles. Related academic publications > Cang, X. L., Bucci, P., Strang, Allen, J., MacLean, K. E., and Liu, H.… Continue reading Detecting gestures on a fabric touch sensor
Small furry robots
The CuddleBits are handheld furry robots that express emotions through breathing movements. We wanted to make a lot of different sizes and shapes, so I created two design systems that could be easily modified (full instructions here). They were cheap, simple tools to study emotional touch. We wrapped them in our fabric touch sensor and… Continue reading Small furry robots
Reddit’s Am I the Asshole?
Reddit’s Am I the Asshole advice forum is a rich source of information on what people consider to be moral and good in our culture. I am using natural language processing (NLP) methods to support a deep read of social norms around device privacy and interpersonal trust.