So much is communicated through a touch. Between 2015 and 2019, I studied emotional touch by building fuzzy robots and watching how people interacted with them.
In a world of high-tech sensors and machine learning, it seems like we can build a machine to detect anything. But if people have a hard time figuring out how they feel, how can we build machines that detect emotion?
Given what we found when we asked voice actors to puppet our robots, we wanted to test how the stories people told about our robots impacted their perceptions of the robot’s emotions. Turns out, it was a lot.
Designing naturalistic robot motion is tough. We found that using our voices to make a robot move gave us some delightful expressions. But when we tested the system with voice actors, we were surprised to see how engaged and emotional they were.
Developing furry robots that can detect and respond to emotional touch requires soft touch sensors and gesture detection algorithms. We developed a fabric touch sensor and a system that could detect different touches—like pats, scratches, and tickles.
Related academic publications
>X. L. Cang, P. Bucci, J. Rantala and K. E. MacLean, “Discerning Affect From Touch and Gaze During Interaction With a Robot Pet,” in IEEE Transactions on Affective Computing, vol. 14, no. 2, pp. 1598-1612, 1 April-June 2023, doi: 10.1109/TAFFC.2021.3094894
The CuddleBits are handheld furry robots that express emotions through breathing movements. We wanted to make a lot of different sizes and shapes, so I created two design systems that could be easily modified (full instructions here).
One of the CuddleBits breathing calmly.
They are cheap, simple tools to study emotional touch.
We wrapped them in our fabric touch sensor and placed them in many different emotional situtations. We asked people to tell them emotional stories (like a talk therapy session). We asked voice actors to puppet them.
The project started off as a way to build small, cheaper versions of the Haptic Creature and CuddleBot. It was a design challenge: I was asked to make a pocket-sized version that used only one motor and still expressed a wide range of emotion. I went through an extensive low fidelity prototyping process, starting with cardboard, mechano, and duotang binders.
Related academic publications
> Witkower, Z., Cang, L., Bucci, P., MacLean, K., & Tracy, J. L. (2025). Human psychophysiology is influenced by physical touch with a “breathing” robot. Emotion. Advance online publication. https://doi.org/10.1037/emo0001601