speaker-info

Rebecca Fiebrink

Senior Lecturer - Goldsmiths University of London

Designing expressive sensor-based interactions with machine learning

In this talk, I will show how machine learning can enable new forms of expressive, embodied, and creative interactions. By modelling data that captures how we move, play, and make art and music, machine learning algorithms can help us build interfaces that understand our most expressive human activities. Furthermore, by designing new interactions from example data, rather than by programming, algorithms can help people more quickly prototype and explore new ideas, and they can enable non-programmers to build more complex interactive systems. This talk will include a brief live demonstrations of machine learning used to make new musical instruments.

About the speaker:

Dr. Rebecca Fiebrink is a Senior Lecturer at Goldsmiths, University of London. Her research focuses on designing new ways for humans to interact with computers in creative practice, including on the use of machine learning as a creative tool. Fiebrink is the developer of the Wekinator, open-source software for real-time interactive machine learning whose current version has been downloaded over 10,000 times. She is the creator of a MOOC titled “Machine Learning for Artists and Musicians,” which launched in 2016 on the Kadenze platform. She was previously an Assistant Professor at Princeton University, where she co-directed the Princeton Laptop Orchestra. She has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research, and Smule, where she helped to build the #1 iTunes app “I am T-Pain.” She holds a PhD in Computer Science from Princeton University.

My Sessions