Interactive Exploration-Exploitation Balancing for Generative Melody Composition


†: The University of Tokyo

‡: National Institute of Advanced Industrial Science and Technology (AIST)

Concept: interactive exploration-exploitation balancing in the search for the desired content in user-in-the-loop optimization processes. We allow the user to manually adjust the balance between exploration and exploitation in each iteration to control the variations of the system-generated candidates. Here, we focus on generative melody composition as the representative task, where the goal is to find an appropriate latent vector that generates the desired melody.

Abstract

Recent content creation systems allow users to generate various high-quality content (e.g., images, 3D models, and melodies) by just specifying a parameter set (e.g., a latent vector of a deep generative model). The task here is to search for an appropriate parameter set that produces the desired content. To facilitate this task execution, researchers have investigated user-in-the-loop optimization, where the system samples candidate solutions, asks the user to provide preferential feedback on them, and iterates this procedure until finding the desired solution. In this work, we investigate a novel approach to enhance this interactive process: allowing users to control the sampling behavior. More specifically, we allow users to adjust the balance between exploration (i.e., favoring diverse samples) and exploitation (i.e., favoring focused samples) in each iteration. To evaluate how this approach affects the user experience and optimization behavior, we implement it into a melody composition system that combines a deep generative model with Bayesian optimization. Our experiments suggest that this approach could improve the user’s engagement and optimization performance.

Publication

Yijun Zhou, Yuki Koyama, Masataka Goto, and Takeo Igarashi. 2021. Interactive Exploration-Exploitation Balancing for Generative Melody Composition. In Proceedings of the 26th International Conference on Intelligent User Interfaces (IUI '21), pp.43–47.

DOI: 10.1145/3397481.3450663

Resources

Paper

PDF (0.9 MB)

Talk at IUI 2021

YouTube