Understanding musical cognition and its emotional responses in humans by developing machine learning pattern tools
ORAL
Abstract
Music is arguably the artform which humans most naturally react to. Most humans display an emotional response to music, irrespective of their understanding of music theory. This unique ability to experience complex music is a defining feature of human cognition, begging the question: can it be replicated using machine learning? We aim to develop an algorithm able to process and analyze music and correlate the musical patterns with human emotions that they spark, mimicking the human cognitive process. First, through digital signal analysis by neural networks, we have created an algorithm to separate music into individual components (tracks), analyzing and categorizing patterns in music of different styles and tonalities. In parallel, we are constructing a deep neural network to correlate musical patterns and tonalities to the human emotions they usually induce. Shedding light on how to evoke specific emotions could have applications in music composition, medical therapeutics, and neuroscience. This research will allow us to gain insight into how our brains process musical information and to mimic that with neural networks which replicate human emergent cognition.
–
Presenters
-
Quinn Picard
University of California, San Diego
Authors
-
Quinn Picard
University of California, San Diego