Skip to main content
Intellect

Student turning computers into composers

Computer Science doctoral student Kristine Monteith pulls out her laptop and asks, “What are we feeling like?” With 30 seconds and a click of the mouse, her ThinkPad becomes a regular Beethoven, composing original songs based on any emotion she chooses.

Monteith is a left and right brain kind of person. She came to BYU with a bachelor’s degree in music therapy, a passion for voice, piano and guitar, and is now preparing to defend her doctoral dissertation on her computer program that can generate original music.

Since the beginning of her graduate work, Monteith has been trying to answer a golden question – can machines be creative like humans?

A classic issue in machine learning is developing ways for computers to act like humans. Can computers be so humanlike as to fool us? For Monteith, her question was “Can a computer act like a human in composing music?“

According to Monteith’s research papers presented and published at the International Conference of Computation and Creativity in 2010 and 2011, she shows that her computer program is able to compose original music like a human.

To test this, students blindly listened to songs written by her computer program and those by a human composer. It turns out that 54 percent of listeners could identify the emotions in computer-generated music, while only 43 percent did the same for human-composed music.  Respondents also gave computer-generated music a score of 7 out of 10 for musicality; the other received an 8 out of 10.

“The fact that a number of the computer-generated songs were rated as more musical than the human-produced songs is somewhat impressive,” Monteith wrote in her research.

To start, her computer, like most human composers, enrolled in a type of Composition 101. Monteith took a selection of popular film soundtracks, including those from Toy Story and Jurassic Park, to build an algorithm that taught what made a song emotional.  Her family and friends labeled the recordings with the following emotions: love, joy, surprise, anger, sadness and fear.

 “Soundtracks are good at expressing emotions because they are put to a scene. Once I have my collection of songs, then I go through and pull out the melodies of each line,” Monteith said. The computer uses these melodies as samples to learn from.

Speaking of soundtracks, when Monteith’s program is paired up with another program that labels emotions tied with words in a story, her program has the potential to create background music in audio books. She tried her hand at this with a couple Aesop’s fables.

“We had a bunch of Aesop’s fables and I used an algorithm to label the text – this sentence is sad, this sentence is happy – and then used my program to generate the music to go along with that text,” Monteith said.

According to their surveys, listeners felt emotionally provoking music makes the stories significantly more enjoyable and easier to connect with.

With hopes to continue her goal of combining music therapy and computer science, Monteith’s next feat is to compose music that can lower or raise heart rates for therapeutic purposes.

“When I was doing music therapy, I had a client who would hyper ventilate a lot,” Monteith said. “This was a pretty serious problem that was going to shorten her life. So, in this instance, my work would help.”

As a Ph.D. candidate, Monteith works in the Neural Network and Machine Learning Laboratory directed by computer science professor Tony Martinez. 

Writer: Staley White

Read More From

Related Articles

data-content-type="article"

Q&A with President Reese on "building a covenant community"

June 24, 2024
n this Q&A series with President Reese, he shares more about the seven initiatives he shared in his 2023 inaugural response and how they apply to BYU employees.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

BYU professor, partners create first-ever risk guide to help organizations safely adopt AI

June 20, 2024
Hoping to help companies of all sizes responsibly harness the power of AI while also managing its risks, professors from Brigham Young University, Arizona State University and the University of Duisburg-Essen have collaborated with software company Boomi and consulting firm Connor Group to create the first-ever enterprise risk framework for generative artificial intelligence.

overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
data-content-type="article"

BYU-created water cycle resources adopted by USGS for nationwide use

June 18, 2024
The USGS, the nation’s largest water, earth and biological science agency, has tapped resources created by BYU to help teach school children coast-to-coast about water science in modern times. The new images, created by an interdisciplinary group of BYU scientists, educators and creatives, include a much-needed modernized version of the water cycle.
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= overrideTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= overrideTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText=