2017 has just started and it already promises to be a game changing year for the music industry. Why? Well, Sony Music will release its first album fully composed by Artificial Intelligence (AI).
It will represent the start of the commercialization of music fully produced by software and, in face of Sony’s commitment to the project, it is expected to display great quality and please an immense number of listeners.
An example is the catchy single “Daddy’s Car” released in September 2016.
But the main question to be addressed is: How will listeners react to music produced by AI?
As discussed previously here at LiveInnovation.org, the inner motivation of writers towards their compositions is a key driving force for listeners to relate and engage with artists and their songs.
So what will happen when there is absolutely no reference at all of composition motivation of a song, as is a pure result of AI algorithm, is a questions that must be addressed by the industry.
In order to address this question, at LiveInnovation.org we set out an experiment with 50 German millennial participants in order to investigate if the current perception towards music produced by AI.
During the experiment participants listened to 45 seconds of a songs fully created by Artificial Intelligence. The song is resultant of the Melomics project, led by the highly creative researcher Francisco Vico from the Universitat de Malaga. The track is called “0music12” and can be played here:
The 50 participants were divided into 2 groups of 25 participants each:
- In one group, before listening to the song, participants were given a fictitious emotional description of the motivation behind the composition of the song. The description detailed the difficult and forced separation of the song composer from his family. And that the song was composed to address the difficult feeling of missing the loved ones.
- The second group was informed that the song had been created by an Artificial Intelligence algorithm.
During the experiment, various extraneous variables were carefully controlled for. For example: participant interaction, distractions and the physical environment.
After listening to the song, participants answered 5-Point Likert scales measuring various perceptions and behavioral intentions. Finally, non-structured interviews were conducted with participants in order to generate exploratory findings.
And What Happened?
Although mentioning in post-experiment non-structured interviews that the story of songs “matters to them” and that music produced simply by software is “cold” and “uninteresting”, no statistical differences were found between the two groups. Both groups displayed similar levels of appreciation towards the song (moderate evaluation).
In other words, describing that a song was produced by AI did not alter respondents’ “Purchase Intention” towards the song neither their “Satisfaction Towards the performance” of the song.
Finally, there was no statistical difference among the groups in relation to the involvement with music or their involvement with technology.
What Does This Mean for the Music Industry?
With the commercialization of AI music in 2017 listeners will, for the first time in history, deal with a possible dilemma or internal conflict involving human and non-human music compositions.
What this experiment produced by LiveInnovation.org has revealed to us is the fact that a song produced by AI may not influence acceptance. Most importantly, listeners’ acceptance of AI music may simply come down to the quality of the song or how “touching” it may be in regards to taste and context.
Millennials are in general extremely tech-savvy, permanently interested in innovation and new forms of interaction with technological outcomes. And this may be the start of a new “pop culture” in which individuals seek the intelligent randomness of algorithms to create art, instead of the possible rationality or fictional emotion of humans.
As long as the AI music touches millennials, they probably will not mind if it was created by software or not.