AI Has Been Creating Music and the Results Are … Weird

<pre>AI Has Been Creating Music and the Results Are ... Weird

In the late May, a small crowd at the St. Dunstan's church in East London's Stepany district Irish music. But this event was different; the tunes it featured were composed, in part, by an artificial intelligence (AI) algorithm, dubbed folk-rnn, and stark reminder of how cutting-edge AI is gradually permeating every aspect of human life and culture-even creativity.

Developed by researchers at Kingston University and Queen Mary University of London, folk-rnn is one of several projects exploring the intersection of artificial intelligence and creative arts. Folk-rnn's performance was met with a mixture of fascination, awe, and consternation at seeing soulless machines. But these expeditions are discovering new ways that man and machine can cooperate.

How Does AI Create Art?

Like many other AI products, folk-rnn uses machine learning algorithms, a subset of artificial intelligence. Instead of relying on predefined rules, machine learning ingests, large data sets and creating mathematical representations of the patterns and correlations it finds, which it then uses to accomplish tasks.

Folk-rnn was trained with a crowd-sourced repertoire of 23,000 Irish music transcripts before starting to crank out its own tunes. Since its inception in 2015, folk-rnn has undergone three iterations and has produced more than 100,000 songs, many of which have been compiled in the 14-volume online compendium.

Flow Machines, a five-year project funded by the European Research Council and coordinated by Sony's Computer Science Labs, also applied AI algorithms to music. Its most notable-and-bizarre-achievement is “Daddy's Car,” a song from the Beatles hit songs.

Welcome Mistakes [19659004] Algorithms can mimic the style and feel of a musical genre, but they often made basic mistakes a human composer would not. In fact, most of the pieces were played by folk-rnn's debut were tweaked by human musicians.

“It is not a well-defined problem, because you want to know what you want,” says Francois Pachet, who served as the lead researcher at Flow Machines and is now the director of Spotify's Creator Technology Research Lab. But, he adds cheerfully, “It's not good, it's not good art.”

The generated lead sheet for “Daddy's Car” added by hand. “There was pretty much a lot of AI in there, but not everything,” Pachet says, “including voice lyrics and structure, and of course the whole mix and production.”

“The real benefit is coming up with plans that are not expected, and that lead to musically interesting ideas, “says Bob Sturm, a lecturer in digital media at Queen Mary, University of London who worked on folk-rnn. “We want the system to create mistakes,” but the right kind of mistakes. “

Daren Banarsë, an Irish musician who has been played by folk-rnn, attested to the benefits of interesting mistakes. “There was one reel which intrigued me,” he says. “The melody was kept oscillating between major and minor, in a somewhat random fashion.”

Spotify's Pachet explains that these unexpected twists can actually help to improve the quality of pop music. “Take the 30 or 50 most popular songs on YouTube.” If you look at the melody, the harmony, the rhythm and the structure, they are extremely conventional, which is quite depressing. “You have only three or four chords, and they're always the same. “

No Right Answers

No Right Answers

It's entirely subjective, “says Drew Silverstein, CEO and co-founder of Amper Music, an AI startup based in New York. “It's just different.

” The challenge in the modern world is to build an AI that is capable of reflecting that subjectivity, “he adds.” Interestingly, sometimes, neural networks and purely data-driven approaches are not the right answer. “

Oded Ben-Tal, senior lecturer in music technology at Kingston University and a researcher for folk-rnn, points out another challenge

“In some ways, you can say music is information. We listen to a lot of music, and as a composer, I got to know what I heard about new music, “Ben-Tal says.” But the translation into data is a big stumbling block. “

” To put it simply, an AI algorithm's interpretation and understanding of music and arts is very different from that of humans. “

” In the case of our system, it's far too easy to fall into the trap of saying it's learning the style or it's learning aspects of Irish music, in fact it's not doing that, “says Sturm.” It's learning very abstract representations of this kind of music. And these abstract representations have very little to do with how you experience the music, how a composer puts them together in the context of this music within the tradition.

“Humans are necessary in the pursuit because, at the end of the day , we have to make decisions on how to incorporate certain things that are produced by the computer that we curate from this output and create new music, “Sturm says.

In visual arts, the the divide between the perception of humans and machines is even more accentuated. For instance, take DeepDream, an inside-out version of Google's super-efficient image-classification algorithm. When you give it a photo, it looks for familiar patterns and modifies the image to look more like the things it has identified. This can be useful to turn rough sketches into more advanced drawings, but it yields unexpected results. If you provide DeepDream with an image of your face and it finds a pattern that looks like a dog, it'll turn a part of your face into a dog.

“It's almost like the neural net is hallucinating,” an artist who interned at Google's DeepMind AI lab said about the software in an interview with Wired last year. “It sees dogs everywhere!”

But AI-generated art looks stunning and can rake in thousands of dollars at auctions. At the San Francisco Art Show held last year, paintings created with the help of Google's DeepDream sold for up to $ 8,000.

The Business of Creative AI

While researchers and scientists continue to explore creative AI, a handful of startups have already moved to the space. One is Silverstein's Amper Music, which he describes as a “composer, producer, performer, that creates a unique professional music.”

To create music with Amper, you specify the desired mood, length, and genre. The AI ​​produces a basic composition in a few seconds that you can tweak and adjust. Amper also offers an application programming interface (API), [201]

Jukedeck, a London-based startup created by two former Cambridge University students, provides a similar service. Like Amper, users provide Jukedeck with basic parameters, and it provides them an original musical track.

The main customers of both companies are businesses that require “functional music,” the type used in ads, video games, presentations, and YouTube videos. Jukedeck has created more than 500,000 tracks for customers including Coca-Cola, Google, and London's Natural History Museum.

A third startup, Australia-based Popgun, is building an AI musician that can play music with humans. Named Alice, the AI ​​listens to what you play and then responds instantly with a unique creation that fits with what you played.

In the visual arts industry, business use cases are gradually emerging. Last year, Adobe introduced Sensei, an AI platform aimed at improving human creativity. Sensei assists in a number of ways, such as automatically removing the background of photos or finding stock images based on the context of a poster or sketch.

Collaboration Between AI and Human Artists

Perhaps not surprisingly, these startups are born and managed by people. Amper's Silverstein studied music composition and theory at Vanderbilt University and composed music for TV, films, and video games. Ed Newton-Rex, founder and CEO of Jukedeck, is also a practiced music composer.

But not everyone is convinced of the positive role of artificial intelligence in arts. Some of the attendees at the folk_rnn's event are described the AI-generated pieces as lacking in “spirit, emotion and passion.” Others expressed concerned for the “cultural impact and the loss of the human beauty and understanding of music.”

“I have not heard about this person who has not reacted with something close to the negative side of things, “said Úna Monaghan, a composer and researcher involved in folk-rnn who spoke to Inverse. “Their reaction has been from slightly negative, to outright 'why are you doing this?'”

The developers of creative AI algorithms. “I do not think humans will become redundant in music-making,” says Newton-Rex. “For a start, we are listening to careers.”

 Automation and IoT Predictions

“We think of functional music as it is valued for its use case and not for the creativity or collaboration that went into making it,” Silverstein says. But artistic music, Silverstein explains, “Steven Spielberg and John Williams writing the score of Star Wars, that's about a human collaboration.”

“The key use-cases we see lie in collaboration with musicians, “says Jack Nolan, co-founder of Popgun. “Artists can use Alice as a source of creative inspiration or to help them come up with melodies and chord progressions in their music We think AI will help them do this , rather than replace them. “

Daren Banarsë agrees on the benefits of collaboration. “I always find it daunting when I have got to start a large-scale composition.” Maybe I could give the computer a few parameters: the number of players, the mood, even the names of some of my favorite composers, and it could generate a basic structure for me, “he says. “I would not expect it to work out of the box, but it's going to be a starting point.” computer glitch or random quirk, which could take me in a completely unexpected direction. “

Ben-Tal admits that some jobs might be affected. “Working musicians will have to adapt,” he says. “I show this to my students and say, 'You need to up your game.' “

'Democratizing Creativity' [

'Democratizing Creativity'

'Democratizing Creativity'

'Democratizing Creativity'

19659004] AI creativity can also help people without inherent talent or hard-earned skills. Take Vincent's AI drawing platform, which can transform the rough sketches into professional-looking paintings, and the AI ​​music platforms that create a decent music with minimal input.

Jukedeck's Newton-Rex describes this as “democratizing” creativity. “People with less formal musical education can get to grips with the basics of music and use AI to help them make music,” he says.

Pachet concurs. He draws an analogue of recent AI developments and the arrival of the first digital synthesizers in the 80s, followed by digital samplers. At the time, there was a similar fear that musicians would lose their jobs to computers. “But what happened was the exact opposite, in a sense that everyone took these new machines and hardware with them and how they learned how to use them productively,” he says. “The music industry is explored in some sense.”

“There will be more people doing music, and hopefully more interesting music,” he adds, reflecting back on AI creativity. “I can not predict the future, but I'm not worried about AI replacing.” I'm worried about all the other things, the well-defined problems, like automated healthcare and autonomous vehicles. But for the creative domains, I do not think it's going to happen. “

Source link