Thursday, August 07, 2014

When Robots Write Songs: Computers That Compose

The Atlantic
When Robots Write Songs
Bach, Coltrane, McCartney: New algorithms can produce original compositions in the style of the greats. But are those works actually art?

William Hochberg Aug 7 2014, 8:01 AM ET


When Pharrell Williams accepted five Grammy Awards this year on behalf of the French group Daft Punk, the duo were dressed as robots. This may have foreshadowed a coming invasion by real music robots from France.

Computer scientists in Paris and the U.S. are working on algorithms enabling computers to make up original fugues in the style of Bach, improvise jazz solos a la John Coltrane, or mash up the two into a hybrid never heard before.

“We are quite close now to [programming computers to] generate nice melodies in the style of pop composers such as Legrand or McCartney,” says Francois Pachet, who heads Sony’s Computer Science Lab in Paris.

The commercial applications of such efforts may include endless streams of original music in shopping malls that can respond to crying babies with soothing harmonies, as well as time-saving tools for busy composers. But the questions raised by computerized composition are more abstract—touching on the nature of music, art, emotion, and, well, humanity.

The music-bots analyze works by flesh-and-blood composers and then synthesize original output with many of the same distinguishing characteristics. “Every work of music contains a set of instructions for creating different but highly related replications of itself,” says David Cope, a computer scientist, composer, and author who began his “Experiments in Musical Intelligence” in 1981 as the result of a composer's block.

“It’s truly impressive,” says jazz guitarist Pat Metheny, commenting on a track by a jazz-bot programmed by Pachet’s team to sound like sax legend Charlie “Bird” Parker blended with French composer Pierre Boulez. “I sent it to Chris Potter, the saxophone player in the band I am touring with right now, and asked him who the player was. He immediately started guessing people.”

The French robot that mashes up Parker and Boulez is a lot more advanced than most efforts at computer-penned music. For instance, another jazz-bot emulates Bill Evans with mixed results. Known for his heavenly flights of pianistic virtuosity, often while doped up on heroin, the classically trained Evans defined Cool Jazz on Miles Davis’s “Kind of Blue” outing, the most popular jazz album ever. Sony’s Evans-bot sounds more like it’s doped up on a cocktail of Thorazine and Windows 8. The lush chordings and rush of arpeggios are trademark Evans, but the ham-fisted dynamics and pointless melodies reveal no one human is home.

In 1950, World War II code-breaker and forefather of artificial intelligence Alan Turing introduced a blindfold test to see whether computers could fool humans into believing they were communicating with other humans (“humans” who were actually computers). The test would determine, essentially, whether computers can “think.”

But can they swing? “I would submit that you can certainly make a computer swing,” says Brooklyn-based musician and technologist Eric Singer. “You can kind of jitter that swing a bit to make it sound more human. “

Singer helped devise a computerized band called the “Orchestrion” that Metheny recorded and toured with in lieu of live musicians in 2010. The Orchestrion (also called a Panharmonica) was reportedly invented in 1805 by musician (and, some said, swindler) Johann Nepomuk Maelzel. Beethoven, a fan of early music tech, featured Maelzel’s musical automatons—powered by a bellows—in between symphonies at concerts in 1813.

David Cope has designed EMMY, an emulator named for the acronym of Cope’s “Experiments in Musical Intelligence” project at UC Santa Cruz and elsewhere. EMMY spools out miles of convincing music: from Bach chorale to Mozart sonata to Chopin mazurka, Joplin Rag, and even a work in the style of her creator, Cope.

.....
Read the full article with multimedia examples HERE in the Atlantic.