Stupid federal grants finally go to something awesome: jazz-playing robots

jazzRobots.Shutterstock

It’s a depressing time to be a jazzhead like me.

Young people are largely ignoring jazz (The Bad Plus doesn’t count), and the few remaining greats are going to age out soon.

Vibraphone legend Bobby Hutcherson, whom I saw play in my early 20s, is now in his 70s and in poor health. Drummer Roy Haynes played Washington, D.C. on his 90th birthday last month, and while he’s still got chops (and an infectious grin), he could barely get around the stage or even put the microphone back on the stand.

Chances are you’ve been exposed to jazz mostly through NPR – whippersnapper pianist Brad Mehldau is a favorite of whoever’s programming the network’s interstitials.

But I have confidence that the art and soul of jazz will live on … in humanoids.

The Guardian reports that the Defense Advanced Research Projects Agency is birthing a cool (sorry) $2 million to a University of Arizona professor, Kelland Thomas, whose field is machine learning – but whose side project is jazz performance. Put them together and you’ll have jazz-playing robots.

Thomas, associate director of UA’s School of Information, is leading a team that includes researchers at the University of Illinois at Urbana-Champaign and Oberlin College.

Their goal is audacious – nothing less than creating machines that can learn from Charlie Parker and Miles Davis and create their own original styles:

To jam effectively, Thomas said that machines are going to have to study data at scale and then synthesize it based on input from people; learning to play jazz does all those things.

“We’re getting lots of video of musicians playing in front of a green screen together,” Thomas explained. “We’re going to build a database of musical transcription: every Miles Davis solo and every Louis Armstrong solo we’re going to hand-curate. We’re going to develop machine learning techniques to analyze these solos and find deeper relationships between the notes and the harmonies, and that will inform the system – that’ll be the knowledge base.”

That will be the back end, said Thomas. The front end will be a microphone that listens to the human musician playing an instrument and uses its vast repository of jazz solos to make decisions about what to play next. “We want to get to a point where it’s playing things back to the human performer that the human performer will recognize as collaborative.”

We’ve covered some silly federal grants to professors and researchers at The College Fix, but to me this sounds like one with a really useful real-world application, and not simply sustaining live jazz through the 21st century.

Like The College Fix on Facebook / Follow us on Twitter

IMAGE: Shutterstock




Source: Stupid federal grants finally go to something awesome: jazz-playing robots

Via: Google Alert for ML

Pin It on Pinterest

Share This