Week 12: Polyphony RNN

Dec 6, 2018

For this week's exploration of generating music with Magenta, I've experimented with Polyphony RNN model. Setting up the environment for Magenta wasn't too difficult and training & generating polphonny RNN model generated music was similar to that of the Melody RNN model that we did during class. I considered using Paperspace to train my model, but it seemed like the amount of time that it would take to train were somewhat the same as the time it would take to do locally on my computer, so I decided to run the locally overnight for two days, which allowed me to train over 20,000 steps in total. The fact that I could use the last checkpoints helped a lot, so that I could train a little by little, which accumulated to increasing the accuracy and decreasing the loss. The final trained model had 20465 global steps, accuracy of 0.8278694 and loss of 0.59624904.

As for the midi files that used to generate with the polyphony rnn model, I decided to retry to generate Kpop songs, but this time with a lot less songs and all of which had a similar genre of hiphop R&B. I had a total of 26 midi files that were transcribed using the Piano Scribe from piano covers of songs that I found on the web. Because I had the piano scribe transcribe audio files to midi, the midi files were not as clean as I would have liked them to be.

After the model was trained, I tried different arguments in the generating command to explore how different inputs changed the generated midi files.

Below are playlists of songs that were generated with different inputs.

Primer pitches of 60, 64, 67 and condition on primer false, inject primer during generation false and 128 number of steps:

Primer pitches of 60, 64, 67 and condition on primer true, inject primer during generation false and 128 number of steps:

Primer pitches of 60, 64, 67 and condition on primer true, inject primer during generation true and 128 number of steps:

I also tried generating longer pieces of 2048 number of steps, and this is one that I particularly liked:

BACK TO MAIN