Skip to main content

Learning AI Poorly: Can Meta's AudioCraft Inspire You to Write a Song?

·4 mins

(originally published on LinkedIn)

Last week, Meta released AudioCraft which is, “Generative AI for audio made simple and available to all.” It is an interesting project.

Last Saturday, while our team was waiting to hear who won,* I wound up talking to a friend who’s company is trying to give their AI projects a human feel. She spoke of empathy, inspiration, and really any sort of emotional connection they can provide between their users and AI generated content. It was an interesting conversation.

Is there any way to connect that interesting conversation with that interesting project? In my mind, music is a culturally universal art that has been invoking emotion in humans for millennia. Could I possibly use some product provided by Meta to drive me to record a new song? Sure! Why the hell not.

MusicGen is one of the three models that make up AudioCraft - It generates music from text-based inputs that was trained on Meta-owned and/or licensed audio files. You can read the paper or, better yet, try it yourself. Go to MusicGen, type a prompt, click “generate”, wait a minute, and listen to what it made for you.

Since I had emotion on my mind, I typed, “a sad emo song with guitar and drums 100 bpm” (bpm is beats per minute - I was hoping it would know that to make recording a new version a bit easier. I think it wound up being about 110 bpm but whatever.)

Welp… That’s what it made.** Since I only had an hour to write this, I didn’t fiddle with prompts very much. Was it emo? No. Did it inspire me? Not at first. Nonetheless, it was time to record a new song. I fired up my DAW (digital audio workstation… Most people would use GarageBand, I use the fancier version called Logic) and got to work.

First thing I noticed was that bass line. It is kind of all over the place and plays some wacky notes. TBH, it was unlike anything I would normally play. It actually took me a second to figure out what it was doing. That’s pretty cool.

Then there’s that guitar riff over the top… Not super interesting, but again, not something I would ever do. Inspiring? Maybe.

![meta-audiocraft-1.png]

I am not at all a drummer, and to stay with the AI/robot theme, I used a tool in Logic called a “virtual drummer” to put down a beat similar to what MusicGen created. It was, ok. Instead of the stock drums I used something called Superior Drummer 3 which makes crazy real sounding drums even though they are completely fake.

Along those lines, I did use a real guitar, but I ran it through an amp modeler that uses algorithms to emulate vintage guitar amps and does it really really well. While recording, I also used plugins by Universal Audio which emulates ultra high end and rare analog recording gear. It truly is astounding how real fake things can sound in 2023.

Sadly, I couldn’t spend a ton of time so I just kind of created a version of the short loop that MusicGen gave me. Also, I am not a talented audio engineer. Even with all that vintage gear this is what I ended up with:

That’s ok, though. The point was to see if some AI generated crap could inspire or invoke emotion. Did it? Maybe, kind of, sort of. It did push me to record something I never would have written myself. Maybe someday I’ll take some time to build up a complete song around it. Maybe I’ll have ChatGPT write the lyrics. Maybe I’ll wait long enough and the models will get good enough so I never have to write another song again. Who knows?!?!??

I’d say this AI thing, whatever it is, has some potential… Until next week!

  • We had a ton of fun, money was raised, and not only did we win but we upset someone’s 10 year winning streak. It was a good weekend.

** I am downplaying the absolute mind boggling reality that a neural net can generate sound that is recognizable as decent music… that is straight up insane.