Link to the source files: mr_dudeley.zip (43 MB)
Download audio: mr.-dudeley_r1.mp3 (2.2 MB)
This is a proof-of-concept demo for some music animation software I made. I wanted to do the same thing that Animusic did — animation driven by music — and when I learned that Blender had a Python scripting interface, I had all the tools I needed to be able to do it.
Why did I call this animation “Mr. Dudeley”? It’s the name of the 1-string fretless bassman guy, whom I obviously borrowed from “Stick Figures” from Animusic 1. The name popped into my head one day as I was working on this project. I guess it’s a combination of “dude”, “doodley”, and “dudel”, which is German for “music”.
The song came from an improvisation I did while messing around with a USB microphone and Audacity. I added the synth part in the middle because I wanted to showcase a bunch of different movement patterns. I edited the MIDI file in Rosegarden, and rendered the audio with Ardour, using the FluidR3-GM soundfont.
I wrote all of the python code used to make this animation from the ground up (with a lot of help from the Blender Stack Exchange — this answer was especially helpful, as it formed the backbone of my script). I was surprised to find that most of the movement algorithms weren’t actually that complex; it was mostly a lot of linear scaling and time travel. The most complicated instrument is Mr. Dudeley, because of all the different things going on simultaneously with his animation, but the most complex formula was a bell curve used in the bongo player’s swaying motion.
I call my code AniMIDI. Put simply, it’s a Python module with a lot of useful functions for animating automusically, but you have to write the code that actually produces the animation yourself. (I’ve provided some demo code, though, in the form of the .blend file for this animation.) The way I did it was by looping through the list of MIDI messages, and placing keyframes based on their pitch and timing. The trick is, the function I used to insert keyframes can take any value as its time parameter, so I can give it things like (message.time – 5), for instance. (That’s how my script accomplishes time travel.) And I don’t have to do any of the interpolation, because Blender does all that for me! I love Blender!
The source files contain the .blend file for this animation, a copy of AniMIDI.py, and the MIDI file for the song, if anyone out there wants to take a look at it. If you do, please remember that all Python code contained in animidi.py, note_spread.py, and the .blend file is free to use, modify, or distribute for any purpose. All other content in the .blend file, and Mr_Dudeley_05.mid, are licensed under CC-BY-SA 4.0, except for the image textures “seamless-wood-background-1.jpg”, “seamless-wood-planks-2.jpg”, and “seamless-wood-planks-5.jpg”, which came from myfreetextures.com, and use that website’s license.
I’m so happy Blender enabled me to do this, and I have future plans for this code. I’d like to do an animation of “Axel F”, for example. I hope someone out there looks at my (probably terrible) code, and improves it, or uses it to make their own music video. If you do, please let me know! I’d love to see this code used to make the world a better place.
Software used: Blender, Rosegarden, Qsynth, Ardour.