Ever wondered what your tracks would sound like recorded in a trash can, on a bus or in Wembley Stadium? Welcome to the wonderful world of convolution reverb!
What is convolution reverb?
Convolution reverb is a process used for digitally recreating the reverberation of a physical or virtual space. This means that unlike digital reverb you can make your vocals or instruments sound like they’ve been recorded in real spaces and the sky’s the limit in what’s possible. You can now get realistic simulations of everywhere from Sydney Opera House to Tombs in the Great Pyramids in Egypt and even make your own. It really is mind blowing!
To make your own simulated reverb, you need to collect an impulse response. This involves playing any short loud sound which excites the space, so it can be anything from a sweep tone, starter gun, or snare drum crack and recording it to capture how it reverberates in the space. The software then processes this signal and recreates the unique behaviour of the space. A convolution reverb is therefore only ever as good as the impulse response you take.
One of the most popular and well respected convolution reverb plugins is Altiverb. It’s not cheap but if you’ve got the budget then it’s a really powerful tool. They have lots of amazing spaces you can use from all the world and it also has a ‘chaos’ function which is really fun to play with! Ableton have recently introduced an awesome convolution reverb in Ableton Live 9 and Logic Pro X also have Space Designer. There are even powerful free convolution plugins such as SIR1 and Reverberate LE that perform really well.
So why not download a convolution reverb plugin and get experimenting? It’s a lot of fun hearing how your track would sound in some of the world’s best venues and a great way to learn about room acoustics.
More from Twine
Latest posts by Joe Scarffe (see all)
- Here’s why top creatives use SoundStripe - September 28, 2018
- Freelance Insurance: Why it’s critical to your success - September 5, 2018
- How to create a copywriting project brief - September 3, 2018