Part 4. Routing and mixing
So far, we’ve been working on a set of rather isolated code components, that produce audio, using structurally different approaches: physical modeling (through SuperColliderAU “containerized” instance), PCM sample playback (via RomplerGun), and subtractive synthesis (via SynthKick).
However, if you remember the initial set of requirements we listed for MembrainSC, I mentioned flexible mixing and matching options, which could make this VST an interesting hybrid drum machine, rather than one more 909 clone.
So, in practical terms, how do I mix sampled and synthesized component of a kick drum and output it to specific output channel of a plugin? Same goes for a blend of modeled and sampled snares, which should have its own channel in plugins’ output bus.
To get a bit of grip on that, let’s explore JUCE’s mixing and routing possibilities.
juce::AudioSource
AudioSource
is the base class for objects that can produce a continious stream of audio. The main point of interest for us there will be a method called getNextAudioBlock()
, which, basically fills the audio buffer. For our purposes we will use this class a wrapper around our previously built audio components.
Consider this, we are passing our sound-inducing code object to AudioSource wrappers contrsuctor, and then it takes .. of filling the buffer.
After the buffer is full, we are applying a gain level to it. Default is 0.5 (which is convenient for the common case of two audio sources mixed at equilibrium). We will also have a separate method to set the gain.
#include "RomplerGunAudioSource.h"
RomplerGunAudioSource::RomplerGunAudioSource(RomplerGun& rg) : romplerGun(rg) {
}
void RomplerGunAudioSource::getNextAudioBlock (const AudioSourceChannelInfo& channelInfo)
{
AudioSampleBuffer* outBuffer = channelInfo.buffer;
renderingBuffer.clear ();
romplerGun.renderNextBlock(*outBuffer, *midiMessages, 0, channelInfo.numSamples);
outBuffer->applyGain(volume);
}
This way, we will obtain the possibility to mix them down into one stereo output, using another JUCE class which is described below.
NOTE: for unknown reasons (possibly, poor C++ knowledge), it does not work when I call prepareToPlay()
method of the plugin instance inside the AudioSource wrapper. Maybe, it misbehaves, because I actually invoke prepareToPlay()
multiple times, so, I assume I should move the preparations from the PluginProcessor class initilization to those AudioSource descendants, rather than repeating the invocation there.
juce::MixerAudioSource
Once we’ve got our wrappers ready, JUCE has a convenient way to virtually “plug” them into an audio mixer. Just create an instance of MixerAudioSource, and add your audiosources as its inputs.
kickMix.addInputSource(&romplerGunKickAudioSource, false);
kickMix.addInputSource(&blipBlopKickAudioSource, false);
NOTE: for some other unknown reason (possibly, substance abuse), I was able to get it working only by setting releaseResources
flag to false,
otherwise it got the software crushing on exit. I guess, this now causes a memory leak, which would need to be fixed.
JUCE Output Bus Setup
Thanks to MixerAudioSource
, we are now able to make hybrid drum sounds, by mixing different sources toghether, like simple synthesized blip kick and a kick drum PCM sample. We will use similar approach for a snare drum, where we mix an audio output of of vibrating physical membraine with a PCM sample as well. So, the last bit to be handled, is routing them to separate VST output channels. Luckily enough, JUCE’s API for bus routing is not that complicated either.
You need to specify the number of channels in the class’s headers macro.
MembrainScAudioProcessor::MembrainScAudioProcessor()
#ifndef JucePlugin_PreferredChannelConfigurations
: AudioProcessor (BusesProperties()
#if ! JucePlugin_IsMidiEffect
#if ! JucePlugin_IsSynth
.withInput ("Input", AudioChannelSet::stereo(), true)
#endif
.withOutput ("SD", AudioChannelSet::stereo(), true)
.withOutput ("BD", AudioChannelSet::stereo(), true)
.withOutput ("HH", AudioChannelSet::stereo(), true)
#endif
)
Then you just iterate the output channels of your VST instance inside the rendering loop, and use a special constructor to get a dedicated audiobuffer for each stereo channel.
NOTE: some audio sources, like JUCE’s built-in sample player explcitly requires a MIDI event. Contary to what is was like with the previous tasks we faced in this chapter, creating an arbitary MIDI event in JUCE is a bit awkward, because it has no special high-level builder for that, and I really want to save my (poor) bit arithmetic skills for the situations, where they are really appropriate. So, I’m just using some boilerplate code to pass the MIDI event around from the main loop.
auto busCount = getBusCount (false);
for (auto busNr = 0; busNr < busCount; ++busNr)
{
auto audioBusBuffer = getBusBuffer (buffer, false, busNr);
if (busNr == 0) {
handleKickParams(midiMessages);
AudioSourceChannelInfo info (audioBusBuffer);
kickMix.getNextAudioBlock(info);
} else {
if (busNr == 1) {
scSnare->processBlock(audioBusBuffer, midiMessages);
} else {
AudioSourceChannelInfo info (audioBusBuffer);
float mix = *paramsHub.getRawParameterValue(IDs::hihatMix);
romplerGunHHAudioSource.setVolume(mix);
c romplerGunHHAudioSource.passMIDIMessages(midiMessages);
scHiHatAudioSource->setVolume(1 - mix);
scHiHatAudioSource->passMIDIMessages(midiMessages);
hhMix.getNextAudioBlock(info);
}
}
}