In April 2011 Benjamin Hudarew - one of our developers - and me met up with Rolf Wöhrmann at Musikmesse Frankfurt.
My goal was to meet the guy who makes Nlog - my favorite synth app on iOS and to ask him if what he thinks of a possible collaboration between him and us (Audanika). We had a snack and talked about iOS development, GPU performance, integrating a smaller version of Nlog in our Application (SoundPrism).
We came to the conclusion that it probably wasn’t so easy to do that and that it could be better if we somehow figured out a way to connect Nlog with SoundPrism. Nlog would be running in the background and SoundPrism in the foreground - multitasking had been introduced with iOS 4.0 in 2010.
It’s August 2011 and… it works.
We’re currently working on performance and usability tweaks but it works. In fact, I’m playing with a beta version of SoundPrism Pro on my iPad 2 right now and I’m triggering a beta version of Nlog Synth Pro while it’s sitting in the background. It’s so much fun.
We’re going to submit the binaries to Apple soon and then something new is going to happen.
Two completely different applications - a midi controller and a sound generating app are going to work together at the same time on an iOS device. In fact, I’ve already ran two sound generating apps in the background and controlled them both with different parts of SoundPrism.
This is true multitasking.
And we’re doing it in a way that can easily be duplicated by other apps. It’s not breaking any rules. We’re not using any undocumented features. It’s just CoreMIDI done right. And we’re not doing it in a way that limits it to SoundPrism and Nlog. Anyone can join in. We could have added some features that made the communication and integration more proprietary but we didn’t.
This is our statement. Copy what we’re doing. Follow us. We’re not going to make you sign anything. No NDA. No licensing agreement.
For our users this means that with every app that follows our example possibilities multiply.
Got four synth apps and two controllers? You’ve got eight ways of making music. Add a controller and you’ve got twelve combinations. If you’re just combining one synth and one controller every time. With future generations of iOS devices (the iPad 3 comes to mind) 3 or 4 or maybe more apps are going to be possible to run at the same time without encountering performance issues.
And you can bet on us making sure that it’s going to be as easy as starting your iPad and checking your mail.
More playing around with SoundPrism Pro and Nlog Pro (both beta versions). SoundPrism Pro is used to control Nlog while both apps are running on the same device. Coming to an iOS Device near you soon (TM).
Originally SoundPrism was not designed to be a melodic but a harmonic instrument. Strangely enough we found out after building it, that (melodic) scales can easily be played as well if one uses the correct fingering.
The following video shows my favorite fingering to play such scales. If anyone out there knows an even better one, please leave a video response.
If you like this video, you might also want to check out:
The harmonic minor scale is the most often used derivation of the diatonic scale. The third of the dominant is risen by one semitone. Thus thus a so called leading tone arises. Through the non diatonic leading tone additional musical tension is created which gives the scale an oriental sound.
SoundPrism Fingerings: Playing more precise arpeggios
by Gabriel Gatzsche
When we started to develop SoundPrism our first intention was to create a musical instrument which helps people to create great music without practicing. In the mean time we know more: SoundPrism is a completely emancipated musical instrument that should be practiced like any other instrument. The more you practice the better the musical outcome. The following video shows a fingering which allows you to have more control over arpeggios played with SoundPrism.
This blog post showcases how to use Cubase, Omnisphere and SoundPrism Pro together. I am going to start from scratch so people who are unfamiliar with Omnisphere and Cubase should be able to follow. To demonstrate this I am using a modified pad and a lead guitar patch from Spectrasonics’ Omnisphere.
My goal was to create a track with a melodic lead sound, an accompanying pad sound in the background and an arpeggiated bass as the foundation. The volume of the pad sound will be controlled via the x-axis tilt of my iOs device. The filter characteristic of the lead guitar will be controlled via the x-axis and the vibrato via the y-axis of my iPad.
My DAW of choice is Cubase. I chose Omnisphere’s „Broken Amp Lead“ as the lead patch and “Adagio String Example“ as the pad. The bass is the Trillian preset „Atlas Sphere“.
One possibility to instantiate these VSTs in Cubase are to create so called “instrument tracks”. But these do not allow to receive data on different midi channels. Thus I suggest using Cubase‘s instruments rack. The following video shows how to do that.
Video 2: Setting up the Cubase Production / Loading the VSTs
Setting up midi channel filters in Cubase
To have the ChordSection and the BassSection play different instruments, different midi channels have to be assigned to the chosen patches. So I decided to assign midi channel 1 to the pad, channel 2 to the lead and channel 3 to the bass sound. To route the correct channels to the correct track in cubase a midi filter was set up as well.
Video 3: Setting up midi channel filters in Cubase
The next step was to setup SoundPrism Pro in such a way that the BassSection sends on midi channel 3 and the chord section sends on midi channel 1.
Video 4: Setting up midi channels in SoundPrism Pro
Configuring the SoundPrism Expression module
When the midi channels are configured I am beginning to setup SoundPrism Pro’s expression module. This module allows to transmit the tilt of the iPhone via MIDI controller changes. Synthesizers can use that information to change parameters of the patch.
The expression module was configured to send MIDI CC 1 (wheel) when the x-axis tilt of the iOS device changes. Additionally the minimum and the maximum tilt have to be configured. The minimum tilt is the tilt where a controller value 0 transmitted and vice versa the maximum tilt is the tilt where a controller value 127 is transmitted.
Video 5: Setting the expression module in SoundPrism Pro
Making the Pad Patch’s Volume listen to tilt changes
To map the wheel controller MIDI CC change to the volume setting of the patch, I have to open Omnisphere’s modulation page. I’m adding an additional source „Wheel“ and and then I am assigning the target „Amp/Amplitude“. I also configure „Depth“ and „Target Parameter“ as shown in the next video.
Video 6: Mapping midi wheel controller changes to the volume of the pad sound
Configuring the vibrato and filter resonance of the lead sound
Before I start to configure the lead sound, I change the midi channel for the chord section from 1 to 2. Additionally the expression module y-axis acceleration is configured to send the y-tilt via midi cc 4 (foot controller).
Video 7: Reconfiguring SoundPrism Pro’s midi channel assignment and expression module
To configure the vibrato of the lead sound I’m using a LFO. The LFO changes the pitch height of the sound periodically. The foot controller events sent by SoundPrism are mapped to the rate and the strength of the LFO. Thus the tilt of the iOS device leads to an increased vibrato.
Video 8: Setting up the vibrato
Using the expressivity module is a completely new experience. I was able to modulate the vibrato and the resonance of the sounds while playing the bass and the solo lead. I tried to do this with my keyboard, using an expression pedal and after touch. But it is so much harder. With the iPad I had the feeling to be much more in control. The main reason for this is, that I could define gestures that perfectly fit to my own body. Summarizing I can say that the possibility to define your own gestures, the simultaneous control the timbre of the sound while playing the actual tones adds a completely new musical dimension to SoundPrism Pro.