What's new

The annual question... Who is using Cubase Expression Maps

Any chance you can elaborate about what you figured out?

I never understood what the second, third and fourth articulations meant. Previously I was using expression maps in a simple way - just create an entry for each articulation I wanted to cover and enter the respective name in the Art. 1 column (spicc, stacc, long etc.)

I wasn't aware that you could use Art. 1-4 simultaneously and that they could represent different categories or attributes. Admittedly, that's not all that interesting for sample libraries with a simple layout where you have your spicc, stacc, long, legato, marcato, pizz, bartok pizz, col legno, tremolo, that's it. You don't really need any more precise specification.

But for very complex libraries like Dimension Strings, you just need a more intelligent approach to organize this stuff. Or things like Synchron Strings, where you have all these variations - the longs can be soft attack, normal attack or marcato, but also no vibrato, light, strong, or xfade, etc. That's many parameters that, if you wanted to cover all variations of long notes would absolutely clutter up your expression map.

I realized that using the other articulation columns, I could further specify: long - which type of attack? - which type of vibrato? etc. In the MIDI editor itself, it just says "long", but under the list of articulations there are now additional lanes where I can specify which type of attack and vibrato I want for that note. Since these attributes are also available for legatos, it works the same: I pick "legato" and specify the rest below. Since tremolo can have a marcato as well, the same applies. This way, I can reduce the number of lanes in the articulation list while actually covering more specifics.

Another example could be Big Bang Orchestra Andromeda, where every single articulation can be senza piccolo or con piccolo. The best way to handle this is with a sort of global "switch", so Art.2 is perfect for this. The same could be used for global things like normal or muted, etc.
 
Last edited:
Yeah this kind of behaviour is totally unreliable and unacceptable. I would like to know if this ever happens with Tom’s @Real JXL template as he’s a guy that will surely have a bulletproof template.
as for issues related to Attributes - usually, it's the instruments that act up. not cubase.
 
Last edited:
My workflow:

1. Do your patches from scratch
2. Use Attributes
3. First slot must be empty
4. use the same ks for all libs (A0 is always sustain, every instrument)
5. transpose your (low)instruments by 12 or 24, the ks won't be affected
6. you don't need the articulation lane, different colors for the events should do the job.
 
My workflow:

1. Do your patches from scratch
2. Use Attributes
3. First slot must be empty
4. use the same ks for all libs (A0 is always sustain, every instrument)
5. transpose your (low)instruments by 12 or 24, the ks won't be affected
6. you don't need the articulation lane, different colors for the events should do the job.

why must the first slot be empty? I remember a few years ago I made expression maps for 8Dio Adagio and Agitato and I remember I had to do that. It’s some kind of bug right? And if so, why hasn’t this been fixed yet? I haven’t noticed it because I don’t use Expression Maps and I thought thought would have been fixed by now.
 
Also, regarding the workflow of expression maps, is it fair to say that the workflow involved is to use something like a sustain patch, play the par, and then replace the arts with the ones you want. Then once they have been selected and all Attribute data drawn in, you then have to tweak any midi data accordingly to make everything flow and be at correct levels etc

Also, as an example, are people using Spitfire Time Machine articulations mixed with standard articulations. If so, I’m guessing they are bringing up the corresponding CC lane to tweak the TM stretch functions of the TM patch. I usually have TM patches grouped together on separate tracks but this is definitely a different workflow so I’m curious what people are doing.
 
I do work with them since they were introduced, which made my workflow with orhestral music dramatically more efficient, than working with KS.
I always tend to program kind of universal Maps, which I could use for as many different instruments similary, what might afford to program my own custumated Patch-Presets if ever possible, but that finally makes them easier to use and the results more interchangeable between different tracks.
Since I tend currently to combine different Sample-Libraries this has become more difficult and I am forced to use more different Maps, but I still would never change to a DAW without any compareable function.
I mostly assign the Articulations to an selected midievent via the Cubase Infoline that makes programming detailed "articulated" orchestral music wonderful much easier as it has been ever before.
 
Also, regarding the workflow of expression maps, is it fair to say that the workflow involved is to use something like a sustain patch, play the par, and then replace the arts with the ones you want. Then once they have been selected and all Attribute data drawn in, you then have to tweak any midi data accordingly to make everything flow and be at correct levels etc

Your expression map will work like a keyswitch, so you don't have to play it in with the first articulation in the EM list. Either choose the most appropriate, or change keyswitches while you play. The EM will know which articulation/keyswitch you are using and apply that to the notes you record.
 
why must the first slot be empty?

I found this related discussion: https://www.steinberg.net/forums/viewtopic.php?t=86999

I was noticing this behavior last night also and it was annoying me also. Steinberg must have some reason for that behavior, but it is rather annoying. Bottom line is that your first slot should either be blank, or perhaps have some kind of default home articulation in the first slot. Whenever you hit stop, it sounds like, the expression map pops back to the first slot. Its empty then no keyswitches will be sent, but channelizing will still be according to that first slot. So that will be somewhat predictable.

The thing I found weird and threw me off, is that when I hit stop, the keyswitch for the first slot is sent. Then when I hit play, if the first note is also using that slot, Cubase doesn't send the keyswitch again (because it doesn't need to, it was sent when I hit stop).

The bummer here is that if you hit stop, no keyswitch will be sent, but if the last performed articulation required channelization, then you won't be able to play your keyboard and hear that channelized articulation, until you engage it with a keyswitch again.
 
Your expression map will work like a keyswitch, so you don't have to play it in with the first articulation in the EM list. Either choose the most appropriate, or change keyswitches while you play. The EM will know which articulation/keyswitch you are using and apply that to the notes you record.

And what about using track offset? Different arts have different start times. This easy to adjust when on separate tracks etc
 
I use one expression map for all my vst's. Simple quick. Using Attributes is the key. imo I just select what midi note I want to be what. attribute does not use the lanes. just grab notes and set to what you want. In my opinion its much easier to just move my keyswitches to my expression map.
 
And what about using track offset? Different arts have different start times. This easy to adjust when on separate tracks etc

Not supported now.

Here are a couple things not currently handled by Expression Maps:

  1. timing offsets per articulation, in order to compensate for slow attack times

  2. CC's, PitchBend and aftertouch are not channelized along with notes when using a channelizing expression map.

  3. Easy layering for an articulation to more than one channel (with potentially different keyswitches per layer also, etc)
I think layering can probably be achieved with VePro and ExpressionMap channelizing, but the other two need more help in Cubase.
 
I guess the solution I would personally use to get over not being able to use Track Offset (which is hugely important) would be by creating Macros commands set to nudge Midi data by set values and slap them on a touch screen. For example, -50ms is a good start for Spitfire SSO String Shorts. I guess high light the notes, press the macros button, and they nudge back. Manually. Hmmm.
 
I’m confused about track offset. I use it with expression maps and I only need to nudge the expression slightly ahead of the note. Works quite well, what am I missing?
 
for now yea. Or you can get your hands dirty with scripting using LuaProtoPlug or something similar. I think some crafty plugins could be used to make up for the above mentioned deficiencies until/unless Steinberg would address them in the ExpressionMap directly..which would of course be much more ideal.

LogicPro is suffering from the same deficiencies in their own articulation Management.
 
I’m confused about track offset. I use it with expression maps and I only need to nudge the expression slightly ahead of the note. Works quite well, what am I missing?

In my opinion you should not have to nudge the expression maps earlier. People said that earlier, but I haven't experienced that. If anything, that would only be needed perhaps when using DIRECTION articulations. The attribute articulations are in fact attached to the notes and can't be nudged earlier...nor do they need to be. Yet another reason to use Attribute style expression maps. The Direction attributes are maybe more useful for longer running things setup on Group2, like dynamic levels or something...and sure, nudge them ahead of the first note effected.

What we've been talking bout regarding timing is more related to inherent sample attack latency that exists in some libraries, especially strings. And if you mix and match articulations on one source track, then different articulations may have different amounts of attack latency. So you either have to nudge all the notes earlier to make up for the attack latency...or have the articulation management system do it for you, which right now it does not.
 
Top Bottom