What's new

The annual question... Who is using Cubase Expression Maps

I’m confused about track offset. I use it with expression maps and I only need to nudge the expression slightly ahead of the note. Works quite well, what am I missing?

If one art has a specific delay time and another art has a different delay time to the other art, then using track offset is only going to offset a specific value. So one of the arts will still be out of time.

Hopefully that doesn’t sound confusing.
 
Wow, it's all coming back to me scanning these posts...

*shudder*

Crazy system.

I don't really see it that way. I have spent a lot of time with LogicPro's articulation management and understand its limitations, it is also far from perfect, but its very useful. If you have a specific need that it doesn't address, it can be frustrating.

As of now I see Cubase Expression Maps as being slightly more capable then LogicPro's system. Primarily because of the Group concept, which LogicPro does not have. LogicPro makes up for it perhaps by having a scripting system which has been exploited by some third parties to fill in SOME of the gaps, but some of the mentioned gaps above about timing, layering and channelizing still remain there too.

still, in many cases, it works terrifically. Not sure why you would call it a "crazy" system.
 
Make sure you send the expression map change before the MIDI note. I think you can attach the expression map data to the note and maybe that's what you're doing. That's not a good idea because, remember, it's just MIDI data and you can't be sure how the instrument will react to simultaneous MIDI data. That problem has nothing to do with expression maps, that's a MIDI timing problem.

It's like quantizing a keyswitch to the note it's affecting: sometimes it works, sometimes it doesn't. You have to send the keyswitch in advance of the note. Same thing with Expression Maps.

Again, Expression Maps are just a MIDI mapper. They work as well or as badly as MIDI. So yeah they're not perfect, but they're no worse than MIDI.

This is just misinformation that is going to confuse people.

First - MIDI is a serial protocol and there is no such thing as 'simultaneous MIDI' on a single channel. If there is a CC message and a note on message, one comes before the other.

Second - Cubase expression 'attributes' are on the notes, so there's nothing the user can do re: sending them first. Cubase sends them first. Even a little ahead, to deal with one of the two things that break this system (and cause people to blame concurrency, phases of the moon etc):

1) Kontakt has a bug that causes it to process notes before CCs, so if you are trying to use e.g. Spitfire UACC you can get articulation change CCs after the notes. https://www.native-instruments.com/forum/threads/kontakt-wrongly-reorders-cc-events.345041/

2) VST3 (and only 3) separates CCs from notes, breaking the serial MIDI stream into two parts in a way that can't be reconstructed in order. https://www.steinberg.net/forums/viewtopic.php?f=246&t=174160&p=931253#p931251

These two problems cause us all vast amounts of wasted time and are the result of big industry players thinking they know better and ignoring the MIDI standard. Quite frankly, it's like building on sand.
 
This is just misinformation that is going to confuse people.

First - MIDI is a serial protocol and there is no such thing as 'simultaneous MIDI' on a single channel. If there is a CC message and a note on message, one comes before the other.

Second - Cubase expression 'attributes' are on the notes, so there's nothing the user can do re: sending them first. Cubase sends them first. Even a little ahead, to deal with one of the two things that break this system (and cause people to blame concurrency, phases of the moon etc):

+1

1) Kontakt has a bug that causes it to process notes before CCs, so if you are trying to use e.g. Spitfire UACC you can get articulation change CCs after the notes. https://www.native-instruments.com/forum/threads/kontakt-wrongly-reorders-cc-events.345041/

Its probably CC's before notes, but we don't know for sure the internals, but my own observations were that specifically when CC automation is used in kontakt, then on any given sample time, all the related CC automation is processed before processing any notes. Which may or my not be considered a bug...there are sound reasons why that would be expected to happen. But on the other hand, this architecture makes it impossible to rely on the serial nature of midi to use different CC values across multiple notes of a chord all on the same clock tick.

But the unfortunate thing is that if you were using CC's in between notes of a chord, to be switches for different articulations on different notes of the chord...all of them on the same sample time, then the last CC message used would win and all notes of the chord would use that articulation. That is if and only if CC automation in kontakt is being used. If the instrument itself is scripted or coded in some way to receive the incoming note and CC messages directly, then I don't necessarily think the ordering would be munged, unless KSP scripting has specific bugs in it.


2) VST3 (and only 3) separates CCs from notes, breaking the serial MIDI stream into two parts in a way that can't be reconstructed in order. https://www.steinberg.net/forums/viewtopic.php?f=246&t=174160&p=931253#p931251

I was not aware of this, thanks for pointing this out. If that is true, then CC's are not acceptable to use as keyswitches when using VST3 instruments. If that is true, then a similar situation as Kontakt's CC automation would quite possibly occur where all of the CC parameters would be processed for a given sample time, before processing any notes. And perhaps rightly so, but it then makes it impossible to interleave CC's between notes of a given sample time and expect them to remain in the intended order.

I hope that using a slightly different start time for each note, should resolve that problem in both of the above cases though.
 
Last edited:
I think the above general observation about separate serial streams is probably also why the Expression Map system cannot handle poly-articulation chords (which LogicPro can handle by the way).

The ExpressionMap must be processed before any notes. So all related keyswitches for that midi clock tick are first sent, according to the ExpressionMap slot..then finally any notes.

I believe Steinberg could still fix that though, since with the attribute articulation type, each note has the specific info it needs to send the right keyswitches serial in front of each note. It doesn't need to handle all the of the expressionMap's before doing all the notes. This should be a feature request. Being able to do this is useful for divisi handling. However, its also possible to nudge the notes of a chord slightly so that everything will interleave serially as desired.

But still the above issue about CC's being handled in separate serial queues means the instruments themselves can't receive one serial list...and so... If that is true about VST3, then non-note keyswitches should be avoided...which is troubling. Seems Steinberg has overlooked this possibility of needing various different kinds of expression to be applied differently to different notes of a chord on the same midi tick.
 
Last edited:
Its probably CC's before notes, but we don't know for sure the internals

It's notes before CCs. You can reproduce in Logic with this script. Just put the script in as a midifx, load the script, drop kontakt on the channel and load its KSP midi monitor in verbose mode. I don’t see the note second consistently until about 6 msecs delay:

JavaScript:
/*
test kontakt event order
*/

var NeedsTimingInfo = true;

function HandleMIDI(event) {
if(event instanceof NoteOn && event.velocity > 0)
    {
    var info = GetTimingInfo();
    var cc = new ControlChange(event);
    cc.number = 2;
    cc.value = 42;
    cc.send();
    event.sendAfterMilliseconds(0); //increase this until order is stable, 5-6ms
    }
    
else
    event.send();
}
 
Hmm, that's interesting. I do get this same results now.

I don't think it used to do that, did it change with Kontakt6 or something?

The thing is, Spitfire UACC would totally break if this was actually happening, and CSS too; so it makes me wonder whether the KSP engine is just reporting it wrong. It makes absolutely no sense that NI would process the noteOn's before CC's on the same beat... and actually for this mode, it makes no sense that they would be on separate queues either.

Well anyway if that's true, then CC keyswitches should really be avoided at all costs...both for VST3 instruments and for any Kontakt instruments even if not not VST3.
 
Last edited:
and frankly you should not have to delay the note by 6ms in order to get them in sequence! That is even worse and something seriously broken if its true, but I'm still doubtful about that...I think this could be a KSP multi-script issue. That could still be an issue for many people though.
 
Last edited:
The more I read about it the more I am very, very glad I just use on articulation per track.

Once in my career, I dealt with huge templates and expression maps and Lemur on an iPad and all that jazz... until I realized I was putting much more time into maintaining a template and all the technical hufflepuff that comes with it instead of making music.

It was at that moment that I threw all of that overboard. It's really freeing not having to deal with what is essentially lipstick on a pig. It's fluff. Very technical fluff. And it has zero to do with actually making music. Which is what we all want to do, right?
 
  • Like
Reactions: GNP
The more I read about it the more I am very, very glad I just use on articulation per track.

Once in my career, I dealt with huge templates and expression maps and Lemur on an iPad and all that jazz... until I realized I was putting much more time into maintaining a template and all the technical hufflepuff that comes with it instead of making music.

It was at that moment that I threw all of that overboard. It's really freeing not having to deal with what is essentially lipstick on a pig. It's fluff. Very technical fluff. And it has zero to do with actually making music. Which is what we all want to do, right?
Haha, I know what you mean.

Having previously detested old school keyswitching methods, I've found Logic's own home-brew effort to be just the right balance of tech and "not having to fuss with it." Although I have been tempted to add TouchOsc on the iPad into the mix. Thanks for reminding me not to!
 
Spitfire users should make use of UACC KS (not UACC). That's the most reliable way. And, from what I can see, KS messages which consist of a Note and a single CC are also fine, you just have to know how you build the Expression Map in this case. What's not so fine is multiple CCs in certain configurations, because Kontakt will not deal gracefully with those situations.
 
Hmm, that's interesting. I do get this same results now.

I don't think it used to do that, did it change with Kontakt6 or something?

The thing is, Spitfire UACC would totally break if this was actually happening, and CSS too; so it makes me wonder whether the KSP engine is just reporting it wrong. It makes absolutely no sense that NI would process the noteOn's before CC's on the same beat... and actually for this mode, it makes no sense that they would be on separate queues either.

Well anyway if that's true, then CC keyswitches should really be avoided at all costs...both for VST3 instruments and for any Kontakt instruments even if not not VST3.

I'm still using Kontakt 5 and it is broken there too. Not fixed (yet) in 6.

And yes, if you do a search you'll see people have plenty of problems/gremlins with UACC - e.g. Babylon Waves suggesting people use UACC KS (which don't get reordered) instead, when using Logic.

It is my understanding (but I have not tested) that Cubase sends expression map attribute CC messages ahead of time specifically to accommodate this and similar behavior by plugins. That's why other people will say it works fine for them (mostly).

Note this is not a problem with UACC, it's a problem with Kontakt. For me, it made working with OT Berlin series and notation programs/DAW emaps quite difficult. That's why, for all their growing pains, I welcome the dedicated players from these companies. At least they can support them themselves.
 
Haha, I know what you mean.

Having previously detested old school keyswitching methods, I've found Logic's own home-brew effort to be just the right balance of tech and "not having to fuss with it." Although I have been tempted to add TouchOsc on the iPad into the mix. Thanks for reminding me not to!

The thing is, the more tech you incorporate into your setup, the more room for technical problems. Each and everyone of these things has to work perfectly or your flow is disrupted.

I tend to think about flow as a delicate thing. Once you have it, you don't want to stop. There's nothing more frustrating as to have to stop in your tracks to fix your Ipad, or your internet connection drops out and you can't connect, or Cubase suddenly throws out your controller mapping (has happened to me before), or your VEPro project is corrupted and you have to grab a backup, or... I guess you know what I mean.

Because flow is so delicate and breakable, I tend to minimize the things that can break. I have Studio One, it's rock solid, and what I use are track presets. I type CTRL+F, search for HS 1st VLN Leg, and boom, there's your preset, you press enter and you can immediately start recording.

There are numerous ways you can improve save and loading times, or times where an instrument has to load, or workflow improvements. The danger, however, is to way too easily fall into the trap of thinking about shaving off a couple seconds for so long and spending so much time working on shortcuts, that you are busy doing technical busywork instead of what you are actually supposed to do: focus on notes, and expression and orchestration, because those are the cornerstones of what we actually do.

I know my workflow could be faster. But the thing is, I don't care. I know my workflow works now, and tomorrow, and the day after that. I know the chance of technical problems is minimized because I do not work with three different pieces of software and hardware at the same time. But the most important thing is, I do not have to think about expression maps, about articulations triggering or not, about loading order of my software, about whether I should EQ inside VEPro or inside my DAW, about my internet connection, about the fact that VEPro only takes VST2 not VST3 instruments, and my DAW accepting both, about latency, about which instruments has it's keyswitches where, about disabling or enabling instruments in order of importance, about incoming or outgoing midi ports and channels, and all that.

CTRL+F, type name, enter. Play in notes. If wrong, undo and repeat. Rinse and repeat. Again and again and again. Dive into the midi editor. Transpose notes, try different combinations. Massage the CC's a bit. Not with difficult and unwieldy tools, just by drawing freeform.

Having less ability to think about shortcuts forces you to become better at playing stuff in, to become better and be able to rely on your knowledge of musical instruments and how they are going to sound. A free mind is open to experimentation. And an experimenting mind is a focused one.

Not all that is new is automatically an improvement. While my save and load times may be a little longer than most, I think my time spend on actual composition is actually bigger.

Unless of course I am procrastinating by posting lengthy replies on an internet forum!:shocked:
 
The thing is, the more tech you incorporate into your setup, the more room for technical problems. Each and everyone of these things has to work perfectly or your flow is disrupted.

I tend to think about flow as a delicate thing. Once you have it, you don't want to stop. There's nothing more frustrating as to have to stop in your tracks to fix your Ipad, or your internet connection drops out and you can't connect, or Cubase suddenly throws out your controller mapping (has happened to me before), or your VEPro project is corrupted and you have to grab a backup, or... I guess you know what I mean.

Because flow is so delicate and breakable, I tend to minimize the things that can break. I have Studio One, it's rock solid, and what I use are track presets. I type CTRL+F, search for HS 1st VLN Leg, and boom, there's your preset, you press enter and you can immediately start recording.

There are numerous ways you can improve save and loading times, or times where an instrument has to load, or workflow improvements. The danger, however, is to way too easily fall into the trap of thinking about shaving off a couple seconds for so long and spending so much time working on shortcuts, that you are busy doing technical busywork instead of what you are actually supposed to do: focus on notes, and expression and orchestration, because those are the cornerstones of what we actually do.

I know my workflow could be faster. But the thing is, I don't care. I know my workflow works now, and tomorrow, and the day after that. I know the chance of technical problems is minimized because I do not work with three different pieces of software and hardware at the same time. But the most important thing is, I do not have to think about expression maps, about articulations triggering or not, about loading order of my software, about whether I should EQ inside VEPro or inside my DAW, about my internet connection, about the fact that VEPro only takes VST2 not VST3 instruments, and my DAW accepting both, about latency, about which instruments has it's keyswitches where, about disabling or enabling instruments in order of importance, about incoming or outgoing midi ports and channels, and all that.

CTRL+F, type name, enter. Play in notes. If wrong, undo and repeat. Rinse and repeat. Again and again and again. Dive into the midi editor. Transpose notes, try different combinations. Massage the CC's a bit. Not with difficult and unwieldy tools, just by drawing freeform.

Having less ability to think about shortcuts forces you to become better at playing stuff in, to become better and be able to rely on your knowledge of musical instruments and how they are going to sound. A free mind is open to experimentation. And an experimenting mind is a focused one.

Not all that is new is automatically an improvement. While my save and load times may be a little longer than most, I think my time spend on actual composition is actually bigger.

Unless of course I am procrastinating by posting lengthy replies on an internet forum!:shocked:
Absolutely, 100%.
My music tech setup is incredibly basic compared to most user rigs - purposefully, as nothing kills my flow more than technical issues or the added cognitive load that comes with managing it all.

Actually, I think music tech has gotten more complicated in the last few years, not less. But that's a post for another day.
 
Spitfire users should make use of UACC KS (not UACC). That's the most reliable way. And, from what I can see, KS messages which consist of a Note and a single CC are also fine, you just have to know how you build the Expression Map in this case. What's not so fine is multiple CCs in certain configurations, because Kontakt will not deal gracefully with those situations.

I'm still using Kontakt 5 and it is broken there too. Not fixed (yet) in 6.

And yes, if you do a search you'll see people have plenty of problems/gremlins with UACC - e.g. Babylon Waves suggesting people use UACC KS (which don't get reordered) instead, when using Logic.

It is my understanding (but I have not tested) that Cubase sends expression map attribute CC messages ahead of time specifically to accommodate this and similar behavior by plugins. That's why other people will say it works fine for them (mostly).

Note this is not a problem with UACC, it's a problem with Kontakt. For me, it made working with OT Berlin series and notation programs/DAW emaps quite difficult. That's why, for all their growing pains, I welcome the dedicated players from these companies. At least they can support them themselves.

Interesting. What, if anything, has EvilDragon had to say about this issue?

So I had a chance to run a test with CSS. CSS is able to use CC58 to switch articulations, or regular keyswitches. In any case, I also enabled the multi-script midi monitor. Then I ran Richhickey's LPX script to feed it notes with CC58 right in front of the note on same timestamp.

What happened is that CSS changed the articulation and played the note with the correct articulation(good), while the multi-script midi monitor displayed it as the note being before the CC(bad).

I was not able to get CSS to fail.

When I tried adding some milliseconds of delay for the Note (in the script), then eventually the multi-script midi monitor would show the CC before the note...and right around 5ms, it would be inconsistent, sometimes correct, sometimes reverse order...But CSS always played the right thing, which means somehow CSS is able to process the CC before the note, while the KSP engine somehow is reporting otherwise.

My feeling is that Kontakt is specifically broken somehow in the KSP engine. Its possible that Spitfire is stumbling over this more than CSS for some reason. But anyway, it doesn't really matter...end users don't care what the reason is, if kontakt is unreliable in this regard, then its unreliable.
 
Top Bottom