If you want to well use articulation management and Midi controlers too, may I suggest you to use Reaper. The expression maps of Cubase are now "has been" if you compare with Reaticulate tool of Reaper. The windows to create the expression map in Cubase Pro is a nightmare if you have big library and the expression list takes a huge area in the midi editor. There is no place for notes and Midi CC lanes... Expression maps are good if you use instruments with just a little list of articulations but if you use big libraries, it becomes unusuable. And with Reaper, you have the possibility to "name" all midi CC ... It is of course easier to work on midi CC if you read portamento, vibrato... on the midi lanes... With Cubase, you will see cc21 or CC45 and you have to remember in your rmind all assignements of all Midi CC for each instrument, it is designed only for elephant memory ! I used Cubase pro during three years and I switched to reaper 6 months ago.. now I use all my virtual instruments with all their features (midi CC, articulations...) with great pleasure... I consequently use one track per insturment as a real orchestra... It is a very nice job to do.
Oh snap! I bought OTR awhile back but never got around to installing it. Might have to sit down and go through the process now.With OTR 2’s release, I created and included an easy-to-use GUI editor for Reaticulate. You can still use your existing Reaticulate reabanks with it as well. The editor will only work with the banks you create with it. There is a thread on the reaper forum with more info. It is super fast to create maps with it. No more text editing
@tack was awesome to work through any questions I had designing it. Big shout out to his contributions to the community.
You just moved the goalpost though.make sure you account for poly-articulation chords and overlap from one articulation to another as described above..
But this ...What I suggested in my earlier post is that there is a need to have both midi notes and expression events on a single track, using a single midi channel assigned to the notes (not manually assigning midi channels to each event)...such that the articulation management system in the DAW can send the notes to different midi channels where articulation instruments are waiting, one channel per articulation....and the corresponding expression events will also be sent to those corresponding channels.
... well, I'm not really sure what the use case is here. If you have a prior note under articulation 1, hold it, activate articulation 2 and trigger a note in parallel under the new articulation, CCs subsequently will divert to the channel for articulation 2, but doesn't send them both channels at once.there is often a need for overlapping expression to go to two channels at once, even when not an actual chord to handle the release of previous note
You just moved the goalpost though.
But this ...
... well, I'm not really sure what the use case is here. If you have a prior note under articulation 1, hold it, activate articulation 2 and trigger a note in parallel under the new articulation, CCs subsequently will divert to the channel for articulation 2, but doesn't send them both channels at once.
Note-offs are routed properly, but if you need to send a CC back to the first channel to finesse a release of that articulation, say, I can't imagine why you would want that to also go to the other articulation on the second channel, which you may not even want to release at the same time (or even if you did, may require a different curve to get the right sound).
You may see it that way, but you still described a new behavior subsequent to your original post where I responded that Reaper can do that. Reaper can do what you said first, but not what you said afterward.Not at all, this is a fundamental requirement
One is uncommon, but some people do need it, which is to have poly-articulation chords on one track. I have been told some people use that to layer two different sounds for one attack, for example, but there could be other reasons for having overlapping notes with different articulations in the same track.
Ok, I understand this scenario. But it's not clear to me that this should always happen. For example, if the purpose of the second articulation was something like fortepiano, the CCs would abruptly change and indeed you wouldn't want those to transfer back to the previous note.But when you release the first note, there will be some amount of time where the actual sound is still releasing and it needs to continue receiving CC and PitchBend (and aftertouch) expression data. If the track has been diverted to a different channel at that point then the tail of the first note will miss that necessary CC/PB data.
That's unfortunate.Ivan has also posted an interesting video last week showing exactly the problem with S1 in particular because of these discrepancies. He took it down though because people were fighting instead of listening.
I'm interested in the use cases though. Because I can probably implement it easily enough (provided it doesn't break existing valid use cases).
Ok, I understand this scenario. But it's not clear to me that this should always happen. For example, if the purpose of the second articulation was something like fortepiano, the CCs would abruptly change and indeed you wouldn't want those to transfer back to the previous note.
I'm more explaining why I'm not going to bother with the video, because while I can demonstrate what I originally said I could, I can't demonstrate the subsequent behavior you described (and seem more interested in). At least not in the way you described it.You're just bickering now Tack.
That's the crux of it, yeah. I'll poll Reaticulate's users and see what they would prefer.So if its optional, which way would be the default?
Or a separate channel on the same track, like in the divisi example I linked to. Whichever is default, this approach can be used to implement the other behavior.if you really needed to have a soft note with its own CC curve that doesn't blast up before its release is over...then you can always put those on seperate tracks.
I'm more explaining why I'm not going to bother with the video, because while I can demonstrate what I originally said I could, I can't demonstrate the subsequent behavior you described (and seem more interested in). At least not in the way you described it.
To your earlier point, there are two variations here -- one where the notes overlap, and one where they are merely adjacent -- but I don't think these should be treated differently, as I think that violates the Principle of Least Astonishment. Unfortunately while I think the non-overlapping scenario you described is probably pretty safe, I'm a bit more apprehensive about the overlapping note case, whether this is optimal default behavior.
There are some other nuances to consider that can complicate implementation. For example, there could be more than 2 channels to replicate performance data across -- although perhaps if you have quick successive notes (each with different articulations) it may not be necessary in practice for all of them to get the same events. What do you think?
There should also be a moratorium defining how long performance events are replicated to channels that have no active notes. You want enough time to massage tails, but some patches burn CPU when they receive CCs even while notes ring, so you wouldn't want to route the events for longer than is necessary.
Yes, chasing (to the new destination channel) is handled with Reaticulate (as demonstrated here in this ancient video).every time you switch to a new channel, the CC's need to be chased! I very much doubt S1 is doing that...I can't speak for Rearticulate, but you can tell us.
In fairness, the past few posts have been getting well into the implementation weeds that users should never have to worry about. While we can debate about how the ideal articulation management system should work, I bet we can all agree that from a UX perspective it should all Just Work.Some of the solutions you guys are talking about seem a heck of a lot more annoying than just putting up a separate midi track.
Yes, chasing (to the new destination channel) is handled with Reaticulate (as demonstrated here in this ancient video).
And reverse chasing (for want of a better term) is too -- but only for specific CCs (like CC2 and CC64) to prevent note hanging. Ironically, the code to selectively chase those CCs for the anti-hanging feature is more complex than if I just blindly fired back all CC/PB/AT events like you're arguing.
Correct.So it's template time: I like the idea of playing in the flute or violin or whatever with a single articulation so I can get the melody down. Then I'd like to go back and change articulations where I need, and I assume that Babylon Waves will let me go back to change articulations where needed. I understand it's really Expression Maps and that BW just makes the setup WAY easier. Am I on track so far?
Actually that’s great info, just confirms what I had hoped. Now I just need to fire it up and work it out. Thanks!Hi
Correct.
Only if Babylon Waves contains expressions for your library - I guess it does for Berlin stuff.
Otherwise, you have to set it up manually in Expressions Maps.
I use Reaper + Reaticulate. Here I can programme the expressions for each library separately or use the same for everything (that's what I do).
But in principle Expression Maps works pretty sure similar and it should meet your requirements.
But maybe a Cubase user will get in touch with you who knows more about this topic