What's new

Your way of doing orchestra articulations in 2021?

make sure you account for poly-articulation chords and overlap from one articulation to another as described above..
 
If you want to well use articulation management and Midi controlers too, may I suggest you to use Reaper. The expression maps of Cubase are now "has been" if you compare with Reaticulate tool of Reaper. The windows to create the expression map in Cubase Pro is a nightmare if you have big library and the expression list takes a huge area in the midi editor. There is no place for notes and Midi CC lanes... Expression maps are good if you use instruments with just a little list of articulations but if you use big libraries, it becomes unusuable. And with Reaper, you have the possibility to "name" all midi CC ... It is of course easier to work on midi CC if you read portamento, vibrato... on the midi lanes... With Cubase, you will see cc21 or CC45 and you have to remember in your rmind all assignements of all Midi CC for each instrument, it is designed only for elephant memory ! I used Cubase pro during three years and I switched to reaper 6 months ago.. now I use all my virtual instruments with all their features (midi CC, articulations...) with great pleasure... I consequently use one track per insturment as a real orchestra... It is a very nice job to do.

With OTR 2’s release, I created and included an easy-to-use GUI editor for Reaticulate. You can still use your existing Reaticulate reabanks with it as well. The editor will only work with the banks you create with it. There is a thread on the reaper forum HERE with more info. It is super fast to create maps with it. No more text editing :)

@tack was awesome to work through any questions I had designing it. Big shout out to his contributions to the community.
 
Last edited:
With OTR 2’s release, I created and included an easy-to-use GUI editor for Reaticulate. You can still use your existing Reaticulate reabanks with it as well. The editor will only work with the banks you create with it. There is a thread on the reaper forum with more info. It is super fast to create maps with it. No more text editing :)

@tack was awesome to work through any questions I had designing it. Big shout out to his contributions to the community.
Oh snap! I bought OTR awhile back but never got around to installing it. Might have to sit down and go through the process now.
 
make sure you account for poly-articulation chords and overlap from one articulation to another as described above..
You just moved the goalpost though.

With Reaticulate, Reaper can easily do this:

What I suggested in my earlier post is that there is a need to have both midi notes and expression events on a single track, using a single midi channel assigned to the notes (not manually assigning midi channels to each event)...such that the articulation management system in the DAW can send the notes to different midi channels where articulation instruments are waiting, one channel per articulation....and the corresponding expression events will also be sent to those corresponding channels.
But this ...
there is often a need for overlapping expression to go to two channels at once, even when not an actual chord to handle the release of previous note
... well, I'm not really sure what the use case is here. If you have a prior note under articulation 1, hold it, activate articulation 2 and trigger a note in parallel under the new articulation, CCs subsequently will divert to the channel for articulation 2, but doesn't send them both channels at once.

Note-offs are routed properly, but if you need to send a CC back to the first channel to finesse a release of that articulation, say, I can't imagine why you would want that to also go to the other articulation on the second channel, which you may not even want to release at the same time (or even if you did, may require a different curve to get the right sound).
 
You just moved the goalpost though.

Not at all, this is a fundamental requirement, do I need to explain why? I just want to make sure these other half measures are not checked off as completed.

But this ...

... well, I'm not really sure what the use case is here. If you have a prior note under articulation 1, hold it, activate articulation 2 and trigger a note in parallel under the new articulation, CCs subsequently will divert to the channel for articulation 2, but doesn't send them both channels at once.

Note-offs are routed properly, but if you need to send a CC back to the first channel to finesse a release of that articulation, say, I can't imagine why you would want that to also go to the other articulation on the second channel, which you may not even want to release at the same time (or even if you did, may require a different curve to get the right sound).

Several use cases to think about. One is uncommon, but some people do need it, which is to have poly-articulation chords on one track. I have been told some people use that to layer two different sounds for one attack, for example, but there could be other reasons for having overlapping notes with different articulations in the same track.

another use case is that even if you don't have intended overlapping notes, lets say you just have a quantized melodic line, and the end of each note butts right up against the start of the next note. Alright so far.. But when you release the first note, there will be some amount of time where the actual sound is still releasing and it needs to continue receiving CC and PitchBend (and aftertouch) expression data. If the track has been diverted to a different channel at that point then the tail of the first note will miss that necessary CC/PB data.

Ivan has also posted an interesting video last week showing exactly the problem with S1 in particular because of these discrepancies. He took it down though because people were fighting instead of listening.
 
Not at all, this is a fundamental requirement
You may see it that way, but you still described a new behavior subsequent to your original post where I responded that Reaper can do that. Reaper can do what you said first, but not what you said afterward.

I'm interested in the use cases though. Because I can probably implement it easily enough (provided it doesn't break existing valid use cases).

One is uncommon, but some people do need it, which is to have poly-articulation chords on one track. I have been told some people use that to layer two different sounds for one attack, for example, but there could be other reasons for having overlapping notes with different articulations in the same track.

With Reaticulate there are two obvious options:
  1. Create a custom layered articulation (say long+spiccato) which explicitly routes to both patches in parallel
  2. Use separate source MIDI channels on the same track and notes on different channels. The MIDI channels can still work independently with respect to articulation selection. This would be similar to what I demonstrate here.
But when you release the first note, there will be some amount of time where the actual sound is still releasing and it needs to continue receiving CC and PitchBend (and aftertouch) expression data. If the track has been diverted to a different channel at that point then the tail of the first note will miss that necessary CC/PB data.
Ok, I understand this scenario. But it's not clear to me that this should always happen. For example, if the purpose of the second articulation was something like fortepiano, the CCs would abruptly change and indeed you wouldn't want those to transfer back to the previous note.


Ivan has also posted an interesting video last week showing exactly the problem with S1 in particular because of these discrepancies. He took it down though because people were fighting instead of listening.
That's unfortunate.
 
I'm interested in the use cases though. Because I can probably implement it easily enough (provided it doesn't break existing valid use cases).

Good!

Ok, I understand this scenario. But it's not clear to me that this should always happen. For example, if the purpose of the second articulation was something like fortepiano, the CCs would abruptly change and indeed you wouldn't want those to transfer back to the previous note.

I think you want the curve to apply to all the notes as if they were being performed on the same midi channel...which is predictable and understandable. If you had a very soft note followed by FFF note...and a Curve there..that is exactly what would be heard on a single non-channelized track. Whether or not that is a problem or not I leave to you, but as it is...if its cutting it off...it can often be a problem. So if its optional, which way would be the default? I would personally like a multi-midi channel approach to perform as if its a single track and single channel and predictably as such. if you really needed to have a soft note with its own CC curve that doesn't blast up before its release is over...then you can always put those on seperate tracks.

As I said, Ivan posted a video the other day which showed other usability issues with this approach, which I can't remember right now, but its important also, maybe he will post it again. it had to do with jumping back and forth around while editing and working with the track. Anyway...I repeat again..there is no complete DAW-based solution for this. Period. Ivan sells a third party solution for LogicPro that does in fact handle all of this, and I have a similar free solution on GitLab for LogicPro. Cubase doesn't handle it. DP doesn't handle it. Reaper doesn't handle it. it sounds like Rearticulate is 75% of the way there, as is S1 and that's good to hear. Hope you will consider improving it further.




 
Last edited:
You're just bickering now Tack.
I'm more explaining why I'm not going to bother with the video, because while I can demonstrate what I originally said I could, I can't demonstrate the subsequent behavior you described (and seem more interested in). At least not in the way you described it.

So if its optional, which way would be the default?
That's the crux of it, yeah. I'll poll Reaticulate's users and see what they would prefer.

To your earlier point, there are two variations here -- one where the notes overlap, and one where they are merely adjacent -- but I don't think these should be treated differently, as I think that violates the Principle of Least Astonishment. Unfortunately while I think the non-overlapping scenario you described is probably pretty safe, I'm a bit more apprehensive about the overlapping note case, whether this is optimal default behavior.

if you really needed to have a soft note with its own CC curve that doesn't blast up before its release is over...then you can always put those on seperate tracks.
Or a separate channel on the same track, like in the divisi example I linked to. Whichever is default, this approach can be used to implement the other behavior.

There are some other nuances to consider that can complicate implementation. For example, there could be more than 2 channels to replicate performance data across -- although perhaps if you have quick successive notes (each with different articulations) it may not be necessary in practice for all of them to get the same events. What do you think?

There should also be a moratorium defining how long performance events are replicated to channels that have no active notes. You want enough time to massage tails, but some patches burn CPU when they receive CCs even while no notes ring, so you wouldn't want to replicate the events for longer than is necessary.
 
I'm more explaining why I'm not going to bother with the video, because while I can demonstrate what I originally said I could, I can't demonstrate the subsequent behavior you described (and seem more interested in). At least not in the way you described it.

I think it would still be interesting to see what rearticulate can do. I understand though, making videos takes time.


To your earlier point, there are two variations here -- one where the notes overlap, and one where they are merely adjacent -- but I don't think these should be treated differently, as I think that violates the Principle of Least Astonishment. Unfortunately while I think the non-overlapping scenario you described is probably pretty safe, I'm a bit more apprehensive about the overlapping note case, whether this is optimal default behavior.

I like this euphemism. Well that's partly why I'm in favor that a "channelized" solution from a single source track should behave the same way as if it were just a normal track going to a single midi channel in a key switched instrument. All notes on the track get all the CC, PitchBend and AfterTouch... (excluding NoteExpression of course).


There are some other nuances to consider that can complicate implementation. For example, there could be more than 2 channels to replicate performance data across -- although perhaps if you have quick successive notes (each with different articulations) it may not be necessary in practice for all of them to get the same events. What do you think?

There definitely could be more than 2! Especially if you want to support intentional poly-articulation chords and such. Which can be done in LogicPro, by the way...though I personally have not ever needed it, but I have at times heard some people give reasons why they need it.


There should also be a moratorium defining how long performance events are replicated to channels that have no active notes. You want enough time to massage tails, but some patches burn CPU when they receive CCs even while notes ring, so you wouldn't want to route the events for longer than is necessary.

So what I did in my LogicPro solution is provide a GUI with a slider and you configure how much "continuation" time you want from the NoteOff's, from zero to pretty far out wide range.

You can read about it and see the code, copy and paste if it helps, whatever, I don't care...

https://gitlab.com/dewdman42/Channelizer/-/wikis/home

Screen_Shot_2021-07-21_at_8.53.07_PM.jpg
 
also, to add further.....not to be perceived as moving the goal posts, but unfortunately my brain does not always think of everything at once...but another factor in a channelized solution that needs to be taken into account is "chasing". Similar as when DAW's will chase CC's when you press PLAY from some place in a timeline...it chases back and finds the last set value of Pitchbend and every CC# and then makes sure that they get "chased" to that value at the point you are playing back.

Well when you are doing a multi-channel approach...every time you switch to a new channel, the CC's need to be chased! I very much doubt S1 is doing that...I can't speak for Rearticulate, but you can tell us. I can tell you the DAW's don't do it with current arcticulation managers
 
Some of the solutions you guys are talking about seem a heck of a lot more annoying than just putting up a separate midi track.

I mean, hats off to you if you want to work this way but some of these things seem like a lot of bother, and a lot of details to remember.

I still like having separate tracks for different articulations a lot of the time. It also makes it clearer for the orchestrator -- pizz is separate from spiccato from sustained.

I don't always do that but it is a lot easier for the recipient to understand your intentions.
 
every time you switch to a new channel, the CC's need to be chased! I very much doubt S1 is doing that...I can't speak for Rearticulate, but you can tell us.
Yes, chasing (to the new destination channel) is handled with Reaticulate (as demonstrated here in this ancient video).

And reverse chasing (for want of a better term) is too -- but only for specific CCs (like CC2 and CC64) to prevent note hanging. Ironically, the code to selectively chase those CCs for the anti-hanging feature is more complex than if I just blindly fired back all CC/PB/AT events like you're arguing. :)
 
Some of the solutions you guys are talking about seem a heck of a lot more annoying than just putting up a separate midi track.
In fairness, the past few posts have been getting well into the implementation weeds that users should never have to worry about. While we can debate about how the ideal articulation management system should work, I bet we can all agree that from a UX perspective it should all Just Work.
 
Yes, chasing (to the new destination channel) is handled with Reaticulate (as demonstrated here in this ancient video).

And reverse chasing (for want of a better term) is too -- but only for specific CCs (like CC2 and CC64) to prevent note hanging. Ironically, the code to selectively chase those CCs for the anti-hanging feature is more complex than if I just blindly fired back all CC/PB/AT events like you're arguing. :)

excuse me, what am I arguing? I said nothing about doing anything blindly.

Glad to hear that rearticulate is chasing CC, PitchBend and AfterTouch while channelizing articulations! now if we can just get the DAW makers to do it.
 
Sound variations in Studio One 5. Here's hoping Spitfire etc will join this fantastic system, just as VSL has done.
 
Call me set I my ways, but I still prefer and use 1 articulation per track. Gives me the most flexibility when moving between them and in mixing/balancing articulations.

Wayne
 
I want to work in a score system and don’t want to jump around in a dosen systems for only 1st Violins.
For me there could be two solutions, but unfortunately none of them seem to exist in a DAW yet (or maybe I missed it so far)

1) Articulation Maps/IDs with individual (negative) delay options

2) Some kind of score stacks (like track stacks in Logic) where you could handle individual tracks, but with the possibility to show and edit all tracks of a stack in one score system.
 
Last edited:
Sorry to stir up the dust with a newbie question, but I want to make sure I understand the simpler picture: I'm going to use BOB as my library for my Berklee class in January (new acronym, Berlin Orch w/Berklee). I use Cubase and bought Babylon Waves.

So it's template time: I like the idea of playing in the flute or violin or whatever with a single articulation so I can get the melody down. Then I'd like to go back and change articulations where I need, and I assume that Babylon Waves will let me go back to change articulations where needed. I understand it's really Expression Maps and that BW just makes the setup WAY easier. Am I on track so far?

I also will replay the bits that are legato on a second track if they don't sound great. So I'm building my template with one instrument per track and I'm hoping Expression Maps created by BW will allow me to do the above... play in a line, then head back and quickly tweak/change articulations.

Am I off the rails yet, or getting it so far? Might be a totally DUH question but I don't want to get too far into the template and find out I'm nuts. Thanks!
 
Hi
So it's template time: I like the idea of playing in the flute or violin or whatever with a single articulation so I can get the melody down. Then I'd like to go back and change articulations where I need, and I assume that Babylon Waves will let me go back to change articulations where needed. I understand it's really Expression Maps and that BW just makes the setup WAY easier. Am I on track so far?
Correct.

Only if Babylon Waves contains expressions for your library - I guess it does for Berlin stuff.

Otherwise, you have to set it up manually in Expressions Maps.

I use Reaper + Reaticulate. Here I can programme the expressions for each library separately or use the same for everything (that's what I do).

But in principle Expression Maps works pretty sure similar and it should meet your requirements.

But maybe a Cubase user will get in touch with you who knows more about this topic
 
Hi

Correct.

Only if Babylon Waves contains expressions for your library - I guess it does for Berlin stuff.

Otherwise, you have to set it up manually in Expressions Maps.

I use Reaper + Reaticulate. Here I can programme the expressions for each library separately or use the same for everything (that's what I do).

But in principle Expression Maps works pretty sure similar and it should meet your requirements.

But maybe a Cubase user will get in touch with you who knows more about this topic
Actually that’s great info, just confirms what I had hoped. Now I just need to fire it up and work it out. Thanks! 👍🏼
 
Top Bottom