# Logic 10.4 Articulation Discussion



## khollister

Picking up from the main thread for discussions on the new articulation set feature


----------



## tav.one

Can someone post a small clip showing how I can make my own sets?
I use Spitfire libraries so if anyone can make the clip with Albion One or SCS, that will be great.


----------



## jonathanwright

Good idea.

To summarise my issues already discussed in the other thread.

*MIDI Channel Articulation Switching*
Works fine, but doesn't 'chase' MIDI CC data. So I'm using a script provided with ARTZID which clones the data to each channel.

*Keyswitch Articulation Switching* (Using note data, _not_ UACC)
Using my iPad with Metagrid to change articulations, it works fine 'live' and works fine when assigning articulations to notes in the piano role.

However it occasionally it 'sticks' to an articulation on playback and doesn't change to others when it should. When I save and reopen the project, it plays through correctly, but once I start editing the project, moving around in the timeline it begins to trip up.


----------



## Heinigoldstein

So here is my report of the first full day working with 10.4. and the articulation feature. I used the Babylonwave maps with SCS and my own for BWW and it actually was a complete nightmare. 
Articulations were switching all around and it was just by accident if the right one is played. When I was recording, it sounded fine, when I was playing back the recording, articulation was changed to what ever. Weird cc data , when editing articulation, they jump back to what I never played. UACC for SCS was worse the normal KS for BBW here. But neither of it was even near to good.
I hope it´s not 10.4 here in generell that does weird things...........I´m out of this new feature till the next update I´m afraid and move back to good old SkiSwitcher !!!


----------



## khollister

itstav said:


> Can someone post a small clip showing how I can make my own sets?
> I use Spitfire libraries so if anyone can make the clip with Albion One or SCS, that will be great.



Got to the Track Inspector to create a new articulation set:





The Logic manual will take you from there on what the fields do in the editor

https://help.apple.com/logicpro/mac/10.4/#/lgcp33a49091


----------



## khollister

Heinigoldstein said:


> So here is my report of the first full day working with 10.4. and the articulation feature. I used the Babylonwave maps with SCS and my own for BWW and it actually was a complete nightmare.
> Articulations were switching all around and it was just by accident if the right one is played. When I was recording, it sounded fine, when I was playing back the recording, articulation was changed to what ever. Weird cc data , when editing articulation, they jump back to what I never played. UACC for SCS was worse the normal KS for BBW here. But neither of it was even near to good.
> I hope it´s not 10.4 here in generell that does weird things...........I´m out of this new feature till the next update I´m afraid and move back to good old SkiSwitcher !!!



The only time I got complete anarchy in a Spitfire lib was when I forgot to select "Lock to UACC"


----------



## babylonwaves

Heinigoldstein said:


> I hope it´s not 10.4 here in generell that does weird things


i don't think that what you describe is all related to possible bugs. the one thing I had to get my head around is that I need to select the articulation I want to play in the dropdown within the instrument UI before I start recording. Unless I do that, the notes I record do not carry any meaning info on which articulation they are supposed to play. Also, I have everything that belongs to an articulation set on ONE channel. That's not a requirement but it helps me to keep things easy.

sorry for the huge picture ...


----------



## tav.one

Thanks @khollister 
I had no idea that more than 1 articulations can be played simultaneously, this opens up so many possibilities. Thanks a lot


----------



## Heinigoldstein

babylonwaves said:


> i don't think that what you describe is all related to possible bugs. the one thing I had to get my head around is that I need to select the articulation I want to play in the dropdown within the instrument UI before I start recording. Unless I do that, the notes I record do not carry any meaning info on which articulation they are supposed to play. Also, I have everything that belongs to an articulation set on ONE channel. That's not a requirement but it helps me to keep things easy.
> 
> sorry for the huge picture ...



Thanks Marc, I'm pretty sure it's not only bugs. There must be something wrong with either my template and/or the way I use it. It's the 1st time I used UACC too, so a lot of opportunities for misstakes. But it was such a mess, that I don't want to waste much more energy on this in the moment. This is ment to make live easier and right it does the opposite for 

And since I'm fighting with the behavior of Logic 10 since a while anyhow, it becomes to frustrating to take care of this too.


----------



## playz123

Groove 3 has a new set of tutorials for what's new in Logic Pro X 10.4, and articulations are covered there...if anyone is interested. Not free though.
https://www.groove3.com/tutorials/Logic-Pro-X-10.4-Update-Explained


----------



## procreative

I discovered something about the Smart Controls. The Studio strings/brass have a nice Tabbed layout with a Keyboard showing the Articulations. But this design is not an option anywhere else.

But there is a way to get it. If you load up an Articulation Set for Studio strings/brass then load a Kontakt instance instead you can edit the Studio strings/brass Articulation Set and still retain the Smart Control interface.


----------



## Alex Fraser

procreative said:


> I discovered something about the Smart Controls. The Studio strings/brass have a nice Tabbed layout with a Keyboard showing the Articulations. But this design is not an option anywhere else.
> 
> But there is a way to get it. If you load up an Articulation Set for Studio strings/brass then load a Kontakt instance instead you can edit the Studio strings/brass Articulation Set and still retain the Smart Control interface.


I think if you just add an articulation map to any track, you get this on smart controls. Good times.

Edit: I think you need to add switches to the map though.


----------



## tav.one

Do the keyswitch smart control show up on iPad's Logic Remote? If yes, that would be awesome.


----------



## babylonwaves

Alex Fraser said:


> Edit: I think you need to add switches to the map though.


exactly. without switches no, key switches in smart controls.
interesting detail about the key switches: if you use a second keyboard (or whatever input media) you can give the switches a different MIDI channel to your recording keyboard and they will not interfere with the note you record or limit the range of the possible notes. that's what the MIDI channel in the switches pane is for.


----------



## Alex Fraser

itstav said:


> Do the keyswitch smart control show up on iPad's Logic Remote? If yes, that would be awesome.


Not as far as I can see, but the remote app hasn’t been updated to match the 10.4 release yet. It would be amazing if they appeared.


----------



## resound

Thank you to all who did some investigating after I posted my video. It does indeed look like articulation switches are being sent with along with CC data which is causing the problem. If I change the articulation of the CC data in the event editor to match the articulations that are being played, then everything plays back correctly. Do we think this is a bug or something they did intentionally? It makes it much harder to work with CC data this way.


----------



## resound

jonathanwright said:


> *MIDI Channel Articulation Switching*
> Works fine, but doesn't 'chase' MIDI CC data. So I'm using a script provided with ARTZID which clones the data to each channel.



You can get Logic to direct CC data to specific channels by changing the articulation of the CC data in the event editor, but as far as I know you can't have it sent to multiple channels at the same time.


----------



## khollister

resound said:


> Thank you to all who did some investigating after I posted my video. It does indeed look like articulation switches are being sent with along with CC data which is causing the problem. If I change the articulation of the CC data in the event editor to match the articulations that are being played, then everything plays back correctly. Do we think this is a bug or something they did intentionally? It makes it much harder to work with CC data this way.



I definitely vote for bug, but who knows. Spitfire is testing as we speak and promised to post back with what they found. They might have better luck getting the Logic team's attention about this.


----------



## resound

khollister said:


> I definitely vote for bug, but who knows. Spitfire is testing as we speak and promised to post back with what they found. They might have better luck getting the Logic team's attention about this.


That's good to hear. Spitfire to the rescue!


----------



## garyhiebner

procreative said:


> I discovered something about the Smart Controls. The Studio strings/brass have a nice Tabbed layout with a Keyboard showing the Articulations. But this design is not an option anywhere else.
> 
> But there is a way to get it. If you load up an Articulation Set for Studio strings/brass then load a Kontakt instance instead you can edit the Studio strings/brass Articulation Set and still retain the Smart Control interface.


Awesome. Thats a great tip. Thanks!


----------



## procreative

garyhiebner said:


> Awesome. Thats a great tip. Thanks!





Seems I was indeed incorrect, the GUI only shows like that if the trigger method is Notes. And you dont need to do my tip to get it to show. The minute you add an Articulation set and it has notes as the trigger method this Smart Control displays.


----------



## karusz

Excuse me. When we play those keyswitchng notes, assigned to logic, Does Logic show them on the track, as C0, C1 etc or just skips them?


----------



## gpax

babylonwaves said:


> exactly. without switches no, key switches in smart controls.
> interesting detail about the key switches: if you use a second keyboard (or whatever input media) you can give the switches a different MIDI channel to your recording keyboard and they will not interfere with the note you record or limit the range of the possible notes. that's what the MIDI channel in the switches pane is for.


I have to ask, Marc, and through no fault of your own, how much testing you were able to do before releasing your 2.0 update so close to the Logic 10.4 update? 

My first-ever experience routing through UACC was via your time-saving Spitfire maps, hence my sending a support ticket your way, where I was uncertain at that time what was going on with respect to the cc 1 data affecting the articulations themselves. 

The issues are perhaps out of your control, but might you also consider offering alternatives to the Spitfire UACC maps, until this is sorted out?


----------



## Alex Fraser

karusz said:


> Excuse me. When we play those keyswitchng notes, assigned to logic, Does Logic show them on the track, as C0, C1 etc or just skips them?


They don't appear anywhere.


----------



## karusz

1. There will be no way to use this system with anything which needs several instructions KS+CC or KS1+KS2 etc, like Synchron Strings or Vienna Instruments? Am I right?
2.Also in Spitfire Strings not possible to press 2 keys at the same time (gives double articulations at the same time, staccato and f.ex pizzicato?
3.Is there a way to have those names of articulations written anywhere to be able to work with this using an iPad but see articulation names. Any ideas?


----------



## Alex Fraser

karusz said:


> 1. There will be no way to use this system with anything which needs several instructions KS+CC or KS1+KS2 etc, like Synchron Strings or Vienna Instruments? Am I right?
> 2.Also in Spitfire Strings not possible to press 2 keys at the same time (gives double articulations at the same time, staccato and f.ex pizzicato?
> 3.Is there a way to have those names of articulations written anywhere to be able to work with this using an iPad but see articulation names. Any ideas?


I'm waiting for some audio to bounce, so....

1. Not sure, but I remember a VSL user working out a system using both key and velocity for a double trigger setup.
2. You can edit and change articulations after recording on an individual note basis. This way, you can stack the articulations as much as you wish. I'm not sure about layering the articulations via key switch - I haven't played enough with the system. Perhaps someone else can chime in.
3. You could wait for Logic Remote to be updated. We're hoping the smart controls articulations will display there. You could always set up something using TouchOSC or similar, depending on your commitment to the cause.


----------



## babylonwaves

gpax said:


> The issues are perhaps out of your control, but might you also consider offering alternatives to the Spitfire UACC maps, until this is sorted out?



hi @gpax - if I need to, I'll consider it


----------



## nbd

Hello. I'm planning to add my custom articulation sets here:

https://github.com/nobodo/logic-pro-x-articulation-sets

Feel free to add your own if you wish to share them. If you have github account, just ask for write permission, or you can pm me with the articulation files.


----------



## karusz

nbd said:


> Hello. I'm planning to add my custom articulation sets here:
> 
> https://github.com/nobodo/logic-pro-x-articulation-sets
> 
> Feel free to add your own if you wish to share them. If you have github account, just ask for write permission, or you can pm me with the articulation files.


Hello - good idea. Do you have anything already?


----------



## nbd

I'm currently checking how my existing midi script for CSS converts to this new stuff. It was using CC and never had any timing issues because I made it send the CC just before the actual note, which was easy to accomplish with the midi scripter. Don't know how the new system does it. If it does not send the CC before NoteOn, then there will be problems. Maybe then I'll just throw in the midi scripter also and those can be used together. At least this new system brings the articulation names to the table, so it's much better even with that.

Also planning to add some Shreddage library support, then maybe NI String & Brass ensemble.

Edit: CSS, take one ready


----------



## TGV

I've submitted a pull request for the list with Spitfire UACC articulations.


----------



## babylonwaves

For those who discover articulations sometime not switching with Spitfire instruments: I've talked with their support team. It seems that the way to go is to use UACC KS instead of UACC controller with Logic Pro 10.4. For those who use my templates, i've just issued an update and send you an e-mail with instructions.


----------



## jonathanwright

I use regular keyswitches with Spitfire and the articulations don’t always take.

The same issue occurs with Cinematic Strings.


----------



## Alex Fraser

babylonwaves said:


> For those who discover articulations sometime not switching with Spitfire instruments: I've talked with their support team. It seems that the way to go is to use UACC KS instead of UACC controller with Logic Pro 10.4. For those who use my templates, i've just issued an update and send you an e-mail with instructions.


Interesting. Did they give a specific reason why?


----------



## babylonwaves

Alex Fraser said:


> Interesting. Did they give a specific reason why?


the controller are not designed for use with per-note switching. it's about timing. if you hard quantise it'll get problematic because the events are on the same time stamp. UACC KS (key switches) can deal with this. that also explains why it worked pretty good for me: i hardly quantise my strings. a workaround is to push the notes you want to equip with an articulation change a little back in time.


----------



## Alex Fraser

babylonwaves said:


> the controller are not designed for use with per-note switching. it's about timing. if you hard quantise it'll get problematic because the events are on the same time stamp. UACC KS (key switches) can deal with this. that also explains why it worked pretty good for me: i hardly quantise my strings. a workaround is to push the notes you want to equip with an articulation change a little back in time.


Thanks for that, good to know. I did a quick experiment with a new Albion ONE map. The UACC KS method seems to work just fine. The GUI is still reacting to CC1 artic events. My solution is to simply not look at it.


----------



## gpax

babylonwaves said:


> hi @gpax - if I need to, I'll consider it


Of course it's a moot point now, since the 2.2 update and Spitfire's guidance now remedies this! Thanks!


----------



## khollister

Alex Fraser said:


> Thanks for that, good to know. I did a quick experiment with a new Albion ONE map. The UACC KS method seems to work just fine. The GUI is still reacting to CC1 artic events. My solution is to simply not look at it.



LOL. I had started playing with UACC KS myself right before I got the update email from Marc. So far so good. Although I was never able to create a problem "on demand" so time will tell. 

Interesting that I never recall any issues using UACC with Cubase expression maps - I still think Apple drawing the foul on this one.


----------



## procreative

Still cant decide if the CC1 data containing articulation info is a problem or not. It plays back okay.

In the older Logic, without an Articulation Set loaded, the Event list has IDs rather than Articulations. But the data is the same essentially it just shows as the ID number.






Seems like ID0 in 10.4 is whatever the first Articulation is (maybe there is no ID 0 so it sets to ID 1).






Maybe its a non issue, however with the Kontakt plugin open you will see the articulations switching to the first slot between notes as the modwheel data is triggering them.

Maybe there should be an equivalent of ID0 by using the first slot as an empty articulation?


----------



## khollister

Tried Marc's new Iceni map and wasn't getting any articulation switching. Then I realized he mapped the KS to C-2 and the Spitfire default was several octaves up. Pulled the KS key down to C-2 and all is well


----------



## Alex Fraser

procreative said:


> Still cant decide if the CC1 data containing articulation info is a problem or not. It plays back okay.
> 
> In the older Logic, without an Articulation Set loaded, the Event list has IDs rather than Articulations. But the data is the same essentially it just shows as the ID number.
> 
> 
> 
> 
> 
> 
> Seems like ID0 in 10.4 is whatever the first Articulation is (maybe there is no ID 0 so it sets to ID 1).
> 
> 
> 
> 
> 
> 
> Maybe its a non issue, however with the Kontakt plugin open you will see the articulations switching to the first slot between notes as the modwheel data is triggering them.
> 
> Maybe there should be an equivalent of ID0 by using the first slot as an empty articulation?


I mentioned a few threads back: I found particular setups where having the artic data attached to cc1 etc was useful.

I haven’t done a serious test yet but I guess, like you say, if it plays back ok..


----------



## Alex Fraser

khollister said:


> Tried Marc's new Iceni map and wasn't getting any articulation switching. Then I realized he mapped the KS to C-2 and the Spitfire default was several octaves up. Pulled the KS key down to C-2 and all is well


I..er...may have done something similar.. <cough>


----------



## PJMorgan

jonathanwright said:


> I use regular keyswitches with Spitfire and the articulations don’t always take.
> 
> The same issue occurs with Cinematic Strings.



I was having this problem with LASS & also noticed that NV to V with cc2 wasn't working properly either. so I set all cc to legato, now cc2 is working as it should & the articulations seem to be holding. I think if you set cc to something other than - it helps.

I also noticed if you draw in cc automation it defaults to - but if you double click a node in change it to an articulation & then add more nodes, they remain the same as the 1st selected art until you change to another art in the event list. Can be a bit tedious working this way but not too bad if creating automation curves & not drawing them in.


----------



## procreative

So I did a test. I set the first slot which equals ID 1 in a set to "-" and set the output to something I know is out of the range of triggering an articulation. In my test it was CSS using CC58 so I set the value to 127.

I then recorded CC data, I tried both CC1 and CC2. Both seem to send an ID of 1 with the CC data. I did try using ID0 but it still records CC on ID1.

So because ID1 is now "-" and outputs data that does not target an articulation, no more GUI madness. It does mean that the first selection in an ID does nothing, but maybe thats not a bad thing as it forces you to choose?

Below are the settings I used as an example. The fields to note are the Articulation Name, the ID number and the Output of the first slot.

To conclude, whatever CC data you record seems to send ID1 as well. Not sure if this is a bug, but it seems like it. Cannot think of another way to filter it out.

I just think having the flickering between articulations might create extra CPU drain.


----------



## khollister

Just heard back from my Spitfire ticket on this - they did considerable testing and are preparing a new Knowledge Base article to cover using the 10.4 articulation sets. We have part of the answer thanks to Marc from Babylonwaves, but it sounds like there is more on the way.


----------



## khollister

Here's the Spitfire knowledge base article: 
https://spitfireaudio.zendesk.com/h...0-4-Articulation-Sets-with-Spitfire-libraries

I believe you may need to login to your Spitfire account to access it.


----------



## VinRice

Looks like it's Template Weekend again...


----------



## VinRice

Good work everybody sorting this out... do we have definitive CSS and Berlin workflows yet?


----------



## Wunderhorn

khollister said:


> Here's the Spitfire knowledge base article:
> https://spitfireaudio.zendesk.com/h...0-4-Articulation-Sets-with-Spitfire-libraries



Now, this article does not talk about any possible issues coming up, just basically suggesting which UACC yo use. I haven't had a chance to try it out for myself but is this now a safe method to work with?

BTW. I feel very grateful for everyone here figuring things out. I really hope the Logic team is reading this thread. They couldn't have any better help with finding the bugs than this.


----------



## robh

procreative said:


> So I did a test. I set the first slot which equals ID 1 in a set to "-" and set the output to something I know is out of the range of triggering an articulation. In my test it was CSS using CC58 so I set the value to 127.
> 
> I then recorded CC data, I tried both CC1 and CC2. Both seem to send an ID of 1 with the CC data. I did try using ID0 but it still records CC on ID1.
> 
> So because ID1 is now "-" and outputs data that does not target an articulation, no more GUI madness. It does mean that the first selection in an ID does nothing, but maybe thats not a bad thing as it forces you to choose?
> 
> Below are the settings I used as an example. The fields to note are the Articulation Name, the ID number and the Output of the first slot.
> 
> To conclude, whatever CC data you record seems to send ID1 as well. Not sure if this is a bug, but it seems like it. Cannot think of another way to filter it out.
> 
> I just think having the flickering between articulations might create extra CPU drain.


I can verify this test, however I found that you can leave the ID1 type, selector and value blank as well.

Rob


----------



## Alex Fraser

procreative said:


> So I did a test. I set the first slot which equals ID 1 in a set to "-" and set the output to something I know is out of the range of triggering an articulation. In my test it was CSS using CC58 so I set the value to 127.
> 
> I then recorded CC data, I tried both CC1 and CC2. Both seem to send an ID of 1 with the CC data. I did try using ID0 but it still records CC on ID1.
> 
> So because ID1 is now "-" and outputs data that does not target an articulation, no more GUI madness. It does mean that the first selection in an ID does nothing, but maybe thats not a bad thing as it forces you to choose?
> 
> Below are the settings I used as an example. The fields to note are the Articulation Name, the ID number and the Output of the first slot.
> 
> To conclude, whatever CC data you record seems to send ID1 as well. Not sure if this is a bug, but it seems like it. Cannot think of another way to filter it out.
> 
> I just think having the flickering between articulations might create extra CPU drain.


Thanks for that!
So, the idea is that you record your cc curves whilst the “blank” articulation is activated?

I guess post performance, you can shift your cc curve data to the blank articulation?

Good work, guys.


----------



## resound

For those of you who use SkiSwitcher and/or ARTzID, I HIGHLY recommend getting the new retrofit form Peter. You can get the best of both worlds. Continue to use your old SS/ARTzID scripts, but no longer need the macro, and gain all the new features like changing the articulation in the piano roll and assigning articulation symbols in the score editor. And no issues with CC data sending articulation switching messages!


----------



## khollister

resound said:


> For those of you who use SkiSwitcher and/or ARTzID, I HIGHLY recommend getting the new retrofit form Peter. You can get the best of both worlds. Continue to use your old SS/ARTzID scripts, but no longer need the macro, and gain all the new features like changing the articulation in the piano roll and assigning articulation symbols in the score editor. And no issues with CC data sending articulation switching messages!



What retrofit? I had purchased ArtzID a couple years ago before going the PC/Cubase route but I have apparently fallen off the email list. I guess I need to send Peter a note and get the latest version.


----------



## resound

khollister said:


> What retrofit? I had purchased ArtzID a couple years ago before going the PC/Cubase route but I have apparently fallen off the email list. I guess I need to send Peter a note and get the latest version.


Yea just email Peter and ask for the retrofit and he will send it to you. Basically it's and Articulation Set that you can load in that allows you to use the script plugins, bypass the macro, and gain all the new features. Best of both worlds and definitely worth getting if you are using any ARTzID scripts already.


----------



## procreative

Alex Fraser said:


> Thanks for that!
> So, the idea is that you record your cc curves whilst the “blank” articulation is activated?
> 
> I guess post performance, you can shift your cc curve data to the blank articulation?
> 
> Good work, guys.



From what I can tell, it does not matter what you select. Its more that Logic is creating Articulation data from any CC data. Seems like any CC input when done from hardware generates ID1.

Have to decide whether making ID1 blank is setting up for a complicated system. Its guessing whether this will ever be fixed / or fixed but maybe not for a long while.


----------



## resound

procreative said:


> From what I can tell, it does not matter what you select. Its more that Logic is creating Articulation data from any CC data. Seems like any CC input when done from hardware generates ID1.
> 
> Have to decide whether making ID1 blank is setting up for a complicated system. Its guessing whether this will ever be fixed / or fixed but maybe not for a long while.


What's happening is Logic is sending articulation switching messages whenever it encounters a new articulation ID, even if the ID is attached to CC data. So as long as the ID of the CC data and notes match, Logic won't send unneeded articulation switching messages. I imagine they will issue a fix for this eventually, but who knows when that will be. Having a "blank" articulation in the first slot seems like a good work around for now. Or you can pick up ARTzID and expand on Logic's Articulation System making it even more powerful


----------



## procreative

resound said:


> What's happening is Logic is sending articulation switching messages whenever it encounters a new articulation ID, even if the ID is attached to CC data. So as long as the ID of the CC data and notes match, Logic won't send unneeded articulation switching messages. I imagine they will issue a fix for this eventually, but who knows when that will be. Having a "blank" articulation in the first slot seems like a good work around for now. Or you can pick up ARTzID and expand on Logic's Articulation System making it even more powerful



True except its always ID1 attached to CC events. I actually have ArtzID, however I don't think this eliminates the creation of events. However I think it stops the events triggering.

I encourage people to file a bug report as hopefully if enough do it they may take note.

I wondered if an environment fix might do it, or a MidiFX on the track to stop the events triggering. But I cannot think of a way to target ID1 as they dont seem to be an option in a Transformer object. But I am sure there is probably a way with a MidiFX script.


----------



## resound

procreative said:


> True except its always ID1 attached to CC events. I actually have ArtzID, however I don't think this eliminates the creation of events. However I think it stops the events triggering.
> 
> I encourage people to file a bug report as hopefully if enough do it they may take note.
> 
> I wondered if an environment fix might do it, or a MidiFX on the track to stop the events triggering. But I cannot think of a way to target ID1 as they dont seem to be an option in a Transformer object. But I am sure there is probably a way with a MidiFX script.


No, it's not always ID1 for CC data. Like I said, when Logic sees a new articulation ID it sends an articulation switch message. So you you play a bunch of notes for instance on ID4, then record some CC data on ID5, you'll see the plugin jump to articulation 5.

If you have ARTzID you should definitely get the retrofit he just put out. It's really easy to install and gives you the benefits of the new system without any of the bugs.


----------



## procreative

resound said:


> No, it's not always ID1 for CC data. Like I said, when Logic sees a new articulation ID it sends an articulation switch message. So you you play a bunch of notes for instance on ID4, then record some CC data on ID5, you'll see the plugin jump to articulation 5.
> 
> If you have ARTzID you should definitely get the retrofit he just put out. It's really easy to install and gives you the benefits of the new system without any of the bugs.



So far, thats not what I have experienced, but maybe my tests are done differently.

For me whatever IDs are contained in the notes has no bearing on the ID attached to CC data. In my tests CC data always contains ID1 so if Sustain is on ID1 in the Articulation Set, every CC entry shows Sustain and therefore tries to trigger that articulation.

In itself it does not seem to do too much as this ID is not attached to a note so nothing sounds.

Maybe its the way I have been testing, but I have only seen ID1 being attached to CCs.

Are you getting a different result then?

PS, I have been speaking to Peter and I dont think using ArtzID with the Logic system will alleviate this. At this rate I think I might hold off and see if there is a fix. In the meantime I can carry on using ArtzID...


----------



## Alex Fraser

Unless I'm massively off-base...
It's not the articulation ID that needs targeting, but the midi output triggered and sent to Kontakt when Logic reads an articulation event, as defined in the "output" tab of the articulation map.
But that data can't be blocked or stripped, because it's the same data that Kontakt relies on to switch articulations, just via a different "carrier" event.

I think the whole issue isn't a bug per se, rather a "design decision with unforeseen ramifications." And I'd have to say that having CC modulation pass articulation changes to Kontakt hasn't caused me any real issues.

So the current options would seem to be:

Close Kontakt and ignore the GUI assuming the sound is what you expect.
Remove CC dynamics etc when not required (eg during "shorts.)
Remap any CC curves IDs to the currently playing articulation.
Use the "blank" ID tip.
Bear in mind I'm little more than a Code Monkey and there are others on this forum who'll know better. This is just my take. Like someone said in the previous thread, we're in tin-foil hat territory and should probably let it go until the first round of updates.


----------



## resound

procreative said:


> So far, thats not what I have experienced, but maybe my tests are done differently.
> 
> For me whatever IDs are contained in the notes has no bearing on the ID attached to CC data. In my tests CC data always contains ID1 so if Sustain is on ID1 in the Articulation Set, every CC entry shows Sustain and therefore tries to trigger that articulation.
> 
> In itself it does not seem to do too much as this ID is not attached to a note so nothing sounds.
> 
> Maybe its the way I have been testing, but I have only seen ID1 being attached to CCs.
> 
> Are you getting a different result then?
> 
> PS, I have been speaking to Peter and I dont think using ArtzID with the Logic system will alleviate this. At this rate I think I might hold off and see if there is a fix. In the meantime I can carry on using ArtzID...


Try this:
-record some notes using one ID, lets say ID4
-skip ahead a few bars and record some CC data, i.e. just hit record and move the mod wheel
-open the event list and change the articulation ID of the CC data to a different articulation, lets say ID5
-open Kontakt and play from the beginning of the project
-watch articulation 4 trigger when the notes begin to play, then watch articulation 5 trigger when it reaches the CC data

When Logic sees a new ID (ID5 in this case), even when that ID is attached to CC data, then it sends an articulation switching message (CC32, keyswitch, program change, etc. whatever you are using to change articulations)

I have also been speaking to Peter and he is the one that helped me figure out exactly what was going on. I'm not sure how you reached the conclusion that ARTzID with the retrofit doesn't alleviate this problem? I got the retrofit installed and everything is working perfectly.


----------



## resound

Alex Fraser said:


> Unless I'm massively off-base...
> It's not the articulation ID that needs targeting, but the midi output triggered and sent to Kontakt when Logic reads an articulation event, as defined in the "output" tab of the articulation map.
> But that data can't be blocked or stripped, because it's the same data that Kontakt relies on to switch articulations, just via a different "carrier" event.
> 
> I think the whole issue isn't a bug per se, rather a "design decision with unforeseen ramifications." And I'd have to say that having CC modulation pass articulation changes to Kontakt hasn't caused me any real issues.
> 
> So the current options would seem to be:
> 
> Close Kontakt and ignore the GUI assuming the sound is what you expect.
> Remove CC dynamics etc when not required (eg during "shorts.)
> Remap any CC curves IDs to the currently playing articulation.
> Use the "blank" ID tip.
> Bear in mind I'm little more than a Code Monkey and there are others on this forum who'll know better. This is just my take. Like someone said in the previous thread, we're in tin-foil hat territory and should probably let it go until the first round of updates.


Yes, exactly this. I think the fix Logic should implement would be to have CC data default to ID0 ("----") and NOT have ID0 trigger articulation switches.


----------



## khollister

Resound's approach seems to work great - just tried it out on a coupe tracks. The only slightly confusing things is the articulation ID's will no longer match the UACC-KS values (no big deal but it did keep things simple). And of course I have to edit all of Marc's maps 

Oops - it’s procreative’s tip - sorry


----------



## Alex Fraser

resound said:


> Yes, exactly this. I think the fix Logic should implement would be to have CC data default to ID0 ("----") and NOT have ID0 trigger articulation switches.


A fourth tab in the articulation map: "Filter"
A list where you can define any midi event (note on, CC etc) as a "non articulation event" on which articulation ID's won't be attached. So, you can set up your dynamics, vibrato etc as "clean" data in one place.


----------



## resound

Alex Fraser said:


> A fourth tab in the articulation map: "Filter"
> A list where you can define any midi event (note on, CC etc) as a "non articulation event" on which articulation ID's won't be attached. So, you can set up your dynamics, vibrato etc as "clean" data in one place.


That would be great, as long as there was a global option to filter all CCs. It would be tedious to filter specific CCs one by one.


----------



## procreative

resound said:


> Yes, exactly this. I think the fix Logic should implement would be to have CC data default to ID0 ("----") and NOT have ID0 trigger articulation switches.



Heres the thing though if you draw in the CC data, it does default to – or 0. It only seems to attach ID1 when the CC data is played in using an external controller.

Thats why I wonder if there might be a way to hijack the CC data getting recognised as ID1?



resound said:


> When Logic sees a new ID (ID5 in this case), even when that ID is attached to CC data, then it sends an articulation switching message (CC32, keyswitch, program change, etc. whatever you are using to change articulations)



I have not seen CC data record with anything other than ID1 and I dont get the scenario you mention. I cannot ever see an issue with CC data triggering incorrect articulations as the next note that plays will have the right ID and only notes with IDs trigger actual output.

I can see the IDs attached to CC messages are sending keyswitches, but these get bypassed by the ones attached to notes.


----------



## khollister

I’m pretty sure that in my case, a “-“ is always ID1, not 0


----------



## Peter Schwartz

khollister said:


> What retrofit?




No one's fallen off the mailing list intentionally, I can assure you! Yes, there's a Retrofit for both ARTz•ID and SkiSwitcher that links the best of the 10.4 features to those systems while being immune from the problems being discussed here. I'll be posting more about this in the coming days too.


----------



## rlw

Peter Schwartz said:


> No one's fallen off the mailing list intentionally, I can assure you! Yes, there's a Retrofit for both ARTz•ID and SkiSwitcher that links the best of the 10.4 features to those systems while being immune from the problems being discussed here. I'll be posting more about this in the coming days too.




Ready for the retrofit. I am having an issue when I press play or record where the Articulation from my second key pad is changed to another articulation id that I did not choose. If I have a Cycle record going on. Logic is switching to some default id and I have to trigger my desired articulation again. This is something new in 10.4 . I tried to switch back to logic 10.3.3 but the project I was working on was corrupted with this bug and 10.3.3 did not fix a project that had been saved in 10.4 . Hoping your retrofit will help with is articulation triggering that takes place when every I press play or record . Once I get the retrofit. I will test to see if this problem still exist otherwise I will try and make a video to capture this issue.


----------



## Peter Schwartz

On the subject of CC's with articulation names and ARTz•ID (and SkiSwitcher) being immune from their wiley ways as being discussed here on vi-control... I can confirm that my systems are unaffected by these problems. The reason they're immune from this CC problem is because the Scripts only respond to the ID's of notes. All non-note events (CC's, aftertouch, pitchbend, etc.) are simply passed through.

FWIW... The ID's attached by Logic to any kind of MIDI event never make it out of Logic. They're 100% internal, Logic specific "tags" attached to MIDI events and never see light of day outside of Logic. So when a CC with an articulation name like "Staccato Con Sordino e Molto Fantastico" appears in your event list, well, that's just a name associated with an Articulation ID number. All your plugins ever see are everyday MIDI events, just as we've known and loved them for over 30 years.


----------



## procreative

So I have a sort of solution.

In an attempt to turn my Mackie MCU into a motorised CC controller with midi feedback I created a Midi FX script which (credit where its due) I modified from a free script by Logicscripts (yes I checked he was okay with it).

Basically it creates automation that drives midi CC based controls in instruments, but no midi CC is recorded into Logic. Instead via Smart Controls setup for each hardware controller, it records automation.

The by product of this is that no controller data being recorded means no rogue Articulation IDs in the Event Editor.

Two minor downsides:

1. Put a load of tracks in a Track Stack and while the automation still works, the hardware associated with it no longer works. This is either by intentional design, or a bug. But its been there through many updates so maybe they dont consider it a bug.

2. Midi CC is now automation. Not sure if this is a big deal or not. Komplete Kontrol works the same way though, all their lovely "NKS" controls are simply premapped automation. Kind of like a souped up Smart Controls.

I have attached a Patch for Logic that contains the script on a Kontakt channel for anyone that wants to give it a go. You will have to edit the Smart Controls to link your own hardware controller though.

EDIT: You can change any of the CCs and Names in the Midi FX Script if you are semi-competent at editing scripts, its pretty self explanatory.


----------



## Alex Fraser

procreative said:


> So I have a sort of solution.
> 
> In an attempt to turn my Mackie MCU into a motorised CC controller with midi feedback I created a Midi FX script which (credit where its due) I modified from a free script by Logicscripts (yes I checked he was okay with it).
> 
> Basically it creates automation that drives midi CC based controls in instruments, but no midi CC is recorded into Logic. Instead via Smart Controls setup for each hardware controller, it records automation.
> 
> The by product of this is that no controller data being recorded means no rogue Articulation IDs in the Event Editor.
> 
> Two minor downsides:
> 
> 1. Put a load of tracks in a Track Stack and while the automation still works, the hardware associated with it no longer works. This is either by intentional design, or a bug. But its been there through many updates so maybe they dont consider it a bug.
> 
> 2. Midi CC is now automation. Not sure if this is a big deal or not. Komplete Kontrol works the same way though, all their lovely "NKS" controls are simply premapped automation. Kind of like a souped up Smart Controls.
> 
> I have attached a Patch for Logic that contains the script on a Kontakt channel for anyone that wants to give it a go. You will have to edit the Smart Controls to link your own hardware controller though.
> 
> EDIT: You can change any of the CCs and Names in the Midi FX Script if you are semi-competent at editing scripts, its pretty self explanatory.


That's a great idea. Downloaded and had a quick play, thanks so much for sharing.
Works a treat. I guess a nice side effect is that now with a bit of work, dynamics, vibrato etc can all be labeled in the piano roll. And now with 10.4 showing host automation in the piano roll too, it's quite workable.

So I assume all you have to do is set your midi hardware to drive Logic's smart controls and you're all set?
I doff my hat to you.
A


----------



## procreative

Alex Fraser said:


> That's a great idea. Downloaded and had a quick play, thanks so much for sharing.
> Works a treat. I guess a nice side effect is that now with a bit of work, all your dynamics be labeled in the piano roll. And now with 10.4 showing host automation in the piano roll too, it's quite workable.
> 
> So I assume all you have to do is set your midi hardware to drive Logic's smart controls and you're all set?
> I doff my hat to you.
> A



Correct. If you click the i symbol in the Smart Control window, there is a learn control button at the bottom of the left pane, just like any other control assignment.

What started this off for me is that a Mackie MCU has a Smart Controls layer. It automatically populates the first 8 controls and labels them. Standard layout is to put them on the rotary pots, but using Fader Flip suddenly they are controllable via motorised faders!

I experimented and decided that for edits I would flip the faders, but regular playback I put them back to the rotary pots as its less distracting having faders jigging about.

I think this is the only way to have automatable motorised CC faders, I know of. Icon make a really nice compact fader pack that would be perfect for this called the Platform X.

Cannot decide if there is any detriment to controlling CCs via automation.


----------



## Alex Fraser

procreative said:


> Cannot decide if there is any detriment to controlling CCs via automation.


Well - I guess you can no longer see CC data in the arrange window regions. Sometimes useful as a rough guide.


----------



## procreative

You can if you choose it from the little menu to the left of the track.


----------



## procreative

Like this:


----------



## Alex Fraser

procreative said:


> Like this:


I'm resisting the urge to abandon my current work and instead fiddle with this.


----------



## Ashermusic

While what has been added to 10.4. is terrific, it is only phase 1. Allow me to recommend that you use Peter's SkiSwitcher Retrofits with it, it adds so much!


----------



## procreative

One thing thats a bit unfulfilled is the ability to add score symbols. There are a lot of standard ones like Tremolo, Legato, Trills etc missing and as its buried inside the resources somewhere no way to add custom icons.

Kind of a bit pointless having them if they are incomplete...


----------



## Peter Schwartz

@procreative, I agree, but I think it's a foot in the right direction. And I have faith that the devs know about these limitations and will add more symbols.


----------



## procreative

Also a pity there is no way to change articulations embedded into recorded notes via the same external controller used to select them prior to recording or by assigning a smart control to Articulation.

While changing them via the menus in the Piano Roll or Event List is fine, it would be nice to have a way to do this via an up/down button or using Lemur/TouchOSC or such.


----------



## babylonwaves




----------



## cyrilblanc

I have experiment an Environment solution to send Y value of the VSL Matrices
It is quite simple.
I wanted to attach screen shot pictures but it is asking me a URL !

I use the velocity of the Articulation note
A1 1 1 will give 1 to Y
A1 1 2 will give 2 to Y

0) You define instruments call Violin, Viola and Cello (a,b,c....) with a midi channel set from 1 to ..
1) you set a "condition true" transformer AA to extract the articulation on top cable (<= B1)
2) you define transformer A "apply operation and let non matching pass thru"
You test if note velocity = 1 (Y value) and you will set to what you need for example CC 1 0
3) you define another transformer B
You test if note velocity = 2 (Y value) and you will set to what you need for example CC 1 50

....... you do the same for other values

4) you add a the VSL instrument BB
5) you link AA to A
A to B
B to C......
.... to BB
6) You also link the second branch of AA to BB
7) You link a,b,c to AA

To check you use the Monitor object
to play D3 you should get
A1 (X)
CC 1 1 (Y)
D3 (Note played)


----------



## VinRice

The 'Articulation attached to a Controller move' thing can be really annoying, and I imagine pretty frustrating if you don't know what to look for. OT's Capsule Kontakt engine, for instance, can't keep up with constant switching (the edited curve keeps setting the artic to the default no.1) and you can get stutters. As others have said, it doesn't arise if you _record_ the controller move, only if you draw in or edit the curve. I always simplify and edit controller moves so I am starting to get this a lot. The fix is to select the events in the event list and set the articulations to the correct one (after having made the articulation ID'd visible first - control click in the list and enable them). Once edited, the Art ID's will stick.

I've no doubt Apple intend to take Art ID's further (in the fullness of time...) and this might be a manifestation of that intent, but at the moment it is undoubtedly a big, bad bug.

Having said that, crude as it is at the moment, the articulation system is changing my workflow hugely and for the better. I look forward to seeing what the third-party scripters will do with this.


----------



## Ashermusic

VinRice said:


> The 'Articulation attached to a Controller move' thing can be really annoying, and I imagine pretty frustrating if you don't know what to look for. OT's Capsule Kontakt engine, for instance, can't keep up with constant switching (the edited curve keeps setting the artic to the default no.1) and you can get stutters. As others have said, it doesn't arise if you _record_ the controller move, only if you draw in or edit the curve. I always simplify and edit controller moves so I am starting to get this a lot. The fix is to select the events in the event list and set the articulations to the correct one (after having made the articulation ID'd visible first - control click in the list and enable them). Once edited, the Art ID's will stick.
> 
> I've no doubt Apple intend to take Art ID's further (in the fullness of time...) and this might be a manifestation of that intent, but at the moment it is undoubtedly a big, bad bug.
> 
> Having said that, crude as it is at the moment, the articulation system is changing my workflow hugely and for the better. I look forward to seeing what the third-party scripters will do with this.



What Peter has added to the SkiSwitcher is brilliant and his CC Cloner script totally eliminates the issue of MIDI ccs over various articulations.


----------



## procreative

Ashermusic said:


> What Peter has added to the SkiSwitcher is brilliant and his CC Cloner script totally eliminates the issue of MIDI ccs over various articulations.



Very true! Also if you automate the CCs rather than record them directly this issue no longer applies.

The App Peter released to convert the Art names from the scripts into the Logic one is also superb.


----------



## VinRice

procreative said:


> Also if you automate the CCs rather than record them directly this issue no longer applies.



Ah, that's interesting...


----------



## procreative

VinRice said:


> Ah, that's interesting...



See my script here: https://vi-control.net/community/th...culation-discussion.68635/page-4#post-4187025

Downside is Smart Controls dont work with external hardware when inside a Track Stack as Apple made them only work on the folder and not its contents, grrr...

Have to say though ArtzID is probably the next best thing, if you are prepared to set up the scripts for each library.


----------



## VinRice

I have ArtzID, (I have all of them actually) but I'll be honest, the preplanning and futzing with Logic required doesn't fit the way I like to work and isn't a priority when there's a new project to be dived into. Also, while Peter is a great programmer - graphic designer, not so much... I'm hanging fire on committing to a system while everybody, particularly Peter, works out out how to best exploit these new riches Apple have graciously bestowed upon us.


----------



## procreative

Well in the new setup for ArtzID, you no longer need the environment fix. The app he made extracts all the Articulation names and IDs and creates an Apple preset premapped from this.

The redirection from the script is purely to generate the IDs, the names are generated by the Logic Set. His system bypasses the Modwheel and CC issues.

Once its set up, you should never have to stare at that script window again...

Still a bit disappointed that the score symbol feature is a bit unfinished and there is no way to extend it.

Means no way to add score info for a lot of things like Legato, Tremolo, Trills, Portato etc. And some of the symbols used seem to be different to the markings defined by other sources.


----------



## Ashermusic

VinRice said:


> I have ArtzID, (I have all of them actually) but I'll be honest, the preplanning and futzing with Logic required doesn't fit the way I like to work and isn't a priority when there's a new project to be dived into. Also, while Peter is a great programmer - graphic designer, not so much... I'm hanging fire on committing to a system while everybody, particularly Peter, works out out how to best exploit these new riches Apple have graciously bestowed upon us.




That is a fair statement for those who don't like templates and "pre-planning and futzing". But "pre-planning and futzing" is the lifeblood of working composers.


----------



## synthpunk

I'm finding this less and less myself. More people are discovering spontaneity equals more creativity and originality.



Ashermusic said:


> That is a fair statement for those who don't like templates and "pre-planning and futzing". But "pre-planning and futzing" is the lifeblood of working composers.


----------



## Ashermusic

synthpunk said:


> I'm finding this less and less myself. More people are discovering spontaneity equals more creativity and originality.



Are you working under tough deadlines?


----------



## synthpunk

Not in about 20 years Jay , but others I know are. Anyway YMMV, choose the way that works best for you, just as long as you know there are other ways



Ashermusic said:


> Are you working under tough deadlines?


----------



## Ashermusic

synthpunk said:


> Not in about 20 years Jay , but others I know are. Anyway YMMV, choose the way that works best for you, just as long as you know there are other ways




Well, there is the difference. Working the way you work is probably not a realistic option for composers under the tight deadlines like those who hire me to help set up Logic Pro/VePro templates. They don't have time for futzing around

I do _have_ the time, unfortunately, but I like composing with structures, as that is how I was trained.


----------



## procreative

Not sure where to post this but as it is connected to the Logic Articulation Sets thought I would post here.

Notice the score symbols are currently a bit limited in the Articulation Sets and also some are different to the info I have found on the correct symbols. 

For instance the symbol used for Flageolet is also the symbol for Harmonics, the symbol used for Marcato is different to reference I have found. Also the symbol shown as Double Accent I have seen used for Marcato Legato!

So I wondered if there is a definitive resource of the "correct" symbols?

This is the best reference I could find but its not comprehensive and I have found other sites that contradict it?

https://en.wikipedia.org/wiki/List_of_musical_symbols

Also can someone explain the difference between Detache and Tenuto? And what is Martellato (not got one library with this one)?


----------



## Peter Schwartz

Some of the names given to some of the articulation markings are _definitely _not to be taken literally in all cases. It would be a mistake to look at that list of markings and think of it as a reference guide as to what they're called. And it's unlikely that any authoritative guide on the subject would entirely agree with another authoritative guide! But there is some black/white right/wrong with some of these terms:

*Staccato*, *Staccatissimo*, and *Tenuto:* Fine.
*
Marcato:* Not a chance. That's an accent.
*
Marcatissimo:* Not a chance. _That's_ marcato.
*
Pizzicato (Left Hand)*: Sure, but it's also used to indicate "stopped" notes for horns.
*
Martellato*: Eh... Not sure about that.
*
Double Accent*: Not a chance. In piano music it's used to indicate a stronger-than-usual accent. In string writing it indicates an accent but to also play the note for its full written length. For winds it would indicate that the note is to be accented and tongued. Double accent? Not a chance!

*Stroke Up *& *Stroke Down*: "down bow" and "up bow".

*Flageolet*: Harmonic.


----------



## procreative

Peter Schwartz said:


> Some of the names given to some of the articulation markings are _definitely _not to be taken literally in all cases. It would be a mistake to look at that list of markings and think of it as a reference guide as to what they're called. And it's unlikely that any authoritative guide on the subject would entirely agree with another authoritative guide! But there is some black/white right/wrong with some of these terms:
> 
> *Staccato*, *Staccatissimo*, and *Tenuto:* Fine.
> *
> Marcato:* Not a chance. That's an accent.
> *
> Marcatissimo:* Not a chance. _That's_ marcato.
> *
> Pizzicato (Left Hand)*: Sure, but it's also used to indicate "stopped" notes for horns.
> *
> Martellato*: Eh... Not sure about that.
> *
> Double Accent*: Not a chance. In piano music it's used to indicate a stronger-than-usual accent. In string writing it indicates an accent but to also play the note for its full written length. For winds it would indicate that the note is to be accented and tongued. Double accent? Not a chance!
> 
> *Stroke Up *& *Stroke Down*: "down bow" and "up bow".
> 
> *Flageolet*: Harmonic.



Great, thanks very enlightening. Can you describe Tenuto as I dont recall that in any string sample library but have seen it in Brass, but it almost sounds like Portato to me or short Marcato.

Can you recommend an authoritative guide used by most players?

Also seems a bit "beta" to include score marks, but leave many out or use completely incorrect ones. At least if they had put a way to edit these... seems like a pointless feature as it stands and I will be surprised if they ever improve it as I doubt its a mass-market feature.


----------



## stonzthro

I too was scratching my head at some of the names, but I'm so glad to see they are at least heading in the right direction. I can't think of the last time I needed a left hand pizzicato, but nothing for trills or tremolo? Really LEFT hand pizzicato?!?


----------



## Peter Schwartz

Tenuto is not an articulation but rather a performance indication that has different meanings for different instruments.

The text indication "tenuto" in vocal music means to hold out a note in an expressive way.

In the following passage for clarinet, the slur indicates that the entire passage should be played on one breath.




The tenutos indicate where the notes should be tongued. Where there is no marking (such as the first two notes) it's understood that the notes are to be played legato. The very last tenuto'd note and the staccato note following it are an interesting situation, because the staccato note would not necessarily have to be tongued, but it probably would be. It's not necessary to indicate tenuto for that note.

However... if there _were_ a tenuto on that staccato, it would indicate that the note was to be held a little longer than a normal staccato note.

If this were written for violin, the slur wouldn't make much sense. But the tenutos would indicate that the notes were to be played for their full duration, as opposed to being played detaché (detached, with small spaces between them).

If this were written for piano, the tenutos would indicate a slight accent or stress on those notes. Not as much as an full accent, but somewhere in-between how you might normally play the note and an accent.

But the tenuto on the F at the end of bar 1 bar would be awkward to play as tenuto unless the tempo was fairly slow. And the tenuto on the D natural in bar 2 would only really make sense if there was an accent on the staccato Eb.

To summarize, tenuto is not an articulation. It's all of the above.


----------



## procreative

Peter Schwartz said:


> Tenuto is not an articulation but rather a performance indication that has different meanings for different instruments.
> 
> The text indication "tenuto" in vocal music means to hold out a note in an expressive way.
> 
> In the following passage for clarinet, the slur indicates that the entire passage should be played on one breath.
> 
> 
> 
> 
> The tenutos indicate where the notes should be tongued. Where there is no marking (such as the first two notes) it's understood that the notes are to be played legato. The very last tenuto'd note and the staccato note following it are an interesting situation, because the staccato note would not necessarily have to be tongued, but it probably would be. It's not necessary to indicate tenuto for that note.
> 
> However... if there _were_ a tenuto on that staccato, it would indicate that the note was to be held a little longer than a normal staccato note.
> 
> If this were written for violin, the slur wouldn't make much sense. But the tenutos would indicate that the notes were to be played for their full duration, as opposed to being played detaché (detached, with small spaces between them).
> 
> If this were written for piano, the tenutos would indicate a slight accent or stress on those notes. Not as much as an full accent, but somewhere in-between how you might normally play the note and an accent.
> 
> But the tenuto on the F at the end of bar 1 bar would be awkward to play as tenuto unless the tempo was fairly slow. And the tenuto on the D natural in bar 2 would only really make sense if there was an accent on the staccato Eb.
> 
> To summarize, tenuto is not an articulation. It's all of the above.



Well explained. I guess for us non-theory trained (I mean me, apart from Grade 5 Theory a LONG time ago), its confusing in the sample world as what are called "articulations" are a mix of articulations and performance directions.

But for a score element system in Logic to work it would need to be able to intelligently insert performance instructions only at the start of each instance of notes using those?

Probably not ever going to happen, and maybe a step too far.


----------



## stonzthro

Excellent points Peter! 
As a string player I usually put a little more weight on a tenuto as well.


----------



## PeterJCroissant

guys really need your midi expertise!

OK below is a short video I made of this weird behaviour in Logic Pro x 10.4 trying to get the articulation feature working.

Brief explanation, if I record a passage with out dynamics, or in fact any kind of CC information, all works as expected. 

how ever as soon as I add, in this case dynamics control it makes the articulation switch back to the 1st articulation in the list! 

its not my controller, if I draw in the CC it has the same effect. now I'm stuck!

help much appreciated!


----------



## Alex Fraser

PeterJCroissant said:


> guys really need your midi expertise!
> 
> OK below is a short video I made of this weird behaviour in Logic Pro x 10.4 trying to get the articulation feature working.
> 
> Brief explanation, if I record a passage with out dynamics, or in fact any kind of CC information, all works as expected.
> 
> how ever as soon as I add, in this case dynamics control it makes the articulation switch back to the 1st articulation in the list!
> 
> its not my controller, if I draw in the CC it has the same effect. now I'm stuck!
> 
> help much appreciated!



Try this?
Logic also attaches articulation data to CC events like dynamics, expression etc.
Open the event list editor and right click on the column headings > show articulation.
Does your CC data have articulation data attached? Try removing the articulation data from your CC events. Does this make a difference?


----------



## procreative

This is I am afraid a bug in the design of this system.

Two current options:

1. Use ArtzID by Skiswitcher as the ID mapping to keyswitches is done by the script this problem is eliminated as the script only looks for notes with ID information.*

* Pros - Eliminates Logic bug
** Cons - Needs you to create scripts for each library, not hard but time-consuming

2. Use automation to drive your CC data (I made a script elsewhere that controls CC in Vi's via automation events using Smart Controls which can be mapped to hardware)**

** Pros - enables use of motorised faders and can recall events in real time
** Cons - need to enable automation Write/Touch/Latch to record these and Smart Controls wont recognise hardware when inside a Track Stack.

Currently the Articulation Sets feature feels kind of Beta, its superb but has a few flaws.


----------



## PeterJCroissant

thanks guys - what a pain...Skiswitcher is not on sale right now, but looks like it might be the way to go.

hmmmm...why is it so difficult!


----------



## PeterJCroissant

what do you think of this?

http://www.audiogrocery.com/a.g_toolkit_pro.htm


----------



## Sami

In comparison to @Peter Schwartz systems and Art Conductor, it's from the dark ages


----------



## PeterJCroissant

Ok I will wait for Skiswitcher.

But I have one more mystery to solve!

If we build a multi in Kontakt, with different articulations in each instance of LASS for example. Each articulation from what I’ve seen is on a different middle channel, make sense. But, how can you record dynamics for the different sustained articulations? As now it needs to be recorded on multiple midi channels?


----------



## Alex Fraser

PeterJCroissant said:


> Ok I will wait for Skiswitcher.
> 
> But I have one more mystery to solve!
> 
> If we build a multi in Kontakt, with different articulations in each instance of LASS for example. Each articulation from what I’ve seen is on a different middle channel, make sense. But, how can you record dynamics for the different sustained articulations? As now it needs to be recorded on multiple midi channels?


If I understand correctly - the problem/bug you encountered with your first example now becomes an asset! 
As the articulation data is attached to your CC data, your CC data will automatically get sent to to the correct midi channel/part/instrument.

https://vi-control.net/community/threads/logic-pro-x-10-4-released.68505/page-17#post-4184922


----------



## PeterJCroissant

Alex Fraser said:


> If I understand correctly - the problem/bug you encountered with your first example now becomes an asset!
> As the articulation data is attached to your CC data, your CC data will automatically get sent to to the correct midi channel/part/instrument.
> 
> https://vi-control.net/community/threads/logic-pro-x-10-4-released.68505/page-17#post-4184922




Well thank you! I’ll have a read...


----------



## samtrino

See if this helps:

From Logic's menu, select: Mix > Autoselect Automation Parameter in Read mode






PeterJCroissant said:


> guys really need your midi expertise!
> 
> OK below is a short video I made of this weird behaviour in Logic Pro x 10.4 trying to get the articulation feature working.
> 
> Brief explanation, if I record a passage with out dynamics, or in fact any kind of CC information, all works as expected.
> 
> how ever as soon as I add, in this case dynamics control it makes the articulation switch back to the 1st articulation in the list!
> 
> its not my controller, if I draw in the CC it has the same effect. now I'm stuck!
> 
> help much appreciated!


----------



## PeterJCroissant

samtrino said:


> See if this helps:
> 
> From Logic's menu, select: Mix > Autoselect Automation Parameter in Read mode




well thanks for trying but unfortunately not... it was selected already so I tried unselecting it..

but thank you for trying...


----------



## procreative

FYI Skiswitchers ArzID has another script included that passes CC data down the chain for VIs with Multis addressing separate Midi Channels instead of Keyswitches. 

I have a setup for Hollywood Strings due to its lack of keyswitch patches, plus I use this for several Kontakt ones that dont have full setups..

For example I use it for Symphony Series Brass as it does not have the option natively to have Sustain and Legato as separate keyswitches (from memory I think the main patch uses scripted Legato in the main keyswitch patch - you have to use the separate Legato patch for True Legato).


----------



## PeterJCroissant

Ahh I see, yes I have NISSB also, and also noticed legato was in a completely different patch. So that’s a good idea!

So is all this easy to use? The Skiswitcher I mean? I spoke to peter and he thinks a new version will be ready in a week or so... I’m really hoping it will cover all bases, can’t work with out dynamics!


----------



## procreative

I think it will be "easier" than it was. Its not hard to do, just a little bit time consuming. With the new Logic system its a hybrid of both so most of the Articulation info is in the Logic one. His documentation is good though...

And Peter is a very patient man, always happy to help.


----------



## A.G

Sami said:


> In comparison to @Peter Schwartz systems and Art Conductor, it's from the dark ages



*Sami*, it seems that you come from the dark ages...

*AG 6.0 - coming SOON!*

*New Generation AG Articulation EDITOR & system*.
Cubase <=> Logic (coming soon).

*Logic 10.4 Full Compatibility (Save as Logic Articulation Sets)*

Now you can save your AG Presets as *100% working Logic Articulation sets* including quick/easy assignment Art MIDI Remote Control etc.
• You can create Logic Art Sets *at least 20 times quicker and easier* (than in Logic or Cubase Editors) via our Advanced Art EDITOR software, thanks to the Editor multi selection feature, smart batch editing functions, selective *Art Sets* management (New) and the quick key commands.
• You can order the Articulations and remap the Remote Assignments as you wish. The Articulation custom ordering, Remote mapping, Colors, Instrument Names, and all prefixes are exported in the Articulation Sets (10.4 and above) presets.
• The new AG Art Sets export feature is 100% compatible with the AG Remote iPad Workstation. You can teleport the AG Presets to the iPad (Art Names, Groups, Colors etc) and the iPad will create a layout automatically which will work natively with the new Logic Art Sets.

*New Advanced MIDI FX Art Switching system ::: Automation Names + Art IDs + Program + COMBO triggering*

Now the new AG Advanced Articulation system (AG MIDI FX Scripter) offers multiple types of Articulation Maps triggering and it is working with all sorts of Instruments offered on the market. The advanced Art system offers Maps triggering via:
• Automation Text Control Points (Region Based - editing in the Piano Roll and Main Window).
• Art IDs. The AG Remote can record the external KS, Program, or CC as ArtIDs on the fly.
• Program Change.
• Combo Automation Text points + Art Ids (on the fly - real time switching). It is an *awesome* brand new system! The Cubase types "Attributes & Directional" are fixed and cannot be changed during real time composing. You are 100% free to change that in real time in Logic now!
• Combo Art Ids + Program Change (on the fly - real time switching).
*Note*: The AG Advanced MIDI FX plugin is 100% compatible with the Logic Art Sets Names and MIDI remote. We offer a custom export setting for the Arts Sets which includes Art Names & Remotes. In this case the AG MIDI FX system Art IDs are shown as Names in the Piano Roll, Event List and work with the Score.

*iPad (Lemur) Art Teleport Workstation*

Now the AG iPad Workstation layout/pages are switched automatically by a simple track selection for the AG Logic "Large Orchestral" templates. No manual "Recall" buttons pressing is required. The new system is powered by a new AG Control Surface which allows you to use multiple iPads and MIDI devices.
Let's remind you that the AG EDITOR Art Sets *can be teleported* to the AG iPad Workstation. All Art Names, Colors, Groups are auto arranged in the iPad layout automatically.
• Now the AG iPad Workstation offers +/- Vertical Zoom for the Mixer Faders - up to three levels.
• We implemented a Copy/Paste Mixer buttons where you can copy and selective paste the Mixer Labels, the Controller Assignments and the Fader levels from one interface/page to another.

*AG Large Orchestral Templates *

We have developed a few Large Professional Orchestral templates (ideal for VE PRO) - up to 1200 Instruments.
• The Pro Orchestral templates come with well organized orchestral stacks, Screensets & Name Labels for each Orchestral section, Track Colors & Icons etc.
• The new MIDI Channel Strip Expansion Headers allow you to route the Channel Strips to a "Local" or "Sum" Studio MIDI Device Port, and change the hardware device presets via a track selection or MIDI recall message (the iPad is automatic).
The Channel Strip MIDI Expansion Headers offer several MIDI output types such as: Program Change, Key Pressure, Control Change etc.

*Basic Art EDITOR features*

• Copy Map Next KS1 or KS2 (the selected Map is copied with the next Key Switch number according to the "Copy Semitones" or "White Keys" user stetting.
The Controllers or Program assignments are copied as well - essential feature for the VSL Instruments, Sample Modeling etc.
• Copy Maps Next Channel. It is an essential feature for Instruments which are switched via MIDI Channel or for creation Multi Timbral Presets.
• Duplicate Map.
• Swap KS1 & KS2 assignments.
• Articulation Color Assignments (the Color system works with the Logic 10.4 Arts Sets and AG Pro System). The Color system offers: Select Maps by Color, Group Maps by Color, Clear Group Color.
• Multiple sorts of Instrument name prefixes which can be added to the Logic Art Names.
• You can organize the Articulations (Maps) in any custom order (all Output assignments are *embedded* into the Art Sets).
• etc, etc, etc, ...

*New Editor Features (AG 6.0).*

• Ultimate Articulation Sets *Selective Management*. Now you can open two Art EDITOR applications and custom select Articulations (via the Shift or Command computer keys) in the 2nd Editor and drag and drop the selection into any Articulation position the 1st Editor. This way you can create Universal pallets from existing ones.
• Multi Timbral Management (*from Finder*). Now it is possible to drag & drop or merge AG "Single Instrument" Presets directly from the Mac Finder and quickly create a Multi Timbral Instrument Art Set for seconds.
• *Selective Multi* Instrument Arts Management. You can open two Art Editor applications (Multi Presets) and drag & drop Multi Parts from Editor to Editor and order them as you wish. *The Part MIDI Channel is changed automatically* according to the MIDI Part/Channel order.
• Key Switch Velocity assignment powered by Multi Selection editing.
• The Multi Timbral Parts can be saved as *Logic 10.4 Art Sets*.
• Now when you create a new Articulation Map or press the "Copy Next KS1/KS2" buttons, then the newly created Map Name is highlighted automatically.
• A New Auto-Save system and various Smart Dialogs.
• New onboard UI quick buttons: Art Names, Groups, Remote.
• The Advanced Key Switch "Latching" system offers:
- *KS-OFF* Key Switch setting which is an emulation that you wish to release the KS manually.
- The new "*Auto-Latch*" Articulation assignment lets Logic know to release the latched KS during the Logic "Stop" and trigger it automatically during the Logic "Start".
• Clear Map(s) Assignments. For example if you wish to remove the UACC CC32 assignments and create new KS assignments and keep the Articulation Names, Colors, Groups, you can use that new function which is powered by the keycommand "C".
• Optimize the Articulations. This new function is very useful and you can use it in several situations. For example if you create an Art Set for the SPITFIRE Instruments and there are "blank/gray" Articulation buttons in the Instrument, then you can keep the Maps creation using the "Copy Next KS".
During the Maps creation you type Art Names for all valid Articulations and keep the blank Art Names as "---". After the quick Copy KS procedure you just press the keycommand "O" or its relative Menu item. All non-assigned Maps will be deleted automatically.
• Save As Template. This is a new sort of "Save As". For example you can create 30 empty Maps with default KS Assignments and MIDI Remote Assignments (10.4.0) etc.
You can load that starting template and multi-select and set the base KS & Remote and assign Names only for 10 Maps according to the Instrument and your External keyboard specifics. After that you can press "O" (Optimize) and all non-assigned Maps will be deleted.

*AG Factory Articulation Sets (Maps) Library*

AG Factory Library comes with thousands of Articulations Sets for many third party libraries. All AG Editor Presets support:
 • Logic Art Sets 10.4 "Save As" format;
 • AG Advanced Art System switching via AG MIDI FX Plugin (including Kontakt A,B,C,D Ports).
 • All Presets can be teleported to AG iPad Workstation and used with track selection auto-switching control surface system (supporting 10.4 Sets and AG Pro).


These are some of the very basic features which come with AG 6.0.

I will start a new topic soon with detail images and demo Videos.

Best

Ivan & AG Team


----------



## procreative

Such a huge commercial post in a non commercial thread... hmmm just a little distasteful? All you needed to do was refer to your site not post a huge advert?


----------



## PeterJCroissant

I have looked at the AG, with respect it looks too complicated for me? And I’m an engineer! My main concern from what I can tell is it uses CC data to switch articulations? In which case I’m in the same boat I was to begin with, as any CC data just cause the strange behaviour I posted earlier..

Please correct me if I’m wrong though.


----------



## babylonwaves

A.G said:


> *Sami*, it seems that you come from the dark ages...


Sami is not from the dark ages and I think he's right. Your software is just as cluttered and complex as your huge advertisement you've just copied in this thread.


----------



## Heinigoldstein

PeterJCroissant said:


> I have looked at the AG, with respect it looks too complicated for me? And I’m an engineer! My main concern from what I can tell is it uses CC data to switch articulations? In which case I’m in the same boat I was to begin with, as any CC data just cause the strange behaviour I posted earlier..
> 
> Please correct me if I’m wrong though.





PeterJCroissant said:


> I have looked at the AG, with respect it looks too complicated for me? And I’m an engineer! My main concern from what I can tell is it uses CC data to switch articulations? In which case I’m in the same boat I was to begin with, as any CC data just cause the strange behaviour I posted earlier..
> 
> Please correct me if I’m wrong though.



This is exactly why I still use SkiSwitcher, even it is a little limited. I don't need to have another piece of software driving me crazy because it's too complicated. And since Peter promises hat the new ARTzID will be much more easy to use too, I'm looking forward to it.
And I prefer to support people acting fair and nicely


----------



## PeterJCroissant

@procreative I have found out what is causing the conflict, when you add in dynamics into your recorded midi region, what ever is selected here(where the arrow is) is what it tries immediately to switch back to. (the behaviour in my earlier video)

I have just sent a logic song and demonstration to @babylonwaves who is Kindly going to take a look..

thought you would like to know,

best
Pete


----------



## khollister

As much as we have discussed this, there appears to be something we still haven't quite got our heads around.

I just played with this again without Peter's scripts (ArtzID) using SSS and Babylonwave's articulation set (UACC-KS).

I played in a simple region with no CC's, went back in and changed the articulations around in piano roll, verified it played correctly, drew in a CC1 curve and it still played correctly. The problem I thought we had from previous testing was the articulation appearing to jump back to ID 1 in between notes (causing visual indications in the Kontakt GUI) but the notes still playing with the correct articulations.

The problem you seem to be having was when we were using UACC before Spitfire did testing and advised UACC-KS.

I do not have the NI Symphonic stuff to try and duplicate your test case. While I have other orchestral libs, most of my work is with Spitfire, at least since LPX 10.4 came out.


----------



## resound

The problem has nothing to do with UACC. The problem is built in to Logic's articulation system. It sends articulation switching messages when it encounters CC data with a different articulation than the previous played note/CC. It doesn't "jump back to ID1". Logic sees CC data with a new ID and sends an articulation switching message to switch to that articulation.


----------



## khollister

When I look at my test case in Event List - all the CC events have "-" for the articulation, which we collectively determined to be ID 1 when we slogged through this earlier. I realize some folks appear to be getting explicit articulation ID's in the CC events, but I usually don't see that whether I play it in or draw it in afterwards.

I have no interest in arguing about this again but I'm just noting there is still a pattern to how individuals use this that I certainly haven't discovered that gives very different results. I understand that articulation ID's being on the CC events is the anomaly, but the actual usage pattern that creates the playback problem isn't 100% clear to me yet. I have had one or two odd things happen since this was introduced, but most of the time it works as expected. Every time I go back to try and duplicate the problem I fail, regardless of how I input the CC data. That's all I was trying to communicate.


----------



## Alex Fraser

Some musings.

I don't think the articulations on CC's thing is a bug, but a design feature (with unforeseen consequences.) In some use cases, it's helpful.
The Logic artic system working as advertised (seems) to depend on how the receiving Kontakt instrument interprets the data. I've had no problem with Spitfire stuff, but gave up on the Symphony Series pretty quickly.


----------



## cyrilblanc

Be careful CC 1 is used in VSL Strings to specify the Y value


----------



## khollister

Alex Fraser said:


> Some musings.
> 
> I don't think the articulations on CC's thing is a bug, but a design feature (with unforeseen consequences.) In some use cases, it's helpful.
> The Logic artic system working as advertised (seems) to depend on how the receiving Kontakt instrument interprets the data. I've had no problem with Spitfire stuff, but gave up on the Symphony Series pretty quickly.



Interesting - I had missed the different results on NI vs Spitfire in the original discussions, Alex. That would begin to explain our differing results. I guess I need to spend some time on this with a non-Spitfire KS library and see what happens.

Or just stick to ArtzID and get on with it


----------



## Alex Fraser

khollister said:


> Or just stick to ArtzID and get on with it


Haha, yep it's a dangerous rabbit hole..


----------



## PeterJCroissant

I can’t really understand why it’s so difficult to do this?

I mean every VI can work with key switches, so each midi note has some meta data attached that triggers the key switch... simples! Which is kind of what we have I assume but it doesn’t work..

I must be missing something, well let’s hope some can fix this stuff..


----------



## resound

Hopefully this will clear up any confusion. As Alex Fraser said, it's not necessarily a bug but rather the way it has been designed to work in Logic. In the video below you will see when I change the ID of the CC data, the GUI jumps to that articulation, because Logic is sending the key switch for that articulation when it encounters the CC data with the new ID. In most cases playback is not affected, you just get a glitchy GUI due to the fact that Logic is sending unnecessary key switches. But if you are using Spitfire UACC it seems to interfere with playback which is why Spitfire recommended using UACC KS instead.


----------



## khollister

PeterJCroissant said:


> I can’t really understand why it’s so difficult to do this?
> 
> I mean every VI can work with key switches, so each midi note has some meta data attached that triggers the key switch... simples! Which is kind of what we have I assume but it doesn’t work..
> 
> I must be missing something, well let’s hope some can fix this stuff..



The note-on key switches work fine - that's not the issue. The problem is the interaction of the CC events also carrying an articulation payload and how the heavily scripted Kontakt instrument interprets those CC articulation messages. I also think there is some sensitivity to the timing of CC events to note-on events and which one sends the first articulation ID. Notice in your original video example, the articulations played fine in the section of the region where the CC data was - I recall the problem was as the playhead cycled back around to the first phrase with no explicit CC events. I think I may have stumbled on something similar a few weeks ago when several of us were trying to run this to ground but couldn't duplicate it. That is the "missing" part of the use case pattern I was referring to.

Just give the other Peter a week or two to get ArtzID 2.0 ready and use his scripts. While they shouldn't be necessary in a perfect world, it's cheap, pretty easy to use and not only fixes this stuff but adds some value in dealing with VSL in particular.


----------



## PeterJCroissant

resound said:


> Hopefully this will clear up any confusion. As Alex Fraser said, it's not necessarily a bug but rather the way it has been designed to work in Logic. In the video below you will see when I change the ID of the CC data, the GUI jumps to that articulation, because Logic is sending the key switch for that articulation when it encounters the CC data with the new ID. In most cases playback is not affected, you just get a glitchy GUI due to the fact that Logic is sending unnecessary key switches. But if you are using Spitfire UACC it seems to interfere with playback which is why Spitfire recommended using UACC KS instead.



Ahh ok at least I’m not going mad! Ok... enough! Let’s wait for peters Skiswitcher... thanks all


----------



## cyrilblanc

and if you remove the overlap on the notes ????


----------



## resound

PeterJCroissant said:


> Ahh ok at least I’m not going mad! Ok... enough! Let’s wait for peters Skiswitcher... thanks all


It'll be worth the wait!


----------



## procreative

From my tests, adding CC data via hardware controllers adds ID1 data which correlates to the first articulation. 

I did not test every library but I did not hear rogue articulations triggering as this data is attached to CC data and the next note played with whatever ID attached to it.

The main concern (for me) was the possibility of GUI madness and other unknowns.

What I did find is that drawing in curves with the mouse did not attach ID data to these CCs.

I also found that uisng automation to drive CCs also prevented this.

And finally using this in combination with ArtzID, also prevented this as while the Articulation names and means to trigger them are in Logic, the triggering rules for keyswitche output to the instrument are in ArtzID and this only recognises IDs attached to notes.


----------



## Alex Fraser

khollister said:


> The problem is the interaction of the CC events also carrying an articulation payload..


"Articulation Payload." I love that phrase. Can we use it from now on?


----------



## A.G

PeterJCroissant said:


> I have looked at the AG,
> with respect it looks too complicated for me? And I’m an engineer!


It is not complicated at all in comparison with the Cubase and the new Logic 10.4
Editors. As I mentioned the "Copy Key Switch" with Next semitone (or white keys),
"Copy Map Next Ch", multiple Articulation selection and changing the Values
assignments in multiple selection, batch editing and Key Commands saves lots of
time. Obviously, you have to watch our new Videos about that after we release v6.


> My main concern from what I can tell is it uses CC data to switch
> articulations?


You are wrong. The current AG version 5 switches the Articulations via Logic region
based automation text control points - not CC. You only press "Record" and record
Articulation switching and music - the external MIDI Remote events such as KS,
Program or CC are transformed on the fly into text automation control points visible in
the Main window and Piano Roll. The Text points are embedded into the regions so
you can copy, cut, move the regions without a problem.
The upcoming AG 6 offers three types of Arts triggering:
- Via Text Region automation control points (similar to Cubase "Directional" type).
- Via Art IDs (Similar to Cubase "Attribute" type or Logic 10.4 triggering).
- Via Program Change.
- Combo Automation+ArtID (realtime recording and offline editing).
- Combo Program+ArtID (realtime recording and offline editing).

We will post much more info and a Demo video in the Commercial Announcements soon.


----------



## khollister

Alex Fraser said:


> "Articulation Payload." I love that phrase. Can we use it from now on?



Sorry, retired software engineering manager out of the aerospace industry


----------



## VinRice

Some libraries will glitch with the constant return to ID1, some won't. It depends on the Kontakt scripting overhead I assume. The Larger Spitfire and Berlin libs will glitch. It'll get sorted soon enough I'm sure. The 'ASS Wars©' (Articulation Switching Scripters) can only benefit us users...


----------



## babylonwaves

PeterJCroissant said:


> @procreative I have found out what is causing the conflict, when you add in dynamics into your recorded midi region, what ever is selected here(where the arrow is) is what it tries immediately to switch back to. (the behaviour in my earlier video)
> I have just sent a logic song and demonstration to @babylonwaves who is Kindly going to take a look..


@PeterJCroissant: the instrument doesn't switch back. every new event (note or controller) is treated independently by the engine. it's not a problem to send events with different IDs while a note is playing. Kontakt can handle that. what you see is a simple side effect of the user interface which attempts to always indicate the latest articulation. i'm sure this switching needs calculation power, one the CPU or the graphic card, i don't know. and that's why some of us see glitches. maybe it's worth experimenting with the DFD buffers to see if the glitches disappear. maybe manufacturers will tweak and optimise their instruments if required. but unless i'm missing something, there is nothing wrong with the way logic works.


----------



## babylonwaves

resound said:


> But if you are using Spitfire UACC it seems to interfere with playback which is why Spitfire recommended using UACC KS instead.


spitfire recommends UACC KS to ensure that the articulations are switched in a reliable fashion. with UACC [no KS] the switches sometimes don't work when you hard quantize notes. it has nothing to do with controllers.


----------



## cyrilblanc

What is bothering me about the example above is that the notes are overlapping !
You should remove the overlap on the notes to see if your problem is solved 
You should analyse the midi flow


----------



## PeterJCroissant

babylonwaves said:


> @PeterJCroissant: the instrument doesn't switch back. every new event (note or controller) is treated independently by the engine. it's not a problem to send events with different IDs while a note is playing. Kontakt can handle that. what you see is a simple side effect of the user interface which attempts to always indicate the latest articulation. i'm sure this switching needs calculation power, one the CPU or the graphic card, i don't know. and that's why some of us see glitches. maybe it's worth experimenting with the DFD buffers to see if the glitches disappear. maybe manufacturers will tweak and optimise their instruments if required. but unless i'm missing something, there is nothing wrong with the way logic works.



Ok understood.

I totally agree in the NI example there is no resultant audio effect.

How ever I did hear it on Spitfire, BUT I must double check if I was using UACC KS or not.



cyrilblanc said:


> What is bothering me about the example above is that the notes are overlapping !
> You should remove the overlap on the notes to see if your problem is solved
> You should analyse the midi flow



I will also check this... thanks all for your input... I hope I haven’t wasted too much of your time. 

Best
Pete


----------



## procreative

VinRice said:


> ASS Wars©



I like this! Could become catchy, especially as some are much _ASS_ier than others...


----------



## karusz

Hello, I am a happy owner of the previous AG Toolkit with ArtEditor Pro.
Please make things the most simple to use and please let us know as soon as possible, so I would like to switch to the new system. I also would like to use Logic Articulations Maps. When will you release it?



A.G said:


> It is not complicated at all in comparison with the Cubase and the new Logic 10.4
> Editors. As I mentioned the "Copy Key Switch" with Next semitone (or white keys),
> "Copy Map Next Ch", multiple Articulation selection and changing the Values
> assignments in multiple selection, batch editing and Key Commands saves lots of
> time. Obviously, you have to watch our new Videos about that after we release v6.
> 
> You are wrong. The current AG version 5 switches the Articulations via Logic region
> based automation text control points - not CC. You only press "Record" and record
> Articulation switching and music - the external MIDI Remote events such as KS,
> Program or CC are transformed on the fly into text automation control points visible in
> the Main window and Piano Roll. The Text points are embedded into the regions so
> you can copy, cut, move the regions without a problem.
> The upcoming AG 6 offers three types of Arts triggering:
> - Via Text Region automation control points (similar to Cubase "Directional" type).
> - Via Art IDs (Similar to Cubase "Attribute" type or Logic 10.4 triggering).
> - Via Program Change.
> - Combo Automation+ArtID (realtime recording and offline editing).
> - Combo Program+ArtID (realtime recording and offline editing).
> 
> We will post much more info and a Demo video in the Commercial Announcements soon.


----------



## anderslink

Has anyone figured out how to make the new logic 10.4 Art IDs useful with Symphony Series Brass (NI / Soundiron)? Spitfire with UACC KS is fantastic but man the Symphony Series Brass library is a nightmare when trying to set it up. It's so funny because the interface looks so modern and polished but unfortunately of course they couldn't have predicted Apple would release this feature. Even still the way they designed SSB doesn't really fit with the idea of a template or consistency from project to project. I mean how are you going to remember which keyswitch is going to which articulation?


----------



## ptram

No solution to the "always reset to the first ID" bug in Logic 10.4.1, unfortunately.

Paolo


----------



## babylonwaves

anderslink said:


> Has anyone figured out how to make the new logic 10.4 Art IDs useful with Symphony Series Brass (NI / Soundiron)? Spitfire with UACC KS is fantastic but man the Symphony Series Brass library is a nightmare when trying to set it up.


what's the actual issue?


----------



## Begfred

ptram said:


> No solution to the "always reset to the first ID" bug in Logic 10.4.1, unfortunately.
> 
> Paolo


Indeed. IF you mean ''always reset to the first articulation when using CC''. My workaround is to start any articulation set with a dummy articulation. The articulation must have a articulation ID number and must be set to nothing on the output section. When the cc data as no ID it trigger this dummy articulation. And that works for me so far.


----------



## A.G

karusz said:


> When will you release it?


Hope soon. We will send a mass Email to all AG users and post a commercial announcement here. Stay tuned.


----------



## ptram

Begfred said:


> Indeed. IF you mean ''always reset to the first articulation when using CC''.


Yes, that's it. Even if further tests resulted in more bugs (articulation reset even after deleting any CC, and articulation no longer switching after having assigned to a note ID1, and then a different ID again).

The empty articulation trick don't work for me, because I need articulations to stay fixed. An example is the Soundiron Olympus Choir, where you have a different set of parameters depending on the selected vowel. If the vowel is always reset to Ah, all the parameters controlling the tails of the sound are messed up.

Also, it doesn't work in VSL, since it it difficult to "debug" a difficult passage if you can't see what articulation is currently selected. And so on. At the moment, the system is unusable for me.

Paolo


----------



## anderslink

babylonwaves said:


> what's the actual issue?


I think the issue is with the actual Symphony Series Brass library and my own lack of experience with this stuff. I'm trying to use Logic 10.4's own functionality BTW not any 3rd party software in case that wasn't clear.

In an ideal world I would have 4 articulation settings for each instrument and have all of the articulations available for that instrument from the piano roll. I think this is impossible with SSB because they only let you select a handful of articulations at a time per instance.


----------



## procreative

anderslink said:


> I think the issue is with the actual Symphony Series Brass library and my own lack of experience with this stuff. I'm trying to use Logic 10.4's own functionality BTW not any 3rd party software in case that wasn't clear.
> 
> In an ideal world I would have 4 articulation settings for each instrument and have all of the articulations available for that instrument from the piano roll. I think this is impossible with SSB because they only let you select a handful of articulations at a time per instance.



I think it will work if you make a Multi of them on different channels and address the Keyswitches by channel?


----------



## khollister

procreative said:


> I think it will work if you make a Multi of them on different channels and address the Keyswitches by channel?



Can you reassign the keywitches? If so, you just need load the different instruments in a single stereo Kontakt instance with the same MIDI channel and make sure every articulation across the multi has a different KS value.


----------



## Alex Fraser

I was having issues with the symphony series and key switching. (I've sold it on since, so can't try any stuff out..)
As mentioned above, creating a multi consisting of the articulations you need might be the way forward with this particular library. So, one articulation per midi channel. You can use the articulation map output tab to make sure that each articulation is sent to the correct midi channel/part.


----------



## procreative

Yes you can reassign the keyswitches, but I have a Multi with about 4 instances and use the Midi channels to re-route to the correct slot. However currently am using ArtzID which has a script just for that scenario.

Have not tried doing it with the Logic system, but it should work as you can channelise the Keyswitches.


----------



## anderslink

Hmm gotta give this a go thanks so much for the replies!!


----------



## procreative

Anyone figured out a way to trigger two things from one articulation ID?

For instance CSS has CC58 to trigger articulations and value 10 = Sustain and value 85 = Legato On and value 80 = Legato off. So to play Legato Sustain you need values of 10 + 80.

Its not a deal breaker as I have ArtzID but the cretin in me cannot resist trying...


----------



## Dewdman42

I ended up writing my own script for just this kind of scenario to handle Kirk Hunter stuff. LPX articulation set feature doesn't handle it very well, nor does ArtzID. 

Here is a quick and dirty script to handle just that one case you mentioned, presuming its articulationid #59 (edit the script for whichever id you want to use):



Code:


 var cc = new ControlChange;
 cc.number = 58;
 
function HandleMIDI(event) {

    if (event.articulationID == 59 ) {
    
        cc.channel = event.channel;
        cc.value = 10;
        cc.send();
        cc.trace();
        cc.value = 80;
        cc.send();
        cc.trace();
    }
    
    event.send();
    event.trace();
}


Also, bear in mind that the ArticulationSet feature in LPX removes the articulationID from the note event once it sends the keyswitch if it has a keyswitch defined for it...so...in order to write your own script, you need to turn off sending a keyswitch from the articulationSet. You can use the articuliationSet to have a nice name, but don't send out any keyswitch for that one, let it pass on out to your Scripter script.


----------



## procreative

Dewdman42 said:


> I ended up writing my own script for just this kind of scenario to handle Kirk Hunter stuff. LPX articulation set feature doesn't handle it very well, nor does ArtzID.
> 
> Here is a quick and dirty script to handle just that one case you mentioned, presuming its articulationid #59 (edit the script for whichever id you want to use):
> 
> 
> 
> Code:
> 
> 
> var cc = new ControlChange;
> cc.number = 58;
> 
> function HandleMIDI(event) {
> 
> if (event.articulationID == 59 ) {
> 
> cc.channel = event.channel;
> cc.value = 10;
> cc.send();
> cc.trace();
> cc.value = 80;
> cc.send();
> cc.trace();
> }
> 
> event.send();
> event.trace();
> }
> 
> 
> Also, bear in mind that the ArticulationSet feature in LPX removes the articulationID from the note event once it sends the keyswitch if it has a keyswitch defined for it...so...in order to write your own script, you need to turn off sending a keyswitch from the articulationSet. You can use the articuliationSet to have a nice name, but don't send out any keyswitch for that one, let it pass on out to your Scripter script.



Very nice! In my case I probably would stick to ArtzID as the scripts I have already handle this well. Swapping one MidiFX script for another probably doesnt make much sense. But for others without ArtzID, its a neat solution.


----------



## Dewdman42

you can actually use more then one script at a time, 

if you want to figure it out. tell me how you are using Artzid and I can tweak this script to work in conjunction with it.


----------



## Dewdman42

For example,

Let's say you have artzid setup to just do simple keyswitch as in MP Script, put that in the first plugin slot. In the second plugin slot put a custom script. In the second custom script we will assume that MPScript has already inserted the first CC58=10 for you. By the time it gets to the second script we only need to send the CC58=80 just before the actual note.



Code:


 var cc = new ControlChange;
 cc.number = 58;
 
function HandleMIDI(event) {

    if (event.articulationID == 59 ) {
    
        cc.channel = event.channel;
        cc.value = 80;
        cc.send();
    }
    
    event.send();
}


----------



## procreative

Understood. But there is a unique feature in the ArtzID CSS script that bypasses the Trills as they are and puts the HT and WT on separate articulations like most libraries which I prefer as its consistent.

Also the CSS system makes playing chordal trills very tricky to play in.

But thank you for your suggestion, I did test and it does indeed work nicely. But on balance for me its better to stick with using ArtzID to do the triggering and use the Logic system to apply the names and score symbols.


----------



## Dewdman42

I don't disagree about Artzid being a good solution, you asked about sending two keyswitches for a single articulation which artzid doesn't do and just trying to help you do that, without giving up artzid. So you should be able to use the CSS script *AND* a custom addon script to get the extra keyswitch out.... Just put the custom script in the second plugin slot in addition to the CSS script. I actually think the last one I just gave you would work for that particular articulation you asked about, but anyway I'm sure the problem is solvable in conjunction with artzid if you want to try to figure it out.


----------



## procreative

I am not moaning, you have been very generous and your script works fine. As it happens ArtzID is able to switch between Sustain and Legato as it has code to trigger both needed CCs.

I was experimenting to see if I could bypass the need for any midifx scripts, completely forgetting about the Trills fix included in it. So really there is little point using both systems to actually trigger the articulations at least for CSS.

But thanks anyway, and I am sure it will be very helpful for many other scenarios, maybe helpful to those with VSL matrix type situations.


----------



## Dewdman42

Got you. Unfortunately yea the built in articulationSet doesn't do multiple keyswitches for one id...so I can't think of any way around having SOME kind of script or midifx plugin to send extra stuff that it is not capable of doing on its own. That's why I wrote my own script for Kirk Hunter Concert Strings 2 (and working on a few more), it needed as many as 4 keyswitches for an articulation id in order to operate the way I wanted. 

I think if you're using products that Peter has provided very specific solutions for, that is your best bet!


----------



## Peter Schwartz

FWIW, ARTz•ID V2 (just released) handles patches that need to see multiple keyswitches. In response to an ID it can send up to 4 keyswitch notes. Or, alternatively, it can send incoming notes MIDI out on up to 4 different channels, which is great for layering sounds. And you can mix and match the two. The Script was originally intended for use with Capsule patches, so it currently handles 16 articulations. That can be expanded pretty easily, but I'll await customer feedback on that.


----------



## Symfoniq

Peter Schwartz said:


> FWIW, ARTz•ID V2 (just released) handles patches that need to see multiple keyswitches. In response to an ID it can send up to 4 keyswitch notes. Or, alternatively, it can send incoming notes MIDI out on up to 4 different channels, which is great for layering sounds. And you can mix and match the two. The Script was originally intended for use with Capsule patches, so it currently handles 16 articulations. That can be expanded pretty easily, but I'll await customer feedback on that.



I'm curious if it can handle patches that need multiple settings for the same CC? Bear with me here...

The best example is Cinematic Studio Strings. I like controlling the library using the built-in CC58 articulation switching. Because of the way CSS is designed, I sometimes need to set CC58 twice, first to (for example) switch to the Sustain patch (CC58 value 6), then to turn off Legato (CC58 value 76).

Cubase's Expression Maps _cannot_ do this. If you set CC58 more than once with different values for the same mapping, rather than buffering them and "firing" them in order, it just sets CC58 to one of the values (the last one listed, I think).

Does ARTz•ID behave differently?


----------



## Peter Schwartz

Hi @Symfoniq,

Thanks for your question about this. The short answer is "yes". Here's the slightly longer answer... 

My Cinematic Studio Strings script lets users switch to all articulations, in all possible combinations (including a few that don't exist in the native CSS programming), sometimes by firing off multiple CC's simultaneously (no buffering required), sometimes by sending keyswitch notes, sometimes both (multiple CC's + keyswitches).

When developing this Script, I remember being stymied at first because some combinations of MIDI events that _should_ have worked to switch articulations (in theory) just didn't work. It took a while to figure out exactly which events to send -- and in what order (this was crucial) -- so that someone could switch between any two random articulations and never have a misfire.


----------



## Dewdman42

Peter Schwartz said:


> FWIW, ARTz•ID V2 (just released) handles patches that need to see multiple keyswitches. In response to an ID it can send up to 4 keyswitch notes. Or, alternatively, it can send incoming notes MIDI out on up to 4 different channels, which is great for layering sounds. And you can mix and match the two. The Script was originally intended for use with Capsule patches, so it currently handles 16 articulations. That can be expanded pretty easily, but I'll await customer feedback on that.



I looked over the info in your site for v2. You mentioned that something changed in LPX 10.4 midi infrastructure that allow you to support multi timbral operation through a single multi instrument such as vepro. What changed in 10.4 in this regard?


----------



## Peter Schwartz

Hi Dewd,

My systems have always supported multi-timbral operation through a single multi instrument like VEPro. Or Kontakt. Or Play. Even the most basic of all the Scripts supported the ability to send MIDI out on multiple channels if desired. What I mentioned on the website has to do with the capabilities of Articulation Sets, which indeed represent a huge change in the way Logic handles MIDI.


----------



## Symfoniq

Peter Schwartz said:


> Hi @Symfoniq,
> 
> Thanks for your question about this. The short answer is "yes". Here's the slightly longer answer...
> 
> My Cinematic Studio Strings script lets users switch to all articulations, in all possible combinations (including a few that don't exist in the native CSS programming), sometimes by firing off multiple CC's simultaneously (no buffering required), sometimes by sending keyswitch notes, sometimes both (multiple CC's + keyswitches).
> 
> When developing this Script, I remember being stymied at first because some combinations of MIDI events that _should_ have worked to switch articulations (in theory) just didn't work. It took a while to figure out exactly which events to send -- and in what order (this was crucial) -- so that someone could switch between any two random articulations and never have a misfire.



Thanks Peter!


----------



## robh

Peter,
How are you doing the multiple ports? I don't see it anywhere in the API. (No "event.port" thing if you know what I mean.)

Rob


----------



## Peter Schwartz

Hi Rob,

I'm doing it by...

 Whoops! I almost forgot, I never discuss the code or my methods. But I'll be happy to tell you about other stuff, like my approach to bluffing in poker (as long as we're not playing at the same table), or how I got my dogs to spin in circles on command, which was much easier than doing the ports thing.

-=sKi=-


----------



## Dewdman42

I’m assuming it’s using the same trick as the vepro multiport macros, except in a script instead of in the environment; a much better place to do it!


----------



## PeterJCroissant

Hi Peter, @Peter Schwartz 

About to pull the trigger on this.

One small question if I may,

Just using logics own articulation controls,

If I am using a multi in kontakt, and each instance/articulation of this particular VI is on a different midi channel. (Well that’s another question, does it need to be any more?) I have a problem then recording Dynamic CC info as all the instances are in different midi channels.

So I guess, with ArtID now they would be all on the same midi channel? 

Hope that made some kind of sense!

Best
Pete


----------



## procreative

Peter Schwartz said:


> Hi Rob,
> 
> I'm doing it by...
> 
> Whoops! I almost forgot, I never discuss the code or my methods. But I'll be happy to tell you about other stuff, like my approach to bluffing in poker (as long as we're not playing at the same table), or how I got my dogs to spin in circles on command, which was much easier than doing the ports thing.
> 
> -=sKi=-



Hello Peter, does your multi limit processing of the instruments? In Logic multimbral tracks have to share some resources don't they? For instance you cannot use separate MidiFX on each track and if I remember right FX have to be the same?

Is your implementation different to a standard multitimbral instance which uses Auxes of sorts?


----------



## PeterJCroissant

PeterJCroissant said:


> Hi Peter, @Peter Schwartz
> 
> About to pull the trigger on this.
> 
> One small question if I may,
> 
> Just using logics own articulation controls,
> 
> If I am using a multi in kontakt, and each instance/articulation of this particular VI is on a different midi channel. (Well that’s another question, does it need to be any more?) I have a problem then recording Dynamic CC info as all the instances are in different midi channels.
> 
> So I guess, with ArtID now they would be all on the same midi channel?
> 
> Hope that made some kind of sense!
> 
> Best
> Pete




ok I bought it and read the manual, and it says this! so I think this is the answer...good news!


• CC Cloning ensures that all patches in a multi-timbral plugin receive common CC information, such as CC#11 or sustain pedal, by using CC messages recorded exclusively on channel 1 as a “master” for all of the patches in the plugin. This greatly simplifies the control of dynamics and common CC-manipulated attributes of all patches in a multi-timbral patch configuration.


----------



## robh

Peter Schwartz said:


> Hi Rob,
> 
> I'm doing it by...
> 
> Whoops! I almost forgot, I never discuss the code or my methods. But I'll be happy to tell you about other stuff, like my approach to bluffing in poker (as long as we're not playing at the same table), or how I got my dogs to spin in circles on command, which was much easier than doing the ports thing.
> 
> -=sKi=-


You stinker! 

Yeah, I was afraid of that. I was wondering if you had a more exhaustive list of the API than the manual revealed.

Rob


----------



## robh

Dewdman42 said:


> I’m assuming it’s using the same trick as the vepro multiport macros, except in a script instead of in the environment; a much better place to do it!


GASP!! Thanks for the clue! Just tested it in a script and it works!
Peter, your secret is out!! But I'm not telling either. 

Rob


----------



## PeterJCroissant

@Peter Schwartz

ok I like it! its going to take me a while to go through my template and name and configure every thing....but it all works! so I'm very happy, thank you!


only mistake I made was buying NameX as it was bundled in with ArtzID anyway! doh!


----------



## Peter Schwartz

@robh
Sorry 'bout dat. It's been my long-standing policy to never discuss the code. All I can say is that I tested the bejesus out of it before releasing it into the wild, so I know it's solid.

@PeterJCroissant
Indeed, CC Cloning is the answer to your question. 

@procreative
Indeed, when you use a multi-part/multi-track configuration in Logic you are dealing with a single software instrument channel and up to 15 additional "aliases" of that SI channel. They all have MIDI channel settings. The original "real channel" gets set to ch1, the aliases to consecutively higher numbered channels. These act as channel filters. So if you select track 3 (alias instrument 3), only MIDI on channel 3 will be sent to that alias.

Since you're only dealing with one real instrument channel, any MIDI FX apply to the entire instrument. For a MIDI effect to work independently for each channel there would have to be a "poly channel" version of the FX, with up to 16 sets of GUI controls so that each one could be set independently for each channel. The Singularity Script acts "poly channelly" but regular MIDI FX do not.

On a similar-ish note, the fader you see on the real and all the alias channels are one and the same. So if you alter that fader on any one channel, it moves exactly the same amount on all of the others – because there's only really one fader to begin with.

Audio FX will also apply to the entire instrument _if_ you're returning all audio from the plugin to its stereo output. But if you've got the plugin configured for multiple outputs, and signal from outputs 3-4 and higher return to Auxes, then you can achieve independent control over level and FX. Here, the fader on all of the channels (including the aliases) will only affect the level of audio returning to channel 1. And audio FX strapped across the original instrument channel will only affect the stereo output's signal. All other level and FX control is achieved on the Auxes.

(Need more coffee. BRB.)


----------



## Peter Schwartz

@PeterJCroissant 
I saw that you purchased nameX separately so I issued you a refund for it.

BTW, I really appreciate all this discussion about ARTz•ID, but I'd like to suggest discussing further details about the system in the ARTz•ID V2 thread in the Commercial Tier 2 section. That way this thread can continue as a general discussion on Logic 10.4 articulation stuff.


----------



## Leon Portelance

I also purchased namex separately.


----------



## Peter Schwartz

Leon Portelance said:


> I also purchased namex separately.


No problem, you'll get a refund/rebate.


----------



## karusz

1.Is there a way to control articulations using iPad? 2.When I change a track, can I see different articulations available to choose for each track?


----------



## Peter Schwartz

1. Yes, if it outputs any kind of CC message, program change messages, or MIDI notes.
2. Yes, of course. 

For any further questions about my system, please post in my ARTz•ID V2 thread because I think this thread is kinda getting hijacked by so much discussion about my system.


----------



## karusz

I sent you also an email


----------



## Nick Batzdorf

Just a casual reminder to anyone this applies to: it's against The Rules to create additional VI Control accounts under different names, let alone doing that to promote/dis competing products.


----------



## WindcryMusic

Has anyone else noticed a dramatic increase in the frequency of Logic Pro X crashing to the desktop after starting to use articulations, and always upon the sending of an articulation? The fault seems to lie with Kontakt 5.7.3, according to the crash reports. It is often happening to me within a few minutes of starting to use articulations in a Logic project ... in fact, not sure how much I'm going to be able to use them at the current rate of crashes. I've submitted a bug report to NI. (So far I'm using Spitfire palettes with ARTzID 2.0 ... I don't know if the crashes are specific to that set of libraries and components or not.)


----------



## Peter Schwartz

I'm tempted to think it's a Kontakt problem, perhaps (?) specific to 5.7.3. I've been using Kontakt 5.6.6 and haven't experienced anything like this during development of V2. Ultimately, the messages V2 sends to plugins are no different (and not in excess) of what they'd normally need to receive to switch articulations, except that the timing of those messages is extremely tight with respect to the onset of the notes. However, if you were to use Logic's built-in articulation switching, you'd find it performs in exactly the same way.

It would definitely help to try and narrow down the players... Is it always the same libraries and palettes? Does it happen with Play-based instruments as well? Etc.


----------



## Dewdman42

I haven’t been getting any crashes and I’ve been using articulation sets a lot lately with heavy scripting. Did you upgrade to 10.4.1 yet?


----------



## WindcryMusic

Dewdman42 said:


> I haven’t been getting any crashes and I’ve been using articulation sets a lot lately with heavy scripting. Did you upgrade to 10.4.1 yet?



Yes. This seemed to start after that (although I wasn't making very heavy use of articulations prior to that, so I don't think it was necessarily introduced with the update). Anyway, the likelihood is that Kontakt is at fault, given that the console log always points to it.


----------



## WindcryMusic

Peter Schwartz said:


> I'm tempted to think it's a Kontakt problem, perhaps (?) specific to 5.7.3. I've been using Kontakt 5.6.6 and haven't experienced anything like this during development of V2. Ultimately, the messages V2 sends to plugins are no different (and not in excess) of what they'd normally need to receive to switch articulations, except that the timing of those messages is extremely tight with respect to the onset of the notes. However, if you were to use Logic's built-in articulation switching, you'd find it performs in exactly the same way.
> 
> It would definitely help to try and narrow down the players... Is it always the same libraries and palettes? Does it happen with Play-based instruments as well? Etc.



I agree about Kontakt being the most likely culprit. Hopefully NI will use the crash log I sent them to figure it out.

It's been happening with a variety of palettes from several Spitfire libraries (mostly SSO, but not exclusively). I don't have any Play libraries, so I can't check that.

I do have the CC Cloner multiscript currently running in Kontakt, and I'm not sure it is strictly required for my current use cases (the ones I've told you about via email, where everything is on channel 1), so I am going to try disabling that, in case it is happening in that part of Kontakt.


----------



## WindcryMusic

I think I might be on to something regarding these crashes. I took a closer look at the stack dump from the crashed thread, and here are the last several lines:

Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 Kontakt 5.MusicDevice.component 0x000000012c06de34 NI::UIA: Picture::getAnimationWidth() const + 4
1 Kontakt 5.MusicDevice.component 0x000000012b132f13 ScriptUIModuleBase::setLabel(int, int, int) + 1859
2 Kontakt 5.MusicDevice.component 0x000000012b130611 ScriptUIModuleBase::setControlPositions() + 4145
3 Kontakt 5.MusicDevice.component 0x000000012b11a3ce ScriptUIModuleBase::SyncToEngine(bool) + 206
4 Kontakt 5.MusicDevice.component 0x000000012b11989d PerfViewModule::onEvent(unsigned int, NI::UIA::EventData*) + 301

This looks to me like something in either the Spitfire UI or the Kontakt UI (I'd suspect the former, but I don't have a symbol file for Kontakt of course, so I can't know) is attempting to respond to an event like the change of articulation, and then resulting in the following crash while it is attempting to measure the size of some screen element (as if perhaps the element hasn't been loaded yet):

Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000038
Exception Note: EXC_CORPSE_NOTIFY

So tonight I've been trying to work with the Kontakt UI closed during articulation changes (which is awkward, since I'm in the process of creating articulation mappings), and I've been able to continue for a while now without any more crashes. If this holds up, then the crashes would probably be a less common issue in actual production use, since once I have my articulation mappings set up, I won't need to have the UI open nearly as much. Still, it seems like something that should be fixed by one of the parties involved.

It would be interesting to hear if others who have not had these crashes with 5.7.3 (or earlier versions), and especially others who have Spitfire libraries, could respond as to whether they've had the Kontakt UIs for these libraries visible during the articulation changes, and if not, if they might try leaving the UI open for a while and see if they start having these same crashes.


----------



## samphony

I too had crashes in the past if I left the Kontakt spitfire GUI open.


----------



## babylonwaves

WindcryMusic said:


> So tonight I've been trying to work with the Kontakt UI closed during articulation changes (which is awkward, since I'm in the process of creating articulation mappings), and I've been able to continue for a while now without any more crashes. If this holds up, then the crashes would probably be a less common issue in actual production use, since once I have my articulation mappings set up, I won't need to have the UI open nearly as much. Still, it seems like something that should be fixed by one of the parties involved.


i have had this type of crash as well when testing out my maps. as it turns out, i never really got to the ground of it. for a while i had the suspicion that it only happens when you switch to one of the legato instruments (assuming that you have multiple SF instruments stacked in one kontakt instance). which library are you having this problem with, with me, it was symphonic strings.

this is how it looks here. i'd say that's the same crash:

Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000038
Exception Note: EXC_CORPSE_NOTIFY

Termination Signal: Segmentation fault: 11
Termination Reason: Namespace SIGNAL, Code 0xb
Terminating Process: exc handler [0]

Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 Kontakt 5.MusicDevice.component 0x0000000136386e34 NI::UIA:icture::getAnimationWidth() const + 4
1 Kontakt 5.MusicDevice.component 0x000000013544bf13 ScriptUIModuleBase::setLabel(int, int, int) + 1859
2 Kontakt 5.MusicDevice.component 0x0000000135449611 ScriptUIModuleBase::setControlPositions() + 4145
3 Kontakt 5.MusicDevice.component 0x00000001354333ce ScriptUIModuleBase::SyncToEngine(bool) + 206
4 Kontakt 5.MusicDevice.component 0x000000013543289d PerfViewModule::onEvent(unsigned int, NI::UIA::EventData*) + 301
5 Kontakt 5.MusicDevice.component 0x0000000135a2406e NI::NGL::SubFormControl::onEvent(unsigned int,


----------



## procreative

WindcryMusic said:


> I think I might be on to something regarding these crashes. I took a closer look at the stack dump from the crashed thread, and here are the last several lines:
> 
> Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
> 0 Kontakt 5.MusicDevice.component 0x000000012c06de34 NI::UIA: Picture::getAnimationWidth() const + 4
> 1 Kontakt 5.MusicDevice.component 0x000000012b132f13 ScriptUIModuleBase::setLabel(int, int, int) + 1859
> 2 Kontakt 5.MusicDevice.component 0x000000012b130611 ScriptUIModuleBase::setControlPositions() + 4145
> 3 Kontakt 5.MusicDevice.component 0x000000012b11a3ce ScriptUIModuleBase::SyncToEngine(bool) + 206
> 4 Kontakt 5.MusicDevice.component 0x000000012b11989d PerfViewModule::onEvent(unsigned int, NI::UIA::EventData*) + 301
> 
> This looks to me like something in either the Spitfire UI or the Kontakt UI (I'd suspect the former, but I don't have a symbol file for Kontakt of course, so I can't know) is attempting to respond to an event like the change of articulation, and then resulting in the following crash while it is attempting to measure the size of some screen element (as if perhaps the element hasn't been loaded yet):
> 
> Exception Type: EXC_BAD_ACCESS (SIGSEGV)
> Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000038
> Exception Note: EXC_CORPSE_NOTIFY
> 
> So tonight I've been trying to work with the Kontakt UI closed during articulation changes (which is awkward, since I'm in the process of creating articulation mappings), and I've been able to continue for a while now without any more crashes. If this holds up, then the crashes would probably be a less common issue in actual production use, since once I have my articulation mappings set up, I won't need to have the UI open nearly as much. Still, it seems like something that should be fixed by one of the parties involved.
> 
> It would be interesting to hear if others who have not had these crashes with 5.7.3 (or earlier versions), and especially others who have Spitfire libraries, could respond as to whether they've had the Kontakt UIs for these libraries visible during the articulation changes, and if not, if they might try leaving the UI open for a while and see if they start having these same crashes.



Just a thought, maybe you could open your Kontakt instrument in standalone just to view for setting up the Maps? I have not got around to Spitfire libraries yet, but not had crashes anywhere else... yet.


----------



## WindcryMusic

procreative said:


> Just a thought, maybe you could open your Kontakt instrument in standalone just to view for setting up the Maps? I have not got around to Spitfire libraries yet, but not had crashes anywhere else... yet.



I am using ARTzID to combine both multiple palettes into a single Kontakt instance ... I'm not sure if I can do the same thing without ARTzID in the mix. Also, if the crash is in Kontakt scripting code, I'm betting it would occur in standalone as well. Still, it might be worth a try to verify things. Thanks!


----------



## WindcryMusic

babylonwaves said:


> i have had this type of crash as well when testing out my maps. as it turns out, i never really got to the ground of it. for a while i had the suspicion that it only happens when you switch to one of the legato instruments (assuming that you have multiple SF instruments stacked in one kontakt instance). which library are you having this problem with, with me, it was symphonic strings.
> 
> this is how it looks here. i'd say that's the same crash:
> 
> Exception Type: EXC_BAD_ACCESS (SIGSEGV)
> Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000038
> Exception Note: EXC_CORPSE_NOTIFY
> 
> Termination Signal: Segmentation fault: 11
> Termination Reason: Namespace SIGNAL, Code 0xb
> Terminating Process: exc handler [0]
> 
> Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
> 0 Kontakt 5.MusicDevice.component 0x0000000136386e34 NI::UIA:icture::getAnimationWidth() const + 4
> 1 Kontakt 5.MusicDevice.component 0x000000013544bf13 ScriptUIModuleBase::setLabel(int, int, int) + 1859
> 2 Kontakt 5.MusicDevice.component 0x0000000135449611 ScriptUIModuleBase::setControlPositions() + 4145
> 3 Kontakt 5.MusicDevice.component 0x00000001354333ce ScriptUIModuleBase::SyncToEngine(bool) + 206
> 4 Kontakt 5.MusicDevice.component 0x000000013543289d PerfViewModule::onEvent(unsigned int, NI::UIA::EventData*) + 301
> 5 Kontakt 5.MusicDevice.component 0x0000000135a2406e NI::NGL::SubFormControl::onEvent(unsigned int,



I agree ... looks like the same crash. I have had it on both Symphonic Strings and Symphonic Brass so far, and (not sure about this) possibly on Albion One as well. Not with legatos ... just with standard palettes. I suspect it is equally likely with any palette-style patch, including legatos with multiple articulations (e.g., the performance legatos).


----------



## procreative

WindcryMusic said:


> I am using ARTzID to combine both multiple palettes into a single Kontakt instance ... I'm not sure if I can do the same thing without ARTzID in the mix. Also, if the crash is in Kontakt scripting code, I'm betting it would occur in standalone as well. Still, it might be worth a try to verify things. Thanks!



You dont "necessarily" have to use ArtzID to combine them. All you would need to do is change the midi channel in the middle tab of the Articulation Set window. For example first Spitfire palette, all set to channel 1, second channel 2 etc.

When this crash happens, is there any CCs recorded? Wondering if that bug of CCs having Ids attached might be the culprit. So far in my tests it drives the GUI mad, but has no effect on output. But maybe the Spitfire GUI script is not so forgiving? *

* EDIT: As you are using ArtzID, its probably not relevant.


----------



## WindcryMusic

procreative said:


> You dont "necessarily" have to use ArtzID to combine them. All you would need to do is change the midi channel in the middle tab of the Articulation Set window. For example first Spitfire palette, all set to channel 1, second channel 2 etc.



Alas, that isn't compatible with the way I am setting these up. I don't want to bind them to specific MIDI channels because I am building UACC control templates in Lemur that are as universal as possible to Spitfire instruments (SSO, SCS, Albion ONE, etc). Since (for example) one library will have the UACC #49 articulation in its Core palette, while another will have it in its Decorative palette, if I switched to programming against individual MIDI channels I'd need a separate Lemur template for each and every library (ugh).

EDIT: here's what I'm building ...


----------



## procreative

Not sure why? The Lemur template if its like mine uses CC32 and a CC value, the trigger can be always the same. Its the output that changes. I have several set up that use the same button/value but the routing changes.

Separate the button that triggers the Articuation Set slot from what it outputs. All my buttons use CCS32 values, what they output changes depending on the patch being Note Keyswitch, UACC, CC or separate patches by Midi Channel.

For instance Staccato is the same button for Hollywood Strings as Sable on mine even though the trigger method at the instrument is different.

However yhe point of UACC should mean you dont need to do this as every articulation has its own CC value so stacking slots should make no difference.


----------



## procreative

By the way I am using a great Lemur template called Composer Tools Pro in conjunction with Osculator. Every time I select a track in Logic it auto loads the correct Lemur layout for it with all articulations labelled.

Ive even got a page for all the useful Logic Key Commands and using Osculator to translate them into Key Commands from buttons that output CC.

Its like combining TouchOSC with Metagrid into one interface.

I'm slowing turning into a nerd spending more time optimising than composing, grrr... its so addictive!


----------



## WindcryMusic

procreative said:


> Not sure why? The Lemur template if its like mine uses CC32 and a CC value, the trigger can be always the same. Its the output that changes. I have several set up that use the same button/value but the routing changes.
> 
> Separate the button that triggers the Articuation Set slot from what it outputs. All my buttons use CCS32 values, what they output changes depending on the patch being Note Keyswitch, UACC, CC or separate patches by Midi Channel.
> 
> For instance Staccato is the same button for Hollywood Strings as Sable on mine even though the trigger method at the instrument is different.
> 
> However yhe point of UACC should mean you dont need to do this as every articulation has its own CC value so stacking slots should make no difference.



Well, I want to use ARTzID so as to avoid the CC's with articulations issue. ARTzID overrides Logic's Articulation Sets output page. I'm using the UACC Mapper, which doesn't provide the ability to set both MIDI channels and UACC mappings for the same input CC, so I have to have the palettes all on MIDI channel 1 for it to work.


----------



## procreative

WindcryMusic said:


> Well, I want to use ARTzID so as to avoid the CC's with articulations issue. ARTzID overrides Logic's Articulation Sets output page. I'm using the UACC Mapper, which doesn't provide the ability to set both MIDI channels and UACC mappings for the same input CC, so I have to have the palettes all on MIDI channel 1 for it to work.



Why do you need to channelise them anyway? The point of UACC is that a multi should be all on the same midi channel anyway. If you have 2 SCS pallettes for Violin 1, every articulation will have its own CC value.

By the way, Spitfire recommends you use UACC KS rather than UACC as timing is better. This involves using the same KS note combined with a value.


----------



## WindcryMusic

procreative said:


> Why do you need to channelise them anyway? The point of UACC is that a multi should be all on the same midi channel anyway. If you have 2 SCS pallettes for Violin 1, every articulation will have its own CC value.
> 
> By the way, Spitfire recommends you use UACC KS rather than UACC as timing is better. This involves using the same KS note combined with a value.



That's interesting. I didn't know how UACC KS worked at all. You are saying it uses always sending the same MIDI note, but with different velocity values for the UACC value? I'll want to do some research to see what the differences/advantages are.

But outside of that, I think we are going around in circles, since this started with the suggestion that I should be channelizing the palettes in order to use Kontakt standalone. In any event, I'm not inclined to go through a complete rewrite of my existing template (which is well along now) in order to get around a temporary bug that mainly just impacts the creation thereof anyway. I would like to get back to writing music before the end of the millennium.


----------



## babylonwaves

WindcryMusic said:


> That's interesting. I didn't know how UACC KS worked at all. You are saying it uses always sending the same MIDI note, but with different velocity values for the UACC value? I'll want to do some research to see what the differences/advantages are.


you need to use UACC-KS, UACC (without KS) will not work well once you assign articulations to individual notes in a chord.


----------



## procreative

WindcryMusic said:


> That's interesting. I didn't know how UACC KS worked at all. You are saying it uses always sending the same MIDI note, but with different velocity values for the UACC value? I'll want to do some research to see what the differences/advantages are.
> 
> But outside of that, I think we are going around in circles, since this started with the suggestion that I should be channelizing the palettes in order to use Kontakt standalone. In any event, I'm not inclined to go through a complete rewrite of my existing template (which is well along now) in order to get around a temporary bug that mainly just impacts the creation thereof anyway. I would like to get back to writing music before the end of the millennium.



I was only suggesting using Kontakt standalone to see what to program, not to actually run it.

But I was only referring to your point about multiple patches addressing UACC, they should work fine all set to the same midi channel.


----------



## WindcryMusic

procreative said:


> I was only suggesting using Kontakt standalone to see what to program, not to actually run it.



The difficulty I would have is that a significant part of the reason I want the UI open is to confirm that my template adjustments are actually triggering the intended articulations. It isn't always that easy for me to tell aurally ... some SF brass instruments, for example, have Marcatos and Tenutos that are nearly indistinguishable.

Anyway, thanks for all of the input ... lots for me to think about today before I can get back to working on this later.


----------



## Peter Schwartz

> By the way, Spitfire recommends you use UACC KS rather than UACC as timing is better. This involves using the same KS note combined with a value.



According to the Spitfire info, UACC-KS is an articulation-switching scheme in which a MIDI note of one pitch only is used to signify a change of articulation, the velocity value of that note being the indicator of which articulation is to be selected. The velocity value of the note corresponds directly to the UACC values for CC 32 messages. So it's not a combination of messages. It's one message with a variable velocity value.

To clarify, there is no reason to use UACC-KS with ARTzID. Regular ol' locking-to-UACC is what you want. There will be no improvement or difference in timing using UACC-KS over UACC. If anything, and this is entirely "in theory", using UACC will result in better timing. That's because MIDI notes have 3 bytes to parse, CC messages have 2. So with CC's you "get there 1/3 faster" than by using MIDI notes.

None of my Scripts output variable velocity keyswitch notes except for the Cinematic Strings stuff. As for Spitfire, I could easily modify the UACC mapper to output UACC-KS keyswitches, but there is no technical advantage to it. The only advantage would be to cater to customer preference, for those people who want to use that switching scheme for whatever reason. I'm willing to do that, but again, there's no technical advantage at all.

[EDIT] There are two schemes you can use with ARTzID to handle multiple palettes. The first is to use the UACC Mapper Script, with all palettes set to channel 1. This is the ideal situation. But -- and this is something Windcry and I have discussed offline -- some older Spitfire library's palettes are problematic in that you can't set them all to the same MIDI channel and expect that an articulation change to (say) a sound from the decorative palette will stop the previous core palette selection to stop sounding. And vice versa. It's confounding! 

It's in this situation that each palette needs to be on its own MIDI channel. And for that specific situation I think I could create a UACC version of the Combinatrix with a mapping that caters to the way @WindcryMusic is doing things.


----------



## Dewdman42

Not that this necessarily makes sense for Artzid, but just for reference, Spitfire did put out the following support document where they recommend using UACC-KS with articulation sets:

https://spitfireaudio.zendesk.com/h...0-4-Articulation-Sets-with-Spitfire-libraries

I'm not sure why they recommend using the KS version with articulation sets.


----------



## procreative

Dewdman42 said:


> Not that this necessarily makes sense for Artzid, but just for reference, Spitfire did put out the following support document where they recommend using UACC-KS with articulation sets:
> 
> https://spitfireaudio.zendesk.com/h...0-4-Articulation-Sets-with-Spitfire-libraries
> 
> I'm not sure why they recommend using the KS version with articulation sets.



Yes I am wondering that:

Noticed this in the comments:

_"The reason UACC KS is preferable to UACC is that it allows you to select multiple articulations, with UACC the articulation is selected by setting CC#32 to a specific value and MIDI controls can only hold one value at a time so only one articulation can be selected at any point in time. 

Using UACC KS means that a MIDI note on message (on C-2 in this case) can select an articulation with a specific velocity. Whilst a note can only have one velocity at a time, Logic (and many other DAW/Sequencers) can sequence several note on messages at once with different velocity values on the same note. 

What this means practically is that you could, for example, easily sequence a chord where some notes are regular Longs and some are Long CS (or tremolo, etc) using articulation sets."_


----------



## babylonwaves

i don't know why timing frequently pops up on this forum with UACC KS. it's an urban myth. if you use UACC controllers and you aim for multiple notes to play different articulations at the exact same position, this will not work. so for instance playing a chord with the same instrument which consists of three notes playing three different articulations will not work as expected.
though, if you spread the notes (delay each by some ticks), so every note has its own position in time, it will work. with UACC KS you don't have this problem because the spitfire kontakt script understands which articulation switch belongs to which note.


----------



## Peter Schwartz

Yup, that's the article I was referencing in my previous post. But it doesn't apply to ARTzID. Of course, if you're going to roll your own articulation sets, that's a good guide to setting one up, but I still don't see any advantage to using UACC-KS vs. UACC. 

In the article, Sandy from Spitfire wrote:
_
"The reason UACC KS is preferable to UACC is that it allows you to select multiple articulations, with UACC the articulation is selected by setting CC#32 to a specific value and MIDI controls can only hold one value at a time so only one articulation can be selected at any point in time."_

That doesn't really make sense, because any MIDI event "holds" its value just once. Rather, it's all about timing...

Let's say you had two notes, different pitches, exactly hard quantized:






Here's what's going on under the hood with an articulation set:

The C3 is going to be preceded by its articulation-switching MIDI event (doesn't matter if it's a CC#32 or a UACC-KS C-2) before the C3 fires off. The D3 is going to be preceded by its articulation-switching MIDI event before the D3 is fired off. You will hear a chord with one note sounding sustain, the other staccato.

There doesn't need to be a gap in-between. You'll hear a two-note chord playing different articulations on each note.

So again, I don't see any advantage to using UACC, or UACC-KS, or just keyswitch notes. Whatever the articulation set is programmed to output is what it's going to output _before_ each actual musical note is fired off.


----------



## Dewdman42

Yea I agree, not sure why they are recommending it..

Midi is serial. Even if two events have exactly the same beatpos, one is sent before the other internally. In scripter you can control that order. I don't know what the articulation set feature does for outgoing keyswitches, but I seem to notice that it tends to send the keyswitch NoteOn, followed by the Note, and only after the Noteoff for the actual note is the Noteoff for the keyswitch actually sent out.

I haven't experimented with CC to see what it does, but its possible that it somehow sends the CC more than once over the course of a held note or something, as if its a horizantal line in the cc automation window...just guessing wildy now...could be another bug to report to Apple.


----------



## Peter Schwartz

[EDIT - we've cross-posted the same thought] 

I think it all comes down to this... In the above example, the notes appear to be "hard quantized". But in reality, there's no such thing as hard-quantized in MIDI. Even within a DAW, all MIDI information is still sent serially to a plugin. You can have (say) a 10 note hard-quantized chord firing off to a plugin, but it will play back as a very very fast 10 note arpeggio, from lowest to highest note.

*Note*, *Note*, *Note*, *Note*, *Note*... and not as a block of actually simultaneous events.

When you stick articulation-switching messages in the mix -- as generated by an articulation set's output -- that 10 note chord now becomes a 20-MIDI-event arpeggio. If it's UACC messages you're using, here's the arpeggio (values omitted):

CC32, *Note*, CC32, *Note*, CC32, *Note*... etc.

If it's UACC-KS, the arpeggio is:

UACC-KSNote, *Note*, UACC-KSNote, *Note*, UACC-KSNote, *Note*...


----------



## Dewdman42

Hmm, no I don't think articulation set is doing anything weird like that and I also can't see any reason why UACC KS would be neccessary. It sort of seems on the surface like a lack of understanding on the part of the support person at Spitfire..or there is something we are missing,

Check out the following example.. This has every other note using art1, art2, art1, art2






here are the midi messages sent out by the articulation set keyswitches...



Code:


[  1.0  ] Control Change   [#32]  val:60    ch:01 art:0    (Bank LSB)
[  1.0  ] NoteOn           [C4]   vel:80    ch:01 art:0
[  1.480] Control Change   [#32]  val:30    ch:01 art:0    (Bank LSB)
[  1.480] NoteOn           [D4]   vel:80    ch:01 art:0
[  1.720] NoteOff          [C4]   vel:64    ch:01 art:0
[  2.0  ] Control Change   [#32]  val:60    ch:01 art:0    (Bank LSB)
[  2.0  ] NoteOn           [E4]   vel:80    ch:01 art:0
[  2.240] NoteOff          [D4]   vel:64    ch:01 art:0
[  2.720] Control Change   [#32]  val:30    ch:01 art:0    (Bank LSB)
[  2.720] NoteOn           [D4]   vel:80    ch:01 art:0
[  3.0  ] NoteOff          [E4]   vel:64    ch:01 art:0
[  3.720] NoteOff          [D4]   vel:64    ch:01 art:0
>



For reference, here is the same sequence using NoteOn keyswitches




Code:


[  1.0  ] NoteOn           [C0]   vel:60    ch:01 art:0 
[  1.0  ] NoteOn           [C4]   vel:80    ch:01 art:0 
[  1.480] NoteOff          [C0]   vel:64    ch:01 art:0 
[  1.480] NoteOn           [D0]   vel:30    ch:01 art:0 
[  1.480] NoteOn           [D4]   vel:80    ch:01 art:0 
[  1.720] NoteOff          [C4]   vel:64    ch:01 art:0 
[  2.0  ] NoteOff          [D0]   vel:64    ch:01 art:0 
[  2.0  ] NoteOn           [C0]   vel:60    ch:01 art:0 
[  2.0  ] NoteOn           [E4]   vel:80    ch:01 art:0 
[  2.240] NoteOff          [D4]   vel:64    ch:01 art:0 
[  2.720] NoteOff          [C0]   vel:64    ch:01 art:0 
[  2.720] NoteOn           [D0]   vel:30    ch:01 art:0 
[  2.720] NoteOn           [D4]   vel:80    ch:01 art:0 
[  3.0  ] NoteOff          [E4]   vel:64    ch:01 art:0 
[  3.720] NoteOff          [D4]   vel:64    ch:01 art:0


----------



## Peter Schwartz

Another problem with using keyswitch messages as the output of articulation sets -- and I believe this is something @Dewdman42 pointed out somewhere in this thread -- is that the Note Off for the KS message is only sent when a new ID is detected. This is problematic for several reasons...

1) In some plugins, KS notes are detected as an active voice, unnecessarily reducing the polyphony.

2) It causes the keyboard graphics of keyswitching instrument GUIs to display a "stuck keyswitch note". And let's make that plural if you have chords assigned to multiple articulations.

In some ways it makes sense to have the Note Off sent as late as possible, because that's one less MIDI byte occupying the data stream while musical note events, or CC's used to manipulate dynamics are playing down. But it would be all the same to send the Note Off by a fixed amount later, anywhere from 5 - 50 milliseconds, without any downside.

CC#32 messages (or any CC messages, for that matter) don't have an "off" state as do Note messages. Hence, with CC messages there's nothing to "cancel". CC#32 messages are one-shot events and are by far the most efficient means of transmitting articulation switching information. To summarize, they happen once, they're shorter than note messages by 1/3, they don't activate voices, and they don't need an off state.


----------



## Dewdman42

I updated my above post to include also the same example with noteOn keyswitches. its not quite how I described it in the past, but notice that the Keyswitch noteoff doesn't happen until the next keyswitch note on. But we need a test now for a chord with different articulations in the chord.








Code:


[  1.240] NoteOff          [C0]   vel:64    ch:01 art:0 
[  1.240] NoteOn           [D0]   vel:30    ch:01 art:0 
[  1.240] NoteOn           [A3]   vel:80    ch:01 art:0 
[  1.240] NoteOff          [D0]   vel:64    ch:01 art:0 
[  1.240] NoteOn           [C0]   vel:60    ch:01 art:0 
[  1.240] NoteOn           [C4]   vel:80    ch:01 art:0 
[  1.240] NoteOff          [C0]   vel:64    ch:01 art:0 
[  1.240] NoteOn           [D0]   vel:30    ch:01 art:0 
[  1.240] NoteOn           [D4]   vel:80    ch:01 art:0 
[  1.240] NoteOff          [D0]   vel:64    ch:01 art:0 
[  1.240] NoteOn           [C0]   vel:60    ch:01 art:0 
[  1.240] NoteOn           [E4]   vel:80    ch:01 art:0 
[  2.240] NoteOff          [E4]   vel:64    ch:01 art:0 
[  2.240] NoteOff          [D4]   vel:64    ch:01 art:0 
[  2.240] NoteOff          [C4]   vel:64    ch:01 art:0 
[  2.240] NoteOff          [A3]   vel:64    ch:01 art:0



Here's the same chord using CC's for keyswitches...and notice...not only is a CC message less bytes as Peter pointed out...it also doesn't need a note off message! So less traffic in general...



Code:


[  1.240] NoteOff          [C0]   vel:64    ch:01 art:0  
[  1.240] Control Change   [#32]  val:30    ch:01 art:0    (Bank LSB)
[  1.240] NoteOn           [A3]   vel:80    ch:01 art:0  
[  1.240] Control Change   [#32]  val:60    ch:01 art:0    (Bank LSB)
[  1.240] NoteOn           [C4]   vel:80    ch:01 art:0  
[  1.240] Control Change   [#32]  val:30    ch:01 art:0    (Bank LSB)
[  1.240] NoteOn           [D4]   vel:80    ch:01 art:0  
[  1.240] Control Change   [#32]  val:60    ch:01 art:0    (Bank LSB)
[  1.240] NoteOn           [E4]   vel:80    ch:01 art:0  
[  2.240] NoteOff          [E4]   vel:64    ch:01 art:0  
[  2.240] NoteOff          [D4]   vel:64    ch:01 art:0  
[  2.240] NoteOff          [C4]   vel:64    ch:01 art:0  
[  2.240] NoteOff          [A3]   vel:64    ch:01 art:0


----------



## Dewdman42

My take away is that articulation set is sending all the right keyswitches, but it doesn't like to have more then one keyswitch open (without the note off) at a time. Which is probably a good thing. So in the chord example, look at all the C0 and D0 keyswitches happen at 1.240 for the chord...always sending the noteoff for each one before sending the next keyswitch. That is perfectly fine...though...verbose..

CC's don't require a note off event, which actually makes them simpler in many ways and I can't think of any reason why spitfire wouldn't work with normal CC's for keyswitches, not to mention with less midi traffic in order to do so.


----------



## babylonwaves

Peter Schwartz said:


> *Note*, *Note*, *Note*, *Note*, *Note*... and not as a block of actually simultaneous events.


it's called block processing. time stamped events end up in a ring buffer. i don't want to come across as a smart ass but you cannot compare the way traditional MIDI works with the way a plug-in receives it and deals with timed information. even if logic sends a stream of events, a plug-in doesn't constantly receives events but only once per _block_ 
and, you can try for yourself, UACC (without KS) doesn't work reliably. whatever the reason is for that.


----------



## Peter Schwartz

Yup, exactly. Upon a change of articulation, the NoteOff is sent. 

It's also too bad that MIDI isn't ever handled internally using running status, i.e., that NoteOn msgs with a velocity of zero (sent as running status) aren't used instead of explicit NoteOff messages.


----------



## Dewdman42

Peter Schwartz said:


> It's also too bad that MIDI isn't ever handled internally using running status, i.e., that NoteOn msgs with a velocity of zero (sent as running status) aren't used instead of explicit NoteOff messages.



what do you mean by this?



babylonwaves said:


> it's called block processing. time stamped events end up in a ring buffer. i don't want to come across as a smart ass but you cannot compare the way traditional MIDI works with the way a plug-in receives it and deals with timed information. even if logic sends a stream of events, a plug-in doesn't constantly receives events but only once per _block_



sure but the events are still processed one at a time...they sit on a queue which is processed in order. The ordering matters.



> and, you can try for yourself, UACC (without KS) doesn't work reliably. whatever the reason is for that.



Its interesting that Spitfire is having problems with it. I would really like to understand why. Could be the way they are doing things...but then it doesn't make sense to me that articulationSet doesn't work and artzid does work.


----------



## Peter Schwartz

Babylon, yes, processing blocks and all... But unless I'm misunderstanding you, it's not like information is sent in spurts every processing block. The timing is preserved. For all intents and purposes, what I wrote above is accurate enough for general discussion (without getting into all the gory details of processing blocks.) 

That aside, the way I described things above is exactly how the system appears to actually function, regardless of the under-the-hood technical details. Polyphonic articulation switching (my term for it) was possible even in the pre-articulationID switching days, i.e., SkiSwitcher could do this. So there's no reason to think that Articulation Sets and their built-in capabilities can't do it. It does it quite easily.

So it's not an urban myth at all. In fact, I just set up a regular ol' Articulation Set (no ARTz•ID involved) and polyphonic articulation switching works just fine. Two simultaneous notes, each sending a different articulation-switching MIDI event. Bingo. No problem.


----------



## Dewdman42

the processing block thing is a moot point. Nothing inside a DAW is ever processed in real time, under the hood everything happens in blocks where the CPU churns away and does whatever it needs to do to process a block of stuff. All internal timestamps are obviously obeyed or else it would all sound quite bad. That's why there is latency because that is the block of time needed to process a block. If you have a lot of plugins or some plugin that requires more CPU to crunch, then you need a bigger block (aka buffer).

The only question is whether the internals of Logic respect the ordering of queued midi events in the order they are queued. I believe the answer is "yes". I haven't ever found it to be otherwise.


----------



## Peter Schwartz

@Dewdman42

Running status is a scheme provided for in the MIDI spec where, if two events in sequence are of the same status (same note message type and channel) then the status byte doesn't need to be repeated. It will be assumed that the next bytes that come down the pipeline will be assumed to be data bytes pertaining to the same type of message.

Here are two examples, with status bytes indicated in bold.

A three note chord, C3 + D3 + G3, all on the same MIDI channel. The non-running status version of this would be:

• *NoteOn/ch1*, C3, velocity
• *NoteOn/ch1*, D3, velocity
• *NoteOn/ch1*, G3, velocity

The running status version would be:

• *NoteOn/ch1*, C3, velocity
• D3, velocity
• G3, velocity

For CC messages such as a stream of modwheel messages uninterrupted by any other type of message, the running status version would be:

• *CC/ch1*, number, value
• number, value
• number, value
• number, value....


----------



## Peter Schwartz

In fact, if I'm not mistaken, with CC messages, you could have (say) CC1 and CC23, both on channel 1, intermingled in the same data stream, and the running status version would look something like this:

• *CC/ch1*, number 1, value
• number 23, value
• number 1, value
• number 1, value
• number 23, value


----------



## babylonwaves

Peter Schwartz said:


> Babylon, yes, processing blocks and all... But unless I'm misunderstanding you, it's not like information is sent in spurts every processing block. The timing is preserved. For all intents and purposes, what I wrote above is accurate enough for general discussion (without getting into all the gory details of processing blocks.)


i'm talking about the block processing of the receiving part, the plug-in.


----------



## Dewdman42

I understand all that, but I didn't get what you are referring to about using velocity=0 for note off not working in some way? I use that trick in scripter and it seems to work, what am I missing?


----------



## Dewdman42

babylonwaves said:


> i'm talking about the block processing of the receiving part, the plug-in.



same difference, the events are in queue and the plugin obtains them from there. 

Plugins don't compute in real time. They get the midi events, one at a time, process them each, taking whatever time it takes and placing the result into a buffer, which gets sent to the soundcard later as a block. All internal processing happens in order, but not in real time.


----------



## Peter Schwartz

Here I agree with @Dewdman42, the block processing is a moot point. In practice, things work as we're describing them. And there's no reason to space events apart to achieve simultaneous multiple articulations.


----------



## Peter Schwartz

Dewdman42 said:


> I understand all that, but I didn't get what you are referring to about using velocity=0 for note off not working in some way? I use that trick in scripter and it seems to work, what am I missing?



Oh, just a misunderstanding I think. All that works fine. I was just talking about the idea of using NoteOn messages with a velocity of zero (vis a vis running status) instead of sending explicit NoteOff messages.


----------



## Peter Schwartz

@Dewdman42, you probably found this already, but when you're scripting you can save a little bit of work by not having to explicitly define NoteOff events. You can set a variable to represent a NoteOn and send them with a velocity of zero. The Scripter then takes the liberty of outputting NoteOff events for you.


----------



## Dewdman42

anyway back to the spitfire thing...I'd really like to understand why articulation set is having a problem with UACC32


Peter Schwartz said:


> @Dewdman42, you probably found this already, but when you're scripting you can save a little bit of work by not having to explicitly define NoteOff events. You can set a variable to represent a NoteOn and send them with a velocity of zero. The Scripter then takes the liberty of outputting NoteOff events for you.



Yea I know I've been using that. that's why you threw me off with your running status comment, but I see now what you meant. I think most likely inside the plugin's it is probably a moot point whether you post an entire NoteOff or use a running status with velocity 0 to reduce bytes... That would matter over a midi cable, and might matter slightly over ethernet to Vepro, but I bet vepro generally just sends entire midi messages in buffers that are bigger then a few bytes anyway, so I doubt there would be much bandwidth savings..

But there would probably be bandwidth savings by using CC's instead of Note Keyswitches to avoid the Noteoff altogether.


----------



## Peter Schwartz

Dewdman42 said:


> But there would probably be bandwidth savings by using CC's instead of Note Keyswitches to avoid the Noteoff altogether.


----------



## Dewdman42

I'm working through a problem in a script related to sending a CC message through kontakt CC automation in order to toggle a certain feature on/off. I'm starting to come to the conclusion that CC automation may be handled on a seperate thread from NoteOn data, which means it can't garantee the ordering we've been talking about, unless I put a bit of delay between the CC and the note.

It occurs to me that its possible Spitfire might be doing something funny with CC's on a different thread than NoteOn's also... not sure..just thinking out loud here...


----------



## studioj

Are you guys finding that if you import tracks with articulation sets configured the articulation set doesn't hold and the imported track is set to NO articulation set? that has been my experience so far, but I haven't done much testing.


----------



## babylonwaves

Dewdman42 said:


> I'm starting to come to the conclusion that CC automation may be handled on a seperate thread from NoteOn data, which means it can't garantee the ordering we've been talking about, unless I put a bit of delay between the CC and the note.



surprise ...


----------



## yellow hat

Question:
I have midi remote activated in articulation sets in Logic Pro and use iPad and Metagrid to trigger articulations.
It wirks perfectly.
But...
Why can I still access keyswitches on my main keyboard?
I thought they were bypassed when activating midi remote.
My main keyboard is midi channel 1 and articulations are on midi ch 16 on my iPad.


----------



## Dewdman42

Dewdman42 said:


> I'm working through a problem in a script related to sending a CC message through kontakt CC automation in order to toggle a certain feature on/off. I'm starting to come to the conclusion that CC automation may be handled on a seperate thread from NoteOn data, which means it can't garantee the ordering we've been talking about, unless I put a bit of delay between the CC and the note.
> 
> It occurs to me that its possible Spitfire might be doing something funny with CC's on a different thread than NoteOn's also... not sure..just thinking out loud here...



On second thought I was able to get it working with Scripter without using any delays...so... I don't think the issue is in kontakt. Its something inside LPX.. I eliminated the problem by avoiding the use of *NeedsTimingInfo* in my script. So somehow internally LPX was perhaps using different queues for CC's and notes or something, I don't know..but anyway, that must be why some people reported issues with the ArticulationSet and spitfire using CC32.


----------



## Alex Fraser

yellow hat said:


> Question:
> I have midi remote activated in articulation sets in Logic Pro and use iPad and Metagrid to trigger articulations.
> It wirks perfectly.
> But...
> Why can I still access keyswitches on my main keyboard?
> I thought they were bypassed when activating midi remote.
> My main keyboard is midi channel 1 and articulations are on midi ch 16 on my iPad.


My understanding is that the key switch layers are active when midi remote is on. Have you set the midi channel to 16 for your keyswitches, via the artic map?


----------



## yellow hat

Alex Fraser said:


> My understanding is that the key switch layers are active when midi remote is on. Have you set the midi channel to 16 for your keyswitches, via the artic map?



Ok
Yes midi ch is set to 16 in articulation sets setup.
And on my ipad.
I thought the articulation set took over the main keyboards switches when I assigned C0 -C1 inside the articulation sets.


----------



## ScarletJerry

I don't have the Key Switch tab or MIDI remote button in the Studio Horns Smart Controls pane - just a controls and EQ tab. Am I missing something?

-Scarlet Jerry


----------



## Dewdman42

You need an articulation set active on the track


----------



## ScarletJerry

Yes! That was it. I tried to the find the answer in the Logic docs and it was driving me crazy. Thanks for your help!

-Scarlet Jerry


----------



## jonathanwright

Has anyone found a way to get around Logic not 'picking up' the correct articulation when using MIDI CC switches?

It seems to work okay with regular key switches, but if using MIDI CC, the articulation isn't always updated, despite trying to play though the region a few times.

Have I missed a configuration to stop this happening? It's pretty much negating the point of them for me.


----------



## babylonwaves

jonathanwright said:


> Has anyone found a way to get around Logic not 'picking up' the correct articulation when using MIDI CC switches?


which library are you working with?


----------



## Dewdman42

No I have been getting weird results also when the CC switch event is scheduled on the same midi tick as the note following it, even though they are scheduled in that order it doesn’t seem to garantee the CC will be first. I don’t know who is to blame, LPX or kontakt. When notes are scheduled in order but on the same tick, then they play in that order, so keyswitches work fine. But it’s hit or miss with the CC switches.

My theory at the moment is that either kontakt or LPX are processing notes and CC’s in separate queues so there is no garantee about which will be first when the time stamp is the same. Just thinking out loud I don’t really know the answer. My advice for now is avoid CC keyswitches whenever possible.


----------



## rlw

I don't know where to report bugs to Logic Pro but I an issue that bothers me is when I use the Import functions to import tracks from another project , the articulation set is not included. I use Import function rather than templates so that I can create a new project using the same track setup. Very disappoint when you import 200 tracks from your last project but none of the templates are there after the import


----------



## jonathanwright

babylonwaves said:


> which library are you working with?



In this project I’m having the issue with Spitfire Chamber Strings and Cinebrass. Cinebrass has just started playing up with regular keyswitches too.

I guess it must be a Logic problem, as I haven’t experienced it in Cubase.

If I close the project and play it through, the articulations trigger as they should, but the moment I started moving the play head to different location, they start to trip up.


----------



## rlw

I had to go back to using CC 32 for HZ Strings because the HZ Strings does not work like Kontakt. Since there is no equivalent for the multi function in Kontakt, I create an instance in VEP with multiple instances of HZ Strings on different channels. I use the All in One patches on Midi Channel 1 and use the ArtZid UACC scripter which will sent CC32 or Midi Channels 2-16 for the other instances of HZS (but i reserve Channel 16 and avoid using an HZS instance on Channel 16). When I have a vacant slot from my 32 push button positions I just set the UACC Scripter ID to Channel 16 so that all the HZ Strings will ignore those switch messages. Thus far with the new version of ArtzId 2.04 I am not seeing issues using UACC Scripter.


----------



## jonathanwright

Does the new version of ArtzID bypass all of these triggering issues? Do you still use Logic Articulation sets in the same way?


----------



## Peter Schwartz

@jonathanwright , ARTzID V2 doesn't have any of these kinds of issues. No late triggering, no CC problems, etc.

@rlw, I have the feeling you could also use the Combinatrix =16= to drive the HZ Strings.


----------



## jonathanwright

Peter Schwartz said:


> @jonathanwright , ARTzID V2 doesn't have any of these kinds of issues. No late triggering, no CC problems, etc.
> 
> @rlw, I have the feeling you could also use the Combinatrix =16= to drive the HZ Strings.



@Peter Schwartz Good to hear! Would I apply articulations in the same way as using regular Articulation Sets (with your clever voodoo working away in the background)?


----------



## stigc56

I had to give up on Logic - again - for the same reason and go back to Cubase, which actually is working quite nice right now! Now almost all my VEPro6 instances is capable of handling both the Cubase and the Logic way so it's not that time consuming.


----------



## jonathanwright

stigc56 said:


> I had to give up on Logic - again - for the same reason and go back to Cubase, which actually is working quite nice right now! Now almost all my VEPro6 instances is capable of handling both the Cubase and the Logic way so it's not that time consuming.



Have you gone the ‘VEP Instance per track’ route with Cubase, as recommended with Logic?


----------



## Peter Schwartz

jonathanwright said:


> Good to hear! Would I apply articulations in the same way as using regular Articulation Sets (with your clever voodoo working away in the background)?





Yes, you'll use Articulation Sets because they provide the MIDI Remote --> ID encoding and articulation names. But you won't have to load them separately. They pre-load along with the scripter and smart controls. Everything else is handled by the Scripts, so there are zero issues regarding late switching, or CC's causing articulation changes. The system operates very transparently and operation is effortless. You won't even know it's there.


----------



## babylonwaves

jonathanwright said:


> In this project I’m having the issue with Spitfire Chamber Strings


use UACC KS for spitfire instead.


----------



## Peter Schwartz

On the subject of using 1 VEPro Instance per instrument as compared to minimizing the number of Instances and packing them full of patches (to be used multi-timbrally)... 

There are practical as well as technical reasons for using both approaches. It is a mistake, I think, to believe that anyone has to exclusively use one approach or the other. There exists no technical _need_ to minimize the number of Instances. This is contrary to what the Vienna developers recommend, but here's a case where I don't think the developers realize just how capable their product is with regard to using multiple instances. 

And I'm not saying this from an armchair perspective... I've configured many orchestral systems using tons of Instances without a problem. Jay Asher, George Leger, and other studio techs I've discussed this with have had the same good experiences setting things up this way for their clients.

All in all, the upsides of taking a one-instrument/Instance per track approach outweigh the downsides. But there are also great reasons to use multi-timbral (packed) Instances... For example, let's say you have a bunch of celesta patches from different libraries. Here it would be silly to have them each live in their own Instance. They can all live in one Instance, and you can switch between them quite easily on one track using Logic-native methods or a 3rd party system like mine.


----------



## stigc56

I have tried to follow one of many advises given here, to max out the channel numbers - 16 pr. instance - and that is working okay also for Cubase. The VSL Synchron Strings can run on only one port, and that goes for all the VSL libraries. The VSL woodwinds have to use two instances in Logic, but you can load a Cubase instance, with all woodwinds AND then create 4 instances with each instrument section at the same time, because you are only loading the samples one time! I load a template on my 2 slaves - one rather small all my percussion, and the other quite big - all my strings, and I works okay. 384 samples is the minimum, I can live with that, but I still don't like the Steinberg GUI, sometimes I get crazy of looking at it all day. But I must admit the MIDI editor is better. Furthermore I have switched to use Attribute and I it is faster than the Direction system. But Cubase is still so buggy, and there are quite a lot that doesn't work proper. Editing the tempo is not that easy, midi ports still don't load at start-up, some times I get the spinning ball and I don't know the reason, so it's alot of COMMAND+S here! - I don't know ;-(


----------



## WindcryMusic

Peter Schwartz said:


> @jonathanwright , ARTzID V2 doesn't have any of these kinds of issues. No late triggering, no CC problems, etc.



Alas, I must report that I've had problems with certain notes in Spitfire Symphonic Orchestra sometimes not playing in ARTzID v2 (latest version of the UACC Mapper) when using CC32 for both input and output and two notes with different articulations are playing close enough to the same time. I've been able to get it to work reliably by creating more horizontal space between the note starts (like > 40 ticks). I've only seen this occur thus far when one of the two selected articulations is a performance legato, but I can't say for sure if it is limited to that patch's involvement, nor if it is limited to CC input or even that mapper (haven't had the time to spend on a deep investigation yet).


----------



## Peter Schwartz

@WindcryMusic 

If you're having problems getting articulations to switch successfully in quick succession in Spitfire patches when _not _using ARTzID, the likely culprits are:

• trying to get this to happen in a legato patch (legato transitions sometimes take up more time than you might expect)
• trying to get two notes of the same pitch to play with different articulations
• not having the CC events positioned correctly with respect to the notes

Come to think of it, the above would apply to most libraries, though some allow simultaneous unison notes to play with different articulations.


----------



## WindcryMusic

Peter Schwartz said:


> @WindcryMusic
> 
> If you're having problems getting articulations to switch successfully in quick succession in Spitfire patches when _not _using ARTzID, the likely culprits are:
> 
> • trying to get this to happen in a legato patch (legato transitions sometimes take up more time than you might expect)
> • trying to get two notes of the same pitch to play with different articulations
> • not having the CC events positioned correctly with respect to the notes
> 
> Come to think of it, the above would apply to most libraries, though some allow simultaneous unison notes to play with different articulations.



To clarify, I was seeing this behavior *with* ARTzID v2.


----------



## Dewdman42

-


WindcryMusic said:


> Alas, I must report that I've had problems with certain notes in Spitfire Symphonic Orchestra sometimes not playing in ARTzID v2 (latest version of the UACC Mapper) when using CC32 for both input and output and two notes with different articulations are playing close enough to the same time. I've been able to get it to work reliably by creating more horizontal space between the note starts (like > 40 ticks).



This has been my experience also, not with artzid to be clear. In my case I was working on KH Spotlight Solo Strings. When I have CC switches scheduled via scripter to play before the notes they are supposed to effect, then mostly they work, but if I have a chord with different articulations on each note of the chord...somehow by the time it gets to Kontakt...the CC isn't always there in time. NoteOn keyswitches always seem to be in time.

In my case, the situation is clouded a bit by the fact that in the situation I was trying to solve, I'm using CC automation in Kontakt to receive the CC and change something in the instrument, rather then a CC switch programmed directly in the instrument itself. Not sure if that matters, but it might. When I use a separate plugin to log what the script is doing in midi, before it hits kontakt, it says everything is going in the correct order. Yet, sometimes, Kontakt doesn't get the CC in time in chordal situations like described above. So I'm actually leaning more towards Kontakt being the culprit. but never know..

Some people have reported this problem with articulation set keyswitching (which I am not using) and I can report that I havent' been able to figure out a way in scripter around it (othe then programming some lag time between the CC and the actual note), WindcryMusic seems to be having this problem with artzid also FWIW


----------



## Peter Schwartz

OK then... at this point it's not gonna serve anyone to try and discuss using and not using my system in the same thread. Windcry, send me an email with the specifics of your problem and let's see where the culprit lies.


----------



## WindcryMusic

Peter Schwartz said:


> OK then... at this point it's not gonna serve anyone to try and discuss using and not using my system in the same thread. Windcry, send me an email with the specifics of your problem and let's see where the culprit lies.



I'll first (as soon as I have a little time) try to set up a similar test using UACC KS, to see if the problem is specific to CC output or not. And I am very much aware that the likelihood is that this is a Spitfire problem rather than an ARTzID problem.


----------



## Dewdman42

I'll add a bit more info. I notice in my case that the problem seems to come up even more specifically when there is a chord with multiple articulations and some of the notes are using a particular CC event switch and some of the other notes are using the same CC# but with a different value.

So let's say you have a chord with 4 notes... From bottom to top each of the notes has a different articulation. The bottom note sends cc_switch CC32, value=100. a note higher up the chord sends a cc_switch with CC32=0. Somehow when the actual notes are performed by kontakt, the CC32:0 wins.


----------



## Peter Schwartz

By my read of what "UACC KS" means, a palette will be on the lookout for keyswitch notes with velocities that match the UACC values normally associated with those articulations. But in practice I've found that with certain Spitfire libraries, UACC KS really means "UACC or KS": a palette will switch in response to CC#32 values _or_ keyswitch notes with velocities that match those values. 

This leads me to think that testing a Spitfire palette's performance in response to articulation-switching messages needs to be approached by testing each specific criteria: 

• when palettes are locked to nothing (so that they act like normal keyswitching patches)
• when they're locked to UACC only
• when they're locked to UACC+KS

There's also an option to switch articulations using program change messages, though I haven't been able to get that to work.

@WindcryMusic, you might be able to cut to the chase (and avoid testing anything) by turning off the Data Reduction feature on the Script. Doing so ensures that an articulation-switching message will be set at the start of every musical note -- as opposed to the default setting, where switching messages are only sent on an as-needed basis. The idea behind "as needed" is to stop the Script from "spamming" a patch with normally unnecessary, repetitive switching messages. But in practice I think you'll find that turning it off won't cause any timing issues as a result of all those extra events being sent; at the same time, if a palette is 'misfiring' for some reason, you can be assured that the proper articulation is being selected at the start of every single note.


----------



## WindcryMusic

Peter Schwartz said:


> @WindcryMusic, you might be able to cut to the chase (and avoid testing anything) by turning off the Data Reduction feature on the Script. Doing so ensures that an articulation-switching message will be set at the start of every musical note -- as opposed to the default setting, where switching messages are only sent on an as-needed basis. The idea behind "as needed" is to stop the Script from "spamming" a patch with normally unnecessary, repetitive switching messages. But in practice I think you'll find that turning it off won't cause any timing issues as a result of all those extra events being sent; at the same time, if a palette is 'misfiring' for some reason, you can be assured that the proper articulation is being selected at the start of every single note.



That sounds like something that should be easy to try ... I'll see if I can do that test later on tonight.


----------



## Peter Schwartz

We like "easy".


----------



## Dewdman42

Try to replicate the scenario I outlined above...


----------



## Peter Schwartz

@Dewdman42, in the scenario you described, I wasn't sure what you meant by "keyswitch CC32, value=100..."

Are you talking about sending both a keyswitch & and CC32 message (followed by the actual musical note)? Or just one or the other?


----------



## Dewdman42

CC32 refers to controller # 32. When you send a CC message for CC32, it also has a value from 0-127.



Code:


var cc = new ControlChange
cc.number = 32;
cc.value = 100;


Just trying to get to the bottom of what is going on, in a general sense... I think there may be a race condition in kontakt or perhaps in the spitfire instrument.

I don't own spitfire so I'm not that familiar with it. In a more general sense, a keyswitch or cc switch can go in front of a note, and the switch event can be a CC event, which has both a number and a value.

Some background, I have been working with KH Spotlight Solo Strings. Turns out KHSSS has one very important keyswitch for staccato, which is a toggle. Meaning, you turn it on and leave it on..or you turn it off and leave it off, and it functions in concert with other keyswitches that are happening. The trouble is that KH chose to use a non-CC keyswitch for this. And that means you send the keyswitch to turn it on and send the same keyswitch to turn it off. The problem with THAT is that Scripter doesn't know what the state of it is...so if you need a staccato articulation id to turn on staccato, you don't really know in the script whether its already on or not. So.... I chose to set something up using CC automation because by automating the actual instrument control I can send a CC with a value of 0 to turn it off and a value of 127 to turn it on. But this requires to map a CC automation ahead of time to that control, then the script generally works great...

except for the chord situation outlined above, which is very similar to what WindCry is reporting...so it might all be relevant to all of us..not sure yet..

As noted above..when I create a chord with several different articulations in the chord...if a note further down the chord is using the CC# with some value n for a cc_switch...and then further up the chord (that seems to be the order LPX processes the notes in the region), if there is another note with a different articulation that sends a cc_switch that uses the same CC# but with a different value then the one below... then what happens is that the value from the top most note is the one that seems to win..

When I watch a logger, even outside of LPX altogether, I can see that the CC events are sent in exactly the right order with the notes...each one has a keyswitch or cc_switch right in front of each note its supposed to be effecting. yet...over in kontakt...in the above scenario...what seems to happen is that all of the CC switches of the chord get processed before kontakt actually processes the notes...so the last value for that CC# is the one that wins and effects both notes... or something along those lines.

My feeling at the moment is that Kontakt processes all the CC events first, then the notes(for any given single midi tick). This may or may not be unique to my case where I'm using CC automation, as opposed to the instrument itself processing the CC's. And who knows what Spitire does inside their own instrument and kontakt scripting... so it may be similar with spitfire, I'm not sure... but it would be interesting to hear what happens with the test I outlined above, both with and without artzid


----------



## Peter Schwartz

Ah, OK. It's a nomenclature issue. "Keyswitch" refers exclusively to a MIDI note that's used as an articulation-switching instruction. That's why I was confused by what you meant by "keyswitch CC32".

CC#32 messages are their own thing and not "keyswitches", just as program change messages used as articulation-switching instructions aren't keyswitches. They're they're own thing too.

For general information now...

Making matters more confusing (or not, depending on how you look at it LOL) is the "Switches" section of the Articulation Set configuration. Here, what Logic is calling a "switch" is really nothing of the kind. The events you program there don't switch nuthin'. Rather, "switches" are MIDI events of any definition you choose used to set the Articulation ID's for all subsequently played notes. And CC's, etc.

Setting an ID is, to be fair, a kind of switching action. You're using MIDI events to change ("switch") the ID value. Still, it's kind of a poor choice of term given all the other implementations of "switch" in the jargon we use to talk about all this stuff.

It's only in the Output section of an Articulation Set where, shall we say, "articulation switching messages" are generated, not the Switches section. But even there, no switching occurs. Switching only ever occurs in a patch.


----------



## Dewdman42

Sorry if I confused you, I do attempt to name things the way you have just explained, but may have missed something since it can become kind of laborious to write out explicit words exactly right all the time. Keyswitching in a more general sense I think some people could be referring to CC switches too...but fine..I will correct my post above to avoid future confusion.

Do you understand the test now?


----------



## bpford

Dewdman42 said:


> CC32 refers to controller # 32. When you send a CC message for CC32, it also has a value from 0-127.
> 
> 
> Some background, I have been working with KH Spotlight Solo Strings. Turns out KHSSS has one very important keyswitch for staccato, which is a toggle. Meaning, you turn it on and leave it on..or you turn it off and leave it off, and it functions in concert with other keyswitches that are happening. The trouble is that KH chose to use a non-CC keyswitch for this. And that means you send the keyswitch to turn it on and send the same keyswitch to turn it off. The problem with THAT is that Scripter doesn't know what the state of it is...so if you need a staccato articulation id to turn on staccato, you don't really know in the script whether its already on or not.



I'm not really sure what the logic scripter rules are, but I had (am having) a similar issue to this with regards to toggling group visibility in Logic, and mapping that to Lemur. In the sense, that when you send the "toggle group hide" key command to Logic, it doesn't know the current visibility state of the group, and just switches it. The work around in Lemur was to create a global variable that memorizes the current state of each group and then use a conditional if statement to check its status before sending a command and then changing its global status to reflect its new current state.

Couldn't you do this in Scripter? When you trigger the toggle, it checks the status of the global variable and then sets it to its new state?


----------



## Dewdman42

Yes sure that was the first thing I tried, but if the user changes the control from the GUI, then scripter is out of sync again. The only way to know for sure is to set the control explicity to 0 or 127.

Another approach I am possibly going to do later is to explicitly turn off the control one time when the user presses PLAY, and then use the keyswitch after that to toggle it on and off using a variable to keep track as you suggested. as long as the user doesn't change the GUI while its playing back, it should work fine, and I'll avoid the problem we're talking about with note keyswitches rather then CC automation, perhaps.

Be that as it may...the issue I have come across, may also be related to the spitfire issue and why they have recommended that people use the KS version.


----------



## Peter Schwartz

Yup, got it now, thanks.


----------



## WindcryMusic

Peter Schwartz said:


> @WindcryMusic, you might be able to cut to the chase (and avoid testing anything) by turning off the Data Reduction feature on the Script. Doing so ensures that an articulation-switching message will be set at the start of every musical note -- as opposed to the default setting, where switching messages are only sent on an as-needed basis. The idea behind "as needed" is to stop the Script from "spamming" a patch with normally unnecessary, repetitive switching messages. But in practice I think you'll find that turning it off won't cause any timing issues as a result of all those extra events being sent; at the same time, if a palette is 'misfiring' for some reason, you can be assured that the proper articulation is being selected at the start of every single note.





WindcryMusic said:


> That sounds like something that should be easy to try ... I'll see if I can do that test later on tonight.



Update: I just tried it. Recorded something where I could reproduce the misbehavior, then turned off Data Reduction. No effect ... the note drop-outs still happen in the same way. Again, the effect I'm seeing is not that certain notes play with the wrong articulation ... rather, certain notes don't sound AT ALL. The key still shows as having been pressed in Kontakt's keyboard pane, but there's nothing audible from that note. In this test, the offending notes were separated by 12 ticks, with the performance legato note played later than the other articulation (a simple long sustain). If I switch things around so that the performance legato note is sounded 12 ticks BEFORE the other articulation, then both articulations play back fine. I duplicated this same behavior in a couple of sections. In each case, if I slide the trailing legato note back to about 40 to 50 ticks after the other articulation, it seems to sound 100% of the time.

This feels to me like it could be an issue with the Spitfire performance legato not handling being switched off and then back on as a result of an intervening note using a different articulation, unless enough time is allowed for something in Spitfire's script to "reset". (I'm assuming it is Spitfire's scripting that is at fault at this point since, if it was an issue with ARTzID, one would expect it to happen regardless of which articulation was selected to play first.)


----------



## Peter Schwartz

@WindcryMusic, that's curious behavior indeed. But I agree with your conclusion, that it's behavior specific to the patch. Which library are you testing that performance legato with?

@Dewdman42, I'm of the same mind regarding the KH patch you've been working on the Script for -- I think what you're seeing is behavior specific to the programming of the patch.

This kind of behavior isn't entirely unusual, either. Not _common_, but still, not unusual. For example, I've run any number of polyphonic articulation-switching torture tests with Vienna Instruments, various EastWest and Kontakt-based libraries (including Cinematic Strings 2 & Berlin Strings) and they all trigger articulations flawlessly.

On the flip side, Cinematic Studio Strings has trouble when you try to get certain combinations of articulations to sound simultaneously. Then again, this is somewhat understandable behavior, such as when trying to get legato (monophonic) and staccato (polyphonic) articulations to sound simultaneously.

Furthermore, I've found that some EWQL keyswitching patches which feature "legato" articulations (Q-Leg, Exp-Leg, etc.) require 35 milliseconds of space between the keyswitch note and the musical note intended to sound with those articulations.

So regardless of whether someone's trying to roll their own articulation sets or they're using my system, I think it's fair to say that there will always be some exceptions to how well a particular patch/palette will respond. Fortunately, the exceptions are few.

One of the ways to work around these problem children is to address those sounds by not using articulation-switching at all: load the individual articulations and set them up to respond on a different MIDI channel from the palette/keyswitching patch.


----------



## WindcryMusic

Peter Schwartz said:


> @WindcryMusic, that's curious behavior indeed. But I agree with your conclusion, that it's behavior specific to the patch. Which library are you testing that performance legato with?



Spitfire Symphonic Strings, Violins 1 Performance Legato. But I'm guessing it is likely to be common to all of the performance legato patches from the Symphonic line.

For my part, I've decided for the moment to just go ahead and combine the performance legatos with the other articulations, regardless of this issue. If I don't do so, it adds around 30 Kontakt instances to my template and increases the template's RAM footprint (even completely purged of samples) by around 2 GB, and memory space is at a premium for me right now because I don't have $9000 for an iMac Pro (tear runs down cheek) and my current iMac is maxed out at 32 GB.

I'm choosing to look at it as follows: in almost all cases with any given orchestral section, I'll only be using a single legato articulation for a stretch by itself and then switching to other articulations, rather than doubling the two up like this. In any rare cases where I really do have to overlap legato and other articulations, at that point I'll just duplicate the track and move the legato regions to a track of their own, rather than waste all of that memory (and screen space!) to always separate the legatos for every section just to protect against a problem that might only come up for a single section in one cue out of ten.


----------



## ptram

Maybe I've found a solution to Logic's Articulation 'jump-back'. To summarize the problem again: after a note selects the assigned articulation, Logic immediately and automatically selects articulation ID1. It doesn't remain on the selected articulation up to the end of the playing note, as expected.

I own two types of libraries. The first one (like VSL) seems to be able to work even with this issue. A note selects a slot/articulation, and VSL's player continues to play that articulation even if ID1's slot has been automatically selected in the meantime. (I've yet, however, to check what happens when switching or crossfading slots inside the intended slot.)

The other one (like Tarilonte's Mystica or Soundiron Olymputs choir) can't work with this bug. Mystica (and other Tarilonte's libraries) can change sound even while a note is playing. Let the note select the vowel 'Eeh', and Logic with suddently switch it to 'Aah'. Soundiron's libraries would let you select the ending phonem of a syllable, but with Logic automatically choosing ID1 you will always end up with only the choices available for the 'Aah' syllable.

What I've tried with Mystica is this: Create a full list of articulations in Logic. The last one will be a 'Blank' articulation, without no event assigned. Until Apple solves the bug, you will assign ID100 to the 'Aah' vowel (the first articulation in the list), and ID1 to the 'Blank' dummy articulation. Replacing articulations is very easy in Logic (just select them with Edit > Select > Similar Articulations, and assign them a different articulation in the Info box), so swapping them is not a major issue.

When Logic automatically selects ID1, it select the 'Blank' dummy articulation, and nothing will change in Mystica. The ladies will continue singing their phonem as intended.

Paolo


----------



## Dewdman42

Yes, but what I would do is use id=254 on the first line of the articulation set. That is the largest id allowed. Just make it a habit to always put a dummy articulation on the first line with id=254. Then you can number the rest of them starting at id=1 on the second line. Make sure dummy articulation 254 doesn't have any output switches. Unless and until Apple fixes this problem then that is the working model on ALL of my articulation sets.


----------



## ptram

As far as I've seen, Logic is not going back to the first articulation in the list, but to articulation ID1. So you can have ID254 in the first line, and Logic will continue to look for ID1, wherever it is.

I prefer to have the most basic articulation as my first choice (Sustain, or Aah). When the bug is solved, I want it as the default articulation, the one that will be used the most.

Paolo


----------



## Dewdman42

No, sorry that's not correct. The first one on the list is the one that is defaulted, regardless of what ID it is.

This defaulting behavior happens both coming in and going out. So for example, if you play in some parts and don't use any input keyswitches, then all the incoming events will get encoded to the region with the articulation id of the first one on the list. There is no way, to record incoming midi to a region without assigning one of the articulations from the set. So if you don't choose one, then the first one will be used...first on the list...regardless of what ID# it is.

Also, there is a control on the top of the plugin window..and basically by default it will be showing the first articulation on the list (regardless of the actual ID#). Changing that field is supposed to be the same as if you had used an input keyswitch. Then subsequent incoming notes will be recorded to the region with that artid number. If you use input keyswitches you can see they are updating that GUI control also. That field is kind of linked to the the input switch mechanism.

On output, something similar happens for events in the region that don't have any artid assigned to them, they will also attempt to send the output keyswitch for the first articulation on the list just as if they had been encoded with it. Again, regardless of what the actual ID# is.


----------



## Dewdman42

IMHO, what LPX *should* do that would be better:

First, events in the region without an artid assigned to them should not ever send a keyswitch out by default. If the user wants keyswitches sent, they should assign articulation id's to events to cause that to happen. Unassigned events should send nothing extra.

In essence that means that events in the region without any artid assigned to them will just continue to play whatever sound is currently in the instrument, no switching would occur.

_(note the only way to currently avoid the above problem is to make sure the first articulation on the list is a dummy articulation with absolutely no output switches.)._

Secondly, while recording to a region, by default events should be encoded without any articulation id, unless the user selects one via input keyswitch or the plugin window control. The input switch feature has several modes that can determine whether a selected articulation will last permanently, or one-time momentary switch, or a toggle, etc. This is all fancy behavior that can be programmed so that you can use input switches to determine how incoming midi events will be encoded as they are recorded to the region. That's all fine, but the user should be able to also record events without any id and that should be the default.

It can become particularly problematic when notes and CC's are recorded at the same time..and then later you go back to change the articulation id's of the notes.. Then during playback you end up with keyswitches flipping the instrument rapidly back and forth between the articulations you want for the notes, and the articulations that were automatically encoded to the CC events earlier when you recorded them to the region. You really want the notes and the CC's in the same space of time, to have the same matching artid... or if you can record stuff without an id assigned initially...then those unassigned CC's should not send any conflicting keyswitches. This is all "should" behavior..right now the behavior is totally wonky.

One work around is to always record your midi to the region with the articulation set turned off. Then by default they will be recoreded without a default artid (YAY!) and then you can go set the notes you want keyswitches for to have the artid... and it will all be fine..but only if you put a dummy articulation on the first line so that the default output behavior for all the unassigned events will be to not send any keyswitches out.


----------



## ptram

Deawdman42, it works in a different way, here, and I don't know why it is different.

In the first video, you can see the 'Aah' articulation, the first in the list, being ID1. It is the one automatically selected by Logic:

http://www.studio-magazine.com/video/sw/logic/Logic_Articulation_jump_back.mov

In this other video the 'Aah' articulation has been moved to ID100, while still being the first in the list. Logic seems to still automatically select ID1, that I assigned to the empty 'Blank' articulation. As a result, the articulation that was playing is not interrupted by the one associated to ID1:

http://www.studio-magazine.com/video/sw/logic/Logic_Blank_ArtID.mov

What's happening?

Paolo


----------



## Dewdman42

I would need to see your articulation set completely and also your complete event list (not just notes), to see fully what is going on in your case. You must have some stuff with id1 saved in the region somewhere, its not using it by default unless that is the first articulation on the list


----------



## ptram

I removed everything, but the segment of track where I experimented with articulation change. Do you mind to give it a look?

Paolo


----------



## resound

ptram said:


> I removed everything, but the segment of track where I experimented with articulation change. Do you mind to give it a look?
> 
> Paolo


All of your modulation/expression data is set to articulation 1 (Blank) so every time a new modulation/expression value happens, it sends the key switch for articulation 1. That would explain why it keeps jumping back to the "Ahh" articulation when you had that set to articulation 1.


----------



## Dewdman42

So take a look at the event list:






Notice how all your recorded CC events have articulation id set to "Blank"? 
_
(You have to use the event list "view" menu to show the articulation column.)
_
That is not playback "defaulting", that is because when you actually recorded the track to begin with with both notes and CC's you somehow had id1 articulation enabled, whether that was originally the first one on the list at the time you made the region or you had it selected via keyswitch or the plugin window menu...that is what you had enabled when you recorded the track and its burned into those CC events. 

So now during that section of music, for example, you have an "ahh" note with the "Ahh" articulation, but its surrounded by CC events with the "blank" articulation. In this case its not so terrible, because you actually want the blank articulation attached to the CC's to avoid the scenario I mentioned earlier, but it has nothing to do with the fact that its id1... it does have to do with the fact that when you recorded the track you had id1 selected at the time you recorded it.

If you want to try a test to start over....

delete the region
create a new empty region
make sure the plugin window shows "Ahh" for the articulation control at the top
with your current articulation set (where the first item is id100) enabled, record some notes and CC events.
Look at the event list and you will see they are all encoded with the "Ahh" articulation id100.
And now change some of the notes to say articulation 5 and watch what happens... wonky stuff...


----------



## Dewdman42

but in the above, if you ever have any events with "-" undefined artid, then with the articulation set you provided here, those events will playback the "Ahh" articulation, including the output keyswitch for it. that includes both notes and CC's that might end up in your region without any artid assigned.


----------



## ptram

I see. Thank you both for pointing me to the right direction. I had the Articulation column hidden in the List pane, and could not even think that events other than notes could have an ArtID attached.

And I can confirm that the first ArtID in the list is selected. Even if the '--' empty articulation is assigned to the events.

Is there some reasons to have ArtID attached to Control Changes, and is there any creative way to use them?

Paolo


----------



## procreative

I am not sure how you are programming your Sets, but in my tests (so far) whatever "rogue" cc data is recorded with Art IDs does not affect playback. Sure agreed the GUI jumps around, but the sounds are what I recorded.

I have tried three ways:

*1. Record notes, art id changes and modwheel moves "live" (not easy to do though)*
Result: Notes and CC share the same articulation and whenever notes change articulation the CC data at that point when recorded together assumes the same articulation ID. Playback unaffected.

*2. Record notes, art id changes and then modwheel moves on separate pass*
Result: CC assumes ID of whatever is preselected (for example by dropdown in Kontakt window) and stays at that throughout unless you change the articulation in this menu or via a remote, while recording cc. Playback unaffected.

*3. Record notes, art id changes and then draw in cc data*
Result: CC has no ID attached just shows –. Playback unaffected.

In my case it has no bearing as I am using automation to control cc data via a Midi FX script so I dont record midi CC I record automation which is converted before it hits the VI into CC data.

But so far I have not heard anything play back incorrectly. I tested the above with Emotional Cello which uses note based keyswitches and Sable/SCS which uses UACC. Both playback fine here.


----------



## Dewdman42

ptram said:


> Is there some reasons to have ArtID attached to Control Changes, and is there any creative way to use them?
> Paolo



sure there might be, but generally most people do NOT need them and they cause more headaches to deal with it. Its probably something else Apple should change in the feature or at least make it configurable somehow so that while recording only notes will be encoded with articulation id. But I think their thinking was that most people would record a given section of music, providing both notes and input switches in real time and recorded to the track, so if you are capable of playing like that, then you would end up with a given section of music having the desired artid's encoded into both notes and CC's in a way that would make sense and not require any further editing. Their input switch features are much more sophisticated then output switches, and to me it appears that is their thinking was more oriented towards a lot of flexbility in the way you input keyswithes as you play and burn the result into midi events as a final performed result.

Personally I think most people play their parts in using input keyswitches more minimally...and then do a lot of editing after the fact to setup all the notes up exactly how they want them played back with various articulations.

So as I was saying before, if you disable the articulation set entirely while recording, then your CC's and notes will not be encoded with anything and only the notes you want to have a specific articulation, you can assign manually in the piano roll after enabling the articulation set. Make sure the first articulation is the blank one, so that all the unassigned notes and CC's will not send any keyswitches out, then you should be golden.

Or....

If you like to use some minimal input switching as you play...then just be aware that CC's are going to be encoded too....and you'll have to either clean them up by removing their articulation ID, or by making sure they are the same as the notes around them, or using a custom script to do your output switches and ignore CC's with artid.


----------



## procreative

I am just not sure what the issue is here. Maybe you have a different set up to me, but I have had no playback issues so far. The CCs might have IDs attached, but only the notes with IDs actually trigger anything.

The CCs only record IDs if they are recorded with a hardware input, if they are drawn in no ID is attached.

CCs have IDs attached with hardware moves eg modwheel. But the ID that is attached is either the last one selected in the Kontakt window or remote OR if you are changing IDs live while playing notes/moving the modwheel they follow the ID of the notes.

The only time CCs have ID1 is either if you selected ID 1 before moving the modwheel OR if you had nothing selected as nothing = ID1 [or whatever the first slot ID is numbered to].

Its possible there are libraries that go bananas with this, but I have tried Note Based, UACC and Midi Channel types and so far none have played back with incorrect articulations.


----------



## Dewdman42

procreative said:


> I am just not sure what the issue is here. Maybe you have a different set up to me, but I have had no playback issues so far. The CCs might have IDs attached, but only the notes with IDs actually trigger anything.



well that can be the case with custom Scripting such as artzid or roll your own, but not so with the articulation set output switches..that sends switches on cc events too.



> The CCs only record IDs if they are recorded with a hardware input, if they are drawn in no ID is attached.


correct. Which is another reason to have a "blank" articulation at the top of the list.



> CCs have IDs attached with hardware moves eg modwheel. But the ID that is attached is either the last one selected in the Kontakt window or remote OR if you are changing IDs live while playing notes/moving the modwheel they follow the ID of the notes.



correct, UNLESS you opt to turn off the articulation set while recording mod wheel moves.


----------



## procreative

Dewdman42 said:


> well that can be the case with custom Scripting such as artzid or roll your own, but not so with the articulation set output switches..that sends switches on cc events too.



Well my comments above are not using ArtzID, they are using the stock Articulation Sets. While the GUI indicates switching visually I am not finding notes changing articulation as a result.



Dewdman42 said:


> correct, UNLESS you opt to turn off the articulation set while recording mod wheel moves.



Well IF this were an issue (cannot comment as not had one yet), maybe the solution other than your suggestion which is certainly an option, would be if Apple made an optional key command to toggle Articulation Sets on/off or made an option to bypass CCs?

Another workaround is to record everything, then deselect Notes in the Event Editor and select all CC events with cmd A and change their ID to -

But I have just tried again using Symphony Series Strings and also plays back fine.


----------



## Dewdman42

procreative said:


> Well my comments above are not using ArtzID, they are using the stock Articulation Sets. While the GUI indicates switching visually I am not finding notes changing articulation as a result.


If you observe what is happening with a midi monitor you can see that CC's with artid's do in fact send the corresponding keyswitch from the artset output section. The reason your notes still sound ok is because when the next note comes through another keyswitch is sent to switch the instrument back again. As you have pointed out, it seems to sound fine for you, but there is a lot of needless keyswitching happening, which doesn't seem to bother you, but does bother some others, including myself. And in some cases, such as the other recent poster, could result in problems with the instrument as well. For example if you are using a Program Change to actually change presets in the instrument, then the CC switches would change the sound currently playing, etc.. Could be other situations where it be wonky.



> Well IF this were an issue (cannot comment as not had one yet), maybe the solution other than your suggestion which is certainly an option, would be if Apple made an optional key command to toggle Articulation Sets on/off or made an option to bypass CCs?


Yep I agree. Or just have it so that if you turn off the orange MIDI REMOTE button, then it should disable the defaulting of anything, but right now even with that orange button in the off position, you still get forced artid to every event. Or just at least give us an option to set the articulation control on the plugin window to "-".



> Another workaround is to record everything, then deselect Notes in the Event Editor and select all CC events with cmd A and change their ID to -


definitely.


----------



## procreative

Dewdman42 said:


> If you observe what is happening with a midi monitor you can see that CC's with artid's do in fact send the corresponding keyswitch from the artset output section. The reason your notes still sound ok is because when the next note comes through another keyswitch is sent to switch the instrument back again.



Yes I do know the keyswitches are being sent by the CCs, but from what I can see it seems the IDs have to be attached to Notes to trigger sound changes.

Its not ideal to have all that extra activity. The best workaround is by far to just delete the events as changing IDs for blank ones could create a can of worms.

As I said if you use automation to drive CC controls in the VI instead of CC data directly going into logic, automation does not get IDs attached. Thats another workaround.

The extra neat thing with using automation is that its a great way to change the output as for example in some libraries CC2 is Vibrato and in some others its something else competely.

Editing the script output in a track means you can drag and drop the midi between tracks of different VIs and have them all trigger the same control. Spitfire seem to use different CCs to say Cinematic Studio Strings.


----------



## Dewdman42

procreative said:


> Yes I do know the keyswitches are being sent by the CCs, but from what I can see it seems the IDs have to be attached to Notes to trigger sound changes.


it just depends. In keyswitched instruments that are smart enough, which is most of them; to have multiple sounds playing from one "patch" so to speak..then an incoming keyswitch should not change a sound that is already playing. But that may not always be the case. I gave one example above. In that case a CC with a keyswitch could dissrupt the already playing sound.



> Its not ideal to have all that extra activity. The best workaround is by far to just delete the events as changing IDs for blank ones could create a can of worms.


blank ones are fine, just make sure you have a blank articulation at the top of the set.



> As I said if you use automation to drive CC controls in the VI instead of CC data directly going into logic, automation does not get IDs attached. Thats another workaround.



Well there are some interesting things there. I have been messing around a lot with KH spotlight solo strings, for example, and I'm convinced at the moment that the timing of CC automation in kontakt is not exactly calibrated with notes to the instrument, so you have a bit more slop factor in terms of needing to send CC automation sooner. But there are a lot of things I like about using automation also. its more visual for one thing.


----------



## procreative

Dewdman42 said:


> Well there are some interesting things there. I have been messing around a lot with KH spotlight solo strings, for example, and I'm convinced at the moment that the timing of CC automation in kontakt is not exactly calibrated with notes to the instrument, so you have a bit more slop factor in terms of needing to send CC automation sooner. But there are a lot of things I like about using automation also. its more visual for one thing.



Well from a personal point of view, the main reason I am using automation to drive CCs is that my Mackie MCU has built in support for Smart Controls. By engaging fader flip I can use the motorised faders to input data and it also has recall and track follow as being automation the faders get feedback from Logic.

As usual pros and cons, cannot use Track Stacks as Logic loses hardware connections to Smart Controls when they are inside Track Stacks, but its a trade off.


----------



## ptram

I don't know if you have already tried or discovered it, but there is an easy way to add the 'dummy' artid at the beginning of the list, without having to re-edit everything in Logic's artid Edit window.

Just look for the (User)/Music/Audio Music Apps/Articulation Settings/ folder, and find the .plist corresponding to the artid map you want to edit.

Open it with a text editor, like BBEdit, and duplicate the first articulation. An artid looks something like this:

<dict>
<key>ArticulationID</key>
<integer>1</integer>
<key>ID</key>
<integer>1001</integer>
<key>Name</key>
<string>sustain vib</string>
<key>Output</key>
<dict>
<key>MB1</key>
<integer>1</integer>
<key>Status</key>
<string>Note On</string>
<key>ValueLow</key>
<integer>1</integer>
</dict>
</dict>

Add the duplicated artid at the beginning of the list of <dict> bundles, and assign it the highest supported ID (234). Something like this (or – exactly this):

<dict>
<key>ArticulationID</key>
<integer>254</integer>
<key>ID</key>
<integer>1254</integer>
<key>Name</key>
<string>(none)</string>
</dict>

The '(none)' artid will be the first in the list. The artids you already assigned in your scores will be preserved, since the same ID will remain assigned to the events.

Select the articulation map again in the Info box, and your list will be updated.

Paolo

(EDIT: Replaced 234 with 254)


----------



## Dewdman42

Yep. But I am not sure if the KEY needs to be unique or what. In your case it’s 1001.

The highest allowed articulation id in LPX is 254


----------



## babylonwaves

WindcryMusic said:


> This feels to me like it could be an issue with the Spitfire performance legato not handling being switched off


@WindcryMusic
I didn't read through all this thread but make sure the UACC KS root note is on C-2. Spitfire sometimes sets it differently for no apparent reason and that would result in the switches not working right. I mean the UACC root note of the instrument you're controlling.


----------



## ptram

Dewdman42 said:


> Yep. But I am not sure if the KEY needs to be unique or what. In your case it’s 1001.


Judging from the tests I've done, it seems that using an Articulation ID = 254 and an ID = 1254, with the articulation at the beginning of the list, it works as expected (the automatic selection of the first articulation in the list does not produce any effect).

Paolo


----------



## Peter Schwartz

babylonwaves said:


> @WindcryMusic
> I didn't read through all this thread but make sure the UACC KS root note is on C-2. Spitfire sometimes sets it differently for no apparent reason and that would result in the switches not working right. I mean the UACC root note of the instrument you're controlling.



If that's the case, man, that's some high strangeness. I haven't run into that situation but I'll be on the lookout for it. 

Still, there's no particular reason for locking the palette to UACC KS when using ARTzID (which is what @WindcryMusic is using). Barring any weirdness in the programming of the patch itself, it'll work when it's locked to UACC along with the UACC Script. It should also work with the patch totally unlocked so that it acts like any other keyswitching patch (lowest keyswitch = C-2) in conjunction with any number of other the system's Scripts.


----------



## procreative

ptram said:


> Judging from the tests I've done, it seems that using an Articulation ID = 254 and an ID = 1254, with the articulation at the beginning of the list, it works as expected (the automatic selection of the first articulation in the list does not produce any effect).
> 
> Paolo



This method only works if you only record without an articulation selected or you record the CC on a second pass. If you switch articulation and move CCs while recording, the CCs take on the same ID of the notes.

Also if an articulation is already set in the plugin window, the CCs still pick that up even when recorded on a second pass.


----------



## Dewdman42

Having the blank, however, does effect the output switches during playback, which in my view is the bigger problem.

Turn off the set while recording and use the blank first line and you won’t have problems.


----------



## procreative

Dewdman42 said:


> Having the blank, however, does effect the output switches during playback, which in my view is the bigger problem.



I certainly agree having CCs set to "–" avoids unwanted flickering of arts, but adding a "blank" articulation does not create this.



Dewdman42 said:


> Turn off the set while recording and use the blank first line and you won’t have problems.



But then you wont hear the selected articulations while you record the CCs?


----------



## Dewdman42

The blank is more to to prevent the problem on output that's what I'm trying to say. If you have "-" events, they will default to the first articulation keyswitch also. So you need the blank to prevent that from happening.

If you need to use input keyswitches as you're playing, then of course you have to use the articulation set. If you are ok with recording your part without switching the articulations as you play, then scratch the articulation set in order to record everything "-"

We're going in circles now..


----------



## babylonwaves

Peter Schwartz said:


> If that's the case, man, that's some high strangeness. I haven't run into that situation but I'll be on the lookout for it.


it's a bug in their latest symphonic strings update (1.3.0b42) and also with the chamber strings. all performance legato instruments default to UACC KS C-1 instead of C-2. there is a handful of older instruments which also have that and never got corrected.


----------



## JonesyXL

Apologies if this has already been answered somewhere in this thread - but is there a definitive solution for getting the logic articulations and CSS working? If so, can anyone spot what mistakes I've made in the art. set edit?

I entered a blank for Articulation ID1, but on playback every CC message resets the articulation to Legato Standard, regardless of the articulation that's set at the start of recording.


----------



## nbd

For me the only 100% working solution was to combine the new logic articulation system with MIDI scripter. The logic articulation system is basically just making the articulations human readable instead of 1...n and the MIDI scripter then sends CC's for each before sending the note on command.

I could not find a way to make the CC's or keyswitches work reliably by using only the articulation editor.


----------



## JonesyXL

Okay cool, thanks @nbd, I'll look into MIDI scripting. Is it very complex?

I guess a workaround would be to load each articulation on different midi channels into one kontakt instance. I don't know if that'll effect CPU much. Also, changing mic mixes would be a PITA. 

One of the things that I like most about CSS is having all the options for each section in one place.


----------



## rlw

JonesyXL said:


> Apologies if this has already been answered somewhere in this thread - but is there a definitive solution for getting the logic articulations and CSS working? If so, can anyone spot what mistakes I've made in the art. set edit?
> 
> I entered a blank for Articulation ID1, but on playback every CC message resets the articulation to Legato Standard, regardless of the articulation that's set at the start of recording.


You can consider the Peter Swartz’s utility “ ArtzId “ which handles CSS very well.


----------



## cyrilblanc

rlw said:


> Peter Swartz’s utility “ ArtzId “


Is it still NOT COMPATIBLE with VSL ?
Can you handle Brass and Wind instruments ?
I coud not find the 83 manufacturer list


----------



## Peter Schwartz

I don't know what you mean...ARTzID and all of my previous systems have featured extensive VSL (and VEPro) support. In fact, my VSL scripts provide functions that exceed the native capabilities of the Vienna instruments plug in itself.

ArtzID V2 has scripts that support multitrack recording, velocity-sensitive key switching, and much more. So really, I have no idea where you got the idea that my systems don't support that Library.


----------



## cyrilblanc

Maybe I make a confusion with another script builder 
Does your system can address X Y and Z matrices in Logic or do you have to build your own X VSL matrices


----------



## Peter Schwartz

Yes, the Scripts let you individually configure X/Y individually for each instrument in a multiple-matrix setup, and also select matrices (Z). The basic Script lets you address up to 36 cells (any X/Y configuration) and individually set each one to respond to either Velocity or Xfade (polyphonically).


----------



## cyrilblanc

this is great, can you do a print screen of the one that deals with the Brass of VSL for the level II matrices


----------



## Korto

Hello, i'm here because i have the "intempestive switching" problem between articulations (Albion I redux).
I use Logic pro X and did all the settings in Articulation ID.
I tried the basic KS settings and the UACC KS method told by Spitfire but the switching problem remains.

The only way of solving the problem is to delete all aftertouch data after recording wich is a bit annoying (because, as one can see in the events list, aftertouch sends articulation switch messages).

In Logic Pro X i can filter the recording of aftertouch data globally but not by tracks so it's a radical choice (i will not have the aftertouch sounds of Alchemy for example). 

My question : at this point is there a solution without having to filter or delete all aftertouch data ?

Thank you


----------



## babylonwaves

Korto said:


> My question : at this point is there a solution without having to filter or delete all aftertouch data ?


UACC KS communicates using polypressure messages. Usually, keyboards send channel pressure (AKA aftertouch). unless your keyboard send polypressure on C-2 messages, the aftertouch will not confuse the articulation switching. at least not on my system...


----------



## Dewdman42

if its just C-2, then just filter out poly pressure from that one key, A simple script for that:



Code:


function HandleMIDI(event) {
    if(event instanceof PolyPressure 
            && event.pitch == MIDI.noteNumber("C-2")) {
        return;
    }
    event.send();
}


----------



## cyrilblanc

Peter Schwartz said:


> Yes, the Scripts let you individually configure X/Y individually for each instrument in a multiple-matrix setup, and also select matrices (Z). The basic Script lets you address up to 36 cells (any X/Y configuration) and individually set each one to respond to either Velocity or Xfade (polyphonically).


this is great, can you do a print screen of the one that deals with the Brass of VSL for the level II matrices


----------



## cyrilblanc

Peter Schwartz said:


> Hi Cyril, I don't have that specific library.


Hi Peter,
Can you list the libraries available ?


----------



## Peter Schwartz

As this thread is about Logic 10.4 articulation ID in general, let's discuss this in my ARTzID thread.


----------



## Peter Schwartz

cyrilblanc said:


> Why can't you answer on this Logic 10.4 Articulation Discussion ?



Because I feel it's inappropriate to take a general discussion about Logic Articulation ID and potentially hijack it with extended discussion about my commercial product. So I've answered your question in the thread I linked to above.



cyrilblanc said:


> Do you something to hide ?



 Yes.


----------



## Nick Batzdorf

Peter is doing the right thing by not allowing this to turn into an 2-handed backhand inside-out left-handed reverse layout sukahara full nelson sales pitch for some third-party add-on.

Maybe that could turn into a trend for this thread, please?


----------



## cyrilblanc

I do not agree to have my post deleted without been noticed, it is the minimum politeness :(

I am confirming that ARTZ.Id is not necessary if you have upgrade to Logic 10.4 until Peter is adding templates for VSL instruments


----------



## Mike Greene

cyrilblanc said:


> I do not agree to have my post deleted without been noticed, it is the minimum politeness :(
> 
> I am confirming that ARTZ.Id is not necessary if you have upgrade to Logic 10.4 until Peter is adding templates for VSL instruments


Cyril, you need to stop. Consider this your notice. If you want to talk about Peter's app, go to the appropriate thread and do it there.


----------



## Dewdman42

I am quite possibly going to be acquiring VSL soon and if I do I will work on some VSL custom Scripter scripts for Logic. It will be a while before I can get to it though. Just FYI. Cyril, your best bet at this point is to roll up your sleeves and write some Scripter scripts yourself to handle what you want. There are brute force (yet simple) ways to program Scripter scripts that will do whatever you want to do in this area. I can offer some pointers to get you started if you provide specific information about what you want to happen.

I feel that Scripter is a good place to handle Articulation ID issues in LPX, because the Articulation Set feature from Apple is missing some crucial functionality to make it truly useful in all but the most simple situations. The Scripter plugin is just about the only place you can access the Articulation ID in LPX and turn it into combinations of key switches and channelizing, etc.. 

Scripter scripts can get big and complicated when you are trying to make them general use, with a GUI and so forth, but if you just want to hard code your rules into a simple script, they are not that complicated...


----------



## procreative

I have found you can channelise articulation IDs in Logic but so far it seems you need to put the channel in both the middle tab and the output tab to get the keyswitches triggering on the right slot in a multi.


----------



## Dewdman42

You can channelize with the articulation set, but CC's, PitchBend's and aftertouch are not channelized properly along with notes...so its almost a moot point. You will end up needing to use a script and in order to use a script you need the articulation Set to NOT do anything otherwise it clears out the articulation id. 

I'm working on a multi purpose channelizing script that will do everything needed to propagate all of the stuff mentioned above... and will also be compatible with Multi port mode.. Stay tuned...


----------



## procreative

Interesting. I have a script that is part of Artz ID that spits CCs across all midi channels but have not fully tested to see if this is an issue even with this.

But I also use automation via a script to drive CCs so maybe not noticed it.


----------



## Dewdman42

so some things to keep in mind about propagating CC's when channelizing.. If you propagate all CC's to all channels, you can impact the CPU and clog the midi stream. Its important to do so selectively. 

The script I am working on only sends the CC's from the source channel to any of the channelized channels that are CURRENTLY sustaining notes. So basically if you are playing a monophonic line...the part will be channelized to one other channel at a time and the source channel CC will only be propagated to the channel where a note is playing. So basically no extra CC's will be present on any channels anywhere.

It will also propagate PitchBend, program change and aftertouch if you want it to. (configurable)

I realize at this point, pretty much nobody is using VEP Multi port mode, but I am also making my script smart enough to handle incoming midi from VEP Multi port mode encoding and it will be able to take any number of input "source" channels, and then channelize them across to the next _n_ number of channels. So for example, with one instance of the script, you can take port 1, channel 1 and have it channelized to 10 midi channels, and then have another source channel that is port 1, channel 11 and have it channelized to 15 channels (going across ports if need be), etc.. So basically if you use VEP Multi Port template, then you funnel many tracks through a single instance of the script and it can have multiple input source channels that get channelized (and CC's propagated), out to as many as 768 midi channels in a single VEP instance.


----------



## Peter Schwartz

Hi procreative, I've said it before... It is not an issue. Never has been, never will be. No need to spend any time testing it.

I've posted about this before too, but perhaps it bears repeating:

• If you program Logic's articulation sets to output articulation-switching events, any CC messages, pitch bend, etc. will get encoded with the ID you've intended for notes. *There is no way around this.* And the downside of this situation been examined at length here in this thread.

On the flip side...

• When you don't program articulation sets to output articulation-switching events, and instead you use a Script to provide the articulation-switching messages (in response to ID's) you won't run into any issues.

• You can use articulation sets to directly correlate ID's to MIDI channels for multi-timbral operation.


----------



## Dewdman42

Peter Schwartz said:


> • If you program Logic's articulation sets to output articulation-switching events, any CC messages, pitch bend, etc. will get encoded with the ID you've intended for notes. *There is no way around this.* And the downside of this situation been examined at length here in this thread.
> 
> On the flip side...
> 
> • When you don't program articulation sets to output articulation-switching events, and instead you use a Script to provide the articulation-switching messages (in response to ID's) you won't run into any issues.
> 
> • You can use articulation sets to directly correlate ID's to MIDI channels for multi-timbral operation. But again, you will run into the issues associated with CC's (etc.) becoming encoded with ID's as well.



The problem is that its way all too easy to end up with notes and CC's around them not sharing the same articulation ID, and its not super easy to enforce that with LPX as is. If you change the articulation id's of notes with the piano roll, then CC's you recorded at the same time will be not matching. If you add CC's by drawing on the interface, they also will not have the correct articulation ID unless you manually figure out what they need to be, matching the notes around them, and set all the CC events in the event viewer to match the notes that are sustaining at the time.

That is also relying on articulation id causing channelizing to happen to both notes and CC's(and pitchbend and aftertouch).

Its actually quite a bit more ideal to have Notes from a source channel channelized, and then all supporting events such as CC, Aftertouch and PItchBend, automatically propagated to the same channel where the notes are going, regardless of what articulation ID they have..most people don't actually want or need articulation ID encoded to their CC's and just confuses matters when it is. 

But ideally, a system would provide also the possibility that if and when CC events are encoded with articulation id, then explicit channelizing should happen there too.


----------



## Peter Schwartz

We've been down this road before. 

But when Scripting is involved as I described above, it doesn't matter at all what the ID's of CC's (etc.) are. Yeah, it ends up being eye-clutter in the event list. But there is no functional difference between a CC#1 message with (say) an ID of 23 and a CC#1 message with an ID of 3. So you can draw in/play in CC messages to your heart's content and sure, they might end up on different ID's. _But it doesn't matter._


----------



## Dewdman42

of course, with scripting you can do anything you want, but it depends on what the scripting is programmed to do hehe. It can be programmed to matter or not to matter. And if its programmed not to matter then how are CC's, pitch bends and after touches propagated to the channels where the notes are? Then it will depend on how the script is programmed to do that.


----------



## Peter Schwartz

Sure, you could write a Script that makes decisions based on the ID's of CC's (etc.) You could use the ID to route a CC to a particular VEPro port, or to transform those messages into other messages. But the practical applications of any of that are extremely few and far between. The CC's channel and number are sufficient to get them to where you need them to go and do what you want them to do (or transform them in-script). The ID can be safely ignored, and doesn't even have to be handled by a Script because (as you know) those ID's never make it out of the Scripter and on to the plugin anyway.


----------



## Dewdman42

That is one approach. That is not the approach my script is taking. My script will ignore artid in CC, PitchBend and aftertouch and propagate them according to the rules I mentioned above.


----------



## procreative

Two scenarios that make this argument irrelevant:

1. If you use a script to use automation to control ccs, then it wont ever be an issue

2. If you use the Event Input plugin in VEP then Ports dont matter either as each track in Logic with Event Input acts like a Audio Instrument track so has separate Art Sets and separate Midi.

For other scenarios Peter's solution works fine.


----------



## Dewdman42

I wasn't talking at all about Peter's solution, that is on another thread. I'm offering ideas and solutions outside the commercial realm here. As stated, it can all be scripted by anybody that wants to, and when I'm done working on my own script I intend to share it with everyone. Its not done yet...

But anyway regarding automation in point#1 above, I'm not sure the point you're making? Channelizing, whether driven by automation or driven by articulation id, still needs to propagate CC, PitchBend and AfterTouch to the channels where the notes are occurring. This is true with either method. Either method can be scripted. A future revision of my script will also provide for that possibility also.

I don't use the VEP Event Input Plugin and for various reasons don't even intend to, so the script I'm working on will not be applicable there. But it will still work with or without VEP multi port mode, so I guess you could use it with Event Input mode of operation also if you want to.


----------



## NoamL

@Dewdman42 thank you for the script on page 8, I've been trying to set up something similar all morning!


----------



## procreative

Dewdman42 said:


> But anyway regarding automation in point#1 above, I'm not sure the point you're making? Channelizing, whether driven by automation or driven by articulation id, still needs to propagate CC, PitchBend and AfterTouch to the channels where the notes are occurring. This is true with either method. Either method can be scripted. A future revision of my script will also provide for that possibility also.



Okay I think maybe you misunderstand. I am using automation via Smart Controls to trigger midi ccs. A midi fx script (inspired by something the guy at LogicScripts made) transforms automation into CC.

So instead of CC appearing in the Event list, there are Fader events.

These don't seem to be affected by the Articulation Set shenanigans or by note data being channelised.

As regards your general point, I agree as is the Logic system may well run into issues.

I am not 100% sold on Event Input yet as not put it to test with a project, it seems to behave okay so far but only for instances of VEP that use one stereo output of all slots rather than using a Multi Output version. I think it needs a lot more threads throwing at it or else there is a glitch with it if used that way.


----------



## mc_deli

I'd like to use articulation smart controls to solve an issue I'm having. I'd like to have 2 screens, one has to be holographic... I need a midi rocker pedal to screw some kitchen units together, I need to be present at a 40th birthday party, and watch the final... all at 8pm tomorrow... preferably on one Logic channel, with no whining feedback. A fluid control and a trimmer for well, trimming (I'm getting a bit shaggy, they'll think I'm one of the band), would also be great.


----------



## Dewdman42

Unfortunately, for that you need a flux capacitor


----------



## NoamL

So, @Dewdman42 after reading the entire thread we seem to be working on parallel lines.

I developed your original script into the following script for controlling delay values in Cinematic Studio Strings. The articulationID 1 = "Legato", and 2 = "Legato(start)" reserved for the first note of a legato phrase. In addition, I am using a -300ms track delay on the track, and then using positive delay (from the Scripter) to achieve net delays like -300, -200, -100, -20 etc.

The delays work very nicely, but for some reason the CC articulation selection doesn't seem to work. That is, when I manually select the legato articulation everything works, but if I select the pizzicato articulation before letting a phrase play, it doesn't automatically switch to legato even though the notes have the legato ArticulationID.

Any thoughts on where I might be screwing up?


Here is the Script. BTW, you & anyone else are free to make use of it.

============



Code:


function HandleMIDI(event) {

if (event.articulationID == 1 ) { // ArticulationID "Legato"
var CSSArt = new ControlChange; CSSArt.number = 58; CSSArt.channel = event.channel;
CSSArt.value = 95; CSSArt.send(); CSSArt.trace(); // senza sord.
CSSArt.value = 10; CSSArt.send(); CSSArt.trace(); // advanced legato
CSSArt.value = 80; CSSArt.send(); CSSArt.trace(); // legato on

if (event instanceof NoteOn && event.velocity <= 64) {event.sendAfterMilliseconds(0);} // slow legato is pulled back -300 ms net
if (event instanceof NoteOn && event.velocity >= 65 && event.velocity <= 100) {event.sendAfterMilliseconds(100);} // medium legato -200ms net
if (event instanceof NoteOn && event.velocity >= 101) {event.sendAfterMilliseconds(200);} // fast legato -100ms net
if (event instanceof NoteOff) {event.sendAfterMilliseconds(150)};} //Note Offs are delayed 150ms

else if (event.articulationID == 2) { // ArticulationID "Legato(starting note)"
var CSSArt = new ControlChange; CSSArt.number = 58; CSSArt.channel = event.channel;
CSSArt.value = 95; CSSArt.send(); // senza sord.
CSSArt.value = 10; CSSArt.send(); // advanced legato
CSSArt.value = 80; CSSArt.send(); // legato on
event.sendAfterMilliseconds(290);} // -10ms delay for the first note

else event.sendAfterMilliseconds(300);} // pass through all other events (CC1 etc) at 0 delay



============

I did verify that actually moving a CC58 controller does control CSS & select articulations, etc... and the traces in the Java console clearly show things like:

[ControlChange channel:1 number:58 [#26 LSB] value:95]

[ControlChange channel:1 number:58 [#26 LSB] value:10]

[ControlChange channel:1 number:58 [#26 LSB] value:80]

[ControlChange channel:1 number:58 [#26 LSB] value:95]

[ControlChange channel:1 number:58 [#26 LSB] value:10]

[ControlChange channel:1 number:58 [#26 LSB] value:80]

[ControlChange channel:1 number:58 [#26 LSB] value:95]

[ControlChange channel:1 number:58 [#26 LSB] value:10]

[ControlChange channel:1 number:58 [#26 LSB] value:80]

[ControlChange channel:1 number:58 [#26 LSB] value:95]

[ControlChange channel:1 number:58 [#26 LSB] value:10]

[ControlChange channel:1 number:58 [#26 LSB] value:80]

so I'm not sure where the hiccup is?


----------



## Dewdman42

I'll ponder your code. I presume for CSS, you're doing this because CSS doesn't have consistent delay times for different articulations, yes? You can make your post above prettier... over on the right side of the editor you can see an icon that looks like a page. click on that to see raw codes, then use the code tag like this:



Code:


[code]
function HandleMIDI(event) {
  // do bla bla
}

[/code]

so you end up with this:



Code:


function HandleMIDI(event) {
  // do bla bla
}


----------



## NoamL

I've narrowed the problem down a bit by throwing away extraneous code... This works:

======


Code:


function HandleMIDI(event) {
if (event.articulationID == 1 ) {
var CSSArt = new ControlChange; CSSArt.number = 58; CSSArt.channel = event.channel;
CSSArt.value = 10; CSSArt.send(); // Advanced Legato
event.send();}
else event.send(); }

=====

But this doesn't, in fact, it seems that Kontakt only picks up the "90" message:

======


Code:


function HandleMIDI(event) {
if (event.articulationID == 1 ) {
var CSSArt = new ControlChange; CSSArt.number = 58; CSSArt.channel = event.channel;
CSSArt.value = 10; CSSArt.send(); // Advanced Legato
CSSArt.value = 90; CSSArt.send(); // Con Sordino
event.send();}
else event.send(); }

=====


----------



## Dewdman42

for one thing you're missing an opening "{" after your else. When you modify code, make sure to hit the "run Script" button and check the console for errors


----------



## Dewdman42

Let me catch up after reformatting your first code so I can read it and what you did in this last post


----------



## NoamL

Thanks, by the way! I'm a newbie to Javascript.


----------



## Dewdman42

ok here is your code reformatted a bit better so its easier to read.....

Originally I saw a missing "{" but I might not have read it correctly due to formatting. Its hard to tell because what your sent originally was formatted very poorly and hard to track all the curly braces, but I think you MEANT to have the logic shown below maybe...

I would also optimize it a tiny bit like this, note you can create the CC just once



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) { 

       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send();
       CSSArt.trace(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send();
       CSSArt.trace(); // legato on

      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(0);
      } 
     
      // medium legato -200ms net
      if (event instanceof NoteOn 
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(100);
      } 

      // fast legato -100ms net
      if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(200);
      } 

      //Note Offs are delayed 150ms
      if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(150);
      } 
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {
       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send(); // legato on
       event.sendAfterMilliseconds(290);
   } // -10ms delay for the first note

   // pass through all other events (CC1 etc) at 0 delay
   else {
       event.sendAfterMilliseconds(300);
   } 
}


----------



## Dewdman42

I reformatted it wrong... stay tuned...


----------



## NoamL

In the cleaned up version it looks like the CSSArt is executing when it gets an event of ArticulationID=1, but no other code is executing because of the }, right?

What I want to do is:

Check event's ArticulationID, if it =1 --> send appropriate CC58 to switch the articulation on, then send NoteOn/Off events with a variable delay based on event velocity (and send non-note events through)
Else if ArticulationID=2 --> send appropriate CC58, then send notes with no delay (and send non note events through)
Else if ArticulationID=3 --> send appropriate CC58, and some other instruction (etc)
Else if ArticulationID=4 --> etc

Basically the code as a whole is a massive if 1... else if 2... else if 3... else if 4... checker that executes a different block of code depending on what ArticulationID it detects.

I believe this does that:



Code:


var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) {

       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send();
       CSSArt.trace(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send();
       CSSArt.trace(); // legato on
 
   // slow legato is pulled back -300 ms net
   if (event instanceof NoteOn && event.velocity <= 64) {
       event.sendAfterMilliseconds(0);
   }
 
   // medium legato -200ms net
   if (event instanceof NoteOn
             && event.velocity >= 65 && event.velocity <= 100) {
       event.sendAfterMilliseconds(100);
   }

   // fast legato -100ms net
   if (event instanceof NoteOn && event.velocity >= 101) {
       event.sendAfterMilliseconds(200);
   }

   //Note Offs are delayed 150ms
   if (event instanceof NoteOff) {
       event.sendAfterMilliseconds(150);
   }
 
   else {event.send();}
 
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {
       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send(); // legato on
       event.sendAfterMilliseconds(290);
   } // -10ms delay for the first note

   // pass through all other events (CC1 etc) at 0 delay
   else {
       event.sendAfterMilliseconds(300);
   }
}


----------



## Dewdman42

I corrected my post above...


----------



## Dewdman42

also you should add a catch for non-note's with articulation 1 that might come in......

_(note the else block in articulation 1 handling)_



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) {

       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send();
       CSSArt.trace(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send();
       CSSArt.trace(); // legato on

      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(0);
      }
    
      // medium legato -200ms net
      else if (event instanceof NoteOn
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(100);
      }

      // fast legato -100ms net
      else if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(200);
      }

      //Note Offs are delayed 150ms
      else if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(150);
      }
      else {
          event.sendAfterMilliseconds(300);
      }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {
       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send(); // legato on
       event.sendAfterMilliseconds(290);
   } // -10ms delay for the first note

   // pass through all other events (CC1 etc) at 0 delay
   else {
       event.sendAfterMilliseconds(300);
   }
}


----------



## NoamL

Oh, good catch, thank you!

The cleaned up version of the code still executes the delays flawlessly... that part is working. It's actually really exciting to see CSS work with absolutely no fiddling around moving the note starts... But the actual CC58 messages aren't being received by Kontakt for some reason! It's really weird, I see them in the console log. I'll make a video to show you, hold on...


----------



## Dewdman42

well..there is another problem with Kontakt that i haven't figured out..which is that when MIDI automation is used, it seems that automation is on a separate queue inside kontakt or something...so that you can't always guarantee kontakt will process the CC before the notes....IF....CSS is using automation or something like that to handle the CC's you're sending. Its not entirely clear to me right now, but I was having some problems like when trying to write a script for kirk Hunter Spotlight Solo Strings that I still haven't 100% figured out a work around. 

Send the CC's a bit early...that's all I can say. You have 300ms to work with..


----------



## NoamL

That's a VERY good point. I forgot that the delays worked the other way round too.

Here's a video of the code working (except the CC58 sends)

https://www.dropbox.com/s/2i2sf9m7n2lxhrg/ArticulationSwitcher.mp4?dl=0

What's a good method to send the CC's early? I tried to insert setTimeout() and Thread.sleep between the CC sends but the code base doesn't seem to recognize either one when it gets to that line during execution.


----------



## Dewdman42

-


----------



## Dewdman42

Actually, that is not a good enough solution...what I just sent... because some of your notes need to be sent earlier...before the CC...

So... let me think about it...


----------



## Dewdman42

Try this. change your track delay to -310ms and note the new delay times... I think I have it right..you might need to tweak a bit.



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) { 

       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send();
       CSSArt.trace(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send();
       CSSArt.trace(); // legato on

      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(10);
      } 
     
      // medium legato -200ms net
      if (event instanceof NoteOn 
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(110);
      } 

      // fast legato -100ms net
      if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(210);
      } 

      //Note Offs are delayed 150ms
      if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(160);
      } 
      else {
           event.sendAfterMilliseconds(310);
       }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {
       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send(); // senza sord.
       CSSArt.value = 10;
       CSSArt.send(); // advanced legato
       CSSArt.value = 80;
       CSSArt.send(); // legato on
       event.sendAfterMilliseconds(300);
   } // -10ms delay for the first note

   // pass through all other events (CC1 etc) at 0 delay
   else {
       event.sendAfterMilliseconds(310);
   } 
}


----------



## NoamL

Yep, I'll research it as well. Thanks for all your help in the meantime. There's no rush on this BTW. It's exciting that we're able to use articulation switching to build our templates. I'm going to mess around with it more during Memorial Day weekend.

EDIT: the latest code you suggested doesn't work. I see what you're trying to do (make the CC messages come before even the slowest notes, hence the extra 10 millisecond delay?) but I think the problem is not that the CC messages are arriving after the notes, the problem is that Kontakt is only receiving & executing one of the CC58 messages. It looks like the last one (out of the three) is the only one that is actually executed. Thus, it's possible to do CC58 articulation selection if you only need to send one CC58 message (like CC58=31, pizzicato). Kontakt does reliably receive and execute that single CC58 before the notes, which is cool. But if you need to send multiple CC58 messages (91 no sordino, 10 advanced sustain, 80 legato on), it doesn't work.

I wonder... maybe it is possible to do it with a combination of CC58 messages and created keyswitch NoteOn messags? I'll experiment...


----------



## Dewdman42

yea that makes even more sense per the kontakt problem I was having with Kirk Hunter as it only affected chords with different articulations in the chord...in other words...several CC switches coming on the same midi tick.

I think then what you need to do is delay each one by 1ms. So....



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) { 

       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send();
       CSSArt.trace(); // senza sord.
       CSSArt.value = 10;
       CSSArt.sendAfterMilliseconds(1);
       CSSArt.trace(); // advanced legato
       CSSArt.value = 80;
       CSSArt.sendAfterMilliseconds(2);
       CSSArt.trace(); // legato on

      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(10);
      } 
     
      // medium legato -200ms net
      if (event instanceof NoteOn 
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(110);
      } 

      // fast legato -100ms net
      if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(210);
      } 

      //Note Offs are delayed 150ms
      if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(160);
      } 
      else {
           event.sendAfterMilliseconds(310);
       }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {
       CSSArt.channel = event.channel;
       CSSArt.value = 95;
       CSSArt.send(); // senza sord.
       CSSArt.value = 10;
       CSSArt.sendAfterMilliseconds(1); // advanced legato
       CSSArt.value = 80;
       CSSArt.sendAfterMilliseconds(2); // legato on
       event.sendAfterMilliseconds(300);
   } // -10ms delay for the first note

   // pass through all other events (CC1 etc) at 0 delay
   else {
       event.sendAfterMilliseconds(310);
   } 
}


something like that. Play with values until it works. you can probably bring them closer in to 300, you don't need 10ms buffer there. The main thing is to get each CC event coming into Kontakt on a different midi tick. if that doesn't work, try 2ms apart.


----------



## NoamL

This works!



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) {

       CSSArt.channel = event.channel;
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato on.
       var on = new NoteOn;
       on.pitch = 35; //create Bb0, low velocity... 
       on.velocity = 1;
       on.send(); // ...to turn con sordino off
       var off = new NoteOff(on);
       off.sendAfterMilliseconds(1); // end note soon after

      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(0);
      }
   
      // medium legato -200ms net
      else if (event instanceof NoteOn
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(100);
      }

      // fast legato -100ms net
      else if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(200);
      }

      //Note Offs are delayed 150ms
      else if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(150);
      }
     
      //Pass through other events in real time
      else {
          event.sendAfterMilliseconds(300);
      }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {

       CSSArt.channel = event.channel;
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato on.
       var on = new NoteOn;
       on.pitch = 35; //create Bb0, low velocity... 
       on.velocity = 1;
       on.send(); // ...to turn con sordino off
       var off = new NoteOff(on);
       off.sendAfterMilliseconds(1); // end note soon after
       
       event.sendAfterMilliseconds(290);
   } // -10ms delay for the first note

   // pass through other events in real time
   else {
       event.sendAfterMilliseconds(300);
   }
}


It's not pretty but it works


----------



## Dewdman42

Definitely if you can avoid sending CC switches to Kontakt I recommend it. Note key switches are much preferable. The only drag is you have to send a noteOff too, but oh well, if it works it works. 

Kontakt has some strange queuing issues... Note you can optimize your script by creating the NoteOn just once at the top of the script globally and reusing it.

You can also make your code prettier by creating some reusable functions and drive with data...but I digress.


----------



## NoamL

The one you just suggested works also... but it's a bit unreliable, it doesn't always snap on if I am doing other stuff like moving the modwheel. I'm guessing that sometimes Kontakt "misses its cue" and doesn't pick up all the CC58 messages. It looks like we've encountered a limit of one CC message (probably per CC), per actual note of music that the code executes. Oh well, that's pretty reasonable! (BTW, if only the output section of the Articulation Sets was more usable....)

I'll share the full CSS patch once I figure out the delay values for all articulations. Thanks a lot for your help this evening. I look forward to seeing what you come up with as well!


----------



## Dewdman42

also, you can send a NoteOff by setting velocity=0 on a NoteOn and it will be counted as a note off.

So try this prettier code



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;
var on = new NoteOn;

function HandleMIDI(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) {
   
      sendSwitches(event);
     
      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          event.sendAfterMilliseconds(0);
      }
   
      // medium legato -200ms net
      else if (event instanceof NoteOn
                && event.velocity >= 65 && event.velocity <= 100) {
          event.sendAfterMilliseconds(100);
      }

      // fast legato -100ms net
      else if (event instanceof NoteOn && event.velocity >= 101) {
          event.sendAfterMilliseconds(200);
      }

      //Note Offs are delayed 150ms
      else if (event instanceof NoteOff) {
          event.sendAfterMilliseconds(150);
      }
     
      //Pass through other events in real time
      else {
          event.sendAfterMilliseconds(300);
      }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {

       
       event.sendAfterMilliseconds(290);
   } // -10ms delay for the first note

   // pass through other events in real time
   else {
       event.sendAfterMilliseconds(300);
   }
}

function sendSwitches(event) {
       CSSArt.channel = event.channel;
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato on.

       on.pitch = 35; //create Bb0, low velocity... 
       on.velocity = 1;
       on.channel = event.channel;
       on.send(); // ...to turn con sordino off
       on.velocity = 0;
       on.send(); // end note soon after
}


----------



## Dewdman42

In general I would say, if you can use a NoteOn key switch instead of CC, whenever possible use it (with kontakt). If you have to use a CC, then just don't send them back to back on the same beat position. you have to send them with a little breathing room. NoteOn's on the other hand CAN be sent back to back without any delay and the ordering will work.


----------



## NoamL

Wow, that's much more logical. Thank you. I have much to learn 

(btw there needed to be a SendSwitches for the ArtId=2 block of code as well, but I spotted that easily enough)


----------



## Dewdman42

if it were me I'd probably structure the code so that I have a function that calculates the delay time and returns it as a value, so that HandleMIDI() is simple function that simple does this:



Code:


HandleMIDI(event) {
   sendSwitches(event);
   event.sendAfterMilliseconds(getDelay(event));
}


----------



## Dewdman42

Well actually I didn't take into account artid1 and 2, but you get the idea. I like to keep HandleMIDI() as simple as possible and box up the clever functionality into discrete functions in a way that will be easier to understand in a year from now.


----------



## Dewdman42

for example...



Code:


// create just once
var CSSArt = new ControlChange;
CSSArt.number = 58;
var on = new NoteOn;

function HandleMIDI(event) {
   sendSwitches(event);
   event.sendAfterMilliseconds(getDelay(event));
}


function sendSwitches(event) {
       CSSArt.channel = event.channel;
       CSSArt.value = 10;
       CSSArt.send();
       CSSArt.trace(); // advanced legato on.

       on.pitch = 35; //create Bb0, low velocity...
       on.velocity = 1;
       on.channel = event.channel;
       on.send(); // ...to turn con sordino off
       on.velocity = 0;
       on.send(); // end note soon after
}

function getDelay(event) {

   // ArticulationID "Legato"
   if (event.articulationID == 1 ) {
    
      // slow legato is pulled back -300 ms net
      if (event instanceof NoteOn && event.velocity <= 64) {
          return 0;
      }
  
      // medium legato -200ms net
      else if (event instanceof NoteOn
                && event.velocity >= 65 && event.velocity <= 100) {
          return 100;
      }

      // fast legato -100ms net
      else if (event instanceof NoteOn && event.velocity >= 101) {
          return 200;
      }

      //Note Offs are delayed 150ms
      else if (event instanceof NoteOff) {
          return 150;
      }
    
      //Pass through other events in real time
      else {
          return 300;
      }
   }

   // ArticulationID "Legato(starting note)"
   else if (event.articulationID == 2) {      
       return 290;
   } // -10ms delay for the first note

   // pass through other events in real time
   else {
       return 300;
   }
}


----------



## cyrilblanc

Dewdman42 said:


> Cyril, your best bet at this point is to roll up your sleeves and write some Scripter scripts yourself to handle what you want.


You do not need to write a Scripter Script to make it work with VSL instruments
Specially that in Scripter you cannot define new articulation in the Score

*I use the velocity of the articulation change to select the Y value of the Matrix of the VI instrument.*
Then you just need a transformer using a Map in your environnement to convert the velocity to a CC or a note 





For Dimension VI you will need a Z ; you can use the Midi channel of the articulation change to select the Z value of the Matrix

Let's hope that the Logic team will improve what they have already done


----------



## NoamL

Oh yes, for sure @Dewdman42 . I have a part of the code that changes CC1 and velocity according to linear interpolation of a user specified array, but only when one articulation is active... it's gonna be a mess. So I'll definitely keep your ideas in mind for presenting it as cleanly as possible.


----------



## Dewdman42

don't completely understand that last bit, but elaborate if you feel like it, I might have some ideas to get around the "mess"


----------



## Dewdman42

data driven like that is a much better way to go in general. You could do the previous problem we've been talking about also using a data driven table to determine the delay value, for example, instead of a big if/then/else section.

One thing that does suck about LPX Scripter is that there is no way to read that data from a file or include any javascript from any other files. You just have to put it into an array and make it as pretty as you can, move it to the end of the file if you can so people don't have to look at it. There are a number of ways to construct the arrays the way you're doing it above is good, an array of objects but if you have more dimensions to it, then some other structuring...but its all doable. I wouldn't call that a mess. A lot of musicians won't understand your script...(shrug)

I'm curious to understand more about what you're trying to accomplish with midi and instruments..


----------



## cyrilblanc

An improved Environment


----------



## NoamL

I have some questions for you experts 

1. Does *NoteOff* need to have ArticulationID? Actually, does NoteOff really need to have any information other than the pitch and sending it on the correct channel? Indeed, is there any potential for massive screw ups if you just don't send NoteOffs at all for articulations like Spiccato and Pizzicato?

2. No doubt a super stupid question but... how do you use ArticulationID with a MIDI *keyboard*? Like, let's say I just selected a track and want to start playing Pizzicatos in, (as opposed to drawing a bunch of notes and selecting them and using the Articulations dropdown in the Piano Roll to make them Pizzicatos). With the MIDI keyboard how do I select which articulation I want to play? To be clear, on my ArticulationSets presets, I have the "Articulations" page filled out with numbered arts, and the "Output" and "Switches" pages are both untouched.


----------



## Dewdman42

NoteOff only needs an articulation Id if something down stream is needing it to generate a key switch or do something else to the NoteOff. I would hazard a guess that most sample libraries don't care about anything for NoteOff. So probably doesn't matter most of the time, but that is not to say someone might do something sometime down the road where they want to use articulation ID to do something special to NoteOff events. Articulation ID can be used for any purpose you want. Its just a tag on the event.

in answer to your second question, the answer is in the Articulation Set feature. It listens to incoming key switches and encodes subsequent events with articulation id, as configured in the articulation set.


----------



## Dewdman42

and yes, you ALWAYS want to send the NoteOff for any NoteOn.


----------



## Peter Schwartz

If you're using Scripting to generate keyswitch notes for a patch, you definitely want to send the note offs associated with each note on. It's best practice to do this for sure. I'd suggest sending them anywhere from 5 to 25 milliseconds after the note on for a keyswitch is generated by the Script. This can be easily accomplished using *note_off.sendAfterMilliseconds(x)*, where:

*note_off* is a variable you created to generate a note off message
*x* is the number of milliseconds


----------



## Peter Schwartz

Additionally, when you generate keyswitch note on & note off messages in a Script, you do not need to specifically define an ID for them -- unless you're driving an EXS-24 patch with these Script-generated keyswitches. Remember, ID's are internal, Logic-specific additional values that no plugin other than EXS-24 can possibly react to. When Logic transmits any type of MIDI message encoded with an ID to a 3rd party plugin, those ID's are never included. Articulation ID is not part of the MIDI spec, so there's no way to communicate them to 3rd party plugins. 

So if you have this event...

*Note On, ch1, pitch = 60, vel = 80, ID = 4*

...and it's sent to (say) Play or Kontakt, it gets there looking like a normal MIDI message, sans ID:

*Note On, ch1, pitch = 60, vel = 80*


----------



## NoamL

Got it, thanks.

I would imagine that internally Logic does not know the correct ArticulationID to assign to controller messages if I, for instance, draw some CC1 data underneath the notes?

So, my code has a LastKnownArticulation variable, and I update this every time there's a NoteOn event (including keyswitch notes). Controller data is sent using the last known ArticulationID, which means the code also figures out the right channel to send it to if the instrument has multiple nki's loaded in the same instance of Kontakt...


----------



## Peter Schwartz

BTW, there are two ways you can generate a note off message in a Script. The first way is to send a Note On message with a velocity of zero.

The other is to specifically define a Note Off message (as I mentioned above) and send it with a velocity of your choosing, though 64 is a good value as it's the default "release velocity" value for note off messages.

But you might save yourself the trouble of defining actual Note Off messages at all, because Logic will always send actual Note Off messages from a Script when you generate Note On messages with a velocity of zero. Example:

If you generate this from a Script...

*Note On, Pitch = 60, Velocity = 0 *

...Logic will output this from the Scripter:

*Note Off, Pitch = 60, Velocity = 64*

So you can save yourself some trouble by generating keyswitch notes in a Script like this:

var note_on = new NoteOn;

Then when you generate your keyswitch notes, you send this:

note_on.pitch = (whatever the pitch needs to be);
note_on.velocity = (whatever the velocity needs to be, but higher than zero);
note_on.send();

The above defines the pitch and velocity for the Note On message of the Script-generated keyswitch note. At this point, to generate the Note Off you don't need to redefine the pitch. Just define the velocity (it's going to be zero). So the above code can be immediately followed with this:

note_on.velocity = 0;
note_on.sendAfterMilliseconds(small delay value);

And when the Scripter outputs this second message, it will output

*Note Off, Pitch = 60, Velocity = 64*


----------



## NoamL

Yes I'm aware. It's a cool feature!


----------



## NoamL

NoamL said:


> So, my code has a LastKnownArticulation variable, and I update this every time there's a NoteOn event (including keyswitch notes). Controller data is sent using the last known ArticulationID, which means the code also figures out the right channel to send it to if the instrument has multiple nki's loaded in the same instance of Kontakt...



To explain this a bit better, in case you think I'm under the misapprehension that ArtID is actually sent to Kontakt, I have a script that goes:

if (articulationID == 1) {do stuff and send note on channel 1}

else if artid's 2 {do stuff and send note on channel 2 }

etc

The ArticulationID information never leaves the Scripter.


----------



## Peter Schwartz

Logic won't keep tabs on the value of your LastKnownArticulation variable and apply that to CC's you draw in. I believe it's the last ID you selected vis a vis the keyboard (per your example of using a keyboard as a "MIDI Remote", above) that will be used.

I don't know what your specific application is, but my feeling is that you'd be better off identifying CC's by MIDI channel rather than ID. MIDI channels are as good an indicator of the intended destination of a CC than anything else, and a whole lot easier to deal with both in MIDI Automation (MIDI Draw) and Scripting.


----------



## Dewdman42

I think in general it’s a good idea to consider that other people might put other scripts in midifx plugin slots after yours which might look at articulation id. So while kontakt won’t see it, other scripts might. Does that mean you always should? Dunno, depends and what events you are adding. Just wanted to point out that there is always a possibility for having more scripts that do more stuff before it hits kontakt.


----------



## NoamL

That's a VERY good point I hadn't considered at all. 

Is there anything super clunky and bad about sending NoteOffs to all 16 MIDI channels? I'm picturing a scenario where you have a long sustain and then a staccato, but the staccato NoteOn comes before the sustain's NoteOff because the composer didn't bother to be accurate. In this case my script would accidentally send the sustain NoteOff on the MIDI channel associated with the staccato articulation.


----------



## cyrilblanc

It is possible to add an "end of note" in the environment you must select "------" in the result of a transformer


----------



## Peter Schwartz

NoamL said:


> Is there anything super clunky and bad about sending NoteOffs to all 16 MIDI channels?



Totally clunky.  You really only want to send Note Offs when required, i.e., when a note is actually released.

FWIW, there are more scenarios than not where it's totally desirable/expected for notes to overlap, even when they're playing different articulations. To put it another way, it usually produces a poor sonic result if notes and the articulations they sound are forced to be monophonic. Pretty much all plugins/patches will produce the sound of multiple notes (each with a different articulation) simultaneously. CSS is one of the few that balks at this in certain situations, such as when trying to get staccato and legato to sound at the same time, but there's not much that can be done with Scripting to improve that behavior. Anyway, I'd suggest developing your script so that composers aren't forced to be exacting in terms of when notes end unless that's absolutely essential to what you're trying to accomplish.


----------



## cyrilblanc

When you convert something to a Note i.e. to change articulation, you need to add a note off, otherwise you go into problems


----------



## procreative

Just backtracking a little. I tested some more using the Logic system without any scripts, ie Native. I tested with both Sable (SCS) using UACC and 8Dio Anthology using Note Keyswitch.

So I played in a Sustain part recording ID1 into notes as I played in, then on a second pass I recorded Modwheel CC1 data and Vibrato CC2 data.

Then I changed some of the notes to Tremolo and some to Flautando.

What I noticed is glitches with playback during the recording of CC data where the CC moves were causing notes to playback using ID1. However once I stopped recording and played back, everything seemed to play back correctly.

Granted there is still GUI madness going on where CC events are switching articulations, but the actual notes seem to play back fine and include the dynamics and vibrato changes.

I don't know if (a) there are some libraries that do glitch on playback but so far not found any and (b) if there is something I am doing differently.

By the way also tried recording notes and CC data together and still plays back okay.

I even added a second pass of notes using different articulation and both sets of notes play back with correct articulations.


----------



## cyrilblanc

It is a good idea to have a fast disk reserved for recording


----------



## procreative

cyrilblanc said:


> It is a good idea to have a fast disk reserved for recording



Not sure what that has to do with anything? Glitches I mention are not streaming glitches but incorrect articulations.


----------



## cyrilblanc

The best is to spread the different loads !
Disk access load is important when playing samples
Where are your samples ?
You need to load them and you write on disk what you record


----------



## procreative

cyrilblanc said:


> The best is to spread the different loads !
> Disk access load is important when playing samples
> Where are your samples ?
> You need to load them and you write on disk what you record



Sorry but you still are talking about something else, its nothing to do with CPU/RAM/Streaming etc its about the Logic Articulation Sets and how CCs react with Articulation IDs being embedded in them.

It has nothing to do with samples not playing back due to disk streaming.


----------



## cyrilblanc

99 % of the problems with Logic are due to a lack of cpu or disk resources !
It is the 1st thing to check, and I was giving you tracks to look at !
Forget about what I said !:(


----------



## procreative

Accepted, however this thread from the start has been about issues with articulation switching when combined with CCs that have IDs attached to them. 

I was demonstrating that in my experience the issues were confined to recording and once recorded playback was okay.

The issue most have reported is that Midi CC when recorded on a track with Articulation Sets present has an Articulation ID attached which is the same as whichever ID was set to a note recorded with it.


----------



## NoamL

Peter Schwartz said:


> Totally clunky.  You really only want to send Note Offs when required, i.e., when a note is actually released.
> 
> FWIW, there are more scenarios than not where it's totally desirable/expected for notes to overlap, even when they're playing different articulations. To put it another way, it usually produces a poor sonic result if notes and the articulations they sound are forced to be monophonic. Pretty much all plugins/patches will produce the sound of multiple notes (each with a different articulation) simultaneously. CSS is one of the few that balks at this in certain situations, such as when trying to get staccato and legato to sound at the same time, but there's not much that can be done with Scripting to improve that behavior. Anyway, I'd suggest developing your script so that composers aren't forced to be exacting in terms of when notes end unless that's absolutely essential to what you're trying to accomplish.



Yes, I understand not to invoke monophonic unless it's needed (e.g. simulating legato mode for libraries that don't have legato).

Here's what I'm using right now:



Code:


if (event instanceof NoteOn) {

//a whole bunch of stuff!!!!!

//sending Note On here

//then update two previously declared GLOBAL variables:
 
var UnkilledPitch = SentPitch;
var UnkilledChannel = SentChannel;
}

else if (event instanceof NoteOff) {

/* If we receive a NoteOff event, first we take NewNoteOff (intialized as global var) and give it the right pitch:  */

NewNoteOff.pitch = event.pitch;
NewNoteOff.velocity = 64;

//Now we need to know what channel to send it on. Very hopefully:

if (NewNoteOff.pitch == UnkilledPitch) {
           NewNoteOff.channel = UnkilledChannel;
           UnkilledPitch = null;
           UnkilledChannel = null;
           NewNoteOff.sendAfterMilliseconds(WhateverDelay);
       }

//but if not we're shooting blind:

else {
           UnkilledPitch = null;
           UnkilledChannel = null;
           for (i = 1; i < 17; i++) {
           NewNoteOff.channel = i;
           NewNoteOff.sendAfterMilliseconds(WhateverDelay);
            }
}


pretty gross! (and excuse my formatting, it's not translating well from SublimeText) Is there a better method?


----------



## NoamL

Thinking about it more carefully, the above method fails instantly when you have simultaneous NoteOns like staccato chords.

I'm guessing the real answer is to have an array of unkilled notes and, as the last step of NoteOn handling, push an UnkilledNote object into the array with UnKilledNote.pitch and UnkilledNote.channel?


----------



## Dewdman42

pretty much yea, keeping track of notes and note off's is one of the hardest things to do. Its all doable, but yea..its not simple. Some of the example scripts supplied by Apple get into that you can see what they did.


----------



## procreative

Can anybody tell me which libraries are causing playback issues as so far I have not found any that don't work with the Articulation Sets and Midi CC. Just so I can test them (if I have them) as I am stumped as to why I am not getting the same issues?


----------



## Peter Schwartz

Keeping track of notes isn't that hard to do. Here's a simple Script that will detect Note On & Off messages. The Script creates an array (active_notes), with 128 elements (one for each of the 128 possible MIDI Notes, numbered from 0 - 127).

Example: when a Note On of C3 (MIDI Note #60) comes in the door, array element #60 will be given a value of 1. When the Note Off for that is detected, array element #60 will be set to zero. You can then test the array elements for the presence of a 1 or a 0 for each note.

var active_notes = new Array(128); // there are 128 possible MIDI notes

for (g=0;g<128;g++) { // this initializes all the values in the array to zero
active_notes [g]= 0;
}

var non,noff; // variables which will be used as flags for incoming note on and note off msgs

function HandleMIDI(e) {
non = e instanceof NoteOn && velocity > 0;
noff = e instanceof NoteOn && velocity == 0 || e instanceof NoteOff;

if (non) {
active_notes[e.pitch] = 1; // if a note is turned on, set the array element to 1
}

if (noff) {
active_notes[e.pitch] = 0; // if a note is turned off, set the array element to 0
}

} // (End of Handle MIDI function)

Of course, this code can be condensed down to a smaller form, but this lays it all out very step-by-step.


----------



## Dewdman42

keeping track of note's that are sustaining is not hard, but if you need to mess around with delaying notes and delaying the corresponding note off, etc is where it gets more complicated.


----------



## NoamL

@Peter Schwartz that's brilliant! 

@Dewdman42 NoteOff's don't need to be delayed though, right? When I have zero-latency mode on, I'm passing all NoteOffs in real time. And when the playback mode is on, I'm passing them with sendAfterMilliseconds(MasterDelay); (500ms in my script)


----------



## Dewdman42

NoamL said:


> @Peter Schwartz that's brilliant!
> 
> @Dewdman42 NoteOff's don't need to be delayed though, right? When I have zero-latency mode on, I'm passing all NoteOffs in real time. And when the playback mode is on, I'm passing them with sendAfterMilliseconds(MasterDelay); (500ms in my script)



If you are talking bout the keyswitch notes, then no...the note off can be a few milliseconds after the NoteOn. if you're talking about the articulations you are delaying, then yes, you need to delay the matching NoteOff by the same amount you are delaying their NoteOn.


----------



## NoamL

Wait, not sure I understand that. In my CSS script I'm adding a positive delay of 200, 300 or 400 ms to notes (which, against the negative track delay of -500, turns into -300, -200, -100). Are you saying that NoteOffs of these notes also need to be delayed by a velocity-sensitive amount, 200/300/400? I don't understand why that would be the case. Shouldn't NoteOff's be triggered just where their graphical position in the piano roll is? i.e. add 500ms delay so that it cancels out to 0 with the negative track delay?


----------



## NoamL

By the way I also wanted to thank you for spurring me to learn more about object oriented programming. I've read up on it and it's so powerful! Moving all the library-specific scripts to object methods (where CSS_Cellos, CSS_Violins, AdventureStrings_Cellos etc are objects) is much cleaner and smarter than my stupid, VisualBASIC "if else ladder" way of programming!


----------



## Dewdman42

yes absolutely other wise the notes that are delayed a lot will have a shorter duration then originally intended.


----------



## Dewdman42

Regarding OOP in Javascript I just want to say that Javascript is an extremely flexible language and can do some pretty cool stuff that you actually can't do with C++, Java and other object oriented languages that forces certain OOP principles on you. Actually in javascript you kind of have to jump through some hurdles to use traditional OOP practices in some cases. I have found that its better to keep the script as simple as possible. sometimes that means using objects, like as you noted, big sections of if/then statements are embarrassing code to share with knowledgable OOP programmers. but on the other hand, a lot of if/then statements are easy for typical musicians to understand. So if you're going to share the script with other musicians, there is a lot to be said for avoiding tricky OOP stuff. But some simple stuff is good and ok and a great use of javascript. You can create objects on the fly in Javascript and change their schema on the fly as you wish. That can definitely be very handy and easy enough to understand. When you start trying to get into inheritance and all that, it starts to be a bit academic and I have found the code becomes a lot longer then it needs to be, and probably not understandable at all by most musicians using LPX. That kind of OOP programming is more useful if you have code that is going to be reused a lot in a framework of some kind. In one-off scripts like Scripter has, its kind of doing a lot of proper OOP stuff just to be proper, but not really gaining that much.

But generally, I agree with you, if you can avoid some crazy if/then logic with some simple and smart organization of objects, then ultimately it will be easier to maintain the code. Just remember, most musician scripters will not understand polymorphism AT ALL and it will be very hard for them to understand OOP code using that approach in Javascript.


----------



## NoamL

Hi @Peter Schwartz , I updated your code a bit. This works within code that sends MIDI on different channels (for instruments like Adventure Strings where you need to load multiple NKIs in the same Kontakt instrument and have them listen to different channels).

Thanks again for giving me the core idea!



Code:


var ChannelTracker = new Array(128);
for (g=0;g<128;g++) {
  ChannelTracker[g] = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]; // 0-15 ~> 1-16
};

function NoteOn(pitch, channel) {
//after calculating and sending the note
 ChannelTracker[pitch][channel - 1] = 1;
};

function NoteOff(pitch) {
  for (i = 0; i<16; i++) {
    if (ChannelTracker[pitch][i] = 1) {
      //NoteOffchannel = i + 1;
      //NoteOffsend;
      ChannelTracker[pitch][i] = 0;
    }
  }
};


----------



## Peter Schwartz

Hi @NoamL,

I just saw your post (I've been away for a few days). Though I haven't tried it out, I have just one small suggestion for you...

The names "NoteOn" and "NoteOff" should be treated as reserved words so that they can't be confused (or generate conflicts). That's because these terms are already defined/reserved by Logic's javascript -- as found in these typical implementations:

var generated_note_from_my_script = new NoteOn;

...or...

if (event instanceof NoteOn == blah blah blah)...


----------



## Craig Allen

Dewdman wrote this almost a year ago.
I am wondering if the scenario has changed. I'm about to give Logic a whirl, but just picked up Cubase as well. Maybe my time is better spent there with Expression Maps...
Can anyone confirm current status?



Dewdman42 said:


> I feel that Scripter is a good place to handle Articulation ID issues in LPX, because the Articulation Set feature from Apple is missing some crucial functionality to make it truly useful in all but the most simple situations. The Scripter plugin is just about the only place you can access the Articulation ID in LPX and turn it into combinations of key switches and channelizing, etc..


----------



## Dewdman42

as far as I'm concerned yes. I bought Cubase since then and fooled around with Expression Maps a while to see what the fuss is all about and I feel that Expression Maps in Cubase provide some pros and cons, I think LogicPro still has the better solution because of Scripter. There are some annoying limitations with Expression Maps in Cubase. LogicPro has bugs in the ArticulationSet feature. So there simply is no perfect solution right now.


----------

