# Logic Scripter—multiport track delay compensation?



## Kent (Aug 12, 2020)

As the native track delay compensation feature is incredibly limited on MIDI tracks in Logic (i.e., -99 to +99 ticks, which change in length-of-time depending on tempo), I am trying to brainstorm solutions that work with MIDI tracks connected to an AU3 VEP track. (obviously the native feature works fine with Instrument and Audio tracks, but that's not what I'm talking about here).

Does anyone know if it would be possible (even if complex) to use the Scripter, probably set on the AU3 VEP instrument track, to set per-channel/per-port negative delays for the MIDI channels which feed into it? Or would it be agnostic at that point in the chain?

@Dewdman42 @NoamL etc. have you tried anything like this?


----------



## NoamL (Aug 12, 2020)

you can't set it to Milliseconds? (click on the word Delay itself)


----------



## Kent (Aug 12, 2020)

NoamL said:


> you can't set it to Milliseconds? (click on the word Delay itself)


That is correct. It does allow selection of ms, but it doesn't actually do anything...


----------



## Kent (Aug 12, 2020)

Well...this is interesting.

The Auto Delay Compensation is now...working? It was greyed-out before, on my old rig and also in this very template. I'll have to dig into why that is.

But: as long as it's working, I should be good to go.

the specific ms track delay _is_ broken though. Take this moment from a 100% quantized Damage 2 performance print (some RR, so the waveforms are a bit variable, but the transient timing is dead-on):

NO DELAY​




A bit behind the beat.


AUTO DELAY​




Nicely aligned!

-99 Ticks (@ 120 BPM)​




As expected, rather in front of the beat.

-250.0 ms​




Exactly where the "no delay" is, although we expect it considerably to the left...


----------



## jonnybutter (Aug 12, 2020)

Track delay has been screwed up for a long time in Logic. Not 100% broken, but goofy, buggy, etc. Sure they'll get around to it some eon


----------



## Dewdman42 (Aug 12, 2020)

you can also use expert sleepers latency fixer plugin to accomplish negative track delay by samples or by milliseconds.


----------



## MGdepp (Aug 12, 2020)

jonnybutter said:


> Track delay has been screwed up for a long time in Logic. Not 100% broken, but goofy, buggy, etc. Sure they'll get around to it some eon


Why fix that when you can add some features borrowed from Ableton Live instead ...?


----------



## Dewdman42 (Aug 12, 2020)

ps Scripter can't set negative track delay, you can perform positive track delays, but no meaningful negative track delays because there is no way to report latency to the host from Scripter.


----------



## jonnybutter (Aug 12, 2020)

MGdepp said:


> Why fix that when you can add some features borrowed from Ableton Live instead ...?



I actually like a lot of the new features in Logic. And older features. LPX is an incredible application




Dewdman42 said:


> you can also use expert sleepers latency fixer plugin to accomplish negative track delay by samples or by milliseconds.



Thanks Dewdman


----------



## babylonwaves (Aug 13, 2020)

kmaster said:


> the specific ms track delay _is_ broken though. Take this moment from a 100% quantized Damage 2 performance print (some RR, so the waveforms are a bit variable, but the transient timing is dead-on):



@kmaster 

i don't think the track delay parameter it's broken at all. it works fine and as expected here. one thing worth noting that might be confusing: you can have a delay in ticks and one in milliseconds at the same time.

also, try not to use a sample library to make those tests, because you don't know how the sample is cut and the instrument is setup etc. try using a synth and e.g. a sinewave or a short noise burst.


----------



## Kent (Aug 13, 2020)

babylonwaves said:


> @kmaster
> 
> i don't think the track delay parameter it's broken at all. it works fine and as expected here. one thing worth noting that might be confusing: you can have a delay in ticks and one in milliseconds at the same time.
> 
> also, try not to use a sample library to make those tests, because you don't know how the sample is cut and the instrument is setup etc. try using a synth and e.g. a sinewave or a short noise burst.


This is from a MIDI track? Not an Instrument track?


----------



## babylonwaves (Aug 13, 2020)

kmaster said:


> This is from a MIDI track? Not an Instrument track?


it's a VI (instrument) track


----------



## Kent (Aug 13, 2020)

babylonwaves said:


> it's a VI (instrument) track


Yeah, that's unfortunately not what I'm talking about then. The Instrument track delay functionality is fine.


----------



## Kent (Aug 13, 2020)

babylonwaves said:


> what are you talking about then? i thought you were using Kontakt?


I'm talking about a MIDI track (in this case, one of a MIDI Multi instrument) connected to an Instrument (in this case, a VEP server port Instrument—but it could just as easily be a natively-hosted Kontakt, or whatever). 


























Hope that makes more sense.


----------



## babylonwaves (Aug 13, 2020)

kmaster said:


> I'm talking about a MIDI track (in this case, one of a MIDI Multi instrument) connected to an Instrument (in this case, a VEP server port Instrument—but it could just as easily be a natively-hosted Kontakt, or whatever).


sorry, i didn't remember that part. ... you wrote it in your first post. I can't think of a way to test this really quick but if it's not working at all, you should get in touch with Apple. they should know.


----------



## Kent (Aug 13, 2020)

@babylonwaves they do know, it's a fairly well-documented 'bug'...but it can't hurt to add one more complaint to the pile  just looking for another solution in the meantime...


----------



## Kent (Aug 13, 2020)

Oh! @babylonwaves you did give me an idea that seems to solve one of my problems:

Making the Klopfgeist a track in the Arrange window, and then giving it a positive track delay in ms (it's an Instrument track, so this works), compensates for whatever delay my system is not picking up—looks like it comes to about 22 ms give or take in my case. Now, parts I play in both look and sound "correct" against the grid 

Hopefully this isn't too premature, but it seems like I've stumbled upon a good idea!

@Dewdman42 would Latency Fixer do the same thing?


----------



## Ashermusic (Aug 13, 2020)

I kicked those old MIDI instruments to the curb a long time ago and can’t imagine myself ever using them again.


----------



## Kent (Aug 13, 2020)

There is another possibility here:

My years of marching clarinet have given me a tendency to play considerably, though very consistently, ahead of the beat.

The plot thickens!


----------



## Dewdman42 (Aug 13, 2020)

I have had my doubts about sample accuracy when using environment instruments. I haven’t tested it though.

if the latency is consistent then you can use latency fixer. It basically just sits in a mixer strip and reporte latency without adding any. So if the environment is adding latency then you just report that amount with latency fixer and the pdc engine will do the rest.

I doubt though that the environment is adding significant latency, it’s the instrument you are using thst is doing it. It’s possible that using the environment is blocking pdc from doing what it needs to do, in which case latency fixer would be blocked also.

why do you feel you need to use the environment approach?


----------



## Kent (Aug 13, 2020)

Dewdman42 said:


> I have had my doubts about sample accuracy when using environment instruments. I haven’t tested it though.
> 
> if the latency is consistent then you can use latency fixer. It basically just sits in a mixer strip and reporte latency without adding any. So if the environment is adding latency then you just report that amount with latency fixer and the pdc engine will do the rest.
> 
> ...


I seem to get the same issues when it's a new blank project with, for example, a Sampler piano—I'm consistently and evenly ahead of the beat unless I consciously react late. So I am not so sure it is a particular instrument that causes it.

The 'Environment approach' allows me to: 
1. add MIDI transformers between the track and the instance so that I can control each instrument the exact same way on the front end
2. have a clear separation between MIDI out and Audio in, which makes compartmentalizing creative tasks more easy
3. take advantage of the default MIDI Volume option on the MIDI track inspector (don't need this for instruments I have articulation sets for, but it's useful for "simple" instruments like drum patches)
4. align in visual/functional style with the way I access my external synths
5. send more MIDI channels to the same VEP instance (if there is a way to do this in the "LPX Approach," I don't think I know about it)
6. avoid the kludgy feel of the "LPX Approach" (i.e., either add stuff through the New Track Wizard, which makes weird and confusing things happen in the Auxes, or make Auxes into track 'instruments,' which AFAIK don't operate too independently from each other). I'm honestly not sure whether those methods are full of bugs or just odd design choices, but I try to stay away from them if I can...


----------



## Dewdman42 (Aug 13, 2020)

please post a project using built in instruments that does what you say so I can check it out...


----------



## Dewdman42 (Aug 13, 2020)

In the meantime....



kmaster said:


> 1. add MIDI transformers between the track and the instance so that I can control each instrument the exact same way on the front end



I hear you. The alternative is to use Scripter without the environment



> 5. send more MIDI channels to the same VEP instance (if there is a way to do this in the "LPX Approach," I don't think I know about it)



You definitely don't need environment for this. With VePro.AU2 you can send up to 16 instrument tracks to one VEP instance. With VePro.AU3 you can send up to 127 source tracks to one VePro instance... Here are some templates I made for AU3 you can try out that do it:

https://gitlab.com/dewdman42/Logic-VEP-MultiPort-templates
Manual instructions for how to create a template like that is here:

https://www.logicprohelp.com/forum/viewtopic.php?f=9&t=143416




> 6. avoid the kludgy feel of the "LPX Approach" (i.e., either add stuff through the New Track Wizard, which makes weird and confusing things happen in the Auxes, or make Auxes into track 'instruments,' which AFAIK don't operate too independently from each other). I'm honestly not sure whether those methods are full of bugs or just odd design choices, but I try to stay away from them if I can...



LPX has 3 fundamental ways to handle multi-timbral instruments such as the VePro plugin, each with pros and cons. Here is a post I made on this subject a few years ago:

https://www.logicprohelp.com/forum/viewtopic.php?f=1&t=132635&p=697195&hilit=multi+timbral#p697195


I tend to find myself using the current Apple endorsed method, which is basically the approach used by the New Tracks Wizard, though you don't need to use the Wizard to set it up, but you do have to kind of understand how it works and once you do it won't seem that kludgey to you. One of the downsides is that linked volume slider which always is annoying to many people and it used to be annoying to me too, but it is that way for a sensible reason and unlikely to change. I just disable the volume slider from my track headers and mute/solo also for the same reason. Use the mixer for adjusting stuff like that instead.

See the above forum post about the pros and cons of multi-timbral approaches, all of the approaches have some pros and cons, including the classic old school method you are using. It may be that you have to live with some PDC issues, but I don't want to say for sure that is actually happening without more testing. It may be that PDC is not 100% in some way with the old school approach, either because playback is off or the way the midi is recorded to the track is not quite right, but if you have a simple project that demonstrates it, I'll check it out and try to confirm that.


----------



## Kent (Aug 13, 2020)

Well, I’m willing to re-examine how I do things, since my new rig is much more powerful than my old one and might like certain setups more than others.

for example, my old system hated Playback and Live mode...but my new system hates Playback mode (unless I constrain to 8 or fewer threads, but what’s the fun in having 20 then?)


----------



## Kent (Aug 13, 2020)

Dewdman42 said:


> In the meantime....
> 
> 
> 
> ...


There aren’t too many Scripter-specific resources, are there? It’s basically JS, right? But I’m sure there are some Logic-specific things worth knowing about. I wouldn’t mind having more granular control over things like articulation timings...


----------



## Kent (Aug 13, 2020)

Any good resources you know of specifically for Logic Scripter, @NoamL ? or did you just create Thanos, etc. from your existing JS knowledge?


----------



## NoamL (Aug 13, 2020)

Yes, I just taught myself javascript. Recommend https://www.w3schools.com/js/default.asp and look at some of the default scripts that come with Scripter to learn the things that are specific to the LogicX implementation of it. I found it much easier to understand than the Environment!!

There's not any pro software engineering going on under the hood of Thanos, it only uses if-then statements, switches, very simple use of functions, a few arrays, and a bit of math.


----------



## NoamL (Aug 13, 2020)

The official Javascript reference is good too. Start with functions









Functions - JavaScript | MDN


Functions are one of the fundamental building blocks in JavaScript. A function in JavaScript is similar to a procedure—a set of statements that performs a task or calculates a value, but for a procedure to qualify as a function, it should take some input and return an output where there is some...




developer.mozilla.org





since the bare bones of what the LogicX Scripter does is take some incoming MIDI event and apply a Function to it


----------



## Dewdman42 (Aug 13, 2020)

There are a few resources on the web but really most people just figure it out through trial and error and in logicpro help menu there is a user manual for “effects” where you can find a section for scripter that will explain it almost completely except for a few hidden features that you don’t need to know to get started. There are approaches which work that aren’t explained in the manual but for a lot of simple use cases it doesn’t matter really.

Im sure if you dig in and try some simple things step by step ask for help here I will be happy to help as will others I’m sure. I also find scripter to be way more straightforward then the environment. The environment is handy for filtering things BEFORE the midi hits the sequencer and for certain kinds of tasks but I usually just avoid it.


----------



## Kent (Aug 14, 2020)

Dewdman42 said:


> In the meantime....
> 
> 
> 
> ...


Testing out the new track wizard method, delays in ms DO work....BUT it seems that the whole "Instrument" (or at least the whole Port—haven't tested further yet) takes whichever track delay is set last.

In other words, if I have Inst 1 Port 1 Ch1 (let's call this Flute) and Inst 1 Port 1 Ch2 (let's call this Oboe) and set Flute to -126.5ms, both Flute and Oboe play back at -126.5ms. If I then set Oboe to +79ms, _even if I do not also change the Flute offset_, both Flute and Oboe play back at +79ms.

Am I just misunderstanding how to properly use this method? It seems like this is a bigger problem than only being able to offset MIDI events by ticks...


----------



## Dewdman42 (Aug 14, 2020)

You appear to be correct. Make sure to submit a bug report to Apple about this. 

And by the way if you use expert sleepers latency fixer you will have the same problem.

so basically I can surmise that the old school midi tracks are the only kind that are truly considered as midi tracks by logicpro and the track delay parameter in that case is a midi parameter ( that’s why it’s only in midi ticks), but since it’s an actual midi delay, it’s seperate for each source midi track.

the new tracks wizard way that parameter appears like it’s a track delay but it’s actually a mixer channel attribute I guess, which is why you can use ms, it basically is one setting for the whole mixer channel, for a similar reason that the track volume slider and mute/solo effect the whole channel.

(sigh)

there is still a way to get ms delay per track but you’re not gonna like it.

basically what you do is set the track delay to a large negative value ( or use latency fixer to do that) that will set one global negative delay for all tracks hitting that one vepro instance. But then you will have to use scripter to add positive delay back to each midi port/channel appropriately. In some cases you add the same amount of positive delay as your global negative delay setting, to net zero. In other cases you add less positive delay which results in net negative delay for only that one midi port/Chan.

note that scripter itself is global for the whole vepro instance so you have to write it all in one script, handling each port and channel as you wish.

most likely it will just be easier using old school tracks and just fudge around with tick based negative delay until it’s close enough.

the advantage of the scripter approach is that you also have the possibility to look at articulationid and set the net negative delay on a per articulation basis in addition to doing per port/channel. But you have some stuff to learn about scripter before you will get that done.

This brings us back to trying to figure out why your timing is off with the old school tracks. Do you have a simple LPX project that demonstrates that issue?


----------



## Dewdman42 (Aug 14, 2020)

In case you're feeling adventurous...here is a simple Scripter example which illustrates the possible solution I mentioned above. You can expand or tweak it to your own purpose, or at least learn a little about Scripter in the process...

In this first example, port 1, channel 1 will be sent with 23ms delay. Port 1, channel 2 sent with 54ms delay. All other port/channels sent with 60ms delay..


```
function HandleMIDI(event) {

    // port 1
    if(event.port == 1) {
     
        // channel 1
        if(event.channel == 1) {
            event.sendAfterMilliseconds(23);
            return;
        }
     
        // channel 2
        if (event.channel == 2 ) {
            event.sendAfterMilliseconds(54);
            return
        }
    }
 
    // all other channels
    event.sendAfterMilliseconds(60);
}
```

So in the above example, if you were, for example, to set the singular-track-delay parameter to -60ms, then channel 1 would be sent 60 - 23 ms early (37ms early) and channel 2 would be sent 6 ms early. The other unconfigured channels would be sent net zero.

Obviously you can copy and paste the *IF* blocks to configure as many port/channels in the midi as you want.


If you wanted to handle per articulation as well, that would be like this...


```
function HandleMIDI(event) {

    // port 1
    if(event.port == 1) {
     
        // channel 1
        if(event.channel == 1) {
            event.sendAfterMilliseconds(23);
            return;
        }
     
        // channel 2
        if (event.channel == 2 ) {
     
            if(event.articulationID == 1) {
                event.sendAfterMilliseconds(40);
                return;
            }
            if(event.articulationID == 2) {
                event.sendAfterMilliseconds(30);
                return;
            }
         
            event.sendAfterMilliseconds(60);
            return
        }
    }
 
    // all other channels
    event.sendAfterMilliseconds(60);
}
```

In the above example, channel 2 has artictulationID handling and it ends up with artid1 20ms early and artid2 30ms early and any other articulations on channel 2 would be net zero (assuming track delay is set to -60ms).

Hope that makes sense.

_I should note that with the articulationID stuff there can be other complications such as how to handle CC and pitchblend events when you have different articulations with different amounts of delay, etc.. and non-notes generally don't have articulationID, for example, so it can get a fair bit more complicated to manage all that in Scripter, but its all doable..and yes this is just like how Thanos does it, but Thanos takes care of those extra complexities and is specifically hard coded with all the correct delay values for CSS, ready to roll._

There are other more elaborate ways in Javascript to avoid a humongous list of *IF* statements, you can make a more data driven approach which will be easier to tweak and modify, rather then having to copy and paste . But its more complicated and the above approach is best for starting out with Scripter.


----------



## Kent (Aug 14, 2020)

Dewdman42 said:


> You appear to be correct. Make sure to submit a bug report to Apple about this.
> 
> And by the way if you use expert sleepers latency fixer you will have the same problem.
> 
> ...


Figured as much!



Dewdman42 said:


> there is still a way to get ms delay per track but you’re not gonna like it.
> 
> basically what you do is set the track delay to a large negative value ( or use latency fixer to do that) that will set one global negative delay for all tracks hitting that one vepro instance. But then you will have to use scripter to add positive delay back to each midi port/channel appropriately. In some cases you add the same amount of positive delay as your global negative delay setting, to net zero. In other cases you add less positive delay which results in net negative delay for only that one midi port/Chan.
> 
> ...



I was actually thinking through this last night, before I discovered this "feature" this morning. I think this is the way I'd go...sort of like a meta-mega-Thanos 



Dewdman42 said:


> This brings us back to trying to figure out why your timing is off with the old school tracks. Do you have a simple LPX project that demonstrates that issue?



So there are actually two issues here, and I'm pretty sure they are these:

1. I just naturally play ahead of the beat, probably due to years of playing a slower-speaking bass instrument in concerts and being further outfield in marching band—I feel like I'm lagging unless I'm ~20ms ahead of the beat. I do not think this is a computer issue. (Maybe I just need to get a giant control room with all the spare real estate and studio-building money I have lying around and sit 20 feet from my monitors? hah!)

2. I want to be able to adjust the negative track delay on instruments so they can be quantized to the grid but sound in time. Virtual instruments, by necessity, "play" slightly after the note on.

I do think that these two things are interacting, which is why I'm exploring potential solutions...but I really don't think my computer is "wrong." I'm just as ahead of the beat in a fresh empty project with an EXS Piano as I am in a project with 500+ MIDI instrument instances routed through four or five layers of auxes and effects.


----------



## Dewdman42 (Aug 14, 2020)

see my above javascript examples in case you missed it.

Back to your present situation though, I would really like to understand if using the old school midi tracks is messing with the timing. There are generally two things to worry about, and believe me, LPX does have some timing related bugs, especially when it comes to things like external midi and stuff like that, so I would not put it past it to have a bug in this way, but the big project you mentioned has a lot of other stuff going on which is probably clouding the issue, will comment on that in a sec.

Anyway, the timing issues are two fold.


latency during playback...if Logic detects latency in plugins, then it should use Plugin Delay Compensation (PDC) to play exactly on the click.


recording midi to tracks, LPX is supposed to be aware of your latency both from audio devices as well as plugins and do the "right thing" to make sure that the events end up on the timeline where you intended. I believe LogicPro goes by the notion that where you heard it is where it should end up on the timeline. In other words if there was plugin latency causing a delay in sound, then it should register the event on the timeline at the point in time when you heard it. During record it calculates the correct position and attempts to do that, taking into account all the latency factors it knows about (knock on wood)
Can it get confused by environment routings in step #2? Entirely possible. And some bugs do exist actually when using external midi tracks that way.

On top of that, you mentioned 4-5 layers of AUX channels and if you have plugin latency on those tracks it can start to get very confusing.

PDC works in two ways... if you have a latent plugin on an audio or inst track, then Logic will figure that out during playback, and feed the source track to it early so that it compensates for that latency and you hear the audio when you are supposed to hear it. during record it can't feed it early, so you will hear it late.

However AUX channels and the master output channel are handled differently. if you put plugins with latency on those channels, then Logic basically assumes its "live" audio and can't be fed early, so it will add delay to all other channels to bring them up in sync with the delayed aux channel.

This second way can get confusing if you have a lot of sends going different directions and I have no idea how LPX sorts all that out, but that is basically what its doing.

So when you record tracks with a lot of AUX channels, if any of those AUX channels or the master bus have any plugins with latency, it causes everything to be delayed, including the instrument you are trying to play.. And see point #2 above...in theory it should correct for all that and register the midi event on the track where you HEARD it, but perhaps some use case is not doing that right...or perhaps you are just getting confused by all the PDC happening with your AUX channels.


----------



## Kent (Aug 14, 2020)

Dewdman42 said:


> see my above javascript examples in case you missed it.
> 
> Back to your present situation though, I would really like to understand if using the old school midi tracks is messing with the timing. There are generally two things to worry about, and believe me, LPX does have some timing related bugs, especially when it comes to things like external midi and stuff like that, so I would not put it past it to have a bug in this way, but the big project you mentioned has a lot of other stuff going on which is probably clouding the issue, will comment on that in a sec.
> 
> ...


Yes, but like I said it's the exact same timing on an empty project with a basic instrument, so I'm more inclined to believe it's me than complexity.


----------



## Dewdman42 (Aug 14, 2020)

can you please post a simple empty project with the problem so I can look at it too?


----------



## Dewdman42 (Aug 14, 2020)

Also, when I create a simple project with an old school midi track feeding into environment instrument, which cables to EXS24, the track DOES allow for negative delay by ms. Click on the word "Delay" and you'll see a popup menu to choose the type of delay to use






I see now that this was brought up already earlier in the thread. what do you mean when you say the ms didn't work?

The Auto Compensation is not what I think you need. That is designed to basically mimic what the external instrument plugin does. its designed for when you are connecting raw midi tracks directly to hardware midi ports so that the sound coming from the external synth will line up with your LPX mixer. Typically, the midi needs to be delayed in order to line up with the LPX mixer which is delayed somewhat by plugin and audio card latency. So this Auto compensation setting basically adds some delay to midi events to match whatever delay is being caused by the LPX mixer overall...so that in theory if you were listening to your external hardware through an external hardware mixer, not coming back into LogicPro...then it would all sound in sync.

I don't think that's the option you want to try to use in this case, it will just confuse the issue.

The Delay in MS should be working though...??


----------



## Dewdman42 (Aug 14, 2020)

Now tell me how to replicate the audio rendering timing problem you're getting so that I can try to reproduce the problem here.


----------



## Kent (Aug 14, 2020)

Logic Test Projects


Shared with Dropbox




www.dropbox.com





Here's a very basic Logic project with two tracks, one recorded at 256 and one at 512. I'm more accurate at 512. I really think it's a me thing and not a Logic/computer setup thing, but let me know what you find.


----------



## Kent (Aug 14, 2020)

Dewdman42 said:


> Also, when I create a simple project with an old school midi track feeding into environment instrument, which cables to EXS24, the track DOES allow for negative delay by ms. Click on the word "Delay" and you'll see a popup menu to choose the type of delay to use
> 
> 
> 
> ...


Okay:
1. quantize a MIDI event or series of events to the grid
2. Using no "Delay" selection, print the audio output, via record, onto an audio track (via a bus from the audio return of the MIDI instrument)
3. Using "Delay in Ticks," -99 ticks, print the audio output, via record, onto an audio track (via a bus from the audio return of the MIDI instrument)
4. Using "Delay in Milliseconds," -500.0 ms, print the audio output, via record, onto an audio track (via a bus from the audio return of the MIDI instrument)
5. Compare the accuracy of the prints to the location on the grid

This is what I did here:https://vi-control.net/community/th...t-track-delay-compensation.97100/post-4617761


----------



## Dewdman42 (Aug 14, 2020)

the dropbox link isn't working...

using delay -500ms is working perfectly for me. Rendered audio is exactly 500ms earlier then the midi event.

it seems to work both with and without the auto compensation version (which I don't think you should be using anyway). I'm rather surprised that it came out the same both ways.


----------



## Kent (Aug 14, 2020)

Dewdman42 said:


> the dropbox link isn't working...
> 
> using delay -500ms is working perfectly for me. Rendered audio is exactly 500ms earlier then the midi event.
> 
> it seems to work both with and without the auto compensation version (which I don't think you should be using anyway). I'm rather surprised that it came out the same both ways.


I’ll look at the Dropbox link in the morning and make sure it’s working.

To clarify: You’re getting the -500 ms delay to work on the midi track itself? Not an instrument track or the audio track?


----------



## Dewdman42 (Aug 14, 2020)

babylonwaves said:


> i don't think the track delay parameter it's broken at all. it works fine and as expected here. one thing worth noting that might be confusing: you can have a delay in ticks and one in milliseconds at the same time.



This is very interesting and thanks for pointing this out, which is probably kmaster's problem. I did find this only has both delays in effect if the ticks one is the currently selected one. If the ms delay is the currently selected one, then only the ms delay will be used. Definitely a bug. everyone report this. It should be consistent in any case, preferably just the visible one active at once.


----------



## Dewdman42 (Aug 14, 2020)

Yes I used midi track, it connects to environment object, which is cabled to mixer strip, hosting EXS24 (which by default plays a sine wave sound). The output of the mixer strip goes to BUS1, which is used as the input for a new audio track which I record to.... Its rendered exactly 500ms early. See my last post quoting BabylonWaves comment from earlier in the thread though, that is probably why you were seeing confusing results.


----------



## Dewdman42 (Aug 14, 2020)

And to repeat myself, I don't think you should be using the Auto Compensation mode, that is designed to compensate for your sound card when using external hardware synths, etc.


----------



## Kent (Aug 15, 2020)

Dewdman42 said:


> Yes I used midi track, it connects to environment object, which is cabled to mixer strip, hosting EXS24 (which by default plays a sine wave sound). The output of the mixer strip goes to BUS1, which is used as the input for a new audio track which I record to.... Its rendered exactly 500ms early. See my last post quoting BabylonWaves comment from earlier in the thread though, that is probably why you were seeing confusing results.


That's not what I'm getting at _all. _









Logic Test Projects


Shared with Dropbox




www.dropbox.com





(There are two projects in here. The one we're interested in is *Test-project-MIDI-Delay-200815*.)

In the Notes of the Project, I added what I should expect to find.

In the Notes of the Print Tracks, I added screenshots and explanations of what my settings were per print.

Screenset 1 shows the Notes, the audio file/Track view, the Arrange window, the Track inspector, and the relevant Environment page.

--

The results were these:
1. No delay offset = dead on the grid
2. -99 Ticks = -99 Ticks
3. -500.0 ms = dead on the grid
4. Auto = impossible (option greyed out)
5. -99 Ticks, but -500.0 ms also selected = -99 Ticks
6. -500.0 ms, but -99 Ticks also selected = dead on the grid


Does this project work differently on your system than it does here?


----------



## Dewdman42 (Aug 15, 2020)

Very strange. It worked perfectly for me last night. Today I loaded your project and it didn't work properly. I couldn't see any reason why not. I tried to create a new one from scratch and now I can't get a new one to work properly either. No idea what I did differently last night that made it work properly. If I can figure that out I'll let you know.

its possible I accidentally had the track delay for the mixer channel set to -500ms in addition to the track delay for the midi track, which by the way works perfectly now too, but on the midi track it seems to be ignored.

@babylonwaves, do you have any further ideas about it? You said you had it working?

More in a minute...but anyway, yea...as of right now, I can't get midi track through environment to respect negative track delay either.


----------



## Dewdman42 (Aug 15, 2020)

Another thing to keep in mind regarding VePro.AU3 is that its not possible to configure the AU3 midi port in the midi instrument environment object either, only channel. The "Port" attribute that you see on old school midi tracks is not the AU3 midi port, its in reference to external midi device ports... Its confusing the way they are named, but anyway just wanted to point that out.

That means that in order to use old school midi tracks with more than 16 tracks in one VePro instance in AU3, you have to do some squirley stuff in the environment in order to route the other ports. All doable, but its not totally straightforward, one of the templates I shared earlier is setup like that if you want to see how its done.


----------



## Dewdman42 (Aug 15, 2020)

Stay tuned shortly for a Scripter script you might be able to use instead of banging your head against the wall with the old school midi tracks.


----------



## Dewdman42 (Aug 15, 2020)

I found more bugs related to track delay in LogicPro...i guess this is what you said at the very beginning and I can confirm its still a problem. When you create multiple tracks going to a single multi timbral instrument, the same way NewTracksWizard does it, they each have their own track delay setting, but they are not honored and its totally not clear which one is the one that is honored, it appears to be the last one that was last created...or something strange like that...basically...its not reliable and I would not use it.

This brings us full circle now. I think the best way to handle this kind of situation is by using Scripter in conjunction with *ExpertSleepers,* *LatencyFixer* plugin.

LatencyFixer download here: https://www.expert-sleepers.co.uk/legacy_downloads.html

*Instructions*

Create Instrument Track with VePro.AU3 on it


Add Scripter to that channel and add the script shown at the bottom of this post.


You will need to edit the script to set the known latency (in ms) for each port/channel. For example, the following edit would set port1, chan1 to 50ms latency; port1-chan2 to 30ms latency, etc. _After editing the script, make sure to press the *RunScript* button._


```
var latencyByChannel = [
 [ 50, 30, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 1
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 2
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 3
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 4
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 5
.
.
```

Look on the Scripter GUI and you will see a Lookahead value reported, which should be the largest latency you are configuring across all 768 possible midi port/channels.








Add LatencyFixer to an Audio FX plugin slot on the same channel as VePro.AU3 inst. Configure it to the same value reported as Lookahead. You will have to convert from ms to seconds in this case (divide by 1000).


To add more tracks, use *New Track with Next Channel* command in LogicPro. More information about building large VePro AU3 templates can be found here: https://www.logicprohelp.com/forum/viewtopic.php?f=9&t=143416


```
/*************************************************************
 * Latency Correction By Port/Channel
 * v1.02
 *
 * Specify the KNOWN latency for each port/channel in the array
 * below.  Use in combination with Expert Sleepers LatencyFixer
 * to create enough lookehead to handle the largest reported
 * latency here.
 *
 * Only NoteOn events will be sent early, all others will remain
 * exactly on time as they are in the region.
 *************************************************************/

var latencyByChannel = [
 [ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 1
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 2
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 3
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 4
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 5
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 6
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 7
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 8
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 9
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 10
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 11
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 12
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 13
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 14
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 15
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 16
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 17
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 18
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 19
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 20
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 21
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 22
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 23
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 24
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 25
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 26
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 27
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 28
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 29
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 30
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 31
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 32
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 33
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 34
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 35
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 36
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 37
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 38
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 39
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 40
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 41
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 42
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 43
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 44
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 45
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 46
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 47
,[ 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] // port 48
];

/************************************************************
 * Scan Array to find largest latency and report as lookahead
 ************************************************************/
var lookahead = 0;
for(let port=1;port<=48;port++) {
    for(let chan=1;chan<=16;chan++) {
        if(latencyByChannel[port-1][chan-1] > lookahead) {
            lookahead = latencyByChannel[port-1][chan-1];
        }
    }
}

/******************************************
 * return boolean indicating NoteOn events
 ******************************************/
Event.prototype.isNoteOn = function() {
    return false;
};
NoteOn.prototype.isNoteOn = function() {
   if(this.velocity <= 0) return false;
   else return true;
};

/******************************************
 *HandleMIDI
 ******************************************/
function HandleMIDI(event) {
    if(event.port == undefined || event.port == 0) {
        event.port = 1;
    }
    
    let latency = latencyByChannel[event.port-1][event.channel-1];
    let delay = lookahead - latency;
    
    // Only send NoteOn events early, all others delay for entire
    // lookahead amount.
    
    if(event.isNoteOn()) {
        event.sendAfterMilliseconds(delay);
    }
    else {
        event.sendAfterMilliseconds(lookahead);
    }
}

/***************
 * GUI
 ***************/
 
var PluginParameters = [];
PluginParameters.push({
    type: "text",
    name: "Lookahead = " + lookahead + " ms"
});
```
That's it. Should work as intended....


----------



## Dewdman42 (Aug 15, 2020)

In the meantime, if I get any clarity on how to make the track delay work properly with either old school midi tracks or new school multi-timbral tracks...I will let you know, but so far...that appears to be pretty buggy in LogicPro when dealing with multi-timbral instruments. :emoji_angry:


----------



## Dewdman42 (Aug 15, 2020)

oh also, if you still want to use old school midi tracks for other reasons such as the mute/solo buttons, etc..you still can with this approach. Just make sure not to use the track delay for setting negative delay. Always use Expert Sleepers to avoid any LogicPro bug confusion.. Let me know if you want some guidance setting up a project with old school midi tracks. Using that approach you can actually have all 768 tracks into one VePro instance!


----------



## Dewdman42 (Aug 16, 2020)

Thinking about this topic of multi-timbral sample latency correction, including per-articulation; a bit over night and I want to make some additional comments and observations about the issue and some of the common solutions(none 100% perfect).

*Goal*

So the goal for this conversation is how to set things up so that you can quantize notes to the grid and not have to nudge them early to compensate for slow attack samples, or poorly constructed sample instruments with late sample start times.

*Solutions*

Main solution usually offered is to use negative track delay for each track.


Secondary solution is to use something like LatencyFixer to induce PDC into starting notes early.


In LogicPro, Scripter can be used to move midi event times later on a case by case basis, which when combined with the above two approach can possibly get around bugs in LogicPro and/pro provide finer grained control over the correction.

In thinking this through, I find that there are pros and cons of each approach. All three approaches have some potential downsides, some are easier then others to setup, I will comment on each one now.

*Negative Track Delay*

On the surface this is the easiest solution, and the solution most often cited. If the DAW supports this, then you can determine how latent the sample instrument is (without reporting plugin latency), and just use a negative track delay setting to correct for it.

Here are the problems I see with this approach:

In the case of LogicPro, we have identified in this thread some buggy behavior with multi-timbral instruments and perhaps with midi tracks where it simply doesn't work as expected.


Having all midi events start early is not really what you want either. As it turns out, *NoteOff* events should specifically NOT start early. The problem we are correcting for is the sound of the note not starting immediately, but when you release the key, it will end immediately. So actually, you need *NoteOn* events to be started early and *NoteOff* events to be executed exactly on time as expected.


Other kinds of midi events also need to not be early. For example *CC*, *PitchBend,* *Aftertouch* and probably *ProgramChange* events need to be exactly on the timing you expect them to be, not early. So the negative track delay feature would cause all of them to be too early also. The real problem is only *NoteOn* events are late and need to be made earlier, all other midi events need to remain exactly on time.

*Expert Sleepers LatencyFixer Plugin*

This free plugin can be used to report latency to the host. Since we can basically determine that a particular sample instrument is introducing some start time latency but not actually reporting that to the host, the use of LatencyFixer can simply report that latency, then PluginDelayCompensation (PDC) will do the rest. Right?

Yes that is actually easy solution, no more LogicPro bugs related to track delay setting. Additionally, this plugin can theoretically be used inside VePro mixer itself, so that you can report the latency of each instrument in the VePro mixer and VePro will align all the audio there before bringing it all back to LogicPro. Hm...that is interesting.

Still some problems...

Similar as the negative track delay, this will cause all other midi events, including *NoteOff* to be too early. The real problem we need to solve is making only *NoteOn* events start early. All other events need to be exactly on time.


Because you can only put one instance of LatencyFixer on the instrument channel, you get only one global PDC correction for the whole multi-timbral instrument (entire VePro Instance). At least with negative track delay, if it actually worked right, you can time align each source track independently. Unfortunately it doesn't work right anyway. (_however, as noted, you can theoretically use LatencyFixer inside the VePro mixer in order to align each instrument with different amounts of correction._)

*LogicPro Scripter*

A custom Scripter solution seems like perhaps the best way to deal with this issue, since we can delay different midi events by different amounts. Also it potentially provides a way to have a look at *articulationID* and provide different amounts of latency correction per articulation! Right?

There are problems with this approach too.

While scripter can easily make sure that *NoteOn* events are early and *NoteOff* events are left on time, there are other problems now because legato instruments often require fairly precise overlapping of *NoteOff* to the next *NoteOn* in order to establish legato and play the legato transition. So when you have Scripter monkeying around with *NoteOn* and *NoteOff* timing differently from each other, this may or may not cause problems with certain legato instruments.


As noted earlier, *CC* and other event types need to not be early, they typically need to be exactly on time where expected. But what if the *CC* or *PitchBend* event specifically needs to be set before actually sending the *NoteOn* to the instrument for some reason? This could particularly be an issue if you are using *CC* as an articulation switcher, for example. In that case the *CC* event needs to be sent early also just like *NoteOn*. See the dilemma? How do you know when to send these other kinds of midi events early or not? Maybe there is a clever solution, but I haven't thought of it yet. You could simply avoid using CC articulation switches at all cost, which might get rid of this problem well enough, however.


When attempting to use different latency per articulationID, then it becomes even more complicated. Different amounts of latency correction per articulation, but the keyswitches all need to in front of the notes, including if the switches are CC...and meanwhile non-switch midi events need to be exactly on the time they are supposed to and don't forget about legato transition issues, especially now we're dealing with potentially different timing correction per articulation.
*Summary*

So... bottom line is that this problem can be a bit complicated and so far I don't see a simple solution that really takes care of it quite perfectly. All of these problems are made more complicated when using VePro, PLAY or Kontakt in multi-timbral modes. You can get kind of close to the right answer and maybe good enough depending on your project, but still there are some interesting little details to be aware about related to LogicPro bugs, non-note midi events needing to be on correct timing and potential legato transition problems.


----------



## Kent (Aug 17, 2020)

This is really two issues so I’m going to start a new thread for one of them so these are not being discussed at cross-purposes.

Here it is, a discussion of PDC in Logic: 

https://vi-control.net/community/threads/logic-pro—incorrect-plug-in-delay-compensation.97285/


----------



## SupremeFist (Aug 18, 2020)

MGdepp said:


> Why fix that when you can add some features borrowed from Ableton Live instead ...?


Horses for courses but I am loving the new Step Sequencer for my electro stuff. Might be interesting to use for ostinati too. 

I do wish they'd fix the PDC bug with track automation though.


----------



## Kent (Aug 18, 2020)

SupremeFist said:


> Horses for courses but I am loving the new Step Sequencer for my electro stuff. Might be interesting to use for ostinati too.
> 
> I do wish they'd fix the PDC bug with track automation though.


I think that that's a _third_ problem


----------



## MGdepp (Aug 18, 2020)

I am not talking for or against any features added to Logic, Cubase or any DAW, I just find it sad that almost all of them seem to keep adding stuff before fixing things. Regarding the features added: sure! That is up to personal preference! I just picked that example, because IMO Logic tries to integrate that concept of Ableton Live into their own. Some people might like that, but the more a software tries to be everything, the more it may lead to bugs, problems and an overwhelming number of features. 

I could say the same thing about Cubase and many of their recent additions like pitch correction? I would have preferred, had they just implemented ARA for the market leader Melodyne! They did that later, of course, maybe due to many people asking for it. But it shows again, the DAW developers should focus more on what their target audience wants instead of trying to do things, other programs are better at or great plugins exist to buy for.


----------



## Kent (Nov 18, 2020)

Dewdman42 said:


> I found more bugs related to track delay in LogicPro...i guess this is what you said at the very beginning and I can confirm its still a problem. When you create multiple tracks going to a single multi timbral instrument, the same way NewTracksWizard does it, they each have their own track delay setting, but they are not honored and its totally not clear which one is the one that is honored, it appears to be the last one that was last created...or something strange like that...basically...its not reliable and I would not use it.
> 
> This brings us full circle now. I think the best way to handle this kind of situation is by using Scripter in conjunction with *ExpertSleepers,* *LatencyFixer* plugin.
> 
> ...



Admittedly I am a little fuzzy on the exact order of MIDI flow between the Scripter and Articulation Sets, so this might not work...but along these lines, why couldn't we tie to the value of an unused CC (say, 111, or whatever works) as a MIDI negative delay value, and in the Scripter set the ms multiplier and the MIDI handling functionality? That way we don't have to worry about declaring a huge unlabelled array of various values...?

So, for instance, Library A has staccatos on Channel 2 that sound 60ms behind the noteOn and Library B has staccatos with Keyswitch B-2 that sound 80ms behind the noteOn. In the Scripter, we could say `ms multiplier = 4`, so it's looking 508ms ahead, then whenever Library A sends Articulation Set A sends `channel 2, CC111 val 15` or Library B sends Articulation Set B sends `B-2, CC111 val 20`, each staccato would be heard at the correct negative delay.

This could even get more granular with MSB/LSB values, but for now...would this even work?


----------



## Dewdman42 (Nov 18, 2020)

Great minds think alike...

I have another script laying around here that does almost exactly what you just described. I never shared it because I felt it might be too complicated for most people to want to mess with it. Also I have since moved on to trying to make a more all-encompassing script that does "everything" all in one script...but its not done and its considerably more complicated. But anyway, I prototyped something that does exactly what you just described so that you can embed the CC into the articulation set, etc..just like you described. I can't remember now if I ran into some little problem or if it was 100% working, I will have to go find it and get back to you. I think it was handling both the MSB/LSB possibility (using two CC's) as well as a multiplier, but the MSB/LSB idea was more difficult I think because of the way Articulation Sets work..

Like one thing, Articulation Set's will not send a key switch repeatedly if it believes it has already been sent and hasn't been trumped by a different one. its not always clear when it will send the CC or not.. So I think I may have run into an issue where that uncertainty made it unreliable. But I can't remember now. I will dig up whatever I had and send your way to try it out.

*Sidenote*

the overriding problem with these Scripter solutions is the lack-of-GUI. Simple scripts are fine, but when you get into more complicated ones, and particularly with VePro involved, you could have many tracks funneling through one single Script on its way to VePro...so then you need a GUI that has dozens or hundreds of data points to edit...completely not feasible with Scripter's GUI features. 

The alternative is to send JSON data to the script, like I did in the above script as the big-ass array. Imagine one that is even bigger with a lot more data points. Then you use an external GUI editor to edit that structure. That opens up the possibility to having much more control and a really nice GUI, but the script itself would have an arcane JSON data structure to edit until such time a GUI actually exists. :-(

The CC approach here, allows the Articulation Aet itself to be that GUI...for this particular thing. But it also relies on the articulation set actually sending all those switches, in exactly the right order, etc. For example, with this approach the delay CC has to be the first key switch.....because the other key switches also need to be handled with the same negative track delay as the note itself. That's part of why I didn't share it publicly..too complicated to explain all the precise rules that people would need to follow in configuring their articulation set to make it all work. Too many ways to get it wrong and create frustration and chaos on the internet about it.

Anyway let me see if I can find it, I'll PM it for you to try out


----------



## Dewdman42 (Nov 20, 2020)

Here's a script to handle latency by CC, 

https://gitlab.com/dewdman42/artalign/-/wikis/home

check out the specific script called* ArtAlignByCC.js*


----------



## marclawsonmusic (Apr 10, 2021)

Hi @kmaster, did you ever figure out a solution for negative track delay in Logic? I just rebuilt an entire template only to realize that negative track delay does not work with External MIDI tracks. Lucky me!

PS - I am on 10.4.8, but going to upgrade to 10.5.1 and see if that helps.

Surely there's a solution? John Powell is on Logic / VEP and I know he uses negative track delays for everything. 

Also paging @Dewdman42 in case he has any insight.

Thanks in advance


----------



## Dewdman42 (Apr 10, 2021)

try changing the external midi tracks to use the external instrument plugin instead of the environment. That should make the negative track delay functional.


----------



## marclawsonmusic (Apr 10, 2021)

Dewdman42 said:


> try changing the external midi tracks to use the external instrument plugin instead of the environment. That should make the negative track delay functional.


Thanks for the quick reply. In order to use this plugin, I guess I need an instrument track instead of a MIDI Instrument from the environment, right?


----------



## Dewdman42 (Apr 10, 2021)

Probably the easiest way would be


Create a new instrument track
put ext instrument on it
right click to reassign the existing track you have with midi data on it...to the new ext inst instrument track instead of being assigned to environment midi instrument object.
After that the audio from your ext synth will be coming into the ext inst instrument channel if you want or you can just leave your existing audio/aux or whatever you were using for the ext audio. Doesn't matter, but ultimately you could have the audio come back to that ext inst channel instead...and then some latency compensation will be a little easier to manage too.

Hope that makes sense.

External instrument plugin gives proper latency correction with PDC since its hosted in the mixer...and you can adjust how much latency needs to be accounted for, etc.. Then you should be able to also set negative track delay as needed.


----------



## marclawsonmusic (Apr 10, 2021)

I'm sorry, I'm not following 100%.

Currently, I have...

External MIDI Track (let's say Oboe 1) -> MIDI Instrument (Environment) -> Instrument Channel Strip with VEPro AU3 (Environment).

I add a new Software Instrument Track (in Tracks view, not Environment), and add the External Instrument plugin. (Not sure how to configure that plug-in, though... just leave it as-is?)

Then, I reassign my External MIDI Track (Oboe 1) to the Software Instrument Track with the External Instrument plugin. However, that signal chain is incomplete (nothing is talking to VEPro), so I cabled the Software Instrument Track with External Instrument to my VEPro AU3.

And it still doesn't interpret delay.

This might be a Logic version issue? Like I said, I'm on 10.4.8.


----------



## Dewdman42 (Apr 10, 2021)

ahhhhhh.. sorry I didn't realize you were talking about VePro when you said "External".

this is a bigger problem and you may not be able to solve it easily. Really the negative delay is not working that way?


----------



## marclawsonmusic (Apr 10, 2021)

Thanks, @Dewdman42. Which version of Logic are you on by the way? Maybe they fixed this at some point.


----------



## marclawsonmusic (Apr 10, 2021)

Here is a project that demonstrates this... The only track that works correctly is the Software Instrument with the instrument plugin directly on the track (yellow track). The multi-timbral doesn't work consistently, and the external MIDI doesn't work at all (the light on the track is pinging with the early delay - but the audio doesn't return early).


----------



## Dewdman42 (Apr 10, 2021)

Just ran a little test, sorry my previous deleted comment was wrong. the negative delay seems to only work on instrument tracks not on external midi tracks feeding instrument tracks. You can put one huge negative delay for the entire instrument channel still....but that is almost certainly not what you want..

That's really annoying actually, might be a bug....


----------



## marclawsonmusic (Apr 10, 2021)

Upgrading to 10.5.1 didn't do anything. I'm running out of options... I will submit a bug report, but I'm curious how someone like John Powell manages this.

@kmaster, did you ever find a solution?


----------



## Dewdman42 (Apr 10, 2021)

I think that's a bug. Even the docs for LogicPro seem to say that track delay can be used for "tracks", there is no mention that it can't be used for external midi tracks. I think this is probably a bug in LogicPro, so make sure to report it. Here's one hypothesis....

In the scenario you are using, there are two relevant objects in the environment. There is the midi instrument object and there is the inst channel strip object that it is cabled to. Both of those two objects have their own delay parameter. In this case if you change the value of the delay on the strip object, that actually takes precedence and works, while the delay on the midi inst object, which is also the one that is displayed on the track properties inspector from the arrange page....that value appears to be ignored in this case, I think because somehow the delay in the inst stip is overriding it, internal programming bug as far as I'm concerned. The one shown on the Track inspector should be the one being used, right?

Anyway, please make sure to submit a bug report. I will have to update the AU3 template docs I made to point out that track delay won't work right with that setup.

how many tracks do you have in your template?

My thoughts are that either you have to go to the other template approach, which allows only 127 midi tracks per VePro Instance, but would avoid this particular problem.....or.... you need to find a different solution for negative track delay. I have a Scripter solution but its not easy to setup


----------



## Dewdman42 (Apr 10, 2021)

if you want to try scripter solution, go get this script: https://gitlab.com/dewdman42/artalign/-/blob/master/ArtAlign.js. (look in the top right corner for the copy to clipboard icon and use that to get the script on clipboard and then paste it into scripter.

Here are instructions: https://gitlab.com/dewdman42/artalign/-/wikis/ArtAlign.js

Put this script onto the channel where VePro.AU3 is hosted.


you have to tweak the script with the specific channels that have latency.


once you have done that, the Scripter GUI will display the largest latency out of all the channels you configured. you need that value to go configure negative track delay. 


go to the environment and select the channel strip object that is hosting VePro.AU3. Set the negative track delay to the value that was displayed in the scripter GUI. _(or use expert sleepers latency fixer which is in some ways easier and more reliable)_

That should work, try it.

What this will do is to delay ALL your tracks by that max amount, and the global negative delay will offset all tracks back to where they should be, but the script will adjust just the ones you configured to have a little more negative offset as you configured them.

I am working on a more comprehensive solution for this problem that would be easier to use, but unfortunately its no where near done and not a super huge priority for me... but the above should get you through the issue.


----------



## marclawsonmusic (Apr 10, 2021)

Thanks a bunch, @Dewdman42. I just submitted a bug report, but I saw some old posts from 2015 where this was an issue back then. I wonder if it will ever get fixed.

I have about 250-275 tracks in the template. My impetus for doing this new setup was to support the track delays - separate delays for longs, shorts and legatos. I have the VEP side sorted, and even the AU3 has been pretty stable. In fact, everything is setup and working perfectly - with very low CPU usage. But if I can't get the track delay thing working, it's a lost cause.


----------



## marclawsonmusic (Apr 10, 2021)

Dewdman42 said:


> if you want to try scripter solution, go get this script: https://gitlab.com/dewdman42/artalign/-/blob/master/ArtAlign.js. (look in the top right corner for the copy to clipboard icon and use that to get the script on clipboard and then paste it into scripter.


Thanks!!! I'll give it a try. I'm a programmer so I might be able to muddle my way through this.


----------



## Dewdman42 (Apr 10, 2021)

not entirely a lost cause....

Like I said, two options.....


use three VePro instances...with no more than 127 midi tracks each. You can use the other template I made that has 1270 tracks on 10 VePro instances, to get started. The downside is...your mix is separated in three seperate VePro instances.


Try the script from my last post.


----------



## marclawsonmusic (Apr 10, 2021)

Dewdman42 said:


> not entirely a lost cause....
> 
> 
> ... You can use the other template I made that has 1270 tracks on 10 VePro instances, to get started. ...


I did some testing tonight and think I can use a bunch of Instrument channel strips (instead of MIDI Instruments). I can clone the VEP channel strip over and over, changing ports and channels and then reassign all my External MIDI Instrument tracks to those channel strips.

It appears that delay is recognized on the channel strips, but not on External MIDI Instruments (as we found).

Your 1270 template pointed me in the right direction. At the moment, I am relieved. Thanks, @Dewdman42!


----------



## Dewdman42 (Apr 10, 2021)

I prefer using this other approach in general also.. But....

Make sure you read the instructions on that template thread about how to manually create those kinds of tracks, there is a little so called "tap-dance" you have to do for each transition to each next port, you can't just clone them all...

it should be pretty straightforward for you though. With that option the negative track delay will work

I also want to point out one more thing though..the scripter script I sent you will actually do a better job of handling the negative delay, then the negative track delay. If you're up for setting it up, which isn't really that hard honestly..but anyway, it will negative delay only NoteOn's, PC and CC instrument switches that you specify as such. NoteOff, PitchBend, AfterTouch and all other CC's are left without being sent early..which is what you want! 

you can use that script with the Instrument track approach too though...


----------



## Dewdman42 (Apr 10, 2021)

well you could clone them all in the environment perhaps, but you're on your own to figure that out


----------



## marclawsonmusic (Apr 11, 2021)

Yes, I cloned them in the Environment. 

On the plus side, delay works! On the not-so-plus side, I'm back to where every track is linked to the main VEP instrument - so if I mute / solo / fader one of them, it affects them all. That's the whole reason I went with External MIDI tracks in the first place LOL. So that's not going to work.

Your script could be an option, but unfortunately it will be on 24/7. Which means I need to disable while playing in a part, and reactivate during playback. Also, keeping track of the JSON is tricky if I want to make edits to the delays... 

And I'm doing all this because I can't set a delay on External MIDI Tracks - a feature that should just work in Logic. Unreal.

Cubase is starting to look appealing! Or, I just have to go back to the old AU2 way of setting up tons of VEPro instances. What a nightmare.


----------



## Dewdman42 (Apr 11, 2021)

It’s a dilema I agree. There is no perfect solution only a choice of what to compromise 

Here is another trick you can try. Inside vepro put an instance of expert sleepers latency fixer as an audio fx plugin of every channel that is making late sound. This will report that as latency to vepro and vepro’ pdc engine will do the rest.

With that you don’t need the script and you don’t need negative track delay. Use the environment based template if you want, doesn’t matter


----------



## marclawsonmusic (Apr 11, 2021)

I tried that, but in VEPro, I have a lot of multis... so I can delay the multi, but not the channel within the multi. I might experiment with splitting out the multis in VEPro. Also going to try your script.

As always, thanks.


----------



## Dewdman42 (Apr 11, 2021)

For the script also note that it has a GUI menu item to enable/disable the delays on one channel for while you record while leaving all other channels active correction. It could be possible to use automation to control those. I haven’t done this yet but in theory we could create midi switches that could be used to quickly choose a channel to record on without delay. 

The json editing I agree is not optimal. Scripter’s GUI capabilities are too limited to do better then that. But once you get used to it I don’t think it will bother you that much, the only thing is that if you setup a large template then if you start changing channels around that could be a hassle.

The reality is that most musicians are put off by having to edit any JSON whatsoever which is why I have not made a big effort to talk about this script much. It does get the job done though

And yes for the expert sleepers approach you’d have to use multi audio outs from each plugin in order to isolate the latent articulation


----------



## Dewdman42 (Apr 11, 2021)

As a side note, I keep going back and forth between logic and cubase because of logicpro problems like this. I mean it’s entirely unacceptable that in 2021 logicpro’s half baked multi timbral operation can’t support out of the box 768 AU3 midi channels with the mute and solo buttons working properly and negative track delay working properly, etc. These problems should have been corrected years ago. I suspect they are difficult to fix because of the way logicpro is a prettier GUI layered on top of an ancient old school logic environment object paradigm, and this brings up some of these problems because of complicated linking of those internal objects. To really fix it would require a complete rearcchitecture of logicpro, which is unlikely to happen. 

On the other hand, cubase has its own share of problems and hassles. I use scripter a lot and cubase provides nothing. When you are talking about channelizing articulations, for example, cubase will not channelize expression events, and neither does logicpro but I have a script for it, which I can’t do on cubase. For example. There are all kinds of pros and cons in both sides and it will come down to choosing your compromises


----------



## Dewdman42 (Apr 11, 2021)

Whoa, this issue made me do a quick test and guess what.. the track delay parameter does not work with not only external midi tracks through the environment, but it doesn't work with the normal multi-timbral tracks either! Not for AU2 either.

If you setup a 16 part multi-timbral track in Logic, using its very own wizard no less...then try to apply different track delays to each of those 16 parts...they don't work. They are all linked together with the same delay setting, so whichever part you set the delay, that will be the one affecting all the parts EVEN THOUGH, each of those parts can have a different delay setting appear on the track inspector...it doesn't matter, internally its a single linked value and whichever one of them was the last one to set it will be the delay value used.

That is DEFINITELY a bug.... Kind of related to the external midi track issue too in some way if you ask me, but anyway, I will report this to Apple again as a bug, just to bomb them with heat...but..Its been a around a long time and I don't expect it to change.

:-(

What this means, bottom line, is that if you intend to use the track delay parameter in LogicPro, it is not possible to use multi-timbral instruments with different delay setting per part. There is no way to do it, its just fundamentally broken, not with external midi tracks and not with apple approved multi-timbral tracks either. 

Can be done with Scripter of course. 

And the expert sleepers PDC trick.


----------



## marclawsonmusic (Apr 11, 2021)

Yes, I discovered this as well. It's maddening, because at first it seems to work, but then you realize it was a false positive.

I am experimenting with adding expert sleepers latency plugin inside VEPro. Just need to figure out the best way to set up the signal chain.


----------



## marclawsonmusic (Apr 11, 2021)

I implemented the expert sleepers latency plugin in my VEPro instance and it seems to do the trick. I had to create separate outputs in Kontakt for legatos, longs and shorts (level of granularity for the delay settings), but once I did that I could add the plugin as an effect on each channel. I sum everything before sending back to Logic, so I didn't even touch the Logic side of things.

I did this for only one of my string libraries so there is still a lot of work to be done, but I'm optimistic. I'm going to try the remaining libraries over the coming days. Hopefully, I won't hit another brick wall.

Really appreciate your help (and moral support), @Dewdman42!


----------



## Dewdman42 (Apr 15, 2021)

Here's another new delay compensation script (for LogicPro) I threw together for anyone that wants to try it. This one uses sliders on a GUI to set each known latency value _(which will be corrected-for)_, on a channel-by-channel basis. 

You can find the script and docs on how to use it here:

https://gitlab.com/dewdman42/artalign/-/wikis/ArtAlignGui.js


----------



## marclawsonmusic (Apr 16, 2021)

Wow, this is fantastic @Dewdman42. It gives you control at the MIDI channel level - which is much more granular than I have now with the audio outs in VEPro (which is still working by the way). I'll check this out later today. Great work!


----------



## Dewdman42 (Apr 16, 2021)

It also does one thing special no other tool does, it makes the note attacks start early while note releases and cc, pitchbend and aftertouch events are left alone.


----------



## lettucehat (Apr 17, 2021)

Catching up on all of this as I am trying to set up a VEPro template making use of both Art Conductor articulation sets and heavy use of negative delay (can't live without it now), and I'm surprised that so many issues/problems are being discussed literally right now! I am shocked it's such a hassle with Logic. @marclawsonmusic which template method are you using the @Dewdman42 script with that's finding good results?

Obviously there have to be compromises, and I'm wondering how I'll have to route and organize things in order to get different delays for different articulations.

Is it possible to get this working with, for example, one Kontakt combined-articulations instrument (rather than a bunch of split patches)? I'm still learning all of this, but I was hoping to cram as many related instruments into one Kontakt multi as possible - say, every staple patch in Cinebrass Core/Pro, such that I don't exceed 16 instruments in that one Kontakt. My original thinking was every instrument gets its own stereo output, and the compromise is not having granular control over mics.

However, looking at this negative delay issue, I'm thinking I might need to route outputs more along the lines of "high brass shorts", "high brass longs", etc, because each instrument (e.g. horn solo) having 2 or 3 outputs is crazy, but the script depends on analyzing the audio outputs to determine delay compensation. Is that more or less correct? Thank you guys for working all this out!

Edit: Correction - I think I was conflating the latency fixer plugin and the script.. I need a nap...


----------



## Dewdman42 (Apr 17, 2021)

Sounds like you figured this out, but the script operates entirely in midi domain. The other approach that was discussed earlier assigns each articulation to a separate audio out so that when it hits the vepro mixer each isolated audio can have latency fixer assigned to it. Latency fixer does nothing to the timing it just reports whatever latency you set it to; to the host (vepro) vepro then does whatever it does to perform plugin delay Compensation on each articulation audio channel in the vepro mixer.

There are pros and cons to either approach


----------



## Dewdman42 (Apr 17, 2021)

What I would if I were you would be to create your various busses in the vepro mixer instead of in kontakt. The final set of audio outputs from vepro back to logicpro can be whatever you want


----------



## marclawsonmusic (Apr 17, 2021)

lettucehat said:


> @marclawsonmusic which template method are you using the @Dewdman42 script with that's finding good results?


Hi @lettucehat,

Here's what I'm doing right now:

One VEP instance per library. So I have an instance called 'VEP CineBrass'.

In there, I have one Kontakt multi per instrument. So, I've got the following multis (each on its own track in VEPro) for CineBrass:

CB Solo Tpt (port 1)
CB Tpt Ens (port 2)
CB Solo Horn (port 3)
CB Hn Ens (port 4)
CB Solo Tbn (port 5)
CB Tbn Ens (port 6)
... you get the idea
Each of those multis is on a separate port, so I can have up to 16 MIDI channels per Kontakt multi.

In Logic Environment, I have an Instrument channel strip with the VEP AU3 plugin. I clone this XX times, and change the port number for as many I need. I think I have 9 ports for CineBrass. Each channel strip clone is set to a different port, which is mapped to the corresponding multi in VEP - CB Solo Tpt, CB Solo Hn, etc.

Also in the Environment, I have a MIDI Instrument for each articulation - e.g. CB Solo Tpt Legato, CB Solo Tpt Longs, CB Solo Tpt Shorts. These MIDI instruments point to the corresponding Instrument channel strip.

Finally, in Tracks view, I have a bunch of External MIDI tracks that I 'reassigned' to each of the MIDI Instruments. I use articulations on these to swtch between long, marcato, trill, trem, etc. But this is where my plan fell to pieces... I was going to put negative delay on each of these MIDI tracks (one per articulation), but Logic simply doesn't recognize it. BIG bummer.

To get around this, in VEPro, I set up multiple outputs in each Kontakt instrument. So in CB Solo Tpt, I have legatos going to st.1 (1/2), longs going to st.2 (3/4), and shorts going to st.3 (5/6). In VEPro, I hit the plus button to create two additional audio outputs that correspond to the Kontakt outs. These 3 audio outs are where I put the Expert Sleeper Latency plugin. All audio in each VEPro instance still sums to a single stereo return that feeds back into Logic. You could obviously use multi-out if you wanted.

Still not sure this is all going to work... I just finished setting up all the instruments today and need to do some stress testing with an actual track. If it doesn't work, I'm seriously thinking of switching to Cubase.

PS - Haven't had a chance to test Dewdman's script yet. Need to establish a baseline first.

Hope this helps,
Marc


----------



## Dewdman42 (Apr 17, 2021)

The actual LatencyFixer plugin takes almost no CPU...it doesn't do nearly anything other then report a value back to the host.. So that particular aspect should not have any problem with a stress test.

The extra audio track splitting in order to accommodate it though..that could. Please let us know how that goes.


----------



## lettucehat (Apr 17, 2021)

marclawsonmusic said:


> Hi @lettucehat,
> 
> Here's what I'm doing right now:
> 
> ...


Wow, thank you for the detailed explanation. So I'm strongly assuming this is all using split patches in a one track per articulation setup, right? I think there's still a ton I can emulate in your approach, but unfortunately one of my main priorities is sticking with articulation sets / one instrument one track (as long as the sample library permits). Given how many problems there are already in getting negative track delay to work, I'm guessing there's no _way_ one could make it work by cracking open the all-in-one patches (like Six Horns Articulations) and assigning different delays or outputs to the longs/shorts... right?


----------



## Dewdman42 (Apr 17, 2021)

The approach marclawsonmusic is using can still be used with a single source track. You can specify in the articulation Set to "channelize" the notes rather then sending keyswitches. Just setup your articulations set that way. 

Then in kontakt you load a seperate instance of the instrument into 16 channel slots of kontakt, and pre-load the articulation in each of those, then use the purge all samples command to empty out the ram and save it as a Kontakt Multi. 

After that you you will have one source track, articulation set channelizes it...and then from there go to kontakt hosting each articulation on seperate channel and so on...

I still highly recommend you both consider my Scripter script though, and actually this is a good time to use one of my other two related scripts...EventChaser or Channelizer...

The reason is because LogicPro does not _propagate_ CC and PitchBend expression across multiple channels....even when the articulation set is moving the notes to new channels as described.

These two scripts do that. EventChaser assumes the notes are already channelized and propagates the expression events to the other channels of the same port.

Channelizer is interesting because with that its possible to have more than 16 articulations, spread across multiple midi ports. if you have an instrument that needs more than 16 articulations, then Channelizer is really useful.

I'm doing some updates to all of my scripts right now, but in the next few days I will post a more comprehensive tutorial about how to set things up with these scripts in combination to achieve both channelizing and expression propagation and latency correction... all in the midi domain.

As I was explaining earlier, the other thing about my latency correction scripts is that they correct the NoteOn's while not correcting NoteOff, CC and PitchBend..which is what you want! All the other tricks of using negative track delay or latency fixer, etc...are conceptually easy...but actually are like a blunt hammer that is just kind of getting it closer..but still left with a problem that way where Notes will be ending shorter then you realize and CC and PitchBend will all end up coming in earlier then you realize too.


----------



## lettucehat (Apr 17, 2021)

Dewdman42 said:


> The approach marclawsonmusic is using can still be used with a single source track. You can specify in the articulation Set to "channelize" the notes rather then sending keyswitches. Just setup your articulations set that way.
> 
> Then in kontakt you load a seperate instance of the instrument into 16 channel slots of kontakt, and pre-load the articulation in each of those, then use the purge all samples command to empty out the ram and save it as a Kontakt Multi.
> 
> ...


The channelizing route makes a lot sense, I actually just did a very inadequate job of explaining the articulation sets thing - the thing I'm trying to stick with is the very specific Art Conductor sets (since they are well standardized and I already paid for them, haha), which use keyswitches and, on occasion, use channels for libraries that have a lot of articulations. As of now I'm resigned to having only one negative delay setting per library instrument for some some libraries, unless there's a way of applying one negative delay (based on the longs) and somehow applying a positive delay inside Kontakt to the shorts. Thankfully, many of my favorite libraries only come split out by articulation so I can definitely try your scripts there. I mean, I'll use them on the keyswitch instruments too, but unless there's that miracle solution or some other workaround, I'll have to find some compromise between the delay needs of longs and shorts. Once I started working quantized with negative delays it's been hard to go back! I'm at least thankfully not wedded to separating longs and shorts in audio routing.


----------



## Dewdman42 (Apr 17, 2021)

adjusting latency with key switched instruments is very difficult to do. I am working on a very elaborate script also in the long term, but nobody will want to use it because its too complicated. There are many issues that come up when using key switched instruments...

With kontakt it should be possible to do some things inside Kontakt multi-script, but I don't know KSP well enough to attempt it and there is really not a general solution.

Bottom line, I know of no general solution for latency correction with key switched instruments. There are a few very specific things that have been done for CSS for example, and a few things like that, but as far as a general approach, its just not available anywhere...and I have been tweaking on this issue for a while....and I can say..its not an easy problem to solve in a general way.

If you want to handle automatic latency correction, I strongly suggest you consider using a channelizing approach.

By the way you can adapt Art Conductor for that. You just have to go into the output section and remove the keyswitches and replace them with channel assignments. haha, but that's throwing away a lot of what you paid for, I get it. But also Art Conductor is using a consistent paradigm on the input switch side..(which you can still use by the way!). You can use the input switches as defined by Art Conductor to assign articulation ID's to notes on the fly when you record them...then on playback...have the OUTPUT section channelize instead of send keyswitches... 

or even better, just completely blank out the output section and use my channelizer script instead.

but it will mean you have to rework the articulation sets you are using from them in that regard, and it will mean setting up a lot of kontakt multis also in order to receive the channelized midi instead of key switched.

Even libraries that don't have separated instruments for each articulation, you can just load the same instrument 16 times. purge all samples and then manually configure each of those to the specific articulation, as if the keyswitch has already been sent. And they will just be saved that way in the kontakt multi.


----------



## marclawsonmusic (Apr 18, 2021)

I don't know enough about the articulation sets you are using, but I am using Logic's built-in articulations on my three MIDI tracks - legatos, longs, shorts. There is no reason I couldn't combine these into a single track - as long as the articulation can output to separate MIDI channels.

You will still need to separate the instruments (inside Kontakt) into shorts, longs, etc. - basically anything you want the negative delay on. And then route those instruments to a separate output.

PS - Some of my articulations us keyswitches and I haven't had problems with those.


----------



## Dewdman42 (Apr 18, 2021)

Sure the keyswitches can be there but it has to be channelized in order to do the latency correction, that’s all I’m saying. If you try to determine how much latency correction based on the keyswitches that becomes very difficult to do in a generic way, complicated.

You can use channel to determine latency and generally include keyswitches too, if they are still needed.


----------



## Dewdman42 (Apr 18, 2021)

I will also point out another benefit of using channelizing for your articulations is that can easily level-balance all your articulations


----------



## lettucehat (Apr 18, 2021)

Thank you both again for all of the insight. I think I'm fighting the inevitable necessity of redoing many of my VEPro instances as channelized, at least where I care about negative track delay. I'll see how it goes.


----------



## Bear Market (Apr 24, 2021)

marclawsonmusic said:


> To get around this, in VEPro, I set up multiple outputs in each Kontakt instrument. So in CB Solo Tpt, I have legatos going to st.1 (1/2), longs going to st.2 (3/4), and shorts going to st.3 (5/6). In VEPro, I hit the plus button to create two additional audio outputs that correspond to the Kontakt outs. These 3 audio outs are where I put the Expert Sleeper Latency plugin. All audio in each VEPro instance still sums to a single stereo return that feeds back into Logic. You could obviously use multi-out if you wanted.
> 
> Still not sure this is all going to work... I just finished setting up all the instruments today and need to do some stress testing with an actual track. If it doesn't work, I'm seriously thinking of switching to Cubase.



Did you get a chance to try this out? I'm curious how you find it working since I'm also in the planning stages of setting up a VEP template for Logic.


----------



## marclawsonmusic (Apr 24, 2021)

Bear Market said:


> Did you get a chance to try this out? I'm curious how you find it working since I'm also in the planning stages of setting up a VEP template for Logic.


Sadly, no. I am still in the process of building everything out with the Latency Fixer plugin. Figuring out by ear all the values for shorts, longs, legatos is very time-consuming.

Performance-wise, things are good so far. I had to experiment with VEPro's 'cores per instance' and put that at 1 core per instance (I have an 8-core iMac and 9 instances in VEP). Things are hovering around 20% in VEP and Logic during playback (w/ audio track selected) and it gets as high as 40-50% when I arm a track for recording during playback. 

One plus is the Logic project's file size is only like 2Mb. :emoji_astonished: This with about 285 tracks and full stem routing. My last template was only 80 tracks and about 80Mb in size. I think it's because I'm doing everything in the Environment so there is less bloat behind the scenes.

Anyway, it's a long process and I'm mainly doing it on the weekends so it might take me some time. I want to get a baseline setup with Latency Fixer and then take those values and put them into @Dewdman42's plugin as a second A/B test. My biggest concern is how to handle recording / live input - I know there is a 'play thru port / channel' that can be defined, but I'd have to figure out how to automate that. I'm also not sure about the 'delay Note On but leave everything else as-is'. I know there is a reason for that decision, but it seems counter-intuitive to how Logic handles region delay (the entire region is delayed). So I'm not sure how that will work in practice.

If you get a chance to test, I'd be curious to hear your findings, so please do share. Cheers.


----------



## Bear Market (Apr 24, 2021)

marclawsonmusic said:


> If you get a chance to test, I'd be curious to hear your findings, so please do share. Cheers.


Thanks for your reply! It seems we're pretty much in the same place currently. I'm thinking of using an approach with more VEP instances though. The "live input" issue you describe is on my radar as well. I'll let you know if I stumble upon something useful.


----------



## Dewdman42 (Apr 24, 2021)

marclawsonmusic said:


> I want to get a baseline setup with Latency Fixer and then take those values and put them into @Dewdman42's plugin as a second A/B test.



I can tell you for certain that the script approach will use slightly more CPU then latency fixer. But probably not much more. LatencyFixer literally uses nearly no extra cpu at all. I mean any plugin is using a bit more CPU just by being inserted. But Latency Fixer is not doing any DSP whatsoever, it just reports a fixed value to the host and that's it. So in terms of the lightest CPU approach...I think latency fixer is probably going to be less CPU...though not much less because my script doesn't really use that much CPU either compared to many other plugins.

There is also some concern that you have all that elaborate VePro audio routing in order to seperate the shorts, longs, etc. That could effect CPU a bit also, and hundreds of latency fixer instances.

But with Scripter you have javascript to execute. So its hard to say which way would be better or worse, but I don't expect to see a significant difference either.




marclawsonmusic said:


> My biggest concern is how to handle recording / live input - I know there is a 'play thru port / channel' that can be defined, but I'd have to figure out how to automate that.



With the pure latency fixer approach you are doing now, you don't have to change or automate anything for live channel. LatencyFixer doesn't actually delay anything. It passes through the audio untouched. All it does is report the latency to the host. LogicPro can't really do anything to correct PDC on your live channel. Whatever you play is what you will hear...if the sample library has a lot of latency, then you will hear that latency as you play the part. PDC takes effect when you leave live mode and then LogicPro executes PDC by sending midi data from the regions ahead of schedule to the instrument channel...in much the same way as if you were using a negative track delay. it just does it automatically according to whatever latency value is being reported from latencyFixer.

So what I'm trying to say is, with the approach you are now trying, you should just be able to record your parts and when you playback they will be adjusted and you shouldn't have to change in and out of live mode per say.

The same goes for if you were instead using negative track delay (if it actually worked right)...when you record the part, the negative track delay would have no effect at all, but during playback it would.

With the Scripter approach its a little different because the script imposes a 500ms lookahead delay in order to do what it needs to do. There is no way around that. And that lookahead delay would be pretty much impossible to have on while recording a particular part. So in the Scripter approach there is a GUI control to select one track to ignore the lookahead..that way you can record your part without hearing the lookahead delay (though you will still hear whatever delay is built into the instrument sample). Then you have to make sure to turn that off for playback. So yea that is a little bit of a headache honestly.




marclawsonmusic said:


> I'm also not sure about the 'delay Note On but leave everything else as-is'. I know there is a reason for that decision, but it seems counter-intuitive to how Logic handles region delay (the entire region is delayed). So I'm not sure how that will work in practice.


To be clear I am still undecided on which way I like working better also. I see pros and cons both ways. Having to setup elaborate VePro routings and hundreds of latency fixer instances is not necessarily that easy to setup. On the other hand...less Scripter mumbo-jumbo is always a good thing when it can be avoided. Also, there is the point about having to disable lookahead for the track while you are recording part and then re-enabling it for playback...it can become second nature after a while, but still it is a step that has to be taken.

To explain about leaving "everything else as-is"...let me try to explain that better...

when you use negative track delay or the latency-fixer approach, the entire midi stream is basically being played ahead of time, by a fixed amount. That is kind of a brute force adjustment. That means NoteOn's, NoteOff's, CC events, etc..will all be sent to the instrument early, ahead of the grid.

That makes sense for NoteOn events because if an articulation sample has a slow attack, then the attack needs to start ahead of the grid in order to sound like its on the grid. However, the release of the note needs to happen on the grid, otherwise it might sound like it was terminated early. Same with CC events. If you are trying to draw an expression curve for CC11, the peak needs to be heard when you visually see it on your CC automation lane...at the proper time on the grid. There is no slow attack built into how CC expression works, the effects of CC expression are going to be immediate. So CC events should not be sent to the instrument early..they should be sent exactly on the time you are expecting them to be sent...as they appear on the grid.

Same goes for PitchBend and Aftertouch.

When you use simple negative track delay or latency fixer...then all those events are being sent to the instrument early...at whatever amount you have specified needs to be there for the NoteOn slow attack to sound on time...but expression events, and probably NoteOff...will end up sounding too early because their effects are immediate.

With a scripter approach we can control which events need to be early..and really its just NoteOn that needs to be early due to slow sample attack. Everything else needs to be exactly on the grid.

When NoteOff's are early, the result will be that it will sound like the note is ending too early. But this is an area we haven't spent that much time thinking or talking about. Its quite possible that. many sample libraries have some slow release rates also that perhaps could be taken into consideration separately from slow attack rates. Its quite possible that in some samples, the Noteoff also needs to be early..but not necessarily by the same amount as NoteOn's! Short samples typically just end anyway..they don't sustain while you hold a key down..so the NoteOff time doesn't even matter. Where it matters more is sustains, tremolos, trills, things where the sample has a hold time until the actual NoteOff comes. Even legatos don't really matter, because in a legato transition, the next note will be coming early (good thing) and that will cause the previous note to end early..regardless of when its NoteOff comes...the NoteOff just needs to be overlapped, but whether it overlaps by a lot or a little, won't matter, either way the first note will be terminated early to start the legato transition...early..which is generally what you want...so that the next note reaches its transient peak on the grid.

So... at least in terms of NoteOff...it may be totally ok for them to be early..because really there are more situations where an early NoteOff termination won't matter after all; and only a few situations where it might sound like its ending earlier then it should be. I think the Scripter approach eliminates that rare situation, but it may not be often enough to justify the script approach.

On the other hand...CC expression events need to not be early!! So what about that? that's where Scripter can provide solution.

Now that being said, it gets further complicated.. If you are using CC instrument switches. (not as expression, but as instrument switches), those CC events do need to be early in order to make sure they are in front of the NoteOn's which are being sent early. See what I mean?

And sometimes there are certain kinds of expression that actually needs to be a bit of both, for example, Velocity CrossFade, as some like to call its where a certain CC will determine which sample layer is played, rather then using key velocity. Well in that case there will need to be at least one CC event of that CC# that needs to be before the early NoteOn, in order to start the initial attack with the desired sample layer...but while the note is being held...then that particular CC would need to be on the grid, in order to effect the sound properly in time... This last scenario I do not have scripted yet...and especially the simple script with faders that you like..will not be able to handle that because it requires specifying which CC's would need to be handled that way.

So while the script can help a lot fo put CC expression on the grid while sending NoteOn's early..it doesn't really account for that above scenario...and I don't know right now, I think it would be better to have the VelocityXF on the grid, even if the initial attack starts with the wrong sample..but maybe not..maybe that is just too complicated.

The truth is the only way to really handle it properly in Scripter is to have a way to specify every instrument and articulation for how it should be handled...and that becomes monumentally more complex.

so what I'ms saying is...there is really just no perfect solution. There are pros and cons all the way... I could yet make a version of the script that doesn't do the tricky different timing for different types of events, but just slides all midi forward..which would make it essentially do exactly the same result as the latencyFixer trick...but without having to create a super big elaborate VePro routing setup..and maybe that will turn out the best, but for the moment I'm still committed to trying to work out all the timing issues so that the script really does the right thing in a more effective manner then brute force sliding the region early, or using brute force negative track delay, etc..


----------



## marclawsonmusic (Apr 24, 2021)

Dewdman42 said:


> There is also some concern that you have all that elaborate VePro audio routing in order to seperate the shorts, longs, etc. That could effect CPU a bit also, and hundreds of latency fixer instances.


Great post, as always @Dewdman42. I definitely understand the Note On issue better now. Crap, it's probably yet another problem to solve down the line! Sigh.

FWIW - I am not using hundreds of Latency Fixer instances... maybe a few dozen. So it's not having much of an impact. Something like CSB is using basically 7x3 (each patch separated into legato, long and short). Anyway, I'll keep you all posted.


----------



## marclawsonmusic (Apr 24, 2021)

Dewdman42 said:


> With the Scripter approach its a little different because the script imposes a 500ms lookahead delay in order to do what it needs to do. There is no way around that. And that lookahead delay would be pretty much impossible to have on while recording a particular part. So in the Scripter approach there is a GUI control to select one track to ignore the lookahead..that way you can record your part without hearing the lookahead delay (though you will still hear whatever delay is built into the instrument sample). Then you have to make sure to turn that off for playback. So yea that is a little bit of a headache honestly.


One thought came to mind... would it be possible to ignore lookahead by looking at the transport 'is playing' instead of a specific MIDI port/channel?

So, if transport is playing, enforce all the delays, but if not... basically operate in 'ignore lookahead' mode?

I saw this code snippet and it got me thinking:

function ProcessMIDI() { 
var info = GetTimingInfo(); /* get the timing info from the host */ 
* if (info.playing) /* if the transport is playing** */* Trace(info.blockStartBeat) /* print the beat position */
}


----------



## Dewdman42 (Apr 24, 2021)

I am also adding right now as we speak an option to my ArtAlignGui.js script where you can put it into super simple mode where it just does the same thing Negative track delay does...the brute force method...but with a fader for every channel.

that would enable you to put all the articulations in VePro on seperate midi channels, but not necessarily have to split out all the extra audio outs, and get exactly the same timing correction, and only a single instance of Latency Fixer for the whole VePro instance.

You can give that a try too. There will be pros and cons, the cc expression will just all seem early when it plays back unless you intentionally record it late so that the correction puts it in the right place hehe.



marclawsonmusic said:


> Great post, as always @Dewdman42. I definitely understand the Note On issue better now. Crap, it's probably yet another problem to solve down the line! Sigh.



Well let's just say...there are issues...and sometimes we may end up getting some particular library with some complications and can't figure out WTF is going on.

CSS is probably the worst culprit with its extreme legato latencies...but honestly I think the only solution for CSS is a dedicated custom script, which I am 80% done with also. The more I think about it I'm inclined to just write a custom script for each of the libraries I own and be done with it. I literally wrote that 80% done version of that one in one evening...not a big deal. I am still kind of committed to finding a general solution in order to help everyone else out too.

CSB and CSW will be very similar, I don't own them, but I should be able to copy my CSS script into versions for them pretty easy.

They are way more complicated because they use different latency for the first note of a legato phrase then for the secondary notes of the phrase. They also use different amounts of latency depending on the velocity of the secondary note. They also have some interesting things related to crossfade and stuff that I mentioned before.

I'll respond to your latest question in next post..




marclawsonmusic said:


> FWIW - I am not using hundreds of Latency Fixer instances... maybe a few dozen. So it's not having much of an impact. Something like CSB is using basically 7x3 (each patch separated into legato, long and short). Anyway, I'll keep you all posted.


Even hundreds, it doesn't use that much CPU. you're using more CPU by splitting the audio probably.


----------



## Dewdman42 (Apr 24, 2021)

marclawsonmusic said:


> One thought came to mind... would it be possible to ignore lookahead by looking at the transport 'is playing' instead of a specific MIDI port/channel?
> 
> So, if transport is playing, enforce all the delays, but if not... basically operate in 'ignore lookahead' mode?
> 
> ...



A couple comments about that. first, the ProcessMIDI callback is quite a bit harder on the CPU. So I'm trying to avoid it.

secondly, what we really need to do here is to detect when the midi channels is in live mode where you are trying to play your keyboard and hear sound,...generally while recording. During normal playback you always need the correction and lookahead happening. But while recording...you want it off for at least the channel you are recording to.

There is not a built in way to distinguish between playing and recording, in Scripter. Plus you actually want the lookahead turned off even when the transport isn't moving and you're just tinkering on the keyboard some ideas. or clicking on the piano roll and hearing the notes you had already placed there, its really annoying with the lookahead on to do that. you turn off the lookahead so you click a note and hear it immediately, etc.. 

So anyway, no I haven't figured out a way to automatically detect when to cancel the lookahead, but if I figure out a way, and I'm open to suggestions..keep them coming..I would definitely add that.


----------



## marclawsonmusic (Apr 24, 2021)

Dewdman42 said:


> A couple comments about that. first, the ProcessMIDI callback is quite a bit harder on the CPU. So I'm trying to avoid it.
> 
> secondly, what we really need to do here is to detect when the midi channels is in live mode where you are trying to play your keyboard and hear sound,...generally while recording. During normal playback you always need the correction and lookahead happening. But while recording...you want it off for at least the channel you are recording to.
> 
> ...


Good point! I didn't think about the scenario of playing along during playback... duh. And I understand the CPU concern as well. 

I really appreciate you throwing your brainpower at this to help the community. It really is a very tricky niche problem and you understand it very well.

Thanks again!


----------



## Dewdman42 (Apr 24, 2021)

I've updated the ArtAlignGui.js script, it now has a checkbox to enable a simple correction mode which will more exactly mimic the behavior of simple negative track delay. The main advantage of this would be you just need one instance of this script, one instance of LatencyFixer and no extra audio splits in Vepro, etc.. actually you can put both latencyFixer and Scripter into LogicPro and keep the Vepro mixer clean from anything at all. Perhaps easier to setup IMHO. You can even drag the faders while the project is playing and dynamically hear the latency adjustment change.

check it here: https://gitlab.com/dewdman42/artalign/-/wikis/ArtAlignGui.js


----------



## Dewdman42 (Apr 24, 2021)

oh wait I just found a last minute bug...don't try it quite yet


----------



## Dewdman42 (Apr 24, 2021)

I don't know, I think I found a strange LogicPro bug in Scripter, but the script basically works..the bug is related to the GUI not updating itself properly...so basically you might have to press the `Run Script` button two times when you first load it..

its very strange. I explained the bug in the troubleshooting section of my gitlab page..so.. there it is..let me know if you get anywhere with that.

But the actual script itself is working fine...


----------



## robh (Apr 25, 2021)

Dewdman42 said:


> I don't know, I think I found a strange LogicPro bug in Scripter, but the script basically works..the bug is related to the GUI not updating itself properly...so basically you might have to press the `Run Script` button two times when you first load it..
> 
> its very strange. I explained the bug in the troubleshooting section of my gitlab page..so.. there it is..let me know if you get anywhere with that.
> 
> But the actual script itself is working fine...


I've encountered that bug myself - if I think I understand what you are describing. How I got around it is to "get" the parameter, set it to the opposite and then set it back again under the Reset() function. Here's an example from a script I'm working on.
`function Reset(){
if(GetParameter("Responsiveness") == 0){
SetParameter("Responsiveness",1);
SetParameter("Responsiveness",0);
}else{
SetParameter("Responsiveness",0);
SetParameter("Responsiveness",1);
}
Responsiveness(GetParameter("Responsiveness"));`


----------



## Dewdman42 (Apr 25, 2021)

I'll try that later to see if it works. The funny thing is this bug only comes up when dragging the right edge of the GUI window to make it bigger to show all the controls. When you do that, one of the controls isn't showing until I hit the Run Script button again. But once its showing, then I can change the size of the GUI small to big to small to big as much as I want and its always there. 

Conversely, if instead of initially dragging the GUI window bigger, I simply scroll down, the control is where it is supposed to be from the get go.

I can't find any explicable reason for this other then Scripter must have a bug in the code it uses internally when you resize the window and it needs to repaint the screen, but only the first time.

I actually don't really want to have code like that above that will run during Reset every time, so probably I'd rather just say, hit the Run Script button twice,....but I will try it out later to see if that somehow magically gets Scripter to paint its resized window correctly.


----------



## robh (Apr 25, 2021)

Dewdman42 said:


> I'll try that later to see if it works. The funny thing is this bug only comes up when dragging the right edge of the GUI window to make it bigger to show all the controls. When you do that, one of the controls isn't showing until I hit the Run Script button again. But once its showing, then I can change the size of the GUI small to big to small to big as much as I want and its always there.
> 
> Conversely, if instead of initially dragging the GUI window bigger, I simply scroll down, the control is where it is supposed to be from the get go.
> 
> ...


I read your gitlab note after posting and it might be a different thing after all. But who knows, maybe it's related.


----------



## marclawsonmusic (May 22, 2021)

Hey @Dewdman42, I finally had some time to test the script...

I did a baseline comparison to my Latency Fixer template using the same MIDI. I put the same delay values in the script as I did in Latency Fixer. I rendered audio and listened back. Observations:

Using 'normal' mode (simple correction not checked), there were some strange transitions and jumps / hiccups in the MIDI. I assume this is because only NoteOn is being delayed? Not sure, but I didn't really like how it sounded.
I changed to 'simple correction' and that render was _much _closer to my original Latency Fixer version. In fact, it sounded identical, so I decided to put it to a null test. For some reason the audio didn't null out - I had a staccato passage and could hear those notes clearly - so something must have been out of time in one of the renders. I think the issue is with Latency Fixer and the cumulative effect of adding so many instances of that plugin. When I was setting it up, there were times when tracks I brought into alignment seemed to 'drift' time-wise and get slow again. I thought it was all in my head but maybe there is something in PDC that adds more headroom as more plugins are deployed? As an example, CSS shorts are normally 60ms delay (works fine in your script), but with Latency Fixer, I have them at 80ms.
So I decided that Latency Fixer was problematic and started ripping it out and replacing with your script. I even made the script leaner by removing all but 'simple correction' logic (all I need), and also put in some new code to toggle 'Record thru' using a CC. I set it up on my strings and it worked great. Woo hoo!

Then I moved on to woodwinds and that's when everything fell to shit. With the second Scripter instance, I started getting audio engine overloads. All cores firing at 75-100%. This was during a dense passage in the MIDI. I have a total of 9 VEP instances, so if I can't even get 2 to work, I am toast.

I am running a Late 2013 iMac i7 w/ 32GB of RAM. Maybe it's because of my older machine? With Latency Fixer, my VEP instances barely hit 20-25% during playback. Logic does hover around 40-50% during a dense passage, but that's totally OK for me.

So I'm back to using Latency Fixer on everything. I just want to get back to writing anyway... all this workaround stuff really breaks my balls. I'd be happy if Apple fixed the negative delay thing so I didn't have this MacGyver setup over here, but I guess that's life. It does seem like Cubase is better-suited to large templates these days, but I'm not sure I have the patience to learn a new DAW after all this recent pain.

Anyway, I REALLY appreciate your efforts to help solve this. Your script is great - it's a work of genius. But I think Scripter can be heavy, especially on an older computer. Unfortunately, it doesn't look like it will work for me.

Best,
Marc


----------



## Dewdman42 (May 22, 2021)

Latency fixer doesnt touch midi, it just reports the latency and logicpro should send the midi early so I can’t think of any reason why that would cause the shorts to sound funny. If you have an example of that you can share it would be good to know. Well it could be that using a lot of latency fixer instances confused vepro in some way but I’d need to see what you did to make sure the problem wasn’t something you did  only one latency fixer per channel strip, right?

Scripter doesn’t actually take much cpu so I’d like to see the project that you said ran into a bottleneck with dense passages. Dense passages in general do take a lot of cpu. Since I see you modified the script then I have no idea what you did or the impact of that either. Are you sure you didn’t have the track selected in live mode?

Also would like to understand what you said sounds funny like hiccups and why? That could be resolvable script bugs if I find out more info about how you were trying to use it and the modifications you made to the script.

Well anyway this issue is definitely a PITA. I am leaning towards using kontakt script for any kontakt libraries with this problem such as CSS. I’d rather use scripts custom written for each library and in many cases I don’t want to channelize anything for anything except PLAY which remains brain dead that way


----------



## Dewdman42 (May 22, 2021)

By the way you can also just go back to using one vepro instance per inst track with negative track delay. That would be a lot of instances though


----------



## marclawsonmusic (May 22, 2021)

I will send audio from my tests tomorrow.

I will also do a test with your script (not my modified one) to see if something I added was the root cause. I just know (as a programmer) that interpretive languages (like JS) add CPU overhead. When real-time audio processing is involved this could be a problem. That's why plugins are generally written in a compiled language like C++. Compiled languages are so much faster by comparison... 

Anyway, let's see what tomorrow brings.

Thanks again,
Marc


----------



## marclawsonmusic (May 22, 2021)

Dewdman42 said:


> By the way you can also just go back to using one vepro instance per inst track with negative track delay. That would be a lot of instances though


250 instances at least! Too many!


----------



## Dewdman42 (May 22, 2021)

In the grand scheme of things these scripter scripts are doing very little work compared to all the audio dsp. A jit compiler is also used by scripter. Also the script does not run in real-time. That isn’t the problem, though a juce c++ version could be done later if there is sufficient interest.

If there is a cpu problem it would be related to all the rescheduling of midi events. Every time the script calls sendAfterMilliseconds it causes the event to be reinserted in logicpro’s internal queues, using non-JavaScript I might add; but that appears to be a heavy operation during dense passages; especially if the track is selected in live mode. You can see dense passages peg the cpu with or without scripter when you have dense midi going to one vepro instance and that track is selected in live mode.

If you want to try to figure it out or help me figure it out, please send me some PM’s


----------



## marclawsonmusic (May 22, 2021)

Dewdman42 said:


> Latency fixer doesnt touch midi, it just reports the latency and logicpro should send the midi early so I can’t think of any reason why that would cause the shorts to sound funny. If you have an example of that you can share it would be good to know. Well it could be that using a lot of latency fixer instances confused vepro in some way but I’d need to see what you did to make sure the problem wasn’t something you did  only one latency fixer per channel strip, right?


PS - I actually put Latency Fixer inside VEPro - not Logic. 

So, when I started I just had 'CSS Vln 1' as one channel in VEPro... but since I need different delays for Legato, Longs and Shorts, I hit + and add two more audio outs and adjust Kontakt to output to st.1,2 and 3. (there are 3 copies of CSS Vln 1 in my Kontakt multi)

In VEPro, that results in:

CSS Vln 1 (legato)
CSS Vln 1 2 (longs)
CSS Vln 1 3 (shorts)
All CSS channels in VEPro (Vln 1, Vln 2, Va, Vc and Cb - there are 15 total) route to a single stereo return that goes back to Logic. In Logic, I just have 'CSS', not all the individual channels. So I need to put Latency Fixer in VEPro, not Logic.

Hope that makes sense,
Marc


----------



## marclawsonmusic (May 22, 2021)

Dewdman42 said:


> In the grand scheme of things these scripter scripts are doing very little work compared to all the audio dsp. A jit compiler is also used by scripter. Also the script does not run in real-time. That isn’t the problem, though a juce c++ version could be done later if there is sufficient interest.
> 
> If there is a cpu problem it would be related to all the rescheduling of midi events. Every time the script calls sendAfterMilliseconds it causes the event to be reinserted in logicpro’s internal queues, using non-JavaScript I might add; but that appears to be a heavy operation during dense passages; especially if the track is selected in live mode. You can see dense passages peg the cpu with or without scripter when you have dense midi going to one logicpro instance and that track is selected in live mode.
> 
> If you want to try to figure it out or help me figure it out, please send me some PM’s


Makes sense. I was not in live mode by the way... CPU on core 1 was doing nothing. But the other 7 were pegged to the roof! 

Let me test with your script and we'll see if I jacked something up. Also, I did have the UI up when I did this (something I just remembered). Maybe there is some CPU dedicated to UI (?) that goes away if you close it.

Cheers


----------



## Dewdman42 (May 22, 2021)

I misspoke that time, I meant in vepro. You have only one instance pf latency fixer per vepro channel, right?


----------



## marclawsonmusic (May 22, 2021)

Dewdman42 said:


> I misspoke that time, I meant in vepro. You have only one instance pf latency fixer per vepro channel, right?


Correct. So, for those 15 channels - one instance of Latency Fixer each. All channels sum to a bus which is the main output back to Logic.


----------



## Dewdman42 (May 22, 2021)

The scripter Ui probably isn’t making a big difference but you never know. Some plugin guí’s definitely can slam the cpu, it’s worth a try to close it.

I’m not really sure which script you are using either I have posted several and there have been updates since last time we talked here about it.

I personally have not been able to get any script to peg the cpu ever; so I’m very curious what you are running in to


----------



## Dewdman42 (May 22, 2021)

marclawsonmusic said:


> PS - I actually put Latency Fixer inside VEPro - not Logic.
> 
> So, when I started I just had 'CSS Vln 1' as one channel in VEPro... but since I need different delays for Legato, Longs and Shorts, I hit + and add two more audio outs and adjust Kontakt to output to st.1,2 and 3. (there are 3 copies of CSS Vln 1 in my Kontakt multi)
> 
> ...



About CSS also note that the legatos have different latency depending on the velocity of each note in the legato phrase, after the first. The script I shared to you earlier does NOT support that. Really CSS is a prime example where the easiest thing is probably to use a custom script. I started working on a custom script and its 80% there, but then I realized that for CSS, and probably all kontakt libraries, it would be better to have a custom KSP script that runs in kontakt, so that it can with any DAW. So me personally I will be using a kontakt KSP script for CSS, and for my Kirk Hunter stuff which has similar issues as CSS. There are several out there already, the one from Alex Vincent is really good already, but I might make my own, and eventually make something for KH and other Kontakt libraries, but I don't have many kontakt libraries.


----------



## Kent (Sep 2, 2021)

@marclawsonmusic @Dewdman42 There was a period of about 3 months where I only checked VIC briefly once a week or so…and then instead of responsibly catching up on everything I missed when I returned to normal usage I just marked everything as read. Completely missed this whole discussion! Sorry for the inadvertent ghosting.


----------



## marclawsonmusic (Sep 3, 2021)

kmaster said:


> @marclawsonmusic @Dewdman42 There was a period of about 3 months where I only checked VIC briefly once a week or so…and then instead of responsibly catching up on everything I missed when I returned to normal usage I just marked everything as read. Completely missed this whole discussion! Sorry for the inadvertent ghosting.


No problem, Kent. When you responded to my PM (I was getting desperate LOL), I remember you said you were busy.


----------

