What's new

Lemur Forum Dead?

Thanks, I'm gonna try this out!

Guess I'll be calling findInStr for the array of tracknames each time a new key is entered, so 'b' would give me all tracknames starting with 'b', and 'ba' would give me only 'bass' (if that's the only matching trackname).

No idea how fast Lemur is processing iterative functions, guess I'll find out soon enough.. :)
 
Right, you'll want to loop through the track name array and run findInStr() for each track name every time the user input is updated, keep track of which track names give a return of 0 (meaning the track name begins with the user input) or a return of >= 0 (meaning the track name contains the user input), and then list those track names in a way that lets the user select among them. So something like:

Code:
if (input != oldInput)
{
        //Delete everything in autocomplete list
        for (i = 0; i < sizeof(trackNames); i++)
        {
                if (findInStr(trackNames[i], input) == 0)
                {
                        //Add trackNames[i] to autocomplete list
                }

        }
}

There shouldn't be any noticeable lag; in my experience, Lemur is reasonably speedy at everything except graphics.

Also note that Lemur's arrays are limited to 256 elements, so you'll run into problems if your track list is longer than that.
 
Last edited:
If you imagine 100 different instruments in Cubase or Logic. Everytime you select a track with one of these instruments, you can have your Ipad mirror the correct set of articulations. Imagine to accomplish this in an app where you have to create a separate "scene" for each set, that's a lot of work. In Lemur you use the same "scene", and just change the text according to the set you selected. Furthermore it's possible, to create these articulation sets in excel and via a macro create both the VST map, articulation set (Cubase & Logic) and the all variables for Lemur.
Mihkel Zilmer and Marco Di Stefano has created videos that show how to do, and I will try to create a video showing Excel doing the hard work.

I’m interested in seeing if Lemur can add anything to my live performances.
But I’ve got access to 4 Scenes, each with 4 banks of 9 buttons which I’ve never needed as that’s 148 articulations.
So some are automated drawbar (Hammond B3) combos, upper C3 Vibrato, etc.

I’ve got so many USB Input/Output maps I could easily add Editors and iPads.
Is this forum a good place to learn Lemur and study it’s resources?
 
You can find everything you need here:
https://reaticulate.com/

This is most helpful as I’m considering Reaper as my live host sometime in 2019.
I’m hearing that’s its very efficient when each Kontakt instrument gets its own instance.
Also can create a Hybrid Mixer for complete MIDI Control.
I’ve given up trying to get answers from CueMix/MOTU, UAD Console, RME Total Mix and LynxStudio’ FPGA Based “Hardware” Mixer.
I’m still shocked UAD Plug Ins don’t have MIDI....£]~\_]£<~{.???.?
 
You can find everything you need here:
not everything. For example, there is not info about 256 character limit for text in Lemur expression. So, some people make a patch with abbreviated articulation names, etc.. I am interesting what is easy way to make a Lemur or TouchOSC patch for couple instruments with big list of articulations (for example, VSL organ has about 100 stops). What object is better - separate buttons or multiple switches (i like a Radio button mode, when if you select one articulation, corresponding button is still highlited after you stop press it, but I not see how to make this mode with separate buttons, for me it possible only with multiple switches)
 
not everything. For example, there is not info about 256 character limit for text in Lemur expression. So, some people make a patch with abbreviated articulation names, etc.. I am interesting what is easy way to make a Lemur or TouchOSC patch for couple instruments with big list of articulations (for example, VSL organ has about 100 stops). What object is better - separate buttons or multiple switches (i like a Radio button mode, when if you select one articulation, corresponding button is still highlited after you stop press it, but I not see how to make this mode with separate buttons, for me it possible only with multiple switches)
My suggestion is that you talk to @tack about this. He created and coded Reaticulate (for which I am eternally grateful) and is active on this forum.
 
Unfortunately I'm not familiar with Lemur, but given the design of Reaticulate I'm not sure it's necessary. The idea is that you pick some CC to use to activate articulations (I happen to use CC119) and then bind that CC to one of the "Activate articulations by CC" actions. Then if you want to trigger the articulation with program 42, you send this CC value 42 to one of MIDI devices you have Reaper configured to listen on for control events.

Articulation buttons in your TouchOSC template just need to send out CC119 according to how you've assigned the articulation program numbers for your Reaticulate banks.
 
Unfortunately I'm not familiar with Lemur, but given the design of Reaticulate I'm not sure it's necessary. The idea is that you pick some CC to use to activate articulations (I happen to use CC119) and then bind that CC to one of the "Activate articulations by CC" actions. Then if you want to trigger the articulation with program 42, you send this CC value 42 to one of MIDI devices you have Reaper configured to listen on for control events.

Articulation buttons in your TouchOSC template just need to send out CC119 according to how you've assigned the articulation program numbers for your Reaticulate banks.
I've actually found that life is so much easier if you pick up a copy of OSCulator and use it in conjunction with TouchOSC (or whatever) and Reaticulate. Gives you much more flexibility and has an interface that won't drive you nuts (TouchOSC).
 
I use the hell out of Bome MIDI Translator -- it's great! -- but I don't believe it supports OSC.

But FWIW, I do put Bome between all my controllers and my DAW and do all kinds of crazy translations providing custom functionality, including for articulations. But it's still ultimately translating MIDI to MIDI.
 
@robgb can you expand a bit on how you're using OSCulator?
I just use it as a Bridge, as it's much more flexible than TouchOSC's Bridge. When I use the TouchOSC editor, I just create buttons and sliders, and don't assign any specific information to them other than a name, with OSC auto checked. I then open it up on my tablet and the info is then transmitted to OSCulator, comes up as say /ARTICULATIONS/TREMOLO (the page and button name in TouchOSC) and in OSCulator I simply assign that to Midi Prg 12 or however I've set it up in Reaticulate. For CC sliders I assign it to a Midi CC plus whatever value is needed. This allows me to change things on the fly, if I need to, without having to open up TouchOSC, change it there, download it to my tablet, etc. Takes me about two seconds to make a change in OSCulator
 
I then open it up on my tablet and the info is then transmitted to OSCulator, comes up as say /ARTICULATIONS/TREMOLO (the page and button name in TouchOSC) and in OSCulator I simply assign that to Midi Prg 12 or however I've set it up in Reaticulate.
Makes sense. I can definitely see the value in shimming a translation layer between TouchOSC and Reaper, especially where OSC is involved.

It'd be interesting to support OSC directly with Reaticulate. I think it's possible given the current state of Reaper's API. But in that case, I think we'd need some standard naming for articulations, much like the (attempted) standardized program numbers for common articulations (based on UACC v2). Alternatively (or by default, absent an alias), I could do a substring match such that /articulations/tremolo will activate the articulation that most closely matches the text "tremolo".
 
Makes sense. I can definitely see the value in shimming a translation layer between TouchOSC and Reaper, especially where OSC is involved.

It'd be interesting to support OSC directly with Reaticulate. I think it's possible given the current state of Reaper's API. But in that case, I think we'd need some standard naming for articulations, much like the (attempted) standardized program numbers for common articulations (based on UACC v2). Alternatively (or by default, absent an alias), I could do a substring match such that /articulations/tremolo will activate the articulation that most closely matches the text "tremolo".
That would be nice, but, honestly, I love the way it's working for me right now and figure don't mess with success. :) I'm in the midst of setting up the Amadeus library and things are working beautifully. You truly are a coding wizard.
 
Top Bottom