# Articulation Switching with Track Focus?



## procreative (Mar 22, 2017)

Just wondering if this is possible. Now I know there are various IOS apps out there such as Composer Tools Pro/Lemur, TouchOSC, Metagrid and all have the option to create pads or keys with labels and save them as templates.

I am using Logic Pro X with ArtzID which uses the MidiFX Scripter to translate midi notes input from a 2nd Keyboard into Articulation ID numbers which in turn trigger articulations either by Keyswitch patches, UACC, or Midi Channels for Multis.

All great so far and works fine.

Problem remains you still have to remember which key press does what on the 2nd Keyboard or play them until you find the one you want.

I see all these IOS apps have the option to have labelled Pads, great, in theory.

Do any of these offer a way to auto select a template when you select a track in your DAW? I like the idea, but the thought of having to select a template each time seems counter productive.

Basically my dream solution would be to either auto select the IOS template via track focus or have the labels draw their text from the Scripter values (probably impossible).


----------



## pmcrockett (Mar 22, 2017)

This is in Reaper, so it may not be useful to you, but I have a script + Lemur combo that auto-reads selected tracks' names into Lemur and presents them as buttons (which in a one-articulation-per-track setup lets me move the selected note to the articulation I want by pressing the appropriate button, and I don't have to customize a bunch of Lemur templates to do it). 

If your DAW has some way of sending info about what track is selected — or even better, the name of the selected track — via MIDI or OSC, you could rig a context-switching Lemur interface pretty easily.

At one point, I actually had a paper strip behind my piano keys where I penciled in what script(s) the keys triggered. Sometimes the old-fashioned way is easiest.


----------



## samphony (Mar 22, 2017)

pmcrockett said:


> This is in Reaper, so it may not be useful to you, but I have a script + Lemur combo that auto-reads selected tracks' names into Lemur and presents them as buttons (which in a one-articulation-per-track setup lets me move the selected note to the articulation I want by pressing the appropriate button, and I don't have to customize a bunch of Lemur templates to do it).
> 
> If your DAW has some way of sending info about what track is selected — or even better, the name of the selected track — via MIDI or OSC, you could rig a context-switching Lemur interface pretty easily.
> 
> At one point, I actually had a paper strip behind my piano keys where I penciled in what script(s) the keys triggered. Sometimes the old-fashioned way is easiest.


Would you share your script and how it works? 

I friend of mine is a reaper user and is exactly looking for something like this.


----------



## procreative (Mar 22, 2017)

So I can see that Composer Tools Pro you can set up several templates each using different controls or names then using environment hacks can switch template on selecting a track. However this means you have to have predefined the setup.

Troubles me though that this costs $149 and requires $24 for Lemur (cannot understand how a Lemur template costs 6 times as much as the software its based on). Now I don't doubt its got some great features.

But I wondered if any of the other solutions out there eg Metagrid, TouchOSC etc can do something similar?

All I need is a grid of buttons that output notes, they can be the same notes every time as ArtzID is doing the conversion. All I need is a way to change what these buttons are labelled on the fly.

Composer Tools will do it, but its a big outlay for a relatively simple task and not convinced I can make it work as my main studio Mac is a 2009 Tower without WiFi so the Midi either has to be via the Sync cable, a USB Midi cable like the iRig (more money) or via Bluetooth. And likely the hardware route is the only viable one as I have heard the latency can be the killer.

Any ideas?


----------



## dgburns (Mar 22, 2017)

To my knowledge, it's not possible selecting a track in Logic and having that track send out a midi message. It Is possible, however, to select a track in Logic using a midi message, like a note on etc. You need to set up controller assignments- and then there's a little hack you need to do to get it to work. Problem is, for this method to work, you would need to predefine your track layout and not mess with the tracks or else this method falls apart. The benefit is if you have the patience to set up your tracks and create buttons in lemur for them, you can put the mouse away and call up tracks from lemur. One button press, and you select the track, and call up the controls you want for that track. Lemur scripting is needed. Obviously not for everyone. I use it only for orchestral tracks, as they tend to stay static. The rest of the instrument layout, I just don't bother.


----------



## pmcrockett (Mar 26, 2017)

samphony said:


> Would you share your script and how it works?
> 
> I friend of mine is a reaper user and is exactly looking for something like this.


The script itself isn't quite ready to be shared as it's integrated into a larger MIDI editing suite that I'm still tinkering with (I'll probably make a thread about it once it's in a state where others can use it).

From a user perspective, you get a grid of articulation buttons that you can update to reflect the current track/folder selections. You use forward/back buttons to jump the playhead through notes on the selected tracks, and each articulation button is colored according to whether there is a note at the playhead position on that track, whether the note on the track is active for editing, or whether the note on the track is part of a group edit selection. The idea is to make it easy to layer articulations by keeping track of which tracks have a copy of the note and allowing the user to edit length/position/etc. for any one of these copies separately or all of them together, using the mouse as little as possible and not having to mess around with selection or cut/copy/paste in the DAW. I'm currently implementing a system of storing group ID metadata as text events to make the group editing feature work better, and I plan to add support for viewing keyswitched libraries in the same way (though layering would require at least two instances of the keyswitched library).

From a technical perspective, the most complicated aspect of it is that it uses a hacked together communication protocol that routes MIDI data from a Reaper script through a JSFX plugin in order to get around Reaper's limitations on sending MIDI/OSC messages from scripts -- this is what lets me transmit large chunks of data such as track names without resorting to abusing Reaper's control surface syncing system (which is important for other aspects of the project besides just track names).


----------



## procreative (Mar 26, 2017)

Here is the part I am curious about with TouchOSC. I can see Midi In activity into TouchOSC when I switch tracks. 

I have the WiFi TouchOSC Logic CS installed in Logic, so there must be something there transmitting. I just cannot see anything actually happening. But it does make me wonder what message is going into TouchOSC from Logic.


----------



## Peter Schwartz (Mar 26, 2017)

What dgburns said is on the money: if you're setting up a template and you promise to dedicate instrument channels to specific instruments  it's not all that hard to set up a simple environment thingy on the output of each instrument channel (namely, a transformer) to take a MIDI event playing out of that channel and turn it into something that would get TouchOSC to switch layouts. But... _can_ TouchOSC switch layouts in response to MIDI messages of some kind? I dunno. I haven't investigated it that deeply. But if you know for sure it can be done, I can come up with the scheme to make it happen.

Of course, you wouldn't want these messages getting to TouchOSC when a track plays down; otherwise it would switch between layouts constantly. And it might be fun to watch that mayhem for about 30 seconds LOL. But even that can be worked around. So... lemme know.


----------



## procreative (Mar 26, 2017)

I am not sure about the viability of all this. On the one hand, using ArtzID, every track responds to the same set of ID numbers so one page is all that is needed. On the other hand these TouchOSC buttons if just numbered tell me nothing about what I am triggering. So they are no better than using a 2nd keyboard.

Now one can make templates for each library where the only thing that changes is what the buttons are labelled as the actual output is the same each time.

Here are the caveats to that approach:

1. Some libraries have articulations available only on say Violin 1 and 2 not the rest etc. This means unlike for example CSS, some would either require more than one page or labels on the buttons indicating which instruments use that articulation eg Staccatissimo V1 | V2 | V | – | –

2. TouchOSC seems to have a maximum 25 pages in a template. Might be enough using the second system.

3. To use the second system would require using the Remap script for ArtzID and basically having a complete list of all articulations and a common ID map for them and remapping each instrument so any empty cells map to noting in range (a bit like UACC).

Back to TouchOSC. If we go the second idea above, would need to port all tracks in a stack eg CSS Violin 1, Violin 2, Viola, Cello and Bass to same midi message to output to TouchOSC. Touch OSC can switch pages on receiving a MIDI message. This can be CC, PC, Note, Poly Pressure, Channel Pressure, Pitch Bend, System Exclusive.

Of course it would mean hard wiring in Logic, not necessarily a problem for a template, but a job to set up and maintain. Then like you say what happens when playing back and switching around the project?

The far simpler but impossible task would be to feed the slot names to TouchOSC from the values in the ArtzID script.

Here is info on page switching: https://hexler.net/docs/touchosc-editor-pages


----------



## dgburns (Mar 26, 2017)

Peter Schwartz said:


> What dgburns said is on the money: if you're setting up a template and you promise to dedicate instrument channels to specific instruments  it's not all that hard to set up a simple environment thingy on the output of each instrument channel (namely, a transformer) to take a MIDI event playing out of that channel and turn it into something that would get TouchOSC to switch layouts. But... _can_ TouchOSC switch layouts in response to MIDI messages of some kind? I dunno. I haven't investigated it that deeply. But if you know for sure it can be done, I can come up with the scheme to make it happen.
> 
> Of course, you wouldn't want these messages getting to TouchOSC when a track plays down; otherwise it would switch between layouts constantly. And it might be fun to watch that mayhem for about 30 seconds LOL. But even that can be worked around. So... lemme know.



Peter, you will be more adept inside the Logic environment then I. I can tell you however, that my explorations inside this whole thing revealed a few discoveries.
Touchosc is responding to osc messages as far as I can guess. I can also say that the Hexler Osculator plugin will allow osc messages to be transmitted from Logic to the outside world. It is designed to allow using Osculator as a middle man between a device such as Logic control (Mackie control), and it was designed to do basic Mackie control mixer commands. What I was able to hack was the track focus assignment and create midi messages in the controller assignment within Logic. The "hack" is about simply replacing the "channel strip" designation of track focus with track index, opening up the ability to select any track in Logic remotely by specifying it's track index number, going beyond the Mackie track focus limit of 8 tracks, as per the Mackie controller design. So the idea here is to send simple midi messages such as note on, cc or program changes to select specific tracks as per their index number in the arrange page. While it worked reasonably well in practice, there were some odd glitches. Not unusual for the controller assignment. I eventually abandoned the whole thing and reverted to my old method. The old method requires firing Applescript to click mouse co-ords. Ugly at best.
Logic also has osc message capability, but I suspect the incoming messages must conform to the internal assignments within Logic, and as far as I can tell, they are not published anywhere, so It's near impossible to use osc messages inside the controller assignment page at present. Also, the assignments are stored within the app pref's which is not the best way to handle this kind of setup.
As far as Lemur is concerned, it is much easier to design selecting pages, even containers within pages, from incoming midi or osc messages. I run three ipads, and have the third setup to switch interfaces from buttons on the second one. Fairly simple.The scripting within Lemur allows the ability to have a number of commands sent from one button press if so desired, even to multiple computers or devices on the same network.
-edit-
The whole reason for my needing more then one ipad is because the screen real estate is so small. But I setup the middle ipad to select tracks in Logic and set the third ipad to display the articulations of the instrument I selected as well as firing an Applescript command on the mac to select the track within Logic. All this means that you need to think about creating osc messages that are unique to each instrument within you library collections. As an example, for VSL flute, my osc message would be ' /select_vsl_wd/flute ' , where on the third ipad, it is scripted to respond to such an osc message to "selectinterface" and possibly a container within that interface to show VSL flutes articulations and faders etc.


----------



## Peter Schwartz (Mar 27, 2017)

Much to reply to. But for now...

procreative, thanks for pointing me to the page with the documentation, and dgburns for your insights. So... at the moment, switching between layouts is a piece of cake. Working on a one-shot situation for selecting a layout comes next...


----------



## procreative (Mar 27, 2017)

Peter Schwartz said:


> Much to reply to. But for now...
> 
> procreative, thanks for pointing me to the page with the documentation, and dgburns for your insights. So... at the moment, switching between layouts is a piece of cake. Working on a one-shot situation for selecting a layout comes next...



You are indeed a genius, I spent way too long the last couple of nights trying all sorts of stuff to no avail.

TouchOSC responds to PC for pages/tabs but I could not get that to work (due to my ineptitude in the environment) and I fear this route is very clunky to administer anyway as it means lots and lots of cabling.

What teased me was that there seemed to be Midi going into TouchOSC when selecting a track in Logic that even though wisdom is that Logic does not sent out natively, there seemed to be something in the TouchOSC plugin that came pre-installed in Logic that passed something on.

I did discover that there were commands reserved for LogicPad (a TouchOSC template), that had commands that switched pages. Great! Except it was only programmed for up to 5 pages.

Its a pity TouchOSC does not have a way to auto load a template, that way you could use just 4-5 pages for each and use the above built-in page switching. But I cannot see any documentation for loading templates, only pages/tabs.

Personally still not sure how viable TouchOSC is for this in a large template. Assuming you can use one page for each library by adding notes to each button about which instruments have that slot and using a remapper to move keyswitches in others to make way for "empty" slots, the max page count is 25.


----------



## Peter Schwartz (Mar 27, 2017)

Thank you, but there is no genius involved. I'm just implementing what was on the documentation page you posted the link to.


----------



## storyteller (Mar 27, 2017)

samphony said:


> Would you share your script and how it works?
> 
> A friend of mine is a reaper user and is exactly looking for something like this.


I just implemented this very same functionality in Orchestral Template for Reaper. There is a TouchOSC interface that is included with OTR. To give you an idea of what it looks like and how it is implemented, check out http://otr.storyteller.im/touch-osc-template/. The scripts that are called within oTR are specific to the OTR template and workflow so unfortunately I can't provide those.


----------



## samphony (Mar 27, 2017)

storyteller said:


> I just implemented this very same functionality in Orchestral Template for Reaper. There is a TouchOSC interface that is included with OTR. To give you an idea of what it looks like and how it is implemented, check out http://otr.storyteller.im/touch-osc-template/. The scripts that are called within oTR are specific to the OTR template and workflow so unfortunately I can't provide those.


Thanks! Ill look into your OTR offer.


----------



## Peter Schwartz (Mar 27, 2017)

@procreative, so... I have this working nicely. The procedure is simple enough, I think...

First, you select a track. Then...

• Step 1: Play a note -- any note. That selects the desired* TouchOSC layout. It will automatically switch on your device.

• Step 2: As soon as you select an articulation on your device, the system is prevented from switching layouts until you move to another track.

And when you do select a new track, just repeat step 1.

*How do it know which layout you desire? Simple... There's a little environment thingy ("Layout #") that you connect the channel strip to. Then dial in a number of the layout. Connect its output to another environment thingy that sends MIDI to the TouchOSC Bridge.

Because the Layout # thing is easily copied (just opt-drag) you can create more of them to add them to new channels. They all get cabled to the same destination: "Layout Switcher". And assuming that some of your layouts can be used for multiple libraries, just set the number to be the same -- as shown below -- where channels 1, 4, and 5 all recall the same layout.

There's one other little bit of setup but it's easy and only has to be done once.

* BETA (BUT WORKS!) TOUCH OSC LAYOUT SWITCHER*


----------



## Heinigoldstein (Mar 28, 2017)

Very nice Peter ! Will you make this available for all of us standard mortals ?


----------



## procreative (Mar 28, 2017)

Peter Schwartz said:


> @procreative, so... I have this working nicely. The procedure is simple enough, I think...



Very clever, you are an environment guru indeed. 

Its such a shame that TouchOSC has a 25 layout limit, as so many libraries have inconsistent articulations across instruments. For instance Hollywood Strings: Violin 2 has Harmonics or Flautando, Vilin 1 does not.

So it means having notes in articulation buttons as to which ones have that articulation.

Lemur I think does not have this limitation as I think you can recall templates as well as pages.

You would think 25 Tabs would be enough, but you only need 5-6 libraries with inconsistent sections and thats your 25 Tabs used up.


----------



## procreative (Mar 28, 2017)

I found this posted in their forums:

"You can change text in labels by sending an OSC messages to a label’s OSC address with a parameter of type String, Float or Int and it will display it."

So I wonder if there might be a way to send the articulation names to TouchOSC? This could completely do away with the need for pages.

I see a label has a OSC address eg /1/label where 1 is the page.


----------



## Peter Schwartz (Mar 28, 2017)

Hey All,

Yes, I'll make it available at some point. 

Thoughts on 25 tabs... I don't know if they'll be enough for any one person, but at it's least a starting point. Where there's consistency from one patch to the next (Cinematic Studio Strings, many Vienna patches, and others) you can use the same layout -- so there's a saving. But anytime you customize a patch such as with multi-timbral setups, or as you can do with ARTzID or SkiSwitcher... sure, you'll start to eat through those 25. But there's the flip side of this: programming even just 10 layouts in the TouchOSC editor might test your will to live!

I find that with enough repeated use of a patch I end up memorizing which articulation is associated with which ID. It's just rote learning, not like I have a photographic memory. When I don't remember I refer to the articulation name lists in each ARTzID/SkiSwitcher script viewable with a click on a menu. So for my own use I'll probably only end up making a few layouts, but for those who want total consistency, yeah, 25 might not be enough.

As far as sending articulation names to TouchOSC, that wouldn't be possible in Logic without involving some kind of 3rd party app, and that would be costly to develop. And whenever a 3rd party app is added to the mix, a system can become inherently unstable in the long run. It was nerve-wracking enough when 10.3.0 was released and it broke certain aspects of Logic's native Scripting capability. Imagine what would happen if an update to Logic, and/or this hypothetical 3rd party app, and/or the operating system itself introduced bugs or new incompatibilities? This is why I'm content to keep both ARTzID and SkiSwitcher's functionality contained within the confines of Logic's native capabilities.


----------



## procreative (Mar 28, 2017)

Fair points.

I just mentioned it because it gave the impression that somehow the built-in TouchOSC plugin could replay these messages (if its even possible in Logic). Probably using controller assignments?


----------



## MIDI Kinetics (Mar 28, 2017)

If I may, I'm not too sure about Logic but it's definitely possible with Cubase. I set this up for a media composer:






The giant middle tablet works like an orchestrator: it record-enables and jumps to a track in Cubase and then also recalls the preset in Composer Tools (on the left). Yes, he is still using the old versions of Composer Tools (and C_brains) but that's because his writing schedule is completely insane and can't currently afford to change anything about his setup.

In Cubase all you have to do is use the Project Logical Editor, and create a preset which jumps to the track; then assign the preset to a Generic Remote command. If I get a moment I will look into whether this is possible in Logic, but perhaps the more experienced Logic users can chime in. They may have to actually dig into Logic's Control Surface SDK to get it working. Cubase's Project Logical Editor + Generic Remotes make this kind of thing relatively easy, which is why Cubase is so great with tablets.

Then only trick to making this work is that when you press your finger DOWN on the orchestrator you send the track focus command, then when you RELEASE it sends the recall message. I can't remember exactly why this was necessary. If I _recall_ (ha, no pun intended...) it's because Cubase needed a tiny moment to focus the track and get it record-enabled before the Composer Tools Recall command could be sent through. I believe I initially coded in a little delay timer, but it was enough just to send the first command on press, and the second command on release.

Something I often get asked about is if the labelling on Composer Tools (or any controller) can happen automatically. No it cannot. Think about it like this: your sample libraries live in their sample hosts-- Kontakt/Vienna Instruments/Whatever. The only thing that "knows" what CCs and keyswitches are available is the instrument itself. How would Lemur (or whatever app) get access to that information? It would be up to the sample host developers such as Native Instruments/VSL to implement a mechanism (such as OSC) to broadcast this info to the outside world (hint, hint!). And then on top of that, the sample library devs would themselves be responsible for implementing it so that controllers like Composer Tools Pro would be able to read it. The bottom line is that *someone* has to manually enter the labelling *somewhere*. That's exactly why I created the MIDI Kinetics forum so that people could share presets (and, as a corollary, the Expression Map Converter for Cubase).

Implementing separate pages rather than populating predefined controls is certainly possible and in fact is built right into Composer Tools Pro (see: User Space), but it's a total nightmare to maintain custom controls. The moment you want to change something you have to manually dig into the code. With something like Composer Tools Pro you can use Multi-Edit and change multiple parameters at once across multiple presets.

Hope this was informative without being too self-serving.
Happy music making,
Michael


----------



## procreative (Mar 28, 2017)

Played around in Logic a bit more. I managed to get TouchOSC buttons to receive data for the ArtzID articulation names from the track in focus.

But, and maybe this is due to the fact that the articulation values are from a menu, whatever button you press all buttons display the pressed articulation. Just could not find a way to make the names stick to each button independently.

Maybe someone might figure a way to maybe script the articulations in such a way as to work?

Basically I put it in learn mode, selected the Standby menu in ArtzID, then added the Value String field: /1/artlabel1 = /pagenumber/textlabel (in TouchOSC).


----------



## Peter Schwartz (Mar 28, 2017)

Very interesting! It's cool that you can get at least get that far. Thanks for sharing this.

Caveat: I don't/won't discuss the coding of the scripts, and similarly, the code is not open for discussion in public forums. With that in mind, I will offer this...

Articulation names live in what's call an "array", which is for all intents and purposes a list. The individual names aren't available as individual elements. Only this list is available as an element. Thus, I'm not surprised that you're getting all of the names to appear at once for all the buttons.

If I were to code Scripts so that each individual name was individually accessible for this purpose, it would significantly bloat the code. And the long-term cost in CPU and RAM usage wouldn't be worth the short-term tradeoff of the convenience of entering the names solely in the Scripts.


----------



## procreative (Mar 28, 2017)

Yes I get that its down to the fact its a menu and why its done that way. I could create a display feeding from the selections, but thats not much better than using the Logic Remote with Smart Controls displayed.

The only other way would be having a second script loaded with the array in it displayed some other way, maybe?


----------



## Peter Schwartz (Mar 28, 2017)

Shooting from the hip here but... yes, possibly. Thing is, you wouldn't want that kind of Script instantiated on every instrument channel even in a moderately-sized template. CPU + RAM...

In short, there's no easy or economical solution for this, at least not within the confines of ARTzID, which as I mentioned before, lives within the confines of Logic. At least not as far as I can see. But that's just me and I don't claim to know all the possibilities (to paraphrase Mrs. Throatwarblermangrove, "It would take all the mystery out of life") so if someone can find a way to do it, that would be awesome!


----------



## dgburns (Mar 28, 2017)

MIDI Kinetics said:


> If I may, I'm not too sure about Logic but it's definitely possible with Cubase. I set this up for a media composer:
> 
> 
> 
> ...



@MIDI Kinetics , yup pretty much agree with everything Michael writes here. Cubase rocks for this sort of thing.

You would need to get hold of the Logic SDK as Michael states, but Apple won't give it up unless you intend to sell a product, they won't release it, even with an Non-disclosure, as far as I understand. That said, Logic does support osc messages, so with the proper controller plugin written, all this could be done. Been down this road, dug deep within the bowels, came out the other end with the can and can't do's on this.

@procreative ,my advice to you is to get away from Touchosc and move to lemur. Simply stated it's more capable. Even with Peter's solution, you could make it work. You'd need a discreet midi message for every instrument with a unique set of articulations, but it can be done. This is the whole reason I wanted osc messages, as it is simple to create as many unique ones as you wish. Using midi for this task is archaic, given what osc can do.

Personally, I went ahead and created all the articulation controls I needed for all my libraries. It took a long time, been at it for a few years finessing the layout as new ideas came to light. The current one condenses everything into one ipad and I could still remotely select a track and switch to view the controls on the same ipad. There is a trade-off between button presses and efficiency of motion. You don't want a system that makes you dig deep into menus. But ipads have a small screen real estate.

One final thought, be carefull of using the controller assignments page in Logic. You should def make a copy of your prefs upon quitting Logic. If you should ever need to refresh your prefs, as is often the case, you will wipe out all your hard work . The controller assignments are stored in prefs.

good luck


----------



## Peter Schwartz (Mar 28, 2017)

> You should def make a copy of your prefs upon quitting Logic. If you should ever need to refresh your prefs, as is often the case, you will wipe out all your hard work . The controller assignments are stored in prefs.



Great advice. This would be the .cs prefs file -- different from the regular prefs file:


----------



## procreative (Mar 29, 2017)

Peter Schwartz said:


> Great advice. This would be the .cs prefs file -- different from the regular prefs file:



Yes I am well aware of what I think is a major flaw that remains, namely the controller assignments prefs. It is very flaky, in fact I got into the habit of making a copy from when I used to have my Mackie MCU hooked up via USB (long story short, now have it via MIDI as it seemed to lose its setup frequently and with it all my custom assignments).

Peter may I ask something. I noticed your comment about RAM/CPU load from too many Midi FX scripts. Is this more about complexity of scripts or just the number of separate ones on each track? I have 2-4 on virtually every track now (some tracks have CC rerouters to cope with dev's differing choices for some controls eg HWS has dynamics on CC11 in some patches). Should I be worried?


----------



## procreative (Mar 29, 2017)

dgburns said:


> @procreative ,my advice to you is to get away from Touchosc and move to lemur. Simply stated it's more capable. Even with Peter's solution, you could make it work. You'd need a discreet midi message for every instrument with a unique set of articulations, but it can be done. This is the whole reason I wanted osc messages, as it is simple to create as many unique ones as you wish. Using midi for this task is archaic, given what osc can do.



Does Lemur have better management of templates? I saw a Lemur based solution called Composer Tools Pro (bit pricey for me at moment as most of the other features I don't need) and it has template recall as well as pages which TouchOSC doesn't. Is this a standard Lemur feature?


----------



## dgburns (Mar 29, 2017)

procreative said:


> Does Lemur have better management of templates? I saw a Lemur based solution called Composer Tools Pro (bit pricey for me at moment as most of the other features I don't need) and it has template recall as well as pages which TouchOSC doesn't. Is this a standard Lemur feature?



Lemur is way more powerful because you can script. Touchosc doesn't have that feature. Lemur has more object types that you can use for your template as well. The 'tabbed container' object would allow you to put all your libraries/individual instrument artics on one 'page' (called 'interface' in Lemur). You can then access each individual tab by creating a script to respond to a midi note which would switch the container to show that specific tab. You can have a container within a container and still be able to create a script to access it all, it just takes a few more lines in your script. Touchosc by contrast, is only one level deep this way, so once one page is full, you need to spill over to another page.Once all your pages are full, you hit the wall.

The big drawback to Lemur is that the manual sucks balls, and learning how to script is harder then it should be as a result. Not everyone will want to put in the time to learn it.

Composer tools pro, in contrast, I think is incredibly cheap given the level of programming skill and huge amounts of time that went into creating it. I just prefer my own layout, but that's personal.

I made the switch from Touchosc basically because I realized early on the exact same thing you are trying to deal with, namely, "how in heck do I create all my library instrument keyswitches on my ipad" so I don't have to try remembering them all.
I still use touchosc for the logicpad, and prefer it to logic remote.


----------



## procreative (Mar 29, 2017)

dgburns said:


> Lemur is way more powerful because you can script. Touchosc doesn't have that feature. Lemur has more object types that you can use for your template as well. The 'tabbed container' object would allow you to put all your libraries/individual instrument artics on one 'page' (called 'interface' in Lemur). You can then access each individual tab by creating a script to respond to a midi note which would switch the container to show that specific tab. You can have a container within a container and still be able to create a script to access it all, it just takes a few more lines in your script. Touchosc by contrast, is only one level deep this way, so once one page is full, you need to spill over to another page.Once all your pages are full, you hit the wall.
> 
> The big drawback to Lemur is that the manual sucks balls, and learning how to script is harder then it should be as a result. Not everyone will want to put in the time to learn it.
> 
> ...



Good points and will think on this. I guess though using ArtzID, it feels like a sledgehammer to crack a nut seeing as the actual output of each button never changes between "templates". All that would be needed if it were possible was the labels of the buttons to change on track focus.

But its likely this is impossible as they are contained inside a scripter menu and thus only one can be accessed a time so as posted earlier all buttons switch to the latest selected button. So frustrating!

Having to create templates just to have the labels change, feels like duplication of work.


----------



## Peter Schwartz (Mar 29, 2017)

@procreative, no, you don't have to be worried. Just be aware that even though MIDI FX are just processing lowly MIDI (i.e., data streams that are never nearly as dense as audio) they still use system resources and consume RAM.


----------



## procreative (Mar 30, 2017)

Okay exciting news! I stumbled on an App called Osculator https://osculator.net which by the way also has its own CS plugin for Logic Pro but uses the TouchOSC protocol.

Basically instead of using the TouchOSC plugin you use the Osculator CS plugin then you run the Osculator app in the background and configure it to route messages.

When you click a track in Logic it transmits _/logic/mixer/track/1/focus_ (where 1 = track number). Took me a while to figure out how to program it. But using the attached settings where /1 etc = name of Tab in TouchOSC (I left them as default 1, 2, 3 etc).

So now whatever track you select in Logic, the page in TouchOSC switches automatically with no further actions needed and no environment hacks.

Of course you can route any track to any tab and you can save the routing as a preset in Osculator.

Granted its a 3rd party app that might break, but its been around for over 5 years. I am sure there is a lot more you could do as whatever Logic GUI action you do it seems to sniff out the command.


----------



## dgburns (Mar 30, 2017)

procreative said:


> Okay exciting news! I stumbled on an App called Osculator https://osculator.net which by the way also has its own CS plugin for Logic Pro but uses the TouchOSC protocol.
> 
> Basically instead of using the TouchOSC plugin you use the Osculator CS plugin then you run the Osculator app in the background and configure it to route messages.
> 
> ...



Doubt you'll be able to focus more then 8 tracks. Obviously you haven't read my posts :(


----------



## procreative (Mar 30, 2017)

Possibly will try later...


----------



## procreative (Mar 30, 2017)

In a word.

Too excited too soon! Firstly it scrolls up from 1-8 fine, but back down again it gets muddled. Then after 8, its troublesome. I tried making extra assignments in Logic after 8, 9 worked but 10 and above...

Tried changing the assignment mode from Fader Bank to Index and Software Instrument and no change.

Really frustrating as it has the potential to work.


----------



## DanielBrunelle (May 3, 2017)

dgburns said:


> What I was able to hack was the track focus assignment and create midi messages in the controller assignment within Logic. The "hack" is about simply replacing the "channel strip" designation of track focus with track index, opening up the ability to select any track in Logic remotely by specifying it's track index number, going beyond the Mackie track focus limit of 8 tracks, as per the Mackie controller design. So the idea here is to send simple midi messages such as note on, cc or program changes to select specific tracks as per their index number in the arrange page


 @dgburns - Can you provide the step-by-step for this? Despite your warnings, I'd like to give it a try. Thanks!


----------



## dgburns (May 4, 2017)

DanielBrunelle said:


> @dgburns - Can you provide the step-by-step for this? Despite your warnings, I'd like to give it a try. Thanks!



I promise I'll circle back, I just got back to work. Maybe next week...


----------



## dgburns (May 5, 2017)

DanielBrunelle said:


> @dgburns - Can you provide the step-by-step for this? Despite your warnings, I'd like to give it a try. Thanks!



I made a rtf doc with pics in it. I don't remember mentioning it elsewhere, but the reason I abandoned this was because when I went through the whole dog & pony in one of my templates loaded up and ready to rock, some channels simply would not respond to the remote midi , and it was driving me nuts as there were no apparent reasons I could fathom as to why. Also, controller assignments are saved with pref's and that is a lot of vital setup info to lose if you trash pref's.

All in all, the way to do this the RIGHT way is to apply for the SDK and get an app made that addresses the internal architecture of Logic, which does in fact support OSC messages. Apple won't give you the SDK unless you plan on selling and being a commercial entity ( as I understand it) 

I'm just not willing to go that far, I'd rather just write music at this point.


----------



## DanielBrunelle (May 5, 2017)

dgburns said:


> I made a rtf doc with pics in it.



Fantastic thank you! I just enrolled in the Apple developer program. Ill let you know if any of their policies have changed


----------



## Hans-Peter (May 5, 2017)

Well, the way how I remember it is that you can make it work up to about 160 tracks (by combining bank and track controls). However, for some weird reason everything that comes after that number won't work reliably. Eventually I gave up implementing it (too much effort for a school setup anyway, and, it lacked flexibility).


----------

