# Charting kontakt values to db, useful to all



## argitoth (Aug 25, 2013)

Here's my first chart, kontakt group volume

Group volume value to dB
https://docs.google.com/spreadsheet/ccc ... sp=sharing

IMPORTANT: The values are multiples of 15625. Here's the reason. Let's say you script 630000 for group volume. That displays as 0.0db in kontakt. If you then click on the group volume knob (kontakt ui, not gui), the display goes to -0.0db. There's a rounding error when the scripted value is not a multiple of 15625

NOT A MULTIPLE OF 15625 (value changes)
kontakt display = KSP value = measured db
0.0db = 630000 = 0db (you enter this value)
-0.0db = 629883 = -0.1db (you get this value after ui_update)

MULTIPLE OF 15625 (no change in value)
-6.0db = 500000 = -6.1db (you enter this value)
-6.0db = 500000 = -6.1db (you get this value after ui_update)


Velocity to dB
https://docs.google.com/spreadsheet/ccc ... sp=sharing

change_vol value = db * 1000
0db = 0
+6db = 6000
-6db = -6000

Any charting requests? Need a certain number of steps, values, precisions? Let me know.


----------



## Mike Greene (Aug 25, 2013)

*Re: Charting kontakt values, useful to all, plz support!*

A few of us also contributed some values in this thread:
http://www.vi-control.net/forum/viewtopic.php?t=19903


----------



## d.healey (Aug 25, 2013)

Or you could use Big Bobs value converters in the maths library and it will all be done on the fly.


----------



## argitoth (Aug 25, 2013)

*Re: Charting kontakt values, useful to all, plz support!*

updated with chart: velocity to db

Edit: Can Big Bob's math library give velocity to dB based on velocity modulation %?


----------



## mk282 (Aug 25, 2013)

TotalComposure @ 25.8.2013 said:


> Or you could use Big Bobs value converters in the maths library and it will all be done on the fly.



Right, but lookup tables are MUCH easier on CPU, one should strive to use them as much as possible to make the scripts as efficient as they can be.


Here are some that I did (Excel 2007 needed): http://www.mediafire.com/view/zgwsiu6bh ... tings.xlsx

And here's how I did them: http://www.mediafire.com/download/mbls5 ... quency_(Hz).nki

That's a looooooooooot of tedious group name typing. But this is how you get the most correct values when the engine parameter actually changes to the next displayed value. Notice that since Kontakt has a group limit of 4096, I sometimes had to use several NKI files to capture all the values (actually a sane limit to stop is around 2000 groups, from that point onward Kontakt gets REALLY sluggish!).


BTW I always use 631000 as a default for 0 dB volume. 


Anybody care to continue with this method? There's a whole lot of EPs waiting to be plotted to the T!


----------



## argitoth (Aug 25, 2013)

mk282, I can't even imagine how you got thousands of values like that.

for my velocity to db chart to be as precise as yours, I'd have to do what I did 1000 times, for 100%, 99.9% 99.8% etc...

btw, 631000 = 630859 after ui_update


----------



## mk282 (Aug 25, 2013)

^^^^

I suspect that is a bug in ui_update, then. I never use it, anyways, so I'm good. 


See the NKI file, it's all going to become apparent.  Yes, it's quite time-devouring to do it like this, but it's the only right way to get absolutely correct plots of EPs to real-life displayed values, IMO.


----------



## argitoth (Aug 25, 2013)

Very strange...

I tried to find the value threshold of exactly what input value would output a value other than 630859... it wasn't consistent. As I went up and down incrementally, the output value would change sometimes, and not change other times, I guess, based on the previous input value..... wow very buggy.

example (just making up numbers): input value of 485923 = 485847 sometimes and 48603 other times after ui_update


----------



## Big Bob (Aug 25, 2013)

*Lookup Tables* or *Math Library*, which should I use :? The debate seems to continue but, that which glitters isn't always gold. :lol: 

I'll just point out that the Fast mode of the library essentially gives you the best of both worlds. The fast mode employs a generalization of the lookup table scheme by table-driving the basic functions themselves rather than the ep-specific functions. Once the Log/Exp functions are sped up by using a set of tables, all the ep conversion functions based on them also speed up (and lighten the CPU load).

When you run the library in fast mode, you are essentially using lookup tables to reduce CPU demand. However, the tables are created automatically by the Library so you don't have to worry about them. Moreover, you always have the option to use Standard mode when CPU demand isn't an issue, and a simple compile-time switch setting allows you to change instantly to Fast mode when that is approrate.

In all fairness however, I should mention that there are some ep conversion functions that do not track as accurately as you might like when just using simple functions. However, keep in mind that to do any better with a lookup table, may be very difficult. Accuracy of fit will heavily depend on the sheer size of the table. Of course if the total number of data pairs relating the ep to the value is reasonable, you can just use one table entry per point for a' perfect fit'.

In most cases however, the number of possible points will likely be much greater than the number of 'anchor points' you are willing to commit to the table. In this case you will have to use some type of interpolation between the anchors. Most often scripters seem to use linear interpolation (probably because it is the easiest to code). However, this often results in considerable error in the fit between the anchor points. If your reason for going to a table was 'to fit the curve better' you may have a problem with this approach. To increase accuracy of fit, you may have to add a lot more anchor points and/or use a fancier form of interpolation than you would like.

On the other hand, if a reasonable size table with simple interpolation achieves sufficient accuracy for your purposes, then you should be aware that using the Math Library in Fast mode will be easier and probably perform even better.

*The bottom line is, aquaint yourself with the Fast mode of the library* because in the long run, it may perform better and save you a lot of unnecessary effort. Of course there is that one time 'learning curve' that seems to scare everyone' half to death' but remember that the really hard stuff was associated with writing the library. Using the library functions is duck soup comparatively :lol:

Rejoice,

Bob


----------



## d.healey (Aug 26, 2013)

I use Bob's library all the time for such conversions and even on larger libraries with a lot of controls I've never experienced any slowing of program response.


----------



## Big Bob (Aug 26, 2013)

> Right, but lookup tables are MUCH easier on CPU, one should strive to use them as much as possible to make the scripts as efficient as they can be.



Hey Mario, can you give me a specific example of a custom table-lookup ep converter that is *MUCH easier on the CPU* than the Math Library counterpart (when running in the Fast mode)? Have you actually run some quantitative comparison tests and so forth?

I can certainly agree that a lookup table would be more efficient than the library's *Standard* mode but, I question whether or not the same is true for the *Fast* math mode. :shock: 

My point is that when you use the Fast mode of the library you *are* already using a lookup table to speed things up so where does all your additional efficiency come from? :? 

Admittedly, generalizations are often a bit less efficient than tightly-coded specific implementations but this should be mostly a second-order effect and not worth the extra effort. So, I would really like to see some quantitative comparisons of a custom table approach versus the Fast math counterpart. :roll: 

Rejoice,

Bob


----------



## mk282 (Aug 26, 2013)

Right, I meant standard mode. TBH I didn't use the fast mode at all - I just rendered what I wanted into a lookup table and used that instead of including the math lib (mainly sin/cos tables).


----------



## Big Bob (Aug 26, 2013)

> Right, I meant standard mode. TBH I didn't use the fast mode at all - I just rendered what I wanted into a lookup table and used that instead of including the math lib (mainly sin/cos tables).



Fast mode not only speeds up everything based on Log and Exp functions but it also gives an enormous speed boost to all the trig functions.

So then, why not just use the library in Fast mode instead of building custom tables and writing the routines to access those tables? 

*Maybe the problem is that the Math Library's Fast Math mode has been the best kept secret in the scripter's arsenal* :wink: Either that or I guess everyone just likes to re-invent the wheel :lol: 

Rejoice,

Bob


----------



## mk282 (Aug 26, 2013)

I dislike using external NKA files for that fast mode... So I just inline the array in the script.

Nothing personal. Just business. :D


----------



## Big Bob (Aug 26, 2013)

Picky picky :roll: 

But then different strokes for different folks. :lol:

EDIT: Hmm. I suppose there might be situations where the requirement to provide the .nka files (and link the instrument to them) could be objectionable. So let me ponder this a bit because I might be able to provide a simple option to put the tables in the local data space of the script when that seems more appropriate.

To be continued ... :roll: maybe :lol:


----------



## d.healey (Aug 26, 2013)

Sounds good!


----------



## argitoth (Aug 26, 2013)

hey mk282, I'm trying to use your method of getting values with your kontakt LFO frequency patch. I changed it to do instrument volume (CONTROL_PAR_VOLUME -1,-1,-1) and it's working except for the fact that the values are not being displayed.

What am I doing wrong?

Edit: Oh, and just noticed you said group naming was tedios. That could be easily automated via importing samples and auto-placing them (using the sample name as the group name). You can get your sample names via a combination of TextCrawler, Excel/LibreOffice Calc/Google Spreadsheet/whatever spreadsheet, and Bulk Rename Utility... also... I'd be willing to help out and do this stuff myself if you can explain your process. 

Edit2: Oh, and the whole math library vs tables... Sometimes I just want one value, and it'd be nice to look it up and then write it in the script rather than use multiple lines of code to calculate that value.


----------



## mk282 (Aug 27, 2013)

Yeah, I basically rename the groups to actual text string that's displayed to the user and can be picked up with get_engine_par_disp()!

I'm halfway through renaming 1900-something groups to Volume knob values now. :D


Not sure how you intend to use samples, because the dot can be confusing, and there's the sample extension... and they'd need to be mapped consecutively... It seems more trouble to me than just group names.


----------



## Big Bob (Aug 27, 2013)

> Oh, and the whole math library vs tables... Sometimes I just want one value, and it'd be nice to look it up and then write it in the script rather than use multiple lines of code to calculate that value.



Of course, using a math function or a lookup table is only necessary when you need a smooth continuum of values or when you cannot determine at compile time what values you will need.


----------



## argitoth (Aug 27, 2013)

mk282 @ Tue Aug 27 said:


> the dot can be confusing


nope!


mk282 @ Tue Aug 27 said:


> there's the sample extension


 o=< 


mk282 @ Tue Aug 27 said:


> and they'd need to be mapped consecutively...


ok, MAYBE that is a problem. im going to try it now


----------



## argitoth (Aug 27, 2013)

I FAILED!!! :cry: :cry: :cry: CURSE YOOOOOUUU KONTAKT YOU'VE FOILED ME AGAIN!!! :x 

take a look: http://www.elanhickler.com/_/kontakt_auto_sort.gif

problem 1: creates 1 group at a time
problem 2: dash '-' is a deliminator and will not be part of the name


----------



## argitoth (Aug 29, 2013)

wait a minute... wait a minute... this technique would work with % sliders.

100%
99.9%
99.8%
etc.

The problem is still that kontakt only creates one group at a time... wait... all you need then is a mouse-macro program!

right-click
select all
auto

right-click
select all
auto

right-click
select all
auto

run that 1000 times and you got yourself 1000 groups for 100.0% to 0.0%



EDIT: OH! This would work with anything that does not have negative numbers! Including ms for delay!


----------



## mk282 (Aug 31, 2013)

I already have several NKIs created with various groups named... so it can be used for different parameters.

I have already charted delay time.


----------



## argitoth (Aug 31, 2013)

well mk282, where might I be able to help? o/~


----------



## mk282 (Aug 31, 2013)

My current pipe dream is charting ALL Kontakt's engine parameters, displaying them as graphs... and ultimately perhaps a webpage where you would type the value you want, select the engine parameter, and it would spit out the exact engine parameter value?

Here are several NKI templates that I'm using, all with variously named groups. I'm sure these are not enough to chart absolutely ALL parameters, because some EP displayed values have leading spaces on smaller numbers (filter cutoff, LFO frequency...), some don't (pan, envelope curve...), some are very specific in the way they display value (lo-fi sample rate and bit depth, for example)... I'm sure I'll need to make several more of such templates.

http://www.mediafire.com/download/oas9f ... otting.rar


----------

