dr.soundsmith
New Member
I recently bought Palette Symphonic Sketchpad and Palette Melodics from Red Room Audio as my entry point into orchestral virtual instruments. I am currently working on a metal/rock song that has orchestral sections and I am finding I am having a hard time getting things to feel right with tempo. The samples seem to take a bit to get to the attack of the note, so if perfectly quantized, it sounds out of time. Palette lets you adjust the attack of individual articulations, but it doesn’t really solve the problem. And it seems to be the case for basically every articulation.
If I drag the notes to the left a bit, I can get it to sound a little better, but something still seems off. I’ve found that just physically recording the input on the midi keyboard gives the best feel (probably because my brain can compensate a bit for the lag). The problem is, I’m not that great of a piano player and it still isn’t perfect. Then I have a hard time using editing to get it to sound right too. I usually with VIs have liked to record with the keyboard and then use a partial quantize (iterative quantize in Cubase) to get things closer to perfect while still having a human feel. It doesn't work very well in this case.
I’m assuming this is something common to many libraries, so I’m looking for advice on how to handle this. I wrote a brief orchestral piece a month ago and it wasn’t so much of an issue because everything fit together pretty well, but for rock music in which locking to the grid is important, it’s difficult to work with. How do you deal with timing issues of samples? Do you have any tricks for getting things to be in time? Is there a way to quantize to an offbeat position for example? Or do you have a good way of figuring out how far the notes need to be dragged over?
I’m using Cubase. If anyone uses Palette and/or Cubase and has any insights, that would be great, but other input is most welcome.
Thanks for the help!
If I drag the notes to the left a bit, I can get it to sound a little better, but something still seems off. I’ve found that just physically recording the input on the midi keyboard gives the best feel (probably because my brain can compensate a bit for the lag). The problem is, I’m not that great of a piano player and it still isn’t perfect. Then I have a hard time using editing to get it to sound right too. I usually with VIs have liked to record with the keyboard and then use a partial quantize (iterative quantize in Cubase) to get things closer to perfect while still having a human feel. It doesn't work very well in this case.
I’m assuming this is something common to many libraries, so I’m looking for advice on how to handle this. I wrote a brief orchestral piece a month ago and it wasn’t so much of an issue because everything fit together pretty well, but for rock music in which locking to the grid is important, it’s difficult to work with. How do you deal with timing issues of samples? Do you have any tricks for getting things to be in time? Is there a way to quantize to an offbeat position for example? Or do you have a good way of figuring out how far the notes need to be dragged over?
I’m using Cubase. If anyone uses Palette and/or Cubase and has any insights, that would be great, but other input is most welcome.
Thanks for the help!