What's new

DAW or notation first

ptram

Senior Member
Each composer seems to have his preferred workflow, starting either with a DAW or a notation program. I fear my workflow is going to be even more complicate.

1. Compose in a DAW with all the available sounds. Make the most realistic-sounding piece without thinking to what you would do with real instruments. Create a piece for your virtual orchestra.

2. When the piece sounds great, export a Music XML file and load it into a notation program. Transcribe the piece you composed into an actual score, something real musicians would be able to play.

3. When the score looks great and ready for publishing, export it to a DAW, and make it a great sounding piece, exactly as you would do by making a mock-up of a classic piece.

An incredibly long work. But it seems technology is not meant to help us do it faster.

Paolo
 
Last edited:
This is why I now limit attempts at DAW composed pieces. When I write I want to focus on the writing and not all the other stuff...Not to mention my string library cannot play in time so anything exported is notated wrong due to the delay. I write it on paper or directly into Finale first.
Many people use the workflow you describe successfully, all it makes me want to do is pull my hair out. I would say the more sound effects you use the more it would make sense.
 
I started out as a pencil and paper person, and had assumed that was it for me. How could I ever stay focused and maintain control over what I was doing any other way? Fortunately, I wised up and realized that composing in real time, basically one part at a time, at the DAW, is a much better choice for me in almost every respect.

If I need a score, I simply make one after the fact. Logic isn't *awful* about notation, but if it's truly unsalvageable, it doesn't take that long for me to just transcribe it by hand.
 
Last edited:
This process wouldn't be necessary if you spent a small amount of time learning some basic information about each instrument you're using. For example, if you know each instrument's range and a little about what's technically possible then each part should translate into things that can be performed by humans.
 
Agreed. It's just my meaningless opinion, but I think anyone working with orchestral VI's owes it to themselves to learn at least the rudiments of orchestration, even if they intend to throw it all out the window after.
 
Depends on who I am writing for: if it's for media, then straight to the DAW, and for live players, straight to notation knowing that the live version will sound better than a DAW rendition.
 
At this point, even if I am composing for the concert hall, I'll still put in my notes via a DAW and import the midi into Finale. I wish the notation programs would get with the times regarding a lot of things, especially input.
 
At this point, even if I am composing for the concert hall, I'll still put in my notes via a DAW and import the midi into Finale. I wish the notation programs would get with the times regarding a lot of things, especially input.
Do you mean sound quality or input if notes?
 
Inputting notes. I have saved a lot of time by doing my "input" via the DAW and tweaking it after importing into Finale. This method may not be for everyone.
 
I used to do notation first before I put it into DAW, because I tend to be more imaginative when I see them on paper and imagine the sound in my head instead of hearing the MIDI. But it takes a much longer time to do that. So now I'd just do the voicing sketch on paper and put it into my DAW directly.
 
Years ago I used to write in a notation software and play there. Not the best sound, but it was cool to use a PC for this, and the hability to change everything with a few clicks, trying weird things, was inspiring.
Then I got a DAW, and libraries, and I started exporting from the notation software to the DAW, but that was very frustrating. Things lost in the process, things to add, a not much better result unless changing everything...
Now I only use the DAW, but sometimes I miss the feeling of writing notes instead of painting lines.
IMHO the tools you use to create the music (DAW, notation software, paper, piano... whatever) change what you make, not only how you do it.
 
Thank you everybody for the very interesting insights. I think I feel very much like angeruroth, in missing the immediacy of writing in notation, and not being satisfied by the bad sound rendition of notation software. When writing straight to the score, I prefer to turn the audio preview off, to avoid being mislead by what I hear (that is never what I imagine).

Logic still misses any association between notation symbols and audio. Write a dot on a note, and it will simply ignore it. Microtuning alterations are totally missing. What looks like an easy task for a powerhouse of a program, is not even imagined. And it seems like notation is conceived for the most basic pop scores, than for even something for classical music of centuries ago.

This process wouldn't be necessary if you spent a small amount of time learning some basic information about each instrument you're using.

Or, it wouldn't be as easy, since this kind of issue is usually solved by composers in their early youth (in my case, when I wrote my first orchestral piece at 14). Now that we know something about instruments range, we have to deal with things like this:
 

Attachments

  • compl-score.png
    compl-score.png
    22.9 KB · Views: 52
Last edited:
My process is probably foolish.
I input note by note via mouse in the Cubase Score Editor.
So notation, but within the DAW.
 
Jamie, aren't you giving up to the best of a DAW, that is the ability of playing your music as it should sound? Are you following an hybrid procedure, with further work done on the raw notes to make them sound more real?

Paolo
 
Last edited:
I wonder if what you do in the Score page of a DAW couldn't be done in a notation program. For example, velocity values and CC data could be automated via expression marks and hairpins (even hidden ones, for the most subtle dynamic changes).

What I wouldn't be able to do is giving life to the phrase. I feel that it is not just a matter of randomization. There is some other arcane quality in live playing, that is not limited to a slightly delayed or anticipated attack. I've never been able to understand it, despite the many texts read on lab performance analysis.

Paolo
 
You are absolutely right - I find this very interesting!
For I have noticed that if my composition process involves playing something on the keyboard, although it may result in a more living performance it is simply not as good musically. If there is a keyboard in my process then my output becomes slightly limited to keyboard based conceptions. The composition flows relative to a mental image of a keyboard - not to mention those shapes my hands are comfortable with.
This doesn't happen if it goes simply from my mind to the page. It feels less limited. As though it is avoiding a reductive lens.
I know it's probably silly but I will happily sacrifice a bit of performance fidelity for the sake of improved compositional quality.
But I would never recommend my approach to ANYONE! :)
 
Wow, Jamie, I'm amazed. I tried that (it was one of my reasons to pick Cubase) but I just couldn't work with that editor.
Now I use the piano roll or the Yamaha, but I'll give that another try.
ptram, yes, the sheet editor works like any notation program, and it's integrated with expression maps, so the idea is not bad (everything in the same software).
IMHO, the trick when not playing the keyboard is to listen the result carefully (the selected patches sound) and change the notes ignoring the ruler. It also helps using tiny randomizations with layering, and paying parts with the piano so it feels less machine made.
 
Top Bottom