I would say "yes": 16 bits give your DAW a total of 2^16 = 65,536 distinct values to represent -1 to +1 (the raw audio). 24bits gives it 2^24 = 16,777,216. In other words, you gain an additional 8 bits of resolution corresponding to 256 levels between "each of the 16-bit bits".
This allows effects (compression, etc.) to introduce less-noticeable noise. Many inserts do actually up-sample to higher internal resolution to avoid the "rounding" problem. However, your DAW may not, and so I see no reason not to take the higher precision.
Here's a
good link from Wikipedia, that explains it in more detail and also discusses how this relates to noise and dynamic range.
Audio Interface: I would want to ensure that my audio interface runs in the same bit-depth as my DAW as that reduces the amount of processing required to hand off the audio from the DAW to the audio interface.
If you need to deliver in 16 bits (e.g. CD), you should (imo) down-convert as the very last step after mastering.