This evening it occurred to me that varying the pitch of a note, while generating its phase from a multiple of the phase of a base tone, might result in artifacts. I'm not certain of this, and cannot yet articulate why I think it could happen, but it seems at least plausible.
A solution also occurred to me, which is to only use the base tone to generate the initial phase of the note, and from that point on track its phase independently. That thought lead to another complication; when you want the varying pitch to come to rest on a specific tone, the phase of the note may not align with a newly generated note of the same frequency.
I first thought about pacing the change in pitch so it would end up phase-aligned on the target frequency. This would work for scripted compositions, but in live performance it isn't possible to know what the target frequency will be until it happens.
So it seems as though a better solution would be to cross-fade from the sliding note to a newly generated note which is stable on the target frequency.
But, if this mechanism (independently tracking the phase of each note, after initiating it using the phase of the base frequency) is in place for notes with varying pitch, why not just use it for all notes, and not have to worry about whether they will remain at a constant pitch?
Applying this technique to all notes would mean that the base tone is only used to initiate new notes, which would mean precision is no longer an issue, so we can dispense with 80-bit floats!
[7/5/19: The thought that set this all in motion, that varying the pitch of a note while generating its phase from a multiple of the phase of a base tone might result in artifacts (noise), remains a matter of conjecture. I haven't yet hit upon a way of determining whether this is an actual concern. However, eliminating the need for 80-bit floats is sufficient motivation to proceed as though it were established fact.]
No comments:
Post a Comment