[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: About importance of "phase" in sound recognition

Many or most of these arguments seem to hinge on semantics.  When we talk on the phone do we hear the
phone transducer or the person speaking on the other end?  Eliminate either and the connection is lost.

Take two sine waves at slightly different frequencies. You hear beats, the signal amplitude changes as the sin waves
"move" in and out of phase.

You can say we are hearing phase (differences, changes) or you can say we are hearing amplitude changes
caused by phase.  I'm sure it gets really tricky with more than two frequencies - another opportunity for aural illusion?

Substituting "onset time" for phase, renders another very interesting but incomplete specification.

We could create waves which do not begin at a zero crossing and (incompletely) describe the resultant clicks in
terms of phase, though they would probably be better described in terms of impulse response.

It's particularly interesting to see long mathematical analysis based of semantic error or incomplete specification!
Perhaps we can agree that "Phase creates audible effects".