[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
relative phase information in audition and vision
Here's a different question in the midst of all the high-frequency
discussion. If you'll send your replies to me I'll send a single email to
the list compiling them.
I've been working on some studies examining possible parallels between the
processing of spatial frequency and auditory frequency (e.g., Ivry and
Roberston 1998) and the issue of phase information was raised to me
recently; a colleague pointed out that while relative phase information
seems to be unimportant when processing multiple auditory frequencies
(e.g., computing pitch), it is very important in vision. If you look at an
image which contains all of the same spatial frequencies at the same
positions, but without being in-phase, the image is not coherent.
My question is whether anyone has ideas about why this is the case. I'm
starting to wonder if this results from the kind of information that phase
might provide about the environment; i.e., perhaps whereas in vision phase
information is important to interpreting the spatial scene correctly, this
may not be as critical in audition? Does the indifference of pitch
computation mechanisms to phase reflect a lack of informativity of this
information about objects in the environment?
Also, are other elements of audition besides pitch perception (e.g.,
timbre) more dependent on relative phase? I would imagine that at least for
timbres dependent upon the attack like a trumpet that this would be the case.
thanks for your suggestions, reference ideas, etc.,