[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
> Maybe I could help tidy things up a bit here as someone whose main
> research lies in vision.
Thank you for your timely interjection.
> The argument that coupled oscillations in the visual system are caused
> by high frequency eye movements does not seem very viable. The eye shows
> a tremor at between 30-70Hz, but of very low amplitude (typically around
> 1/4 of the half-height width of the optical point spread function).
> These will not normally lead to changes in the retinal stimulation of a
> sufficiently high contrast to cause measurable temporal effects on the
> visual system.
According to Mathew Alpern
"there are oscillations... at rates of about 50 Hz and
amplitudes up to about 1 min of arc. together with slower
oscillations that may be of larger amplitude. ....All classes
of movement must help to preserve continuous fading.."
According to Woodhouse and Barlow on the matter of spatial
"The frequency that can be detected at 100% contrast represents
the observers acuity... In the case of the normal observer the
limit lies between 50 - 60 cycles per deg. (equivalent to a bar
width of between 37 and 30 s of arc)"
Would you absolutely rule out that the slower oscillations
which Alpern talks of (< 50 Hz) with larger amplitude (>1 min
arc) could not transmit jitter into the brain, e.g. for a
stimulus such as single oriented bar (100% contrast) in the
foveal region with a luminance of say 100 to 1000 trolands?
> The drift of the discussion so far suggests that the auditory system
> maps temporal frequency onto spatial layout. How about temporal
> information (ie envelope information)?
As you may have inferred from previous messages, for the last
few years I have been promoting a theory of the auditory cortex
in which temporal/temporal frequency information is represented
"spatially" in the form of a set of 3-D spatio-temporal filters
in a manner analogous to the way in which the visual cortex
represents visual motion.
See for example, Todd, N.P.McAngus (1996). Towards a theory of
the principal monaural pathway: pitch, time and auditory
grouping. In W. Ainsworth and S. Greenberg (Eds). Proceedings
of the International Workshop on The auditory basis of speech
perception. Keele, July, 1996. pp 216-221.
In the visual cortical case the spatio-temporal filtering is
modelled as a family of 3-D Gabor filters where the three
(i) space x
(ii) space y and
(iii) time t.
In the auditory cortical case the spatio-temporal filtering is
modelled as a family of 3-D causal filters where the three
(i) cochleotopic axis (approx. 30 Hz - 10000 Hz);
(ii) periodotopic axis (approx. 10 Hz - 1000 Hz); and
(iii) time axis (approx. 0.5 Hz - 100 Hz).
The existence of the first two orthogonal "spatial" dimensions
in the cortex has recently been demonstrated by Gerald Langner.
Crudely, one may think of this a projection or image of the
inferior colliculus (IC) in the cortex. It has been known for
sometime that in the IC there is a topographic representation
of periodicity orthogonal to the cochleotopic axis, again due
to Schreiner and Langner (1988). This observation has led to a
number of models for 2-D maps, e.g.
Kohlrausch et al (1992) Temporal resolution and modulation
analysis in models of the auditory system. In M. Schouten (Ed)
The auditory processing of speech: from sounds to words. Mouton
de Gruyter: New York. pp 85-98.
Kollmier and Kock (1994) JASA 95(3), 1593-1602.
The idea which I have been promoting, that one may extend this
idea to the cortex, by the use of 3-D filters analogous to
visual motion, as you may have gathered from some of the
exchanges, is somewhat more controversial. This idea was
developed by considering the relationship between rhythm and
motion (see e.g. Todd (1995) JASA 97(3), 1940-1949.) If you are
interested in any further references, some of which are already
posted on previous messages, then I will happily supply them