ASA 127th Meeting M.I.T. 1994 June 6-10

5pSP37. Detection of auditory-visual and tactile-visual onset order in a speech-like context.

Robin S. Waldstein

Arthur Boothroyd

Speech & Hear. Sci., Graduate Ctr., City Univ. of New York, 33 W. 42nd St., New York, NY 10036

The abilities of five subjects to perceive the order of auditory-visual and tactile-visual events in a speech-like context were examined. The visual event consisted of a digitized video image of an adult female's face changing from mouth-closed to mouth-open position. The auditory/tactile event consisted of a 250-Hz tone of 500-ms duration, presented binaurally over headphones (auditory) or to the index finger of the right hand via a B&K 4810 Mini-shaker (tactile). Subjects were required to indicate which event came first: the (auditory or tactile) tone onset or the mouth-opening. Tone onset ranged from -100 to +200 ms re: mouth-opening, in 20-ms steps. Presentation of relative onsets was randomized, and modality was counterbalanced. Results revealed similar patterns of performance in auditory-visual and tactile-visual tasks. Subjects demonstrated a bias toward perceiving auditory/tactile tone onset before mouth-opening. A delay of about 100 ms was required before subjects began to perceive mouth-opening as occurring before tone onset. This delay is substantially larger than the 20-ms bimodal temporal resolving ability reported for very brief stimuli [I. J. Hirsh and C. E. Sherrick, J. Exp. Psychol. 62, 423--430 (1961)]. [Work supported by NIH Grant No. P01DC00178.]