ASA 124th Meeting New Orleans 1992 October

2pPP4. Real-time implementation of spatial auditory displays.

Scott Foster

Crystal River Eng., 12350 Wards Ferry Rd., Groveland, CA 95321

Recent advances in VLSI hardware have made possible the construction of real-time systems that can ``render'' acoustics. These systems use tracking mechanisms to monitor the position and orientation of a listener and produce synthetic binaural sound for that listener using headphones. The most sophisticated of these systems utilize supercomputing performance to render complex acoustic scenarios including transmission loss effects, reflection, nonuniform radiation, Doppler effects, and more, resulting in a compelling and natural presentation. With the advances in algorithms and hardware to produce such simulations, there is a need to develop an extensible protocol for virtual audio. Such a protocol will need to encompass all the acoustic models in use today and those expected to be developed in the near future. It is hoped that this protocol will allow virtual reality developers to utilize virtual audio technology, even as that technology makes dramatic improvements in its capabilities.