[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[no subject]

Controlling the directionality of sound has always been
a classic problem in designing flight simulators.  How
does one convey the broad aerodynamic hiss around the nose
of the aircraft as well as the point source of an
engine failure?  Current designs involve almost a
dozen speakers spread around the cockpit.  Not only is it
expensive, but the simulated sound environments are not
always convincing.

Recent commercial demos of stunning 'imaging' of
auditory environments have inspired a research effort at CAE
to control the directionality of simulated sounds more directly.
I am very interested in hearing comments/receiving
references concerning this proposed effort.  Specifically, does
current research in directionality provide the knowledge
to process sounds to 'place' them in an auditory field using a small
number of speakers?

Thank you for your time,
Jasba Simpson
Flight Simulator Motion and Sound Systems
CAE Electronics Ltd.