Earl G. Williams
Naval Res. Lab., Code 7137, 4555 Overlook Ave. SW, Washington, DC 20375-5350
A near-field holography data set typically consists of about 0.5 Gbytes of acoustic pressure data. After a bit of routine processing, data sets totaling 2.0 Gbytes are soon in our computer's hands. An example is given with a recent experiment of how a typical data set is handled, processed, and reprocessed to eventually provide insight into the physics of vibration, radiation, and/or scattering. The nuts and bolts of the processing and how they have become optimized through years of experience will be discussed, for example, how such large data sets are dealt with, what kind of computers and computer programs, what signal processing algorithms and libraries, how data is visualized, what kind of display devices, computer networks, etc. In the quest for physical understanding, of critical importance is time, that is, it is not only what can be done that counts, but how fast it can be done. This is the purpose of this paper.