From the songs of whales to the crash and hiss of waves, the ocean is a sea of sound. For years, scientists trying to improve traditional underwater-detection techniques, such as sonar, have viewed this background noise as a nuisance.
To Michael Buckingham, however, that noise can shed a whole new "light" on what lies or moves beneath the surface. The professor of ocean acoustics at Scripps Institute of Oceanography in La Jolla, Calif., is refining a technique he pioneered for using the ocean's ready supply of background noise to yield images of underwater objects.
The approach, dubbed "acoustic daylight," could find applications ranging from giving submerged submarines forward vision to surveying the sea floor and monitoring efforts to lay undersea cable, he says.
The ocean, he explains, is much more transparent to sound than to light. Like light, sound has a characteristic set of frequencies, or wavelengths, which an object can absorb, reflect, and scatter. While working on conventional sonar technologies for the British government in the mid 1980s, Dr. Buckingham says, "it occurred to me that one might be able to use the ambient noise in the ocean like using the natural-light field to take a photograph."
In 1991, he and colleagues at Scripps fielded their first acoustic "lens" - which looked like a small satellite dish. The dish-shaped reflector gathered incoming sound and focused it on an underwater microphone, or hydrophone.
The lens was connected to a computer, which turned the signal into an image consisting of a singe large rectangle, or pixel, on a screen. The idea, he says, was to see if the noise level changed when divers placed an object in front of the lens. Their target consisted of a plywood sheet 1 meter (about 39 inches) square, covered with neoprene and placed at distances ranging from about 7 to 12 meters (23 to 40 feet) away.
When they aimed the lens at the target, the sound's volume doubled, and the target reflected some frequencies better than others. This gave the team hope that by having the computer assign colors to specific frequencies, they could generate "false-color" images. In principle, he says, "false-color images could allow us to make inferences about the nature of an object."
Last summer, Buckingham's team used a souped-up version of the lens, which they named Adonis. It used 126 hydrophones at the focus of a larger dish. More hydrophones allowed the team to generate more pixels on the screen, testing the lens's ability to distinguish shapes. They also improved the signal-processing software to include color and to refine the rough image the lens provided. And they gave the lens an ability to rotate like radar. Although the image is not as sharp as a TV image, "we get nice fluid movement," Buckingham says.
Not everyone is sold on the concept of acoustic daylight. Some researchers, such as Nick Markis of the US Naval Research Laboratory, have argued that the range for detecting an object the size of a refrigerator would be limited to a few yards at best because beyond that range, the noise level reflected from the object would be weaker than the ocean's background noise level.
The team's initial results looked promising, says Stewart Glegg, a professor of ocean engineering at Florida Atlantic University in Boca Raton, Fla., who spent a summer working with Buckingham at Scripps. "But they just looked where they expected to see something." In his own acoustic-daylight experiments, he says, he set up an array of hydrophones that, although "steerable" electronically, took readings simultaneously from all directions. While it detected its target, that return didn't differ markedly from random fluctuations in sound coming from elsewhere.
Yet Buckingham says the results of the latest experiments, using the rotating lens and targets of different shapes and reflective properties, have been very encouraging. In one instance, a graduate student lowering a sand-filled plastic drum dropped it while the lens was active. Not only did Adonis record the fall, it "saw" the silt kicked up when the drum hit bottom, Buckingham says. "I was astonished!"
So far, the device has been tested on targets up to about 80 meters away. In principle, he says, Adonis could track objects up to a kilometer away. With additional improvements in signal processing, and by placing 1,000 tiny hydrophones at the focus of a slightly smaller dish, the team hopes to generate dramatically sharper images.
Buckingham says his ultimate hope would be to combine an acoustic-daylight system with "intelligent" navigation systems that learn as they go for use in unmanned undersea vehicles. Used as "surveillance cameras," acoustic-daylight devices one day could help pinpoint, for example, sonar-triggered mines.