zotero-db/storage/8JKVXK3U/.zotero-ft-cache

923 lines
51 KiB
Plaintext
Raw Normal View History

Computationally Efficient Algorithms for Estimating the Angle of Arrival of Helicopters Using Acoustic Arrays
by Geoffrey Goldman
ARL-TR-4998
September 2009
Approved for public release; distribution unlimited.
NOTICES
Disclaimers
The findings in this report are not to be construed as an official Department of the Army position unless so designated by other authorized documents.
Citation of manufacturers or trade names does not constitute an official endorsement or approval of the use thereof.
Destroy this report when it is no longer needed. Do not return it to the originator.
Army Research Laboratory
Adelphi, MD 20783-1197
ARL-TR-4998
September 2009
Computationally Efficient Algorithms for Estimating the Angle of Arrival of helicopters Using Acoustic Arrays
Geoffrey Goldman
Sensors and Electron Devices Directorate, ARL
Approved for public release; distribution unlimited.
REPORT DOCUMENTATION PAGE
Form Approved OMB No. 0704-0188
Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.
PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.
1. REPORT DATE (DD-MM-YYYY)
2. REPORT TYPE
3. DATES COVERED (From - To)
September 2009
Summary
4. TITLE AND SUBTITLE
5a. CONTRACT NUMBER
Computationally Efficient Algorithms for Estimating the Angle of Arrival of Helicopters Using Acoustic Arrays
5b. GRANT NUMBER
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S)
Geoffrey Goldman
5d. PROJECT NUMBER 5e. TASK NUMBER
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)
U.S. Army Research Laboratory ATTN: RDRL-SER-M 2800 Powder Mill Road Adelphi MD 20783-1197
9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)
8. PERFORMING ORGANIZATION REPORT NUMBER
ARL-TR-4998
10. SPONSOR/MONITORS ACRONYM(S)
12. DISTRIBUTION/AVAILABILITY STATEMENT
Approved for public release; distribution unlimited.
11. SPONSOR/MONITOR'S REPORT NUMBER(S)
13. SUPPLEMENTARY NOTES
14. ABSTRACT
Computationally efficient algorithms were developed to estimate the angle of arrival (AOA) of acoustic signals generated by helicopters using acoustic arrays. The data was transformed into the frequency domain, then AOA algorithms based upon both least squares and digital beamforming were evaluated. The algorithms were tested using data collected on a helicopter at Yuma, AZ, in summer 2007. The AOA ground truth data was calculated using global positioning system (GPS) data collected on the helicopter. The algorithms performed well at estimating the azimuth angle of the helicopter, but they could only obtain good elevation angle estimates over limited time intervals. To try to improve the elevation angle estimate, a multipath model was incorporated into the beamforming algorithm.
15. SUBJECT TERMS
Acoustic angle of arrival helicopter
16. SECURITY CLASSIFICATION OF:
a. REPORT
Unclassified
b. ABSTRACT
Unclassified
c. THIS PAGE
Unclassified
17. LIMITATION OF ABSTRACT
UU
18. NUMBER OF PAGES
38
19a. NAME OF RESPONSIBLE PERSON
Geoffrey Goldman
19b. TELEPHONE NUMBER (Include area code)
(301) 394-0882
Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18
ii
Contents
List of Figures
iv
List of Tables
v
Acknowledgments
vi
1. Introduction
1
2. Measurements
2
3. Signal Processing
6
3.1 Least Squares...................................................................................................................7
3.2 Digital Beamforming.......................................................................................................9
3.3 Multipath Model..............................................................................................................9
4. Results
10
4.1 Azimuth Angle Results .................................................................................................11
4.2 Computationally Efficient Algorithms ..........................................................................17
4.3 Elevation Angle Results ................................................................................................18
5. Conclusions
26
6. References
28
List of Symbols, Abbreviations, and Acronyms
29
Distribution List
30
iii
List of Figures
Figure 1. Acoustic array configuration. ..........................................................................................3 Figure 2. Topographical map of the test site at Yuma, AZ.............................................................4 Figure 3. Spectrum of a helicopter at site 5 on the top microphone. ..............................................5 Figure 4. Acoustic signal of a helicopter near the beginning of the flight......................................5 Figure 5. Acoustic signal of a helicopter near the end of the flight................................................6 Figure 6. Simple multipath model. ...............................................................................................10 Figure 7. Azimuth angle calculated using LSQ for data from three microphones at site 5..........12 Figure 8. Azimuth angle calculated using W2LSQ for data from three microphones at site 5. ...12 Figure 9. Azimuth angle calculated using LSQ for data from four microphones at site 5. ..........13 Figure 10. Azimuth angle calculated using W2LSQ for data from four microphones at site 5. ..13 Figure 11. Azimuth angle calculated using W2LSQ for data from four microphones at site
11..............................................................................................................................................14 Figure 12. Elevation angle calculated using W2LSQ for data from three microphones at
site 5. ........................................................................................................................................19 Figure 13. Elevation angle calculated using W2LSQ for data from four microphones at
site 5. ........................................................................................................................................19 Figure 14. Elevation angle calculated using W2BF for data from three microphones at site 5. ..20 Figure 15. Elevation angle calculated using W2BF for data from four microphones at site 5.....20 Figure 16. Elevation angle estimation using four microphone weighted beamforming for
site 11. ......................................................................................................................................21 Figure 17. Estimated phase of the reflection coefficient based upon a simple multipath
model and ground truth data for frequencies between 2127 Hz. ..........................................22 Figure 18. Elevation angle using the estimated phase of the reflection coefficient based upon
a simple multipath model and ground truth data for frequencies between 2127 Hz. ............23 Figure 19. Elevation angle using the average phase of the reflection coefficient based upon a
simple multipath model and ground truth data for frequencies between 927 Hz. .................24 Figure 20. Normalized power from each microphone at site 5.....................................................25 Figure 21. Normalized power from each microphone and telemetry data collected on the
helicopter..................................................................................................................................25
iv
List of Tables
Table 1. Location of acoustic arrays. ..............................................................................................3 Table 2. AOA algorithm parameters.............................................................................................11 Table 3. Median of azimuth angle error using three microphones. ..............................................15 Table 4. Median of azimuth angle error using four microphones.................................................15 Table 5. Standard deviation of azimuth angle error using three microphones. ............................15 Table 6. Standard deviation of azimuth angle error using four microphones...............................15 Table 7. Median of absolute value of the azimuth angle error using three microphones. ............16 Table 8. Median of absolute value of azimuth angle error using four microphones. ...................16 Table 9. Root-mean-square (RMS) of the median of absolute value of azimuth angle error
for sites 2, 4, 5, and 6. ..............................................................................................................16 Table 10. Median of absolute value of the azimuth angle error using three microphones. ..........18 Table 11. Median of absolute value of azimuth angle error using four microphones. .................18
v
Acknowledgments
I would like to thank Chris Reiff, Latasha Solomon, Stephen Tenney, and Mike Scanlon for collecting the acoustic data and providing software to read it. I would also like to thank Duong Tran-Luu and Ragu Damarla for their helpful discussion on signal processing.
vi
1. Introduction
The U.S. Army is interested in detecting and tracking helicopters with acoustic arrays for military applications. Helicopters acoustic signatures are ideal for long-range detection and tracking. They have relatively large amplitudes and the power spectrum of their acoustic signatures is dominated by lower frequencies, which propagate in the atmosphere with minimal attenuation. Acoustic sensors have several desirable properties compared with other sensors. They are passive, low power, inexpensive, and omnidirectional, and can be sampled using lowspeed data acquisition systems. In addition to tracking helicopters, the Army is interested in detecting and localizing acoustical signals from events such as gun blasts and shock waves. Interference from helicopters can degrade this capability. If the helicopter can be characterized and tracked, then its acoustic signature can be separated from other events using spatial and frequency based filtering techniques.
The acoustic signature of a helicopter is a function of the observation angle, helicopter speed and maneuvers, and the environment. The main rotor and tail rotor are the primary sources of acoustic energy generated by a helicopter. In general, the spectrum energy of the main rotor and tail rotor are at different frequencies to prevent them from reinforcing and resonating (1). Other noise sources include the engines, the drive shafts, and gear meshing.
The acoustic energy or noise from the main rotor can be divided into several categories (2). Thickness noise is caused by the rotor blades displacing a volume of air. It is determined by the shape and motion of the blades and is focused primarily in the plane of the rotor and forward of the helicopter. The speed of the rotor blade is cyclical; the advancing side has a lower speed and the retreating has a higher speed. Loading noise is caused by the fluctuating lift forces on each blade, which are primary focused below the rotor blades.
There can be substantial interaction between main and tail rotor, the air frame body, and their wakes (3). As the helicopter advances, a vortex wake is generated behind each blade. The next blade may pass very close to the wake and produce blade-vortex interaction (BVI) and blade slap noise. This can result in a local increase in lift and an increased acoustic energy. The advancing side of the BVI noise is directed downward and forward while the retreating side is directed downward and rearward. The BVI can produce local supersonic interaction, which results in high-speed impulse (HIS) noise. HIS noise is also generated when the blade tips approach the speed of sound. This can occur during high-speed cruising or during maneuvers such as take-off. HSI noise is typically directed in the rotor plane forward of the helicopter. The helicopter also produces lower amplitude broadband noise through the turbulence created by the rotor, the wake of the rotor, and blade self-noise.
1
The signature of a moving helicopter is affected by its Doppler shift. The frequency of the acoustic signal is changed by the radial velocity of the helicopter (relative to an observer) divided by the wavelength of the signal. For example, the Doppler shift for 20-Hz acoustic signal produced by a helicopter traveling at a radial velocity of 100 m/s is approximately 6 Hz. The acoustic signal of a helicopter is also affected by the environment. The propagation of the signal is dispersive and refracts as temperature and humidity profiles change in the atmosphere. Wind, turbulence, multipath, and diffraction all affect the measured signal. Also there can be significant seismic/acoustic coupling (4). An acoustic signal can generate a seismic wave that propagates along the Earths surface, then reradiates into the atmosphere. Multipath and seismic/acoustic coupling are of great concern, because they cannot easily be corrected for in post-processing. Multipath effects have been extensively studied in the field of communications. The measured signal is typically modeled using delayed and attenuated replicas of the ideal transmitted signal. Then, the signal is coherently reconstructed using techniques such as a Rake filter. This approach cannot easily be used for helicopter signatures because the ideal signal is unknown, and for a microphone elevated by a fraction of a wavelength, the multipath time delays are also a fraction of the Nyquist sampling interval. Techniques will be developed to recover the acoustic signature using a single bounce multipath model; however, the effect of seismic/acoustic is ignored.
2. Measurements
Acoustic measurements of a helicopter were made at Yuma, AZ, in summer 2007. The position of the helicopter was recorded and time stamped using a global positioning system (GPS)/ inertial navigation system (INS) receiver. The microphone configuration for an individual acoustic array is shown in figure 1. Each microphone is 1 m from the origin. The three microphones in the xy-plane are spaced apart by 60°, and one microphone is located on the z-axis at a height of 1 m. For ground operation, the array was mounted at a height of 1 m. The acoustic data was sampled at rate of 1.016 KHz with 24-bit analog-to-digital converters (ADCs).
2
Figure 1. Acoustic array configuration.
Four antenna arrays were placed on the ground and one array was mounted on an aerostat at a height of approximately 720 m. The array on the aerostat was mounted upside down. Its position and orientation on the aerostat was measured as a function of time. Table 1 shows the location of the arrays. These locations are plotted on a topographical map using the software Topo USA 4.0 (figure 2). The location of the array on the aerostat is approximate, since it changed as a function of time due to the wind and solar heating.
Table 1. Location of acoustic arrays.
Site Number 2ground 3ground 5ground 6ground 11aerostat
Northing (m)
3638017 3638825 3637364 3639290 3637570
Easting (m)
0768457 0769214 0769212 0768367 0769240
Relative Height (m) 6 6 9 9 717
3
Ground site 6
Ground - site 3
Ground site 2 Aerostat site 11 Ground site 5
Figure 2. Topographical map of the test site at Yuma, AZ.
The temporal and spectral characteristics of the signals were examined as the helicopter flew close to the sensors. Figure 3 shows the discrete Fourier transform (DFT) of the data measured at site 5 that was calculated using 2.5 s of data in each processing interval and plotted in decibels. The spectral lines are prominent at the expected frequencies. The occasion loss of power is probably the result of destructive interference from multipath. Figures 4 and 5 show the acoustic data for the elevated microphone as a function of time. The impulse-like signal seen in these figures is caused by the main rotor blades and is sharper in figure 5 than in figure 4. This may be due to more dispersion in the atmosphere at the later measurement time. The smaller and higher frequency signal seen in these figures is caused by the tail rotor. These results indicate that the temporal and spectral properties of the signal vary significantly over time. Although the signature of a helicopter is very complex, over small observation angles, it can be considered to be a periodic time stationary process. This suggests that the data can be characterized in the frequency domain using a small number of data points with minimal loss of information.
4
Figure 3. Spectrum of a helicopter at site 5 on the top microphone.
Figure 4. Acoustic signal of a helicopter near the beginning of the flight.
5
Figure 5. Acoustic signal of a helicopter near the end of the flight.
3. Signal Processing
Computationally efficient algorithms were developed to estimate the angle of arrival (AOA) from each acoustic array. The AOA calculation was performed in the frequency domain using both least squares and digital beamforming approaches. The data was converted to the frequency domain using a fast Fourier transform (FFT), which has a computational complexity of n log(n), where n is the number of points in the FFT. For the frequencies of interest, the data could be decimated by a factor of 5 using boxcar averaging to reduce the number of points in the FFT. In the data analysis, a more conservative approach was used and the data was decimated by a factor of 2. Least squares algorithms are more computationally efficient than beamforming algorithms. Least squares algorithms require a matrix inverse, which can be performed using single value decomposition (SVD) that has a computational complexity of 4m2n + 8mn2 + 9n3, where m and n are the number of rows and columns of the matrix, respectively. The computational complexity of the digital beamforming algorithm is mno, where o is the number of possible angles in the search area. For large search areas, o can be large and dominate the processing requirements. To improve the estimate of the elevation angle, a multipath model was incorporated into the digital beamforming algorithm. This algorithm required slightly more processing power.
The algorithms are based upon classical array signal processing theory. For an acoustic array in
the far-field, the time delay between the ith and a reference microphone in an array is given by
δτ
i,0
=
 a
T
(Pi
c
P0)
(1)
6
where
the
vector
 a
is
the
direction
of
arrival
of
the
signal;
T
is
transpose;
Pi
and
P0
are
the
locations of the ith and a reference microphone indexed by o; and c is the propagation speed (5).
The relative time delay of the signal received on the ith microphone can be estimated from its DFT. The Fourier transform of the signal y(t) that is delayed by τ i is given by
F ( y(t −τ i)) = Y (ω)e jωτ i .
(2)
where F denotes the Fourier transform, t denotes time, Y(ω) is the Fourier transform of y(t), and ω denotes frequency. The results from equation 2 indicate that a time delay in the signal results in an additional phase shift in the Fourier transform of the signal that is proportional to ω.
3.1 Least Squares
The AOA can be estimated by minimizing the weighted squared error of the differential time delays across the microphone array. The relative time delays between the microphones in an array for a given frequency can be estimated by multiplying the results from equation 2 for the ith microphone by the complex conjugate of the results from the jth microphone, then taking the inverse tangent of the result. The resulting equation is given by
( ) δτ i, j(ω k) = ω k τ i −τ j
(3)
where ω k is the frequency associated with the kth largest bin in the DFT of the signal.
The algorithms developed in this report are based upon using the data associated with the K largest distinct bins of the average of the DFT results over all the microphones. Also, the speed of sound is assumed to be known. The weighted least squares solution based upon equation 1 is given by
( ) aˆ=
H TWH
1
H
TW
∆τ
(4)
where
H
= 1c ( P1P2,...,P3
P4),...,( P1P2,...,P3
 P
4)
T
,
(5)
( ) ( ) ∆τ =[ δτ 1,2(w1),...,δτ 3,4(w1) ,..., δτ 1,2(wK ),...,δτ 3,4(wK ) ]T ,
(6)
and where the elements in H and ∆τ correspond to the i and j microphone index of 12, 13, 14, 23, 24, and 34, which are repeated K times for each frequency; W is a weighting matrix; and 1 denotes matrix inverse (6). The matrix inverse in equation 4 can be solved using a pseudo inverse algorithm such as the one used by Matlab.
7
For Gaussian measurement errors, the optimal weighting matrix is the inverse of the covariance matrix associated with the estimated differential time delays, and it achieves the Cramer-Rao lower bound (CRLB) (7). For high signal-to-noise ratio (SNR) and independent and identically distributed (iid) zero mean Gaussian noise, the variance of the estimated phase in the DFT can be approximated by
σ
2 φ
=
0.5 SNR
(7)
where the estimated phase is also a Gaussian random variable (8). The covariance of the differential time estimates, Cτ , can be computed by dividing the variance of the estimated phase computed in equation 7 by 2w2. For the measured data, the noise was assumed to be small relative to the signal and a constant value independent of frequency. This results in the SNR only being a function of the received signal strength. These assumptions were used to calculate Cτ . For the least squares solution, Cτ was set to the identify matrix. Several other non-optimal weighting schemes were also investigated and will be discussed in the next section.
The azimuth and elevation angle of the target of interest are related to a by
a= [sin(θ ) cos(ϕ),sin(θ ) sin(ϕ), cos(θ )]T
(8)
where φ corresponds to the azimuth angle and (pi/2 –θ) corresponds to the elevation angle of the target. The azimuth angle can be computed from aˆ using
ϕˆ
 =tan
aˆ (2) aˆ (1)
1 
(9)
and the elevation angle can be computed using
( ) θˆ=cos aˆ(3) 1
(10)
or
=θˆ
α
 sin
aˆ (1) cos(ϕˆ
)
1 
+
(1
α
)
sin
 
aˆ (1) sin(ϕˆ
)
 
1
(11)
where α = cos(ϕˆ )2 .
(12)
Surprisingly, equation 11 produced better agreement with the measured data than equation 10, so it was used in the analysis. The covariance matrix associated with aˆ is given by (8)
( )1
C a= H Cτ1H T .
(13)
8
This matrix, in conjunction with the nonlinear mapping of a to angle, can be used to estimate the azimuth and elevation error.
3.2 Digital Beamforming
The AOA was also calculated using digital beamforming with several different weighting schemes. The algorithm processed the data associated with the K largest bins of the DFT results averaged over the microphones in each array. These results were summed over M microphones and K frequencies. The algorithm includes a weighting function given by Bi(w). The AOA
estimate was calculated by selecting the spherical angle given by (ϕ,θ ) in a rectangular grid that
resulted in a maximum value for
α
(ϕ,θ
)
=
K
Bi(ω k)
M
y
m(ω
k )e
ka (ϕ ,θ c
)T
P
m
=k 1 =m 1
(14)
where
Bi(ω k )=
ω
i k
K
ω
i k
.
(15)
k =1
This algorithm corresponds to the solution for the maximum likelihood (ML) estimator for iid Gaussian noise. Similar to the weighted least squares solution, the estimator based upon i=2 achieves the CRLB given the previous assumptions on the data. This algorithm does not include the effect of multipath on the measured signal.
3.3 Multipath Model
Figure 6 illustrates a simple model for multipath, which is a based upon the signal having a single bounce on a flat Earth with propagation that is described by ray tracing for signals in the far-field. The microphone is at a height H above the ground, and a complex reflection coefficient that is potentially frequency dependent is given by ρ(ω) , which can be approximated using empirical data. The signal propagating along the direct and indirect path sum to generate the signal measured at the microphone.
9
microphone
direct path
R H
Elev
indirect path
ρ
Figure 6. Simple multipath model.
This model can be incorporated into a digital beamforming algorithm as shown in equation 16 using
γ
(ϕ,θ
)
=
K
Bi(ω k)
M
=k 1 =m 1
ym(ω k)e
ka(ϕ
)T
P
m1+
ρ
(ω)e
jH
csc(θ
)
 
c
.
(16)
The AOA was estimated by selecting the spherical coordinates (ϕ,θ ) that resulted in a maximum
value in equation 16.
4. Results
The acoustic data was processed to obtain AOA estimates at five acoustic arrays and compared to ground truth data. The acoustic data was only processed at locations where the SNR was high. The AOA algorithms were run with the parameters shown in table 2. The data was restricted to frequencies between 9.5 and 100 Hz. There was energy at higher frequencies, but including this data in the signal processing did not have much of an impact on the results. The algorithms used the six highest peaks in the power spectrum that had a minimum frequency separation of 5 Hz from each other. The processing interval for the data was approximately 2.5 s, and the data was decimated by a factor of 2 before the DFT. A 1280-point FFT was performed on the data that
10
was smoothed using a Kaiser window with β=2. The AOA algorithms used either the bottom three microphones or all four microphones shown in figure 2.
Table 2. AOA algorithm parameters.
Parameter Minimum frequency Maximum frequency Number of peaks Minimum separation in frequency Speed of sound (c) Integration time Decimation factor Number of microphones
Value 9.5 Hz 100 Hz
6 5 Hz 340 m/s 2.5 s
2 3 or 4
The difference between the estimated angle of the helicopter and the angle based upon ground truth was computed and analyzed. The acoustic delay due to the finite speed of sound was incorporated into the GPS results. No other atmospheric effects were included in the calculations. First, the results for the azimuth angle estimation will be presented graphically then quantitatively.
4.1 Azimuth Angle Results
Figures 711 show the results for several different algorithms using data collected at sites 5 and 11 and using either the bottom three or all four microphones in the array. The ranges from the acoustic array to the helicopter were smallest at site 5. The angle estimates obtained at site 11 were corrected using data from both a digital compass that was attached to the acoustic array and an INS that was attached to the aerostat. In figures 711, the blue circles denote the estimated azimuth angle based upon acoustic data, the red xs denote the estimated azimuth angle based upon GPS data, and the black dots denote the normalized range from the acoustic array to the helicopter based upon GPS data. Ideally, the circles and the xs should be co-located. In the following figures and tables, results based upon the least squares solution are denoted LSQ, the beamforming results are denoted BF, and the weighted least squares and weighted beamforming results are denoted WiLSQ and WiBF, respectively, where the weight factor is based upon the value of i selected in equation 15.
11
Figure 7. Azimuth angle calculated using LSQ for data from three microphones at site 5.
Figure 8. Azimuth angle calculated using W2LSQ for data from three microphones at site 5.
12
Figure 9. Azimuth angle calculated using LSQ for data from four microphones at site 5.
Figure 10. Azimuth angle calculated using W2LSQ for data from four microphones at site 5.
13
Figure 11. Azimuth angle calculated using W2LSQ for data from four microphones at site 11.
Several statistics were calculated to quantitatively describe the difference between the estimated azimuth angle of the helicopter and the azimuth angle based upon ground truth data. The median azimuth angle error, standard deviation of the azimuth angle error, and the median of the absolute value of unbiased azimuth angle error were computed and tabulated for eight difference algorithms and two array configurations. The third test statistic was calculated using
Median( ∆ϕ t -Median(∆ϕ t) )
(17)
where ∆ϕ t is the difference between the acoustic estimated azimuth angle and the GPS estimated azimuth angle for the ith processing interval. This statistic will not be significantly affected by outliers or unknown biases. The results for these test statistics are shown in tables 38 and a summary of the results is shown in table 9.
14
Table 3. Median of azimuth angle error using three microphones.
Site Number
2 4 5 6 11
LSQ Error (deg) 0.19 3.23 1.98 13.4 3.31
W1LSQ Error (deg) 0.09 1.44 1.04 1.69 2.90
W2LSQ Error (deg) 0.01 1.15 0.96 0.57 2.47
W3LSQ Error (deg) 0.04 1.07 -0.89 1.90 2.48
BF Error (deg) 0.04 1.41 1.12 1.80 3.17
WBF1 Error (deg) 0.04 1.30 1.03 0.67 2.22
WBF2 Error (deg) 0.14 1.06 0.81 2.52 2.32
WBF3 Error (deg) 0.18 1.50 0.68 3.32 3.21
Table 4. Median of azimuth angle error using four microphones.
Site Numbe
r 2 4 5 6 11
LSQ Error (deg) 0.19 3.35 2.56 13.6 8.65
W1LSQ Error (deg) 0.05 1.44 1.04 1.70 1.52
W2LSQ Error (deg) 0.21 1.19 1.07 0.46 1.59
W3LSQ Error (deg) 0.21 1.07 0.99 1.82 1.78
BF Error (deg) -0.06 1.50 1.05 1.91 2.11
WBF1 Error (deg) 0.06 1.38 0.77 0.64 2.80
WBF2 Error (deg) 0.01 1.12 0.68 2.40 2.75
WBF3 Error (deg) 0.14 1.03 0.64 3.28 3.17
Table 5. Standard deviation of azimuth angle error using three microphones.
Site Number
2 4 5 6 11
LSQ Error (deg) 16.9 14.3 42.1 33.6 11.3
W1LSQ Error (deg) 2.26 1.56 4.68 7.02 6.41
W2LSQ Error (deg) 2.06 1.46 4.37 4.53 12.
W3LSQ Error (deg) 2.08 1.53 4.29 3.64 13.7
BF Error (deg) 2.55 1.99 5.54 10.6 14.6
W1BF Error (deg) 2.12 1.60 4.60 6.75 13.9
W2BF Error (deg) 2.17 1.62 4.40 4.15 14.4
WBF3 Error (deg) 2.29 1.72 4.35 4.03 14.4
Table 6. Standard deviation of azimuth angle error using four microphones.
Site Number
2 4 5 6 11
LSQ Error (deg) 15.4 7.79 35.1 30.1 12.7
W1LSQ Error (deg) 2.58 1.52 6.70 6.97 11.2
W2LSQ Error (deg) 3.06 1.45 12.8 4.64 12.3
W3LSQ Error (deg) 3.98 1.51 20.8 3.81 13.9
BF Error (deg) 2.70 1.86 5.29 10.5 8.57
W1BF Error (deg) 2.14 1.58 4.48 6.42 9.41
W2BF Error (deg) 2.15 1.64 4.26 5.74 10.0
WBF3 Error (deg) 2.20 1.74 4.25 5.58 10.4
15
Table 7. Median of absolute value of the azimuth angle error using three microphones.
Site Number
2 4 5 6 11
LSQ Error (deg) 2.88 1.56 7.14 6.13 6.77
W1LSQ Error (deg) 1.17 0.95 2.56 2.97 3.96
W2LSQ Error (deg) 1.22 0.93 2.34 2.27 3.85
W3LSQ Error (deg) 1.09 0.93 2.33 1.99 4.03
BF Error (deg) 1.35 1.05 2.52 2.73 8.30
W1BF Error (deg) 1.25 0.99 2.37 2.05 7.59
W2BF Error (deg) 1.21 0.93 2.30 1.71 7.41
WBF3 Error (deg) 1.22
.99 2.36 1.65 7.87
Table 8. Median of absolute value of azimuth angle error using four microphones.
Site Number
2 4 5 6 11
LSQ Error (deg) 3.12 1.59 7.24 6.17 7.10
W1LSQ Error (deg) 1.27 0.90 2.63 2.98 5.56
W2LSQ Error (deg) 1.18 0.94 2.39 2.33 5.77
W3LSQ Error (deg) 1.22 0.94 2.39 1.95 5.77
BF Error (deg) 1.39 1.02 2.65 2.86 4.96
W1BF Error (deg) 1.23 0.94 2.52 1.93 5.37
W2BF Error (deg) 1.21 0.94 2.35 1.78 6.00
WBF3 Error (deg) 1.23 1.00 2.41 1.65 6.22
Table 9. Root-mean-square (RMS) of the median of absolute value of azimuth angle error for sites 2, 4, 5, and 6.
Number of Mics
3 4
LSQ Error (deg) 4.98 5.07
W1LSQ Error (deg) 2.10 2.13
W2LSQ Error (deg) 1.80 1.83
W3LSQ Error (deg) 1.69 1.72
BF Error (deg) 2.04 2.13
W1BF Error (deg) 1.76 1.77
W2BF Error (deg) 1.62 1.66
WBF3 Error (deg) 1.64 1.66
The results in tables 3 and 4 indicate that there are small biases in the estimate of the azimuth angle for each array, probably due to factors such as alignment, wind, or microphone calibration errors. Site 6 had an addition preprocessing angular correction of 12° that is not seen in the results. The arrays on the ground had biases of approximately 1°, but the arrays on the aerostat had biases of 23°. The ground truth data on the aerostat was not as reliable as the ground truth on the ground, so there is an addition source of error for the results on site 11.
The results for the standard deviation of the azimuth error are shown in tables 5 and 6. For Gaussian measurement errors, the relationship between the standard deviation and the median of the absolute value (tables 7 and 8) of the angular errors should be in a ratio of approximately 1 to 0.67. The standard deviation results have significantly larger errors, indicating that the model is not correct. The standard deviation results are probably driven by outliers and are not a sufficient statistic. The results in tables 7 and 8 should provide a more descriptive statistic.
The results in tables 7 and 8 are summarized in table 9 to allow for direct comparison of the algorithms. The RMS error for each algorithm was computed for the median of the absolute
16
value of the azimuth angle error for sites 2, 4, 5, and 6. Site 11 was not included, because it had significantly larger errors. This test statistic is commonly used for estimated the error in data that is added together with independent but not identically distributed one-dimensional (1-D) Gaussian distributions.
The results in table 9 indicate that three microphones performed slightly better than four microphones. This result is not intuitive. One possible explanation for this result is multipath effects. The three microphones at the same height should experience approximately the same effects due to multipath, while the one microphone at an elevated height should experience different effects due to multipath. Multipath primarily affects the elevation angle estimate, so while the signal received at the bottom three microphones may generate an incorrect elevation angle estimate, it should be consistently incorrect and have a minimal impact on the azimuth angle estimate. The elevated microphone may also experience more wind effects; however, the noise should be negligible for large signals. The acoustic/seismic coupling will also be different for the elevated microphone as compared to the other microphones.
The results in table 9 indicate the W2BF algorithm using three microphones performed the best. For Gaussian measurement errors, the performance of the W2LSQ and W2BF algorithms should both be optimal and identical. The weighting function with i=2 in (equation 15) had the best performance as expected, but i=3 was almost as good. This may indicate that another noise source independent of the helicopter may be present at lower frequencies. The performance of the weighted beamforming algorithms was slightly better than the performance of the weighted least squares algorithm. The better performance of the weighted beamforming algorithms is not surprising, since the least squares algorithms required a matrix inversion, which may increase the impact of array calibration errors. However, the improvement was only approximately 0.2°, much smaller than errors associated with aligning the arrays (a couple of degrees). The LSQ algorithm performed the worst. It is the least computationally intensive algorithm and had no weighting function to account for varying SNR or frequency. The beamforming algorithm with only an amplitude weighting had the second worst performance.
4.2 Computationally Efficient Algorithms
Techniques were developed to increase the computational efficiency of the AOA algorithms. The computational complexity of the algorithms based upon least squares can be reduced by combining the differential time estimates on each microphone combination calculated using equation 1. One technique is to use a weighted average of the differential time delays at each frequency. A second technique is to use the median value of the differential time delays at each frequency. Since the previous algorithms were evaluated using six peaks, the middle peaks (index 3 or 4) with the larger weighting coefficient calculated using equation 15 with i=2 were selected. These techniques reduced the matrix inverse required in equation 5 from 18 x 18 to 6 x 6.
17
The digital beamforming algorithm can be reduced in computational complexity if we assume the peaks associated with regularly spaced harmonics will always be selected and they are coherent in phase. However, there is no guarantee of coherence, so this approach was not pursued. The results for the taking the mean and the median of the differential phase for both three microphones and four microphones are shown in tables 10 and 11. Again, the results are summarized in the last row by taking the RMS of the results for sites 2, 4, 5, and 6 for each algorithm and antenna configuration.
Table 10. Median of absolute value of the azimuth angle error using three microphones.
Site Number
2 4 5 6 11 RMS (26)
Previous Alg. W2LSQ Error
(deg) 1.22 0.93 2.34 2.27 3.85 1.80
Mean Alg. W2LSQ Error
(deg) 1.36 0.97 2.70 4.69 4.72 2.83
Median Alg. W2LSQ Error
(deg) 1.35 1.08 2.59 2.51 5.69 2.00
Table 11. Median of absolute value of azimuth angle error using four microphones.
Site Number
2 4 5 6 11 RMS (26)
Previous Alg. W2LSQ Error
(deg) 1.18 0.94 2.39 2.33 5.77 1.83
Mean Alg. W2LSQ Error
(deg) 1.42 0.94 2.74 4.48 5.10 2.76
Median Alg. W2LSQ Error
(deg) 1.16 1.01 2.60 2.19 14.3 1.87
The results in tables 10 and 11 indicate that, as expected, the errors in the mean W2LSQ algorithm were larger than the errors in the W2LSQ algorithm (not averaged). However, the errors in the median W2LSQ algorithm were only slightly larger than the errors in the W2LSQ algorithm. Also, the median W2LSQ algorithm results for four microphones were better than the results for three microphones. This suggests that there may have been some outliers in the data associated with the four microphones that were eliminated using the median operation. This further suggests that future algorithm development may benefit from using estimators based upon non-parametric statistics.
4.3 Elevation Angle Results
Graphical results for the agreement of the elevation angle were calculated with digital beamforming and least squares algorithms using either three or four microphones. The results vary as a function of time, so no quantitative analysis was performed. The best agreement was
18
achieved for data measured at sites 5 and 11, which are presented in this section. For the least squares solution, the elevation angle was calculated using equation 12, which produced better results than equation 11. Results are presented in figures 1216 for a weighting based upon i=2 in equation 15, the weighting that produced the best agreement for the azimuth angle data.
Figure 12. Elevation angle calculated using W2LSQ for data from three microphones at site 5.
Figure 13. Elevation angle calculated using W2LSQ for data from four microphones at site 5.
19
Figure 14. Elevation angle calculated using W2BF for data from three microphones at site 5.
Figure 15. Elevation angle calculated using W2BF for data from four microphones at site 5.
20
Figure 16. Elevation angle estimation using four microphone weighted beamforming for site 11.
Figures 1214 and 16 indicate that there is no agreement in the elevation angle results for times between approximately 0250 s and there is relatively good agreement for times between 250420 s. Figure 15 indicates that there is no agreement for the elevation angle calculated using the weighted beamforming algorithm using four microphones. The weighted least squares algorithm using four microphones produced better results, but it used a nonstandard technique described in equation 12 to estimate the elevation angle. This technique minimized the impact of the elevated microphone on the elevation angle estimate. Results were also shown for the array on the aerostat. Surprisingly, the performance of the array on the ground had similar performance to the array on the aerostat even though the multipath environment and atmospheric propagation were very different. An explanation for these results is currently being investigated. One possible factor not included in the AOA estimation algorithms is multipath effects.
A multipath model was included in the beamforming algorithm and elevation angles were computed using equation 16. Initially, a reflection coefficient of 1 was used in the model, but this did not produce good results. Better results should be obtained if the correct reflection coefficient is used. However, determining the reflection coefficient is a complicated task. Since the correct elevation angle is known, the reflection coefficient that results in the algorithm computing the correct elevation angle can be determined by setting the amplitude to a constant
21
and letting the phase vary from 180° to 180°. The phase that produced the match was selected. The results are shown in figure 17 for a single spectral peak for frequencies between 2127 Hz for a reflection coefficient with an amplitude of 1. The results based upon harmonics at higher frequencies were almost random and were not included. The results for frequencies below 27 Hz are reasonably consistent, but not close to the anticipated result of a reflection coefficient phase of 0°. The estimated phases of the reflection coefficients have a dependency on range. This may be caused by a violation of the assumption of the flat Earth model.
Figure 17. Estimated phase of the reflection coefficient based upon a simple multipath model and ground truth data for frequencies between 2127 Hz.
The reflection coefficients calculated in figure 17 were substituted in equation 16 for each coherent processing interval (CPI), and the elevation angle was computed using a single spectral peak for frequencies between 2127 Hz. The results are shown in figure 18. They were improvements over the algorithms that did not include multipath effects; however, the results were still poor for the first half of the data. These results are not surprising, since the algorithm is not very sensitive to multipath at low frequencies.
22
Figure 18. Elevation angle using the estimated phase of the reflection coefficient based upon a simple multipath model and ground truth data for frequencies between 2127 Hz.
The beamforming algorithm with multipath algorithm was rerun using the average of the computed reflection coefficient for two spectral peaks between 9 to 27 Hz. This is a more realistic simulation compared to the previous calculations. The results are shown in figure 19. These results are poor at all times and indicate that the model used to describe the propagation of the signal is not adequate. The assumptions of straight-line propagation, constant reflection coefficient, or reflection off a flat Earth may not be valid.
23
Figure 19. Elevation angle using the average phase of the reflection coefficient based upon a simple multipath model and ground truth data for frequencies between 927 Hz.
Statistics associated with the acoustic signatures of the helicopter were examined to help understand the elevation angle results. Figure 20 shows the calibrated power of the four microphones at site 5. Microphone 1 is elevated 1.0 m above microphones 24. For data at times less than 250 s, the power of microphone 1 is less than the power of microphones 24. After 250 s, the power is approximately equal. After 250 s, the elevation angle results were good for the estimation algorithms using three microphones. These results indicate that a simple test can be designed to determine when the estimated elevation angles are good. The signature of the acoustic data changes at approximately 250 s. At times after 250 s, the impulse from the helicopter blades is clearly defined, as seen in figure 5. At times before 250 s, the impulse from the helicopter blade is not clearly defined, as seen in figure 4. This result suggests that the propagation through the atmosphere was more dispersive at times before 250 s.
24
Figure 20. Normalized power from each microphone at site 5.
The state of the helicopter was investigated as a potential cause of the fluctuating power across the microphone array. The location and orientation of the helicopter was overlaid on the results, as shown in figure 21. The normalized range is denoted “rg” in black, the normalized azimuth angle is denoted by “az” in green, the normalized elevation angle is denoted by “el” in cyan, and the normalized heading is denoted by “head” in yellow. This data shows no obvious correlation between the state of the helicopter and the elevation angle results.
Figure 21. Normalized power from each microphone and telemetry data collected on the helicopter.
25
Changing environment conditions are a possible cause for the fluctuating power of the microphones. The environmental logs indicate that it was over 100 °F with winds gusts of up to 10 kts. Wind gusts may have played a role in the changing signature of the helicopter. More detailed environmental data is available, but it was not analyzed in this report.
5. Conclusions
Algorithms were developed and evaluated to estimate the AOA of acoustic signals generated by a helicopter. Data were collected at Yuma, AZ, in summer 2007 and analyzed for five acoustic arrays. The AOA calculations were performed in the frequency domain using both least squares and digital beamforming approaches. Six peaks in the Fourier spectrum were used for frequencies between 9.5 and 100 Hz. The processing power requirements for the efficient least squares algorithm were significantly less than the requirements for the original least squares algorithms. The processing requirements for the beamforming algorithms were the largest.
The results indicate that the azimuth angle could be estimated to a precision of approximately 12°, but the elevation angle estimation results were inconsistent. For applications that track a single helicopter, the median W2LSQ algorithm had good performance and reduced processing requirements. For applications that may track more than one helicopter, the algorithms based upon least squares are inadequate using the current scheme of selecting peaks. The W2BF algorithm using three microphones had the best performance of the algorithms evaluated and can potentially track more than one helicopter. However, for multiple targets, the frequencies range should be changed to exclude or deemphasize the lower frequencies and increase or emphasis the higher frequencies.
Both the W2LSQ and W2BF algorithms should obtain optimal performance for known Gaussian noise; however, the beamforming algorithms performed slightly better. This may be the result of the inverse required for the observation matrix that contained small calibration errors in the least squares solution.
Surprisingly, the azimuth angle estimation algorithms results summarized in table 10 processed using three microphones produced slightly better results than those processed using four microphones. The reason for this result may be multipath and/or seismic/acoustic coupling effects were cancelled in the processing using three microphones, but were not cancelled in the processing using four microphones. However, without a clear understanding of the phenomenology responsible for these results, I do not recommend eliminating the top microphone in the array.
Visual observation indicated the elevation angle results were reasonable for data collected between 250420 s. The beamforming algorithms only worked when the algorithms used three microphones, not four. This requirement limits the algorithms to targets not in the plane of the
26
microphone array. The characteristics of the data collected on the elevated microphone changed during the time interval from 0250 to 250420 s. The normalized power of the signal was smaller compared to the lower microphones and the propagation of the signal from the helicopter looked more dispersive during the 0250 s time interval. The underlying phenomenology for this behavior is still being investigated. To improve the elevation angle estimate, a multipath model was incorporated into the beamforming algorithm. The algorithm assumed multipath could be modeled with a single bounce, a constant reflection coefficient, straight line propagation, a flat Earth, and incident angles that were not near grazing. This algorithm did not work well. A more detailed analysis is needed to understand its deficiencies.
27
6. References
1. Santa Maria, O. L.; Farassat, F.; Morris, P. J. Two-dimensional Fourirt Transform Analysis of Helicopter Flyover Noise. American Helicopter Society 55th, annual Forum, Montreal, Quebec, May 2527, 1999.
2. Wikipedia. Helicopter noise reduction, last modified 22 April 2009. http://en.wikipedia.org/wiki/Helicopter_noise_reduction (accessed 2009).
3. ShawburyNIGS_AnnexB_Aircraft_Noise.pdf. http://www.mod.uk/NR/rdonlyres/0609B906328C44E78EE149F122CC11B0/0/ (accessed 2009).
4. Burgett, R.; Sabatier, J. M. Analysis of Air and Mechanically Coupled Ground Vibrations. Proc. SPIE 2008, 6963.
5. Van Trees, H. L. Optimum Array Processing; Wiley Interscience, 2002, p. 29. 6. Lawson, C.; Hanson, R. Solving Least Squares Problems, Englewood Cliffs, NJ: Prentice-
Hall, 1974. 7. Kay, S. M. Fundamentals of Statistical Signal Processing: Estimation Theory; Prentice-
Hall, 1993. 8. Shanmugan, K. S.; Breipohl, A. M. Random Signals: Detection, Estimation and Data
Analysis; John Wiley & Sons, 1988, p. 323.
28
List of Symbols, Abbreviations, and Acronyms
1-D ADC AOA BVI CPI CRLB DFT FFT GPS HIS iid INS ML RMS SNR SVD
one-dimensional analog-to-digital converter angle of arrival blade-vortex interactions complete processing interval Cramer-Rao lower bound discrete Fourier transform fast Fourier transform global positioning system high-speed impulse independent and identically distributed inertial navigation system maximum likelihood root-mean-square signal-to-noise ratio single value decomposition
29
NO. OF COPIES ORGANIZATION
1 ELEC
ADMNSTR DEFNS TECHL INFO CTR ATTN DTIC OCP 8725 JOHN J KINGMAN RD STE 0944 FT BELVOIR VA 22060-6218
1 HC
US ARMY RSRCH LAB ATTN RDRL CIM G T LANDFRIED BLDG 4600 ABERDEEN PROVING GROUND MD 21005-5066
3 HCS
US ARMY RSRCH LAB ATTN IMNE ALC HRR MAIL & RECORDS MGMT ATTN RDRL CIM L TECHL LIB ATTN RDRL CIM P TECHL PUB ADELPHI MD 20783-1197
1 DARPA ATTN IXO S WELBY 3701 N FAIRFAX DR ARLINGTON VA 22203-1714
1 CD
OFC OF THE SECY OF DEFNS ATTN ODDRE (R&AT) THE PENTAGON WASHINGTON DC 20301-3080
1 US ARMY RSRCH DEV AND ENGRG CMND ARMAMENT RSRCH DEV AND ENGRG CTR ARMAMENT ENGRG AND TECHNLGY CTR ATTN AMSRD AAR AEF T J MATTS BLDG 305 ABERDEEN PROVING GROUND MD 21005-5001
NO. OF COPIES ORGANIZATION
1 PM TIMS, PROFILER (MMS-P) AN/TMQ-52 ATTN B GRIFFIES BUILDING 563 FT MONMOUTH NJ 07703
1 COMMANDER US ARMY RDECOM ATTN AMSRD AMR W C MCCORKLE 5400 FOWLER RD REDSTONE ARSENAL AL 35898-5000
1 US GOVERNMENT PRINT OFF DEPOSITORY RECEIVING SECTION ATTN MAIL STOP IDAD J TATE 732 NORTH CAPITOL ST NW WASHINGTON DC 20402
4 US ARMY RSRCH LAB ATTN RDRL SER M G GOLDMAN ATTN RDRL SES P V MIRELLI L SOLOMON T TRAN-LUU ADELPHI MD 20783-1197
TOTAL: 15 (1, ELEC, 1 CDs 13 HCs)
30