Matched-Field Performance Prediction with Model Mismatch
Résumé
Matched-field estimation is known to be sensitive to mismatch between the assumed replica of the acoustic field and the actual field. An interval error-based method (MIE) is proposed to predict the mean-squared error (MSE) performance for multisnapshot and multifrequency maximumlikelihood matched-field estimation under model mismatch. The source signal is assumed deterministic unknown. Global errors are predicted by deriving exact expressions of pairwise error probabilities with model mismatch in conjunction with the use of the Union bound. Local errors are approximated using a Taylor expansion of the MSE. Numerical examples show the accuracy of the method.