Comparison of time delay estimators in medical ultrasound
Time delay estimation (TDE) is a common operation in ultrasound signal processing. A variety of TDE algorithms have been developed and applied in medical ultrasound, sonar, radar, and other fields. In this paper we analyze the performances of the widely used normalized covariance, sum absolute differences (SAD), sum squared differences (SSD), hybrid-sign correlation, polarity-coincidence correlation, and Meyr-Spies method. These techniques have been simulated on ultrasound radio frequency data under a variety of conditions. We show how parameters, which include kernel window size, signal decorrelation, and signal to noise ratio (SNR) affect the accuracy of the delay estimate. Simulation results are also compared with the theoretical performance limit set by the Cramér-Rao Lower Bound (CRLB). Results show that for high SNR, high signal correlation, and large kernel size, all of the tested algorithms closely match the theoretical bound, with nearly identical performance. As conditions degrade, the performances begin to differ, with normalized correlation, normalized covariance, and SSD typically outperforming the other algorithms.