This Paper Presents, for the First Time, the Exact Theoretical Solution to the Problem of Maximum-Likelihood (ML) Estimation of Time-Varying Delay d(t) between a Random Signal s(t) Received at One Point in the Presence of Uncorrelated Noise, and the Time-Delayed, Scaled Version as(t - d(t)) of that Signal Received at Another Point in the Presence of Uncorrelated Noise. the Signal is Modeled as a Sample Function of a Nonstationary Gaussian Random Process and the Observation Interval is Arbitrary. the Analysis of This Paper Represents a Generalization of that of Knapp and Carter [1], Who Derived the ML Estimator for the Case that the Delay is Constant, d(t) = do, the Signal Process is Stationary, and the Received Processes Are Observed over the Infinite Interval ([Formula Omitted]). We Show that the ML Estimator of d(t) Can Be Implemented in Any of Four Canonical Forms Which, in General, Are Time-Varying Systems. We Also Show that Our Results Reduce to a Generalized Cross Correlator for the Special Case Treated in [1]. Copyright © 1987 by the Institute of Electrical and Electronics Engineers, Inc.


Electrical and Computer Engineering

International Standard Serial Number (ISSN)


Document Type

Article - Journal

Document Version


File Type





© 2023 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 1987