The time delay estimation problem associated with an ensemble of misaligned, repetitive signals is revisited. Each observed signal is assumed to be composed of an unknown, deterministic signal corrupted by Gaussian, white noise. This paper shows that maximum likelihood (ML) time delay estimation can be viewed as the maximization of an eigenvalue ratio, where the eigenvalues are obtained from the ensemble correlation matrix. A suboptimal, one-step time delay estimate is proposed for initialization of the ML estimator, based on one of the eigenvectors of the inter-signal correlation matrix. With this approach, the ML estimates can be determined without the need for an intermediate estimate of the underlying, unknown signal. Based on respiratory flow signals, simulations show that the variance of the time delay estimation error for the eigenvalue-based method is almost the same as that of the ML estimator. Initializing the maximization with the one-step estimates, rather than using the ML estimator alone, the computation time is reduced by a factor of 5M when using brute force maximization (M denoting the number of signals in the ensemble), and a factor of about 1.5 when using particle swarm maximization. It is concluded that eigenanalysis of the ensemble correlation matrix not only provides valuable insight on how signal energy, jitter, and noise influence the estimation process, but it also leads to a one-step estimator which can make the way for a substantial reduction in computation time.