Coherence Bandwidth Loss in Transionospheric Radio Propagation
Abstract:
In this report a theoretical model is developed that predicts the single-point, two-frequency coherence function for transionospheric radio waves. The theoretical model is compared to measured complex frequency correlation coefficients using data from the seven equispaced, phase-coherent UHF signals transmitted by the Wideband satellite. The theory and data are in excellent agreement. The theory is critically dependent upon the power-law index, and the frequency coherence data clearly favor the comparatively small spectral indices that have been consistently measured from the Wideband satellite phase data. A model for estimating the pulse delay jitter induced by the coherence bandwidth loss is also developed and compared with the actual delay jitter observed on synthesized pulses obtained from the Wideband UFH comb. The results are in good agreement with the theory. The results presented in this report, which are based on an asymptotic theory, are compared with the more commonly used quadratic theory. The model developed and validated in this report can be used to predict the effects of coherence bandwidth loss in disturbed nuclear environments. Simple formulas for the resultant pulse delay jitter are derived that can be used in predictive codes.