In the traditional model comparison procedure, two nested structural models are hypothesized to be equal under some constraints, e.g., equality constraints. A strict null hypothesis is then evaluated by statistical tests to decide on the acceptance or rejection of the restrictions that differentiate the models. We propose instead to evaluate model close match, using the distance between two models in terms of the Kullback-Leibler (1951) Information Criterion, either as important supplementary information or as a criterion for nested structured model comparison. Based on the results of Vuong (1989) and Yuan, Hayashi and Bentler (2005), we develop some ADF-like generalized RMSEA tests for inference on model closeness. Simulation studies show that our proposed tests have robust and desirable performance in spite of severe nonnormality across several examples when sample size is as large as 150, and its relevance to educational research is shown with models for some TOEFL data. Consequently, a two-stage procedure which combines the traditional nested model comparison and the additional inferential information regarding model close match is further suggested to improve the typical practice of structured model modification.