A Note on Approximation of Likelihood Ratio Statistic in Exploratory Factor Analysis

In normal theory exploratory factor analysis, likelihood ratio (LR) statistic plays an important role in evaluating the goodness-of-fit of the model. In this paper, we derive an approximation of the LR statistic. The approximation is then used to show explicitly that the expectation of the LR statistic agrees with the degrees of freedom of the asymptotic chi-square distribution.


Introduction
Factor analyis [1] [2] is used in various fields to study interdependence among a set of observed variables by postulating underlying factors.We consider the model of exploratory factor analysis in the form where Σ is the p p × covariance matrix of observed variables, Λ is a p m × matrix of factor loadings, and Ψ is a diagonal matrix of error variances with ( ) . Under the assumption of multivariate normal distributions for observations, the parameters are estimated with the method of maximum likelihood and the goodness-of-fit of the model can be judged by using the likelihood ratio (LR) test for testing the null hypothesis ′ = + Σ ΛΛ Ψ for a specified m against the alternative that Σ is unconstrained.From the theory of LR tests, the degrees of freedom, ν , of the asymptotic chi-square distribution is the difference between the number of free parameters on the alternative model and the null model.In (1), Σ remains unchanged if Λ is replaced by ΛT for any m m × orthogonal matrix T .Hence,

Approximation of LR Statistiic
Let S be the usual unbiased estimator of Σ based on a random sample of size 1 N n = + from the multi- variate normal population ( ) µ Σ with ′ = + Σ ΛΛ Ψ .For the existence of consistent estimators, we assume that the solution Ψ of ′ = + Σ ΛΛ Ψ is unique.A necessary condition for the uniqueness of the solution Ψ up to multiplication on the right of Λ by an orthogonal matrix is that each column of ΛT has at least three non-zero elements for every non-singular matrix T ([3], Theorem 5.6).This condition implies that ( ) . The maximum Wishart likelihood estimators Λ and Ψ are defined as the values of Λ and Ψ that minimize ( ) ( ) Then, Λ and Ψ can be shown to be the solutions of the following equations: ( ) where ˆˆ′ = + Σ ΛΛ Ψ .The motivation behind the minimization of ( ) that is, n times the minimum value ( ) F Σ is the LR statistic described in the previous section.Under (4) and (5), From the second-order Taylor formula, we have an approximation of the LR statistic as by virtue of (5) [1] [2].While the approximation on the right hand side of (7) shows how the LR statistic is related to the sum of squares of standardized residuals [4], it does not enable us to investigate the distributional properties of hte LR statistic.To overcome this difficulty, we express the LR statistic as a function of ( ) Ψ and  Λ denote the terms of ( ) − Λ Λ linear in the elments of U .Then we have the following proposition.
Proposition 1.An approximation of the LR statistic is given by where Proof.By substituting ,  U Λ , and  Ψ into (4) and ( 5) and considering only linear terms, we have where By replacing ( ) 11) and ( 12) that ( ) thus establishing the desired result.

Evaluating Expectation
For the purpose of demonstrating the usefulness of the derived approximation, we show explicitly that the expectation of (8) agrees with the degrees of freedom, ν , in (2) of the asymptotic chi-square distribution.We now evaluate the expectation of (8) by using  ) To evaluate the expectation of the second term in (8), we need to express  Ψ in terms of U .Let the symbol  denote the Hadamard product of matrices, and define [5].If Ξ is positive definite, then (13) can be solved for  Ψ in terms of U [3].An expression of ii ψ is where i Θ is a diagonal matrix whose diagonal elements are the i-th column (row) of ( ) where ij δ is the Kronecker delta with By combining (15) and (18), we obtain the desired result.
can be shown to hold.Hence, that the expectation of the first term in (8) is