Optimal Sufficient Dimension Reduction in Regressions with Categorical Predictors
Though partial sliced inverse regression (partial SIR: Chiaromonte et al. [2002. Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30, 475-497]) extended the scope of sufficient dimension reduction to regressions with both continuous and categorical predictors, its requirement of homogeneous predictor covariances across the subpopulations restricts its application in practice. when this condition fails, partial SIR may provide misleading results. In this article, we propose a new estimation method via a minimum discrepancy approach without this restriction. Our method is optimal in terms of asymptotic efficiency and its test statistic for testing the dimension of the partial central subspace always has an asymptotic chi-squared distribution. It also gives us the ability to test predictor effects. An asymptotic chi-squared test of the conditional independence hypothesis that the response is independent of a selected subset of the continuous predictors given the remaining predictors is obtained.
X. M. Wen and R. D. Cook, "Optimal Sufficient Dimension Reduction in Regressions with Categorical Predictors," Journal of Statistical Planning and Inference, Elsevier, Jan 2006.
The definitive version is available at http://dx.doi.org/10.1016/j.jspi.2006.05.008
Mathematics and Statistics
Keywords and Phrases
inverse regression; minimum discrepancy approach; partial SIR; partial central subspace
Article - Journal
© 2006 Elsevier, All rights reserved.