Optimal Sufficient Dimension Reduction in Regressions with Categorical Predictors


Though partial sliced inverse regression (partial SIR: Chiaromonte et al. [2002. Sufficient dimension reduction in regressions with categorical predictors. Ann. Statist. 30, 475-497]) extended the scope of sufficient dimension reduction to regressions with both continuous and categorical predictors, its requirement of homogeneous predictor covariances across the subpopulations restricts its application in practice. when this condition fails, partial SIR may provide misleading results. In this article, we propose a new estimation method via a minimum discrepancy approach without this restriction. Our method is optimal in terms of asymptotic efficiency and its test statistic for testing the dimension of the partial central subspace always has an asymptotic chi-squared distribution. It also gives us the ability to test predictor effects. An asymptotic chi-squared test of the conditional independence hypothesis that the response is independent of a selected subset of the continuous predictors given the remaining predictors is obtained.


Mathematics and Statistics

Keywords and Phrases

inverse regression; minimum discrepancy approach; partial SIR; partial central subspace

International Standard Serial Number (ISSN)


Document Type

Article - Journal

Document Version


File Type





© 2006 Elsevier, All rights reserved.

Publication Date

01 Jan 2006