A Nonparametric Approach to 3D Shape Analysis from Digital Camera Images -- II
Abstract
With the wide availability of digital cameras and high quality smart phone cameras the world is awash in digital images. These cameras are most often uncalibrated meaning that one does not have knowledge of the internal camera parameters, such as focal length and optical center, nor information about the external camera parameters which describe the location of the camera relative to the imaged scene. In the absence of additional information about the scene one cannot parse out many common types of geometric information. For instance, one cannot calculate distances between points or angles between lines. One can, however, determine the lesser known projective shape of a scene. Projective shape data do not lie in Euclidean space ℝp. This presents special challenges since the overwhelming majority of statistical methods are for data in ℝp. Furthermore, we consider 3D projective shapes of contours extracted from digital camera images which lie in an infinite-dimensional projective shape space and as such presents an especially novel environment for doing statistics. We develop a novel nonparametric hypothesis testing method for the mean change from matched sample contours. Our methodology is applied to the two-sample problem for 3D projective shapes of contours extracted from digital camera images.
Recommended Citation
M. Qiu et al., "A Nonparametric Approach to 3D Shape Analysis from Digital Camera Images -- II," Journal of Applied Statistics, vol. 46, no. 15, pp. 2677 - 2699, Taylor & Francis Ltd., May 2019.
The definitive version is available at https://doi.org/10.1080/02664763.2019.1610163
Department(s)
Mathematics and Statistics
Keywords and Phrases
Extrinsic means; infinite-dimensional data; neighborhood hypothesis test; nonparametric bootstrap
International Standard Serial Number (ISSN)
0266-4763; 1360-0532
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2019 Taylor & Francis Ltd., All rights reserved.
Publication Date
01 May 2019