Towards 3D Human Shape Recovery Under Clothing

A learning-based method for estimating clothing fitness and recovering human body shape from clothed 3D scans.

We present a learning-based method for estimating clothing fitness and recovering the underlying human body shape from clothed 3D scans.

Our approach maps clothed human geometry to a geometry image (clothed-GI). To achieve reliable alignment across different clothing types, we extend the parametric human model with skeleton detection and warping. For each pixel on the clothed-GI, we extract a multi-channel feature vector (color, position, normal) and train a modified conditional GAN for per-pixel fitness prediction on a comprehensive 3D clothing dataset.

Our technique significantly improves body shape prediction accuracy, particularly under loose and fitted clothing. We further demonstrate applications in human/clothing segmentation and virtual try-on with high visual fidelity.

Paper · Revised version