SMPLicit: Topology-aware Generative Model for Clothed People

In this paper we introduce SMPLicit, a novel generative model to jointlyrepresent body pose, shape and clothing geometry. In contrast to existinglearning-based approaches that require training specific models for each typeof garment, SMPLicit can represent in a unified manner different garmenttopologies (e.g. from sleeveless tops to hoodies and to open jackets), whilecontrolling other properties like the garment size or tightness/looseness. Weshow our model to be applicable to a large variety of garments includingT-shirts, hoodies, jackets, shorts, pants, skirts, shoes and even hair. Therepresentation flexibility of SMPLicit builds upon an implicit modelconditioned with the SMPL human body parameters and a learnable latent spacewhich is semantically interpretable and aligned with the clothing attributes.The proposed model is fully differentiable, allowing for its use into largerend-to-end trainable systems. In the experimental section, we demonstrateSMPLicit can be readily used for fitting 3D scans and for 3D reconstruction inimages of dressed people. In both cases we are able to go beyond state of theart, by retrieving complex garment geometries, handling situations withmultiple clothing layers and providing a tool for easy outfit editing. Tostimulate further research in this direction, we will make our code and modelpublicly available at http://www.iri.upc.edu/people/ecorona/smplicit/.