Automatic Body Measurement by Neural Networks

Size prediction and garment customization are two main goals of body measurement for garment design. Traditional body measurement, involving manual measurement and trying clothes in person, is time-consuming and not cost-efficient. With the help of 3D body scanner and neural networks, body measurement can be fast and accurate, thus reducing the cost. This paper introduces neural network models to predict body sizes and the measurements used to customize clothes from various body data. Three kinds of input data are used: raw 3D point clouds of human bodies, key body locations, and estimated body measurements. Raw point clouds are collected by scanning the participants' body, and key body locations and estimated measurements are automatically computed by existing software based on the point clouds. Then the manual measurement is applied to the participants to obtain the size labels and useful measurements for garment customization, which are used as the ground-truth values of output data. Different network structures are utilized for different kinds of input data. The results show that neural networks can achieve decent performance in predicting measurements for making clothes. While the results of the three models are comparable, the feed-forward network model with estimated measurements achieves the best result in numerical measurement prediction. In terms of size label prediction, the models using estimated measurements achieve similar results, while the CNN model with key body locations and the simplified PointNet model applied on raw point clouds are unable to achieve high accuracy. The initial attempts show the potential of using networks for body measurement. The models can be further improved with a larger amount of data, in order to make it production-ready.