Abstract
The youth of today are much more technologically adept than previous generations. This has led to many technological innovations, and the development of these innovations has in turn altered our perception of the world. The film, gaming, and clothing industries have been forced to embrace new technology to satisfy the demand of people for a more realistic virtual experience. These industries all rely on the use of 3D avatars to create virtual depictions of the human body. In the gaming and film industry, the accuracy of the avatar is not so critical. The avatars used in computer games and CGI sections of films have soft contours, which look visually appealing, but are not necessarily accurate to the human form. By contrast, the apparel industry needs to focus on creating very accurate avatars, which represent each person's body shape for virtual fitting to achieve realistic, well-fitted garments. This article describes the methods used to build an avatar and compares the draping between the following scenarios: the real avatar with a real garment, the real avatar with a virtual garment, and finally the virtual avatar with a virtual garment. This research will help to understand how the body shape can affect the virtual fit.
1. Introduction
The clothing industry has long struggled to create patterns which fit well to the human body. In the past, the clothing industry did not have an anthropometric database to aid the development of patterns which would fit the representative body measurements and shape of that time. With the advent of 3D scanning technology and statistical analysis software, it is now possible to quickly analyze body measurements and shapes by ethnic group, age, gender, continent, country, and even county. From this data, it is possible to rapidly extract representative body measurements and shapes. Therefore, in the present-day scenario, the effort necessary to keep the body measurement data up to date for the use of the apparel industry is significantly lesser. Many clothing companies are switching over from physical mannequins to virtual avatars since the deployment of virtual technology offers them more flexibility to quickly update the avatar with new measurements and reduces production costs. Recent advancements in avatar technology have led to the development of “soft” avatars. These avatars simulate the compressibility of the human body tissue such as breasts, fat pockets, and so on. [1]. This new technology helps in the development of compression clothing, such as sportswear. This technology could have further applications in the medical industry, for example, bespoke bras for women who have had breast cancer; also, this technology can be used to generate knitted sleeves to reduce the appearance of collagen scars caused by burns. By using computer-generated scan data along with the soft avatar model, it is possible to design a sleeve that applies the correct amount of pressure based on the severity of the burn [2].
The apparel industry currently works with avatars, which have smooth contours, much like the gaming and film industries use. These avatars may be visually appealing, but are not necessarily accurate to the real human form, and this will cause issues when using this virtual 3D body to design accurate patterns for use in clothing. Online clothing companies use smooth avatars for customers to visualize how they will look in a particular garment. The smoothness of the avatar will tend to make the garment look more flattering than it is in reality, causing the customer to have a misleading expectation of how they will look in the real garment. This could be a factor in return rates as disappointed customers return garments that do not fit as well as they do on the online avatar.
![Figure 1. Example of medical compression sleeves to treat burn scars [3].](/document/doi/10.2478/aut-2021-0015/asset/graphic/j_aut-2021-0015_fig_001.jpg)
Example of medical compression sleeves to treat burn scars [3].
Many online retailers now offer a personalized fit option, where the customer supplies information including age, height, bust, waist, and hip measurements. These data help the software to generate a smooth, symmetrical avatar. This, however, does not cater to people who have disabilities or for older people, where the posture is affected. A smooth, symmetrical avatar is unsuitable as it is not able to correctly simulate clothes’ fitting in this case. In the future, it should be possible for disabled people to use the avatar software to depict the real shape of their body to accurately assess the fit of clothing, instead of using an automatically generated avatar created from a database.
The research was undertaken on the clothing creation for a disabled body. The research suggested a new technique for the fitting of 3D garments on the disabled body, which easily allows the fitting of the garment to be adjusted perfectly. The research used morphological contours of the virtual body to manage the position of the garment by means of ease points and lines. This method requires a lot of skills and experience in the placement of XY reference planes. The presented research suggested that further work ought to be focused on the creation of a parametric mannequin, controlled by the shape of the spine, to consider an atypical body shape [4].
First, various bodies were scanned with a 3D laser scanner in different shots with the same posture of person (Figure 2a–d). Then, using Rapidform software, we were able to realign the different shapes (Figure 2e), rotate them to merge them, and obtain a scanned body (Figure 2f) [4].
![Figure 2. Procedure to measure an atypical body by using a 3D body scanner (a, b, c, d) Rapidform software (e, f) [3].](/document/doi/10.2478/aut-2021-0015/asset/graphic/j_aut-2021-0015_fig_002.jpg)
Procedure to measure an atypical body by using a 3D body scanner (a, b, c, d) Rapidform software (e, f) [3].
The human body is not perfectly symmetrical. When the body is scanned using a 3D scanner such as TC2, independent measurements can be obtained for the left and right sides of the body. This data can be used to create very accurate patterns that are tailored to that person. When using 3D virtual fitting software, the avatar measurements can be adjusted, but the avatar is always symmetrical. Another issue with 3D virtual fitting software is that there are a finite number of key measurements that are used to define the overall body shape and all other measurements are estimated from these. When the body is scanned using a 3D scanner, the entire body is digitally mapped; therefore, all measurements are accurate.
The body measurements obtained from TC2, and size stream scanners are very accurate; however, when the noise is removed from the avatar (by reducing polygons), the scanner software produces a smooth avatar, which does not accurately match the body that was scanned. Therefore, the scanner-generated avatar cannot be relied on for accurate fitting of clothing [5].
![Figure 3. Real 3D body scans image and avatar generated by scan software [5].](/document/doi/10.2478/aut-2021-0015/asset/graphic/j_aut-2021-0015_fig_003.jpg)
Real 3D body scans image and avatar generated by scan software [5].
2. Limitations of virtual fitting software avatars
Figure 4 shows an original scan taken from the size stream body scanner. The body scan image contains a lot of noise resulting in an “orange peel” effect, where there are lots of bumps on the avatar surface. In addition, it is clearly visible that the insides of the upper thighs are joined, and that the inside of the upper arms and torso are also joined. People who are scanned are usually wearing underwear that, if too tight, causes fat pockets to form around the waist and/or breast area.

The body scan of a woman aged between 45 and 60.
Current 3D virtual fitting software developers rely on a generic avatar to represent the human form. An algorithm takes the measurements obtained from the scan and manipulates the shape of the generic avatar to fit these measurements. The issue with this technique is that the real body shape, with all its intricacies, is not accurately captured. This is especially true for older people, as the generic avatar appears to be based on a younger person's body with smooth contours, firm features, and a straight posture. The avatar generated by current 3D virtual fitting software is perfectly symmetrical; however, this is not the case for the real human body. For people with highly asymmetrical bodies, real body scans could be used to generate bespoke clothes patterns with a far superior fit compared to the symmetrical patterns created by a generic avatar.
The apparel industry is rapidly adopting virtual fitting in the development of clothing. The challenge for virtual fitting is to replicate the quality-of-fit achieved pursuant to real fitting on a real person. To achieve this, it is important for the virtual avatar to be as lifelike as possible, which means that it should have all the real contours and asymmetry of the real human body [8].
Recent developments have made it possible for 3D avatars to simulate human movement and posture. However, a current limitation in the software makes it impossible to apply the simulation-feature available with 3D avatars to the raw-data, body-scan avatar. This means that the real body scans need to be manipulated and converted into a format that the software understands, before movements and the necessary postures can be applied [6].
Figure 5 shows Optitex avatars in different poses. These poses are predefined within the software and can only be used with the Optitex generic avatar.

Optitex avatars in various poses.
Unfortunately, Optitex and other fitting software do not allow the user to assign a pose to a raw-data, body-scan avatar; instead, the user must use separate software, such as DAZ Studio, to add animation to the avatar [7, 8, 9].
3. Methodology for creating a realistic custom avatar
The initial aim is to create an avatar which will represent the exact shape of the human subject. To achieve this, the subject was scanned using the TC2 scanner to create an accurate representative avatar. Figure 6 elucidates the process involved in the creation of the avatar.

Scheme of the process of building a realistic representative avatar.
The custom avatar was created for a size 16 woman. The initial raw-scan-data avatar was loaded into Geomagic Design X. This software was used to clean up the wireframe mesh between the upper arms and torso and between the inner thighs. Certain details of the body, such as hair and underwear, were edited out by deleting the unwanted polygons from the mesh and then creating ‘bridges,’ which were then used to fill in the leftover holes. Figure 7 shows a section that was deleted from the head of the avatar mesh.

The section deleted from the avatar mesh using Rapidform XOR3.

Raw scan data avatar after editing using Rapidform XOR3.
Once all the unwanted areas are removed and the mesh rebuilt, any remaining holes in the mesh need to be filled to close the mesh. The decimate function is used to reduce the number of poly faces, which in turn lessens the “orange peel” effect and smoothens out the surface of the avatar.
The hands, feet, face, and ears are rendered poorly on the size stream raw-data-scan avatar. When the participants are scanned, they are required to hold onto bars on either side of the scanner to keep their posture straight. Therefore, the hands are clenched on the raw-body scan, and the fingers cannot be seen. In addition, the feet on the raw-body scan lack detail consequent to the limited resolution of the scanner cameras, which are unable to resolve small features such as the toes.
These features need to be reconstructed using wrapping software such as Wrap3.4.
Figure 9 shows the hands and feet of the avatar before and after they were reconstructed using the Wrap 3.4 software.

Hands and feet reconstructed using Wrap3.4.
To reconstruct the head, feet, and hands, the raw-data avatar needs to be loaded into Wrap3.4. The software has a builtin gallery of template bodies, hands, heads, faces, and so on. For this experiment, the female body template was used in conjunction with the raw-body-scan avatar. Using the wrap function, these two avatars were combined to form a custom avatar with real face, hands, and feet as shown in Figure 10.

Custom avatar generated using the wrap function in Wrap3.4.

Body scan using size stream scanner and virtual fitting simulation.
It is also possible to add skin texture and hair using Wrap3.4, if needed. This is not required for this research since the avatar has been formulated only to aid in clothes’ fitting and is not intended to generate a cosmetic appearance.
4. Real fitting versus virtual fitting using the actual human body
An experiment was undertaken to determine how the body shape and body asymmetry affect real and virtual draping. A female aged 50 was scanned wearing a basic dress and then scanned again in the same standing position without a dress. The mechanical properties (bending and stiffness) of the material were tested and input into the virtual fitting software.
To determine the difference between real and virtual draping, both the real-scan-with-real-dress and the real-body-with-virtual-dress avatars were sliced vertically through the centre from the front and then from the side. The gap between the garment and the body was measured at five different locations starting from the waist and moving down the body in 10 cm steps (Figure 12)

Garment gap measurements.
The measurements obtained are summarized in Table 1. For each garment–body gap measurement, the delta was calculated by subtracting the 3D virtual fitting measurement from the body scan measurement. The results are given in Table 1.
Body scan garment gap measurements versus 3D virtual fitting garment gap measurements
Measurement position | Body scan (mm) | 3D Virtual fitting on body scan (mm) | Delta-value | Body scan (mm) | 3D virtual fitting on Browzwear avatar (mm) | Delta-value | 3D virtual fitting on body scan (mm) | 3D virtual fitting on Browzwear avatar (mm) | Delta-value | |
---|---|---|---|---|---|---|---|---|---|---|
Front View | Waist left side | 16.35 | 13.2 | 3.15 | 16.35 | 8.35 | 8 | 13.2 | 8.35 | 4.85 |
Waist right side | 8.8 | 13.3 | −4.5 | 8.8 | 8.2 | 0.6 | 13.3 | 8.2 | 5.1 | |
Hip left side | 18.65 | 4.05 | 14.6 | 18.65 | 12.4 | 6.25 | 4.05 | 12.4 | −8.35 | |
Hip right side | 10 | 5 | 5 | 10 | 11.85 | −1.85 | 5 | 11.85 | −6.85 | |
Low hip left side | 13.2 | 2.1 | 11.1 | 13.2 | 13.7 | −0.5 | 2.1 | 13.7 | −11.6 | |
Low hip right side | 2.35 | 1 | 1.35 | 2.35 | 12.6 | −10.25 | 1 | 12.6 | −11.6 | |
Thigh left side | 9.5 | 3.75 | 5.75 | 9.5 | 10 | −0.5 | 3.75 | 10 | −6.25 | |
Thigh right side | 8.05 | 10.8 | −2.75 | 8.05 | 8.05 | 0 | 10.8 | 8.05 | 2.75 | |
Low thigh left side | 28.25 | 25 | 3.25 | 28.25 | 13.75 | 14.5 | 25 | 13.75 | 11.25 | |
Low thigh right side | 21.4 | 33.75 | −12.35 | 21.4 | 11.2 | 10.2 | 33.75 | 11.2 | 22.55 | |
Side View | Waist front | 8.4 | 0 | 8.4 | 8.4 | 17.05 | −8.65 | 0 | 17.05 | −17.05 |
Waist back | 12.75 | 58.5 | −45.75 | 12.75 | 10.8 | 1.95 | 58.5 | 10.8 | 47.7 | |
Hip front | 2.85 | 0 | 2.85 | 2.85 | 0 | 2.85 | 0 | 0 | 0 | |
Hip back | 16 | 17.85 | −1.85 | 16 | 0 | 16 | 17.85 | 0 | 17.85 | |
Low hip front | 35.75 | 46.7 | −10.95 | 35.75 | 30.9 | 4.85 | 46.7 | 30.9 | 15.8 | |
Low Hip Back | 2.35 | 3.75 | −1.4 | 2.35 | 4.05 | −1.7 | 3.75 | 4.05 | −0.3 | |
Thigh front | 26.85 | 40 | −13.15 | 26.85 | 25.35 | 1.5 | 40 | 25.35 | 14.65 | |
Thigh back | 50 | 58.15 | −8.15 | 50 | 45 | 5 | 58.15 | 45 | 13.15 | |
Low thigh front | 28.7 | 36.95 | −8.25 | 28.7 | 40 | −11.3 | 36.95 | 40 | −3.05 | |
Low thigh back | 62.6 | 50.95 | 11.65 | 62.6 | 45 | 17.6 | 50.95 | 45 | 5.95 |
A small value for delta indicates that the real and virtual fittings are similar at that location on the avatar. Table 1 shows that there is a significant value obtained for delta at several locations, and it can thus be inferred that there is a significant difference between real-life fitting and virtual fitting. However, the high values for delta that were obtained in this experiment would suggest that the CAD/CAM system does not accurately simulate draping, when compared to real draping; in particular, there is a −45.75 mm delta in the waist–back-gap measurement.
5. Conclusion
It is now possible to achieve a realistic avatar with real body shape contours, including the face, hands, and feet. This is a step closer to achieving virtual fitting that emulates real-world fitting on the real human body.
The apparel industry is becoming more reliant on virtual fitting software such as Browzwear, Optitex, and Lectra. The end user of such software should not be limited to only being able to use the inbuilt generic avatar that comes with the software. The software package should allow the user to have the option to load the user's own custom-body–shape avatar. Once the raw data are loaded, the software should have the ability to automatically create a true-to-life avatar with a realistic head, face, hands, and feet, as well as, more importantly, real body shape contours such as fat pockets and instances of body asymmetry.
Comparing the real scanned body with a real dress to the real scanned body with a virtual dress, the value for delta between them is lower on average than the value for delta between the real scanned body with the virtual dress and the virtual avatar with the virtual dress. When comparing the real scanned body with the real dress to the real scanned body with the virtual dress, it can be seen that the draping algorithm is not entirely accurate, especially around the lumbar region of the back. Software is capable of simulating draping to an extent, but it is still not completely accurate. Material mechanical properties such as bending and stiffness can have a dramatic effect on the draping characteristics of a garment. The results obtained in this article would suggest that fabric's mechanical properties are either ignored or not correctly factored into the algorithms that are used to simulate cloth draping within current CAD/CAM software packages. To accurately simulate cloth draping and ultimately improve the quality-of-fit of virtually fitted garments, the effects of material mechanical properties need to be correctly calculated and incorporated into the cloth simulation algorithm. However, it appears that the largest contributing factor to the error of virtual fitting is due to the inaccuracy of the shape of the software-created avatar. Figure 12 shows that there is a large discrepancy between the real-shape and virtual-shape avatars, especially around the buttocks. The inaccuracy of the virtual avatar shape—combined with an inexact draping algorithm—will cause inaccuracies in the fitting of virtual clothing, and this inaccuracy could be severe enough to cause a poor fit of the clothing-production sample.
References
[1] Zakaria, N., Gupta, D.(Online). Anthropometry, Apparel Sizing and Design. Second edition ISBN 978-0-08-102605-2.Search in Google Scholar
[2] Ilska, A., Kowalski, K., Klonowa, M., Kowalski, T. M., Sujka, W. (2016). Issues regarding the design of textile compression products for small body circumferences. Fibres & Textiles in Eastern Europe, 24, 116–120. doi: 10.5604/12303666.1221745.Search in Google Scholar
[3] https://www.centrumflebologii.pl/pl/wyroby-uciskowe/odziez-plaskodziana.Search in Google Scholar
[4] Bruniaux, P., Cichocka, A., Frydrych, I. (2016). 3D digital methods of clothing creation for disabled people. Fibres & Textiles in Eastern Europe, 5, 125–131. doi: 10.5604/12303666,1215537.Search in Google Scholar
[5] Balach, M., Cichocka, A., Frydrych, I., Kinsella, M. (2020). Initial investigation into real 3D body scanning versus avatars for the virtual fitting of garments. Autex Research Journal, 20, 128–132. doi: 10.2478/aut-2019-0037.Search in Google Scholar
[6] Guan, P., Reiss, L., Hirshberg, D. A., Weiss, A., Black, M. J. (2012). Drape: Dressing any person. ACM Transactions on Graphics (TOG), 31, 1–0. doi: 10.1145/2185520.2185531.Search in Google Scholar
[7] Lin, S. H., Johnson, R. R., Kang, J. Y. (2018). Fitting simulation evaluation on personalized avatars. Journal of Textile Engineering and Fashion Technology, 4(2), 123–128.Search in Google Scholar
[8] Bonetti, F., Warnaby, G., Quinn, L. (2017). : Augmented reality and virtual reality in physical and online retailing: A review, synthesis and research agenda. Augumented Reality and Virtual Reality, pp 119–132.Search in Google Scholar
[9] Suh, K. S., Kim, H., Suh, E. K. (2011), What if your avatar looks like you? dual-congruity perspective for avatar use. MIS Quarterly, 35(3), 711–729.Search in Google Scholar
© 2021 Monika Balach et al., published by De Gruyter
This work is licensed under the Creative Commons Attribution 4.0 International License.