Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access June 29, 2023

Automatic Identification of Wrist Position in a Virtual Environment for Garment Design

  • Yu Chen EMAIL logo , Qinhua Hou , Yan Hong and Weihong Gao
From the journal AUTEX Research Journal

Abstract

With the development of virtual reality, computer-aided design has shown its strength in the garment product development process. Wrist position corresponds to the styloid process of the ulna (SPU). Due the problem that the SPU position is usually not the finest position of the forearm, the wrist position identification in the 3D clothing design is quite important. This paper proposes a method that uses a fixed-step search algorithm based on existing proportion methods to determine the position of the SPU. The accuracy and efficiency of the proposed method has been validated using 100 samples by comparison with the existing methods. It can be fully applied to a virtual 3D-to-2D garment prototyping process and ensure the automation of this process.

1 Introduction

Virtual reality (VR) has shown its strength in the fashion industry, especially when it is applied to garment product development [19]. In general, there are three main applications of this technology in fashion industry: (1) virtual 3D-to-2D garment prototyping [5, 8, 27, 43], (2) virtual 3D try-on [12, 23, 28], and (3) virtual fashion show demonstration [14]. The application of VR to the fashion industry leads to sustainable design processes because it is able to reduce costs in the development process and ensure fast validation of design ideas, which will increase the success of desired fashion products and reduce stocks.

With this background, recently, virtual 3D-to-2D garment prototyping has been paid more and more attention by researchers [11, 15, 16, 29, 38]. It aims at developing 2D garment patterns based on the 3D garment prototype generated in a virtual environment [1, 9, 13]. The 3D-to-2D prototyping can be fully performed in a virtual environment and involves customers in the design process [3, 12]. It has been proved that it has a great possibility to be fully automatic [15].

However, one of the major problems to be solved for its automation is that key human body feature points cannot be identified automatically [6, 25, 40]. These human body feature points can provide a reference for both 3D prototyping and evaluation of the fashion design effect in a virtual fashion show demonstration [24, 25, 41]. The automatic key human body feature points identification can contribute greatly to the automation of the whole garment design process in two ways [34, 39]: (1) the whole automatic 3D-to-2D prototyping can be realized as fully automatic, (2) the whole individualized customization can be fully automatic based on existing 3D scanning technology and automatic manufacturing system [36, 42].

As we all know, automatic identification of human body features points is a systematic work [33]. This is because the definitions of different human body feature points are different [20]. Most of the researchers in this area began by analyzing the wrist position identified from the SPU [22, 32]. The wrist positions identified from the SPU are feature positions on the human body, which are widely recognized by fashion designers in the garment product design process, especially for individualized garment design [18]. The relationship between the wrist positions identified from the SPU and the simulated garment surface in a virtual environment [4] is shown in the Figure 1. The wrist positions identified from the SPU determine the appropriate length of the clothing sleeves, and the cuff circumference of some categories of garments, such as shirt(s), coat(s), jacket(s), and wet suit(s) [17, 30]. They are also the basis for determining the reasonable wearing position of individualized wearable equipment, such as watches and wristbands [2, 7, 21]. Besides, they are also used to locate wrist joints in skeletons, which is an advanced 3D virtual human skeleton generation system.

Figure 1. The relationship between the wrist positions identified with styloid process of the ulna (SPU) and the garment surface.
Figure 1.

The relationship between the wrist positions identified with styloid process of the ulna (SPU) and the garment surface.

Based on literature review, there are mainly three methods for locating the wrist position from the SPU. The first method is based on the proportion of the wrist of the human body (Method 1) [10]. In this method, researchers investigated the proportion of the human body to calculate the position of the SPU based on statistical methods. For example, based on static analysis of Method 1, some research has concluded that the position of SPU corresponds to about 80% of the length of the entire arm. It is obvious that this method is not precise enough for the fashion industry since the variety of human body morphologies is not well considered.

The second method is based on the definition of the position of the desired wrist area for fashion design as the part with the smallest circumference (Method 2) [34, 35]. The researchers first create a set of equidistant planes perpendicular to the central axis of the forearm in the virtual space. Then, this set of planes is used to cut the human forearm in the virtual environment in order to obtain a set of cross-sections of the human wrist surface. By calculating the circumference of the cross-section, the desired SPU is determined as the cross-sections with the smallest circumference. However, this method is imprecise because the definition of the SPU is different from what is required. In fact, the obtained SPU is often close to the palm.

The third method is based on the image analysis of pictures of the forearm at different angles (Method 3) [26, 31, 37]. The general principle of this method is to locate the SPUs on the image. It can be easily found out that this method largely relies on the pictures of the forearm. However, in the real identification process, it is difficult to capture the carpal bones at an appropriate angle. Subsequently, there is a lot of uncertainty when applying this method.

In this research, we proposed an automatic identification method of the SPU. This method can be fully performed automatically in a virtual environment. The proposed method is developed based on the combination of Method 1 and Method 2. The proposed method starts with the identification of a rough position of the wrist position (RPWP) based on the idea in Method 1. Then, this method identifies a target search area on the forearm with 7.5 cm centered by the wrist position located by Method 1 (RPWP). Then a set of equidistant planes perpendicular to the central axis of the forearm in the virtual space is created in this area, referring to Method 2. Next, this set of planes is used to cut the human arm in the virtual environment in order to obtain a set of cross-sections of the human wrist surface. By calculating the circumferences of the cross-sections, the desired wrist position is determined as the cross-section with the smallest circumference.

The novelties of the proposed automatic identification method of the SPU include the following aspects: (1) the proposed method is developed on existing methods, the advantages of which have been fully utilized with ensured precision; (2) the proposed method can be realized fully automatically, avoiding the manual involvement; and (3) the proposed method will benefit fashion intelligence for the automatic fashion product development and fashion accessory development.

The rest of the paper is structured as follows. In Section 2, the overall working scheme and related methods of the proposed automatic identification method of the SPU are provided; Section 3 presents different case implementations to explain the working process of the proposed system; Section 4 discusses the related data and investigates the precision of the proposed method. Comparisons between the proposed method and other existing methods are also provided. Finally, a conclusion is provided in Section 5.

2 Literature Review

2.1 Review wrist position identification methods are as follows

As is presented in the introduction, there are three existing methods to locate the SPU. These methods are found in existing literature. In order to make the problem clear, in this section, we first provide a brief introduction about the mentioned three methods.

2.1.1 Method 1: locating the SPU based on the proportion of the wrist [34, 24]

In this method, researchers first collected a certain number of human body samples by utilizing 3D scanning technology. Second, the SPUs of these samples were manually defined by the experts (professional fashion researchers, fashion designers, and pattern makers) in the virtual environment. Third, researchers calculated the proposition of the SPU on the human arm. Through a process of statistical analysis, researchers took the average of these propositions as a general value about the proposition of the SPU on the human body arm.

The problem was that the result of this method was based on the average of the propositions of the selected samples. The variety of human body morphologies was not considered in this method. Subsequently, there were always mistakes in the real application. The result provided a reference of the SPU, but in most of the cases, it could not be used directly as an accurate result.

In conclusion, this method could not be widely used in the fashion industry due to its lack of precision.

2.1.2 Method 2: locating the SPU based on the smallest circumference

This method (Method 2) is based on the definition of the position of the desired wrist area for the fashion design as the part with the smallest circumference. In order to detect the smallest circumference, researchers first create a set of equidistant planes perpendicular to the central axis of the forearm. This set of planes is then used to cut the human forearm into cross-sections. Subsequently, a set of cross-sections of the human wrist surface is then obtained. By calculating the circumference of the cross-section, the desired SPU is determined as the cross-sections with the smallest circumference. The general process of this method is shown in the Figure 2.

Figure 2. (a) The result of the smallest circumference (marked with green) obtained by Method 2, and (b) examples of the circumferences detected by different cut planes [36, 37].
Figure 2.

(a) The result of the smallest circumference (marked with green) obtained by Method 2, and (b) examples of the circumferences detected by different cut planes [36, 37].

This method is more applicable to diverse human body morphologies, but still, the precision is not qualified as being sensitive enough. First, the definition of the SPU is different from what is desired. Second, the distances between the given planes are fixed, which are not able to capture the real smallest circumference.

2.1.3 Method 3: locating the SPU based on image analysis

This third method (Method 3) is based on the algorithm of image analysis. Pictures of the forearm at different angles are captured first. Second, through an image analysis process, the desired SPU can be identified. The general working process of Method 3 is shown in the Figure 3. This method is based on the analysis of 2D pictures of the forearm. But in real application, it is not easy to obtain 2D forearm pictures with good angles. In this condition, the precision of the final result cannot be guaranteed.

Figure 3. (a) the real human body, (b) scanned 3D human body, (c) the captured human body silhouette and related human body feature points, and (d) the captured feature points of the wrist.
Figure 3.

(a) the real human body, (b) scanned 3D human body, (c) the captured human body silhouette and related human body feature points, and (d) the captured feature points of the wrist.

2.2 Definition of wrist position

In this research, our research subject is the wrist position on the arm. In order to explain clearly the research subject, it is essential to have an unambiguous definition of the wrist position. As explained before, there are different definitions of the wrist position: (1) the curve on the forearm which has minimal circumference, and (2) the wrist position defined by the SPU.

This research is aimed at the development of garment products such as shirt(s), coat(s), jacket(s), and individualized wearable equipment (watches and wristbands). The wrist position defined by the SPU is more meaningful in our research. It is because this position is able to:

  1. Provide reference for the development of fashion mannequin(s). The position is regarded as the terminal of the arm of the mannequin, which determines the arm length.

  2. Provide reference for the development of garment products. This position directly determines positions of cuffs. It is also concerned with diving suits; wrong position of cuffs will greatly affect the wearing comfort and waterproofness.

  3. Provide reference for the development of garment products. This position is able to provide more accurate data about the wearing position of equipment, such as watches and wristbands.

In the real situation, sometimes the wrist position defined as the curve on the forearm, which has minimal circumference, is not able to provide reference for the related product development. There are three cases in Figure 4, and the curves with the minimal circumference of these three cases are on the forearm, which is not between the SPU and the palm. Such a position is not useful for the related product development. From these cases, we can find out that the wrist position, which is defined as the curve on the forearm that has minimal circumference, is not applicable to all kinds of body morphology.

Figure 4. Different cases of the wrist position defined by the curve on the forearm, which has minimal circumference.
Figure 4.

Different cases of the wrist position defined by the curve on the forearm, which has minimal circumference.

3 Research Method

The proposed method is developed to locate the desired SPU automatically with ensured precision. The working flow chart of the proposed automatic identification method of the wrist position identified with the SPU is shown in the Figure 5.

Figure 5. Working flow chart of the proposed automatic identification method of the wrist position identified with the SPU.
Figure 5.

Working flow chart of the proposed automatic identification method of the wrist position identified with the SPU.

This method starts with the identification of a rough desired wrist position search area (RDWPSA) using the general principle of Method 1 as discussed before. This process is shown in the Figure 6. In this process, we first locate a rough wrist position identified with the rough styloid process of the ulna (RSPU) based on the proportion valued developed in Method 1 (the black plane in Figure 6c), which is defined as the position of 80% of the whole arm length from the shoulder point. The full arm length is defined as the distance between the shoulder point to the fingertip position of the middle finger.

Figure 6. Identification of a rough desired wrist position search area (RDWPSA) based on Method 1.
Figure 6.

Identification of a rough desired wrist position search area (RDWPSA) based on Method 1.

Then, we define an RDWPSA on the forearm and/or the palm, which is centered by the RSPU (Figure 6d). The length of the central axis of the RDWPSA on the forearm is defined as 20% of the full arm length (the read area in Figure 6d). The proposition of the 20% is defined based on the fact that the defined 20% of the arm keeps the search area within forearm line and fingertip position of middle finger. In fact, based on an analysis of 200 samples in our research, the length of the central axis of the average search area is around 7.5 cm.

After this process, using the general principle of Method 2, the minimal girth of the RDWPSA can be obtained (see the green circle of Figure 2a). We define the minimal girth of the RDWPSA as “Min.” Based on the minimal position, we defined an area as Area1 on the forearm, which is defined from the Min position to the position of in a distance of 2.5 cm toward the palm from the Min position. The value of 2.5 cm is defined as one-third of the length of the central axis, which is around 7.5 cm.

Then, based on the working process of Method 2, the maximum girth of Area1 can be located. We define the position of Area1 with maximum girth as Max1. After that, we will calculate Max1/Min. If the value of Max1/Min is bigger than 1.1, it means the position of Max1 is on the palm. In this condition, the position of Min can be regarded as the desired SPU, and the working process will be finished. The value of 1.1 is developed based on static analysis with the analysis of 200 human body samples.

If Max1/Min is smaller than 1.1, it means, the position of Max1 is on the forearm but not on the palm. In this condition, we will define an Area2. Area2 is with the length of 2.5 cm toward the palm developed from the position of Area1. Same as before, we will identify the maximum girth of Area2 as Max2. After that, we will calculate Max2/Min. If the value of Max2/Min is bigger than 1.15, it means the position of Max2 is on the palm, which means the position of Max1 is the desired SPU, and the process will be finished. The value of 1.15 is developed based on static analysis with the analysis of 200 human body samples.

If the value of Max2/Min is smaller than 1.15, it means the position of Max2 is on the forearm but not the palm. In this condition, same as the last process, we need to search for another 2.5 cm more from Area2, which is defined as Area3. The maximum girth of Area3 is defined as Max3. If Max3 is bigger than Max2, it means the position of Max2 is the desired SPU. Otherwise, the position of Max3 is regarded as the desired SPU.

3.1 Related concepts and tools of the proposed system

1 Distance between two points in three dimensions

Let M1 = (l1, m1, u1) be a point in a virtual space and M2 = (l2, m2, u2) be another point. The distance between M1 = (l1, m1, u1) and M2 = (l2, m2, u2) can be denoted as:

(1) dn˜,m˜=13l1l22+m1m22+u1u22

2 Definition of a line passing through two points in three dimensions

In three dimensions, lines are frequently described by parametric equations:

(2) x=x0+at,y=y0+bt,z=z0+ct,

where x, y, and z are all functions of the independent variable t, which ranges over the real numbers; and (x0, y0, z0) is any point on the line. a, b, and c are related to the slope of the line, such that the vector (a, b, c) is parallel to the line.

If we have two vertices (x0, y0, z0) and (x1, y1, z1), the vector (a, b, c) will be vector

(3) x1x0,y1y0,z1z0,

3 Normalization of a vector

The normalized vector û of a nonzero vector u=(a, b, c) is the unit vector in the direction of u, i.e.,

(4) u^=uu=a,b,ca*a+b*b+c*c

where |u| is the norm (or length) of u.

4 Definition of a plane passing through a point and line in three dimensions

If we have one point (x0, y0, z0) and a normal (a, b, c), we can construct a plane as

(5) Ax+By+Cz+D=0,

where A=a, B=b, C=c, and D=−(ax0 + by0 + cz0).

5 Definition of the center of a circle curve

The center of this oval is defined as

(6) vCenter=Σi=0nvertexin

(7) vertex1+vertex2++vertexn1/n,

where n is number of points on the circle curve.

4 Experiments

In order to validate the proposed method, we applied the proposed method to 100 applications. These 100 samples were collected for the year 2007 to 2017. For the involved 100 samples, 20 samples of them were scanned by TC square scanner, 70 samples were scanned by Human Solutions Company 2011 scanner, and the rest 10 samples were scanned by some company's self-made scanner. There are 88 adult male samples and 12 adult female samples. The age of the samples is between 18 and 69 years old, and approximately half of them are between 30 and 50 years old. The height of these samples vary from 150 cm to 194 cm, and approximately half of them are between 160 cm and 179 cm. The body weight of the samples is from 45 kg to 112 kg, and approximately half are between 60 kg and 80 kg. Based on Section 2.2, it can be concluded that, in general, there are four scenarios for the proposed method. In this section, four representative cases corresponding to four scenarios are discussed to validate the effectiveness of the proposed method.

4.1 Experiment I: Presentation of the working process of Scenario I

We randomly selected a sample from process of Scenario I as object of this experiment (Case Study I). The selected sample's name is Christof (male, scan date 20170213, age 34, using scanner of “Human solution company 2011,” Height: 174.52, waist girth: 74.19, hip girth:91.61). The working process for Christof is as follows:

Step 1: Detection of the shoulder point, middle finger vertex, and full arm length

We first detected the coordinate of the shoulder point and the end vertex (middle finger vertex) manually in the virtual space. For Christof, the coordinate of the shoulder point is (17.067, 52.418, −4.234), and the end vertex (middle finger vertex) is (31.263, −21.304, −1.445). The distance between these two points can be regarded as the length of the full arm. Based on Equation (1), the length of the full arm is 75.128 cm (Figure 7a).

Figure 7. Identification of a rough desired wrist position search area (RDWPSA) based on Method 1.
Figure 7.

Identification of a rough desired wrist position search area (RDWPSA) based on Method 1.

Step 2: Definition of the rough arm central axis

In this step, we considered the line passing through the shoulder point and the end vertex (middle finger vertex) as a rough arm central axis. It is defined as a rough one since it is not possible to guarantee that the scanned arm is in a straight condition during the scanning process. We will use the rough arm's central axis to obtain an accurate one in the following step (Figure 7a).

Also, based on Equation (3), the direction vector of the rough arm's central axis is (31.263–17.067, −21.304–52.418, −1.455–(−4.234)) = (14.196, −73,722, 2.789). Then, through a normalization process using Equation (4), the direction vector can be expressed as (0.189, −0.981, 0.037). Based on Equation (2), the parametric equation of the line is then

x,y,z=17.067,52.418,4.234+t31.263,21.304,1.445=1+7t,64t,3+4t

and the line equation is

x=17.067+0.189*ty=52.4180.981*tz=4.234+0.037*tt0,75.128.

Step 3: Definition of the rough center of the rough wrist position identified with the RSPU

Afterward, we took the 80% of the length of the rough arm central axis from the shoulder point, and we got the point (28.424, −6.559, −2.003) on the rough arm central axis with the distance of 60.102 cm from the shoulder point (Figure 7b). The point of (28.424, −6.559, −2.003) can be regarded as the rough center of the rough wrist position identified with the RSPU.

Step 4: Development of a cut plane based on the RMPPCB and identification of the real center of the rough wrist position identified with the RSPU

Then we took the position of the point (28.424, −6.559, −2.003) and using the direction vector of the rough arm central axis as the normal to develop a cut plane. We named the cut plane as Plane C (Figure 7c). Based on Equation (5), Plane C can be expressed as:

11.357x+58.977y+2.231z+705.176=0.

Subsequently, a cross-section is developed between the arm surface and the created Plane C. After that, we calculated the real center of the cross-sections created between the arm surface and Plane C. The real central axis will pass through the center of the cross-sections literately. In order to define the real central axis, we first located the center of the cross-sections. This center will be passed through by the real central axis. The center of the rough wrist position identified with the RSPU is located in the real central axis. It means that the center of the rough wrist position identified with the RSPU can then be identified.

The shape of the cross-sections is close to an oval. Using Equation (6), the final obtained center was (28.842, −6.599, −2.003) (Figure 7d).

Step 5: Definition of the rough forearm line position, development of a cut plane based on the RSPU, and identification of the real center of the rough forearm line position

Using the same idea of Steps 3 and 4, we obtained the rough center of the cross-sections as (26.598, 0.847, 0.678), which is at the position of 70% of the length of the rough arm central axis from the shoulder point (Figure 8). Then we took the normal of the rough center to develop a cut plane (Plane D) as

9.937x+51.605y1.952z+221.934=0.

Figure 8. Definition of the rough (left) and real (right) central axis.
Figure 8.

Definition of the rough (left) and real (right) central axis.

After that, a set of cross-sections can be defined and based on Equation (5), the real center of the rough forearm is (27.931, −6.458, 3.182).

Step 6: Definition of the real arm central axis

After this, we can define the real arm center axis, which is the line of the forearm, which is the line passing through the real center of the RSPU (26.598, 0.847, 0.678) and the real center of the rough forearm line (27.931, −6.458, 3.182).

Then, based on Equation (2), the equation of the real arm central axis can be expressed as:

x=26.598+0.17*ty=0.8470.932*tz=0.678+0.319*t.

Subsequently, the RDWPSA can be defined, where t∈ [0.0, 7.836], and the length of the arm central axis of the RDWPSA is 7.836 cm.

Step 7: Definition of Min

A set of cut planes was developed. These planes took the arm central axis as the normal. A set of cross-sections was developed subsequently. Then the circumferences of these cross-sections were measured. The cross-section with the minimum circumference was defined as Min. In this case of Tarnjeet, the circumference of Min is 13.2540.

Step 8: Definition of Area1 and Max1

Based on the position of Min, the desired Area1 can be developed, which is in the distance of 10% of the full arm length from Min to the finger (Figure 9). Then Max1 can be obtained, which is defined as the maximum circumference of Area1. The circumference of Max1 is 16.4014 cm in this case.

Figure 9. Definition of Area1 and Max1 of Case I.
Figure 9.

Definition of Area1 and Max1 of Case I.

In this case, Max1 girth/Min girth is 1.2375, which is bigger than 1.1. It indicates that the position of Max1 in on the palm, and Min is the desired SPU. The process for Tarnjeet is finished.

4.2 Experiment II: Presentation of the working process of Scenario II

We randomly selected a sample from process of Scenario II as object of this experiment (Case Study II). The selected sample's name is Guest01_20130416 (male, age 47, scan date 2013-04-16, scanner “human solution 2011,” height:168 cm, waist girth:84.8 cm, hip girth:90.76 cm). Using the same operation of Steps 1–7 of Case Study I, we obtained the Min with the girth of 15.1531 cm (see Figure 10) as well as Max1 with the girth of 15.8209 cm.

Figure 10. Definition of Area1 and Max1 of Case II.
Figure 10.

Definition of Area1 and Max1 of Case II.

In this case, the value of Max1/Min was 1.0441, which is smaller than 1.1. It indicated that the position of Max1 was not on the palm, so we continued searching and obtained Max2 with the girth of 19.5324. The value of Max2/Min was 1.289, which was bigger than 1.15. It indicated that the position of Max2 was on the palm, and Max1 was the desired SPU. The process for Guest01_20130416 finished.

4.3 Experiment III: Presentation of the working process of Scenario III

We randomly selected a sample from process of Scenario III as object of this experiment (Case Study III). The selected sample's name is Guest_04_20120924 (male, age 36, scan date 2012-09-24, scanner “human solution 2011,” height:189 cm, waist girth:100.7 cm, hip girth:109.1 cm). Using the same operation of Steps 1–7 of Case Study I, we obtained the Min with the girth of 18.0465 cm (see Figure 11) as well as Max1 with the girth of 18.565 cm.

Figure 11. Definition of Min and Max1 of Case III.
Figure 11.

Definition of Min and Max1 of Case III.

As is shown in Figure 12, in this case, the value of Max1/Min was 1.0287, which is smaller than 1.1. It indicated that the position of Max1 was not on the palm, so we continued searching and obtained Max2 with the girth of 20.2068 cm. The value of Max2/Min was 1.1197, which was smaller than 1.15. It indicated that the position of Max2 was not on the palm, so we continued searching and obtained Max3 with the girth of 23.7226 cm. The value of Max3/Min was 1.3145, which was bigger than 1.15. It indicated that the position of Max3 was on the palm, and Max2 is the desired SPU. The process for Guest_04_20120924 finished.

Figure 12. Definition of Max1, Max2, and Max3 of Case III.
Figure 12.

Definition of Max1, Max2, and Max3 of Case III.

4.4 Experiment IV: Presentation of the working process of Scenario IV

We selected a sample from process of Scenario IV as object of this experiment (Case Study IV). The selected sample's name is Vincent_2007_0212_Male (male, age 27, scan date 2007-02-12, scanner “TC square,” height: 163 cm, waist girth: 68.3 cm, hip girth: 81.6 cm). The model was scanned a long time ago. Due to the technical limitations of the scanner at the time, there were some problems with the final scanned hand of this model. This sample was chosen to test the robustness of our method.

Using the same operation of Steps 1–7 of Case Study I, we obtained the Min with the girth of 12.43 cm (see Figure 13) as well as Max1 with the girth of 12.33 cm.

Figure 13. Definition of Min, Max1, Max2 and Max3 of Case IV.
Figure 13.

Definition of Min, Max1, Max2 and Max3 of Case IV.

In this case, the value of Max1/Min was 0.992, which is smaller than 1.1. It indicated that the position of Max1 was not on the palm, so we continued searching and obtained Max2 with the girth of 12.23 cm. The value of Max2/Min was 0.983, which was smaller than 1.15. It indicated that the position of Max2 was not on the palm, so we continued searching and obtained Max3 with the girth of 12.01 cm. The value of Max3/Min was 0.966, which was smaller than 1.15. It indicated that the position of Max3 was not on the palm, and Max3 is the desired SPU. The process for Vincent_2007_0212_Male finished.

5 Comparison and discussion

In order to validate the proposed method, we also carried out validation experiments using the referred Methods 1, 2, and 3, which are found in literature with 100 samples in our study.

5.1 Validation experiment design

In these validation experiments, we first defined the desired wrist position identified with the SPU by the group of experienced fashion designers through a manual process. One hundred samples were studied in the validation experiment. The manual definition of the SPU of each sample was realized by a group discussion of a group of experienced fashion designers. The defined SPUs in this process were all approved by all the experienced fashion designers in this group.

After that, we carried out the experiments for the definition of the SPU using Methods 1 and 2, separately. Method 3 was not compared in the validation experiment since the general working principle of this method was not relevant to the method proposed by us. Then, we calculated the distances of the centers between the manually defined SPU and that of Methods 1, 2, 3, and our method. The distance is measured by Euclidean distance.

5.1.1 Data analysis

Using Equation (1), distances of the centers between the manually defined SPU and that of Methods 1, 2, and our method can be obtained. Then we took the average of these distances of all of the 100 samples. The final averaged distances of the 100 cases are shown in Table 1.

Table 1.

Average distances between the manually defined SPU to SPU defined by Methods 1, 2, and our method of all the samples.

Average Distance by Method 1 Average Distance by Method 2 Average Distance by proposed method
1.782cm 1.247cm 0.583cm

Table 1: average distances of the centers between the manually defined SPU and that of Methods 1, 2, 3, and our method.

From Table 1, we found out that the average distance of our method is 0.583 cm, which is much smaller than the result of Methods 1 (1.782 cm) and 2 (1.247 cm). It can be concluded that our method is more accurate than other two methods. The proposed method takes the advantage of the existing methods (Methods 1 and 2), which guarantees the success of our method.

Table 2 presents the Max/Min values of the distances between the manually defined SPU to SPU defined by of Methods 1, 2, and our method of all the samples. These data indicate that our methods are more successful than other methods.

Table 2.

Max/Min distances between the manually defined SPU to SPU defined by of Method 1, 2, and our method of all the samples.

Max/Min Distance by Method 1 Max/Min Distance by Method 2 Max/Min Distance by proposed method
3.73/0.304cm 2.375/0.27cm 1.721/0.05cm

Also, the general distance obtained by Method 2 is 1.247 cm, which is smaller than the result of Method 1; this result indicates that Method 2 in general is more accurate than Method 1. The general working principle of Method 2 reserves more information of human morphology than Method 1.

6 Conclusion

In this research, we focused on the automatic identification of human body feature points and developed an automatic identification method of these points. The proposed method takes advantages of the existing methods. It has been validated that the proposed method can be more accurate than other methods. This method is a fully automatic method to be performed in a virtual environment, which avoids the manual involvement. The proposed method enhances the involvement of virtual reality in garment product development area. The level of automation of computer-aided design has also been increased in our method. In all, the proposed method will benefit fashion intelligence for the automatic fashion product development and fashion accessory development. Future work of this research is to develop the methods for the identification of other human body feature points using the same idea.

  1. Conflicts of Interest:

    The authors declare no conflict of interest.

Acknowledgments

This research was funded by “The Program for Professor of Special Appointment (Eastern Scholar) at Shanghai Institutions of Higher Learning”, Funding code TP2017074 and QD2018040.

References

[1] Agarwal G, Koehl L, Zeng X, et al. (2009) An intelligent system for supporting design of fashion oriented personalized fabric products.Search in Google Scholar

[2] Baek S-Y and Lee K. (2012) Parametric human body shape modeling framework for human-centered product design. Computer-aided design 44: 56–67.Search in Google Scholar

[3] Bartle A, Sheffer A, Kim VG, et al. (2016) Physics-driven pattern adjustment for direct 3D garment editing. ACM Trans. Graph. 35: 50:51–50:11.Search in Google Scholar

[4] Camba JD, Contero M and Company P. (2016) Parametric CAD modeling: An analysis of strategies for design reusability. Computer-Aided Design 74: 18–31.Search in Google Scholar

[5] Chaw Hlaing E, Krzywinski S and Roedel H. (2013) Garment prototyping based on scalable virtual female bodies. International Journal of Clothing Science & Technology 25: 184–197.Search in Google Scholar

[6] Chen M and Li X. (2011) Algorithm for extracting end feature point of 3D scanning human body model. International Congress on Image & Signal Processing.Search in Google Scholar

[7] Chen Z, Tang K and Joneja A. (2013) Fast and automatic identification of contours for girth measurement on 3D human models with variant postures. Computer-Aided Design and Applications 10: 321–337.Search in Google Scholar

[8] Chen X, Tao X, Zeng X, et al. (2015) Control and optimization of human perception on virtual garment products by learning from experimental data. Knowledge-Based Systems 87: 92–101.Search in Google Scholar

[9] Dong M, LIU K, HONG Y, et al. (2018) A body measurements and sensory evaluation-based classification of lower body shapes for developing customized pants design. Industria Textila 69: 111–117.Search in Google Scholar

[10] Hall WS. (1896) The changes in the proportions of the human body during the period of growth. The Journal of the Anthropological Institute of Great Britain & Ireland 25: 21–46.Search in Google Scholar

[11] Hong Y, Bruniaux P, Zeng X, et al. (2018a) Design and evaluation of personalized garment block for atypical morphology using the knowledge-supported virtual simulation method. Textile Research Journal 88: 1721–1734.Search in Google Scholar

[12] Hong Y, Bruniaux P, Zeng X, et al. (2017a) Virtual reality-based collaborative design method for designing customized garment for disabled people with scoliosis. International Journal of Clothing Science 29: 226–237.Search in Google Scholar

[13] Hong Y, Bruniaux P, Zeng X, et al. (2018b) Visual-simulation-based personalized garment block design method for physically disabled people with scoliosis (PDPS). Autex Research Journal 18: 35–45.Search in Google Scholar

[14] Hong Y, Bruniaux P, Zhang J, et al. (2018c) Application of 3D-TO-2D garment design for atypical morphology: a design case for physically disabled people with scoliosis. Industria Textila 69: 59–64.Search in Google Scholar

[15] Hong Y, Zeng X, Bruniaux P, et al. (2018d) Garment opening position evaluation using kinesiological analysis of dressing activities: Case study of physically disabled people with scoliosis (PDPS). Textile Research Journal 88: 2303–2318.Search in Google Scholar

[16] Hong Y, Zeng X, Bruniaux P, et al. (2017b) Interactive virtual try-on based three-dimensional garment block design for disabled people of scoliosis type. Textile Research Journal 87: 1261–1274.Search in Google Scholar

[17] Hong Y, Zeng X, Bruniaux P, et al. (2017c) Collaborative 3D-to-2D tight-fitting garment pattern design process for scoliotic people. FIBRES & TEXTILES in Eastern Europe.Search in Google Scholar

[18] Huang J, Kwok T-H and Zhou C. (2019) Parametric design for human body modeling by wireframe-assisted deep learning. Computer-Aided Design 108: 19–29.Search in Google Scholar

[19] Jian S, Peng L and Wei J. (2013) 3D Garment Design of the Computer Virtual Reality Environment. Applied Mechanics and Materials 484–485: 1041–1044.Search in Google Scholar

[20] Jiang L, Ye J, Sun L, et al. (2019) Transferring and fitting fixed-sized garments onto bodies of various dimensions and postures. Computer-Aided Design 106: 30–42.Search in Google Scholar

[21] Kim S, Jeong Y, Lee Y, et al. (2010) 3D pattern development of tight-fitting dress for an asymmetrical female manikin. Fibers and Polymers 11: 142–146.Search in Google Scholar

[22] Krzywinski S and Siegmund J. (2017) 3D Product Development for Loose-Fitting Garments Based on Parametric Human Models. IOP Conference Series: Materials Science and Engineering. IOP Publishing, 152006.Search in Google Scholar

[23] Kulinska M, BRUNIAUX P, Ainamo A, et al. (2016) Virtual mannequins and garment parametrization. Uncertainty Modelling in Knowledge Engineering and Decision Making: Proceedings of the 12th International FLINS Conference. World Scientific, 984–989.Search in Google Scholar

[24] Kobayashi M, Berger RA, Nagy L, et al. (1997) Normal kinematics of carpal bones: a three-dimensional analysis of carpal bone motion relative to the radius. Journal of Biomechanics 30: 787–793.Search in Google Scholar

[25] Krzywinski S and Siegmund J. (2017) 3D Product Development for Loose-Fitting Garments Based on Parametric Human Models. IOP Conference Series: Materials Science and Engineering. IOP Publishing, 152006.Search in Google Scholar

[26] Kulinska M, BRUNIAUX P, Ainamo A, et al. (2016) Virtual mannequins and garment parametrization. Uncertainty Modelling in Knowledge Engineering and Decision Making: Proceedings of the 12th International FLINS Conference. World Scientific, 984–989.Search in Google Scholar

[27] Leong I-F, Fang J-J and Tsai M-J. (2007) Automatic body feature extraction from a marker-less scanned human body. Computer-Aided Design 39: 568–582.Search in Google Scholar

[28] Lin Y-L and Wang M-JJ. (2011) Automated body feature extraction from 2D images. Expert Systems with Applications 38: 2585–2591.Search in Google Scholar

[29] Liu Y-J, Zhang D-L and Yuen MM-F. (2010) A survey on CAD methods in 3D garment design. Computers in Industry 61: 576–593.Search in Google Scholar

[30] Magnenat-Thalmann N, Kevelham B, Volino P, et al. (2011) 3D Web-Based Virtual Try On of Physically Simulated Clothes. Computer-Aided Design and Applications 8: 163–174.Search in Google Scholar

[31] Meng Y, Wang CC and Jin X. (2012) Flexible shape control for automatic resizing of apparel products. Computer-aided design 44: 68–76.Search in Google Scholar

[32] Montagna G, Carvalho H, Catarino AP, et al. (2012) Ergonomic garment design for a seamless instrumented swimsuit. 4th International Conference on Applied Human Factors and Ergonomics (AHFE). USAsa Publishing, 1204–1213.Search in Google Scholar

[33] Ouellet S and Michaud F. (2018) Enhanced automated body feature extraction from a 2D image using anthropomorphic measures for silhouette analysis. Expert Systems with Applications 91: 270–276.Search in Google Scholar

[34] Pietka E, Kaabi L, Kuo ML, et al. (1993) Feature extraction in carpal-bone analysis. IEEE Transactions on Medical Imaging 12: 44–49.Search in Google Scholar

[35] Vitali, A., &and Rizzi, C. (2017). A virtual environment to emulate tailor's work. Computer-Aided Design and Applications, 14(5), 671–679.Search in Google Scholar

[36] Wang CCL. (2005) Parameterization and parametric design of mannequins. Computer-Aided Design 37: 83–98.Search in Google Scholar

[37] Wang CCLC. (2013) Digital Human Body.Search in Google Scholar

[38] Wang CC, Chang TK and Yuen MM. (2003a) From laser-scanned data to feature human model: a system based on fuzzy logic concept. Computer-Aided Design 35: 241–253.Search in Google Scholar

[39] Wang CC, Wang Y, Chang TK, et al. (2003b) Virtual human modeling from photographs for garment industry. Computer-Aided Design 35: 577–589.Search in Google Scholar

[40] Wang CC, Wang Y and Yuen MM. (2003c) Feature based 3D garment design through 2D sketches. Computer-aided design 35: 659–672.Search in Google Scholar

[41] Wang CC, Wang Y and Yuen MM. (2005) Design automation for customized apparel products. Computer-aided design 37: 675–691.Search in Google Scholar

[42] Zhang D, Liu Y, Wang J, et al. (2018) An integrated method of 3D garment design. The journal of the Textile Institute 109: 1595–1605.Search in Google Scholar

[43] Zhang D, Wang J and Yang Y. (2014) Design 3D garments for scanned human bodies. Journal of Mechanical Science Technology 28: 2479–2487.Search in Google Scholar

[44] Zhang Y, Wang CC and Ramani K. (2016) Optimal fitting of strain-controlled flattenable mesh surfaces. The International Journal of Advanced Manufacturing Technology 87: 2873–2887.Search in Google Scholar

[45] Zhong Y, Jiang J, Wang Z, et al. (2011) 3D Garment Prototyping from 2D Drawings.Search in Google Scholar

Published Online: 2023-06-29

© 2021 Yu Chen et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 26.9.2023 from https://www.degruyter.com/document/doi/10.2478/aut-2021-0054/html
Scroll to top button