Skip to content
BY 4.0 license Open Access Published by De Gruyter Open Access October 25, 2023

Automatic Recognition of Density and Weave Pattern of Yarn-Dyed Fabric

  • Jun Xiang and Ruru Pan EMAIL logo
From the journal AUTEX Research Journal

Abstract

Under the production mode of small-batch and multi-item, the recognition of yarn-dyed fabric patterns is a crucial task in the textile industry. In this article, an automatic recognition system based on pixel-level features is proposed to recognize the density, the weave pattern, and the color pattern. In this system, the fabric images are captured by a scanner. First, a method based on the Hough transform is used to correct the skew of the yarns, including warp and weft. Second, the yarns and nodes are located in the enhanced images with a brightness-projection method. The density can be calculated by using the results. Then, the type of each node is identified based on the boundary information. We can obtain the weave pattern after knowing the type of each node. Finally, the fuzzy C-means algorithm is used to determine the color of each node, and thus we obtain the color pattern of the yarn-dyed fabric. Experimental results demonstrate that the proposed recognition system is effective for detecting the structural parameters of yarn-dyed fabric.

1. Introduction

With the development of society, small-batch and multi-item have gradually become the production mode of the textile industry, which means that the textile companies are in desperate need of a rapid, automatic, and accurate method for analyzing the structural parameters and weave patterns of the fabric. In the traditional manual method, these parameters are identified by the inspectors with a textile magnifying glass, which is time-consuming and labor-intensive. Along with this trend, the recognition of fabric parameters, such as density and weave pattern, has become a research hotspot. Many feasible algorithms [1,2,3,4,5,6] based on image analysis have been proposed to recognize the fabric density and pattern. Generally, these methods can be classified into two categories: hand-crafted feature-based methods and deep learning-based methods.

The low-level features, such as the frequency and time domain features of the image, are often used to represent the texture information of the fabric. In the frequency domain, the peaks in the Fourier transform results of the fabric image represent the periodic structure formed by warp and weft yarns. Automatic detection [7,8,9,10] of fabric density can be achieved by locating the peaks in the fabric image and counting the number of located yarns. In the time domain, the pixels in the yarn have higher gray levels, while the pixels in the gap between the yarn reflect fewer lights, showing a lower gray level. In ref. [11], gray projection was proposed to locate the nodes and yarns. Then, pixel-level features, such as geometrical features [9,12, 13] and statistical features [14,15], are extracted to classify the nodes. However, these methods pay more attention to white or homochromatic fabrics.

Recently, a significant breakthrough has been achieved on image analysis by moving from the low-level feature-based algorithms to deep learning-based end-to-end framework. Convolutional neural networks (CNNs) are widely applied to solve pattern recognition tasks. In the textile field, many researchers also adopted CNN methods to address some visual tasks, such as fabric image retrieval [16,17,18], weave pattern recognition [1,19,20,21], and defect detection [22,23,24]. For fabric weave pattern recognition, some researchers [20,25] converted the task to a classification task. However, Meng et al. [1] regarded the pattern recognition task as a multi-task problem that needs to locate and classify the nodes. Technologically speaking, fabric density and pattern recognition are carried out by many tasks, such as regression, classification, object detection, and color measurement. The complexity of this task makes it difficult to train and learn from the end-to-end CNN model. Moreover, the CNN methods require a lot of labeled data to drive learning, but it is difficult to obtain many ground-truth. This is the main reason why there are still few studies based on deep learning to address the weave pattern problem.

With the high demands for the appearance of clothes, yarn-dyed fabrics play a more important role in the textile market. Due to the mutual influence of various colors in the image of yarn-dyed fabric, it is difficult to achieve good performance by the current methods. Pan et al. [26] proposed a method for automatic detection of structural parameters of yarn-dyed fabric based on Hough transform and fuzzy C-means (FCM) algorithm. Xin et al. [27] used the active grid model to recognize the color pattern of yarn-dyed fabric. However, if yarn-dyed fabric has weft skew, these methods may not work well. Based on the work of previous researchers, in this article, an efficient method is proposed to recognize the structural parameters of yarn-dyed fabric, including density, weave pattern, and colored yarn arrangement. The proposed method can help companies analyze fabric parameters more quickly, resulting in efficient production. The following section introduces the proposed method in detail.

2. Fabric image acquisition

Woven fabrics are produced by interlacing two perpendicular sets of yarns: vertically passing warps and horizontally wefts. The cross states of warps and wefts are called interlacing nodes, which are classified into two types as follows: the warp interlacing node denotes a node with a warp residing on the top of a weft, and the weft interlacing node refers to a node with a weft passing above a warp. Generally, the weave pattern consists of the recurrence of the basic weave repeat. Yarn-dyed fabric, as a common type of woven fabric, is woven by two or more different colors of yarn.

The stability of the acquisition environment, including lighting conditions, scale, resolution, etc., is essential for fabric pattern recognition. For this requirement, the scanner is a good choice, which is not only cheap but also convenient. In this study, Canon 9000F Mark II Scanner is used to capture the yarn-dyed fabric image in the RGB (red, green, blue) model. The light source of this scanner is a white light-emitting diode, which can guarantee a stable capture environment. When obtaining fabric images, the resolution is an important parameter. A too high resolution will increase the amount of calculation in the recognition process, thereby affecting the recognition speed; whereas a too low resolution will cause inaccurate recognition. According to the warp density (weft density) of the fabric of 800 pieces/10 cm and the maximum tightness of 80%, the gap between the yarns is 0.025 mm. To ensure the accuracy of recognition, the gap should occupy at least two pixels in the scanned image. Under this condition, the corresponding resolution is not less than 2,032 dpi. So, in this study, the yarn-dyed fabric images are captured with a resolution of 2,400 dpi.

As shown in Figure 1, the image captured in 2,400 dpi contains 946 pixels and 26.5 warp threads per 1 cm. The wrap density of the fabric in the image is 67.3 threads/inch. Then, we can obtain that each warp yarn occupies about 35.7 pixels, which can ensure recognition accuracy. This calibration will serve as the benchmark for our subsequent recognition of fabric structural parameters. When acquiring fabric images, we should try to avoid the edge of the fabric. Moreover, the surface of the fabric should be clean and flat.

Figure 1. Calibration of pixels corresponding to 1 cm of fabric.
Figure 1.

Calibration of pixels corresponding to 1 cm of fabric.

3. Fabric image correction

When capturing fabric images, it is difficult to ensure that the warp and weft yarns of the fabric remain completely horizontal and vertical. However, the inclination of the yarns in the fabric image will generally cause a large error in the recognition of fabric parameters. To improve the performance of the recognition, skew correction is required.

In this study, the process of skew correction is shown in Figure 2. We first enhance the fabric image, specifically, (1) convert the acquired RGB image to HSV (hue, saturation, value) space; (2) extract the brightness component V and perform histogram equalization on it. Considering that the interlacing nodes of the fabric are closely related to the brightness, we use the brightness component to detect the tilt angle of the fabric. The used histogram equalization can be represented by:

(1) sk=Trk=L1MNj=1knj,k=0,1,L1,

where MN is the total number of pixels in the image, L represents the number of gray levels of the image, and nk is the number of pixels whose grayscale is rk. Figure 3 presents the effect of the histogram equalization, where Figure 3(a) shows the brightness component corresponding to the fabric image in Figure 4, and Figure 3(b) shows the result enhanced by the histogram equalization. The interlacing nodes in the enhanced image are more prominent.

Figure 2. The process of skew correction.
Figure 2.

The process of skew correction.

Figure 3. Brightness enhancement of fabric image. (a) The brightness component V and (b) enhanced V component.
Figure 3.

Brightness enhancement of fabric image. (a) The brightness component V and (b) enhanced V component.

Figure 4. A sample of captured fabric images.
Figure 4.

A sample of captured fabric images.

Then, the OTSU (named after the founder’s name) method is applied to segment interlacing nodes of the fabric. Due to the complex colors in the yarn dyed-fabric, directly using this method for threshold segmentation of the entire image will cause many interlacing nodes to be unrecognized, as shown in Figure 5(a). This study first divides the image into multiple blocks, as shown in Figure 5(b), and then uses the OTSU method to threshold each block. It is stated here that the number of grids is not fixed, but depends on the image size. In this study, each block contains at least 250 pixels (50 px × 50 px). Figure 5(c) presents the segmentation results using the proposed strategy. Compared to Figure 5(a), many missing interlacing nodes are recognized. After obtaining the binary image, the Canny operator is used to detect the edge, and the detection result is shown in Figure 5(d). The promising result demonstrates the effectiveness of the proposed method for interlacing node recognition.

Figure 5. Interlacing node extraction of fabric images. (a) The result of global threshold segmentation, (b) blocked result, (c) result of local threshold segmentation, and (d) the result of edge detection.
Figure 5.

Interlacing node extraction of fabric images. (a) The result of global threshold segmentation, (b) blocked result, (c) result of local threshold segmentation, and (d) the result of edge detection.

The aforementioned method can detect most of the interlacing nodes of fabric images. Then, two morphological methods, including erosion and dilation, are used to enhance the features of interlacing nodes. When detecting the warp inclination, the kernel of erosion and dilation is (4,1), and when detecting weft inclination, the kernel of erosion and dilation is (1,4). We apply Hough transformation to detect the lines in the binary image. The Hough transformation can detect thousands of straight lines in the image. We filter the straight lines that pass through more interlacing nodes by limiting the angle range and the length of the straight line. The detection results are shown in Figure 6. Finally, we regard the mean value of the inclination angle of the filtered straight line as the inclination angle of the yarns in the image. The proposed method detects that the warp inclination angle of the sample fabric is 2.16° (90 – θ). After obtaining the warp-inclination angle, we rotate the image by 90 – θ, and then we can obtain an image where the warp is vertical, as shown in Figure 7(a).

Figure 6. Hough transformation of the fabric image. (a) Spectrogram of Hough space and (b) straight lines detected in the fabric image.
Figure 6.

Hough transformation of the fabric image. (a) Spectrogram of Hough space and (b) straight lines detected in the fabric image.

Figure 7. Results of warp and weft correction. (a) Result of warp correction and (b) result of weft correction.
Figure 7.

Results of warp and weft correction. (a) Result of warp correction and (b) result of weft correction.

When the warp yarns are in a vertical state, we use affine transformation to correct the weft-inclination. The warp-corrected image undergoes the aforementioned steps, including threshold segmentation, edge detection, morphological processing (kernel: (1,4)), and Hough transform to obtain the tilt angle. Using the tilt angle, we perform the affine transformation to correct the weft. The weft correction result of the sample fabric is shown in Figure 7(b).

4. Detection of the warp and weft yarns

In the yarn image, the yarn axis part reflects more light, showing higher brightness; whereas the reflected light on both sides of the yarn is less, showing lower brightness [28]. The yarns in the yarn-dyed fabric image show a similar rule, from the axis to the gap between the yarns, the gray level gradually decreases. This rule can be used to detect the yarns in the fabric image.

For accurate recognition, we adopt an efficient method to strengthen the gap between the yarns. First, we use a relative total variation (RTV)-based smooth method [29] to eliminate the details of the image. The smooth method can be expressed by:

(2) argminS p(SpIp)2+λDxpLxp+ε+DypLyp+ε,

where the term (SpIp)2 makes the input and the result does not deviate wildly. The effect of removing texture from an image is introduced by a new regularizer (Dx(p)/(Lx(p) + ɛ) + Dy(p)/(Ly(p) + ɛ)), which is called RTV. λ is a weight to control the degree of smoothness and ɛ is a small positive number to avoid division by zero. The division is an element-wise operation. More details can be found in ref. [29]. The smoothing result for the image of yarn-dyed fabric is shown in Figure 8(a). The uneven brightness distribution of the yarn in the fabric image is weakened or eliminated. This result can be used for color segmentation. By dividing the original image I element-wise by the smoothed image Is, we can obtain the gap-enhanced image, as shown in Figure 8(b).

Figure 8. Detection results of the warp and weft yarns. (a) Smoothed fabric image, (b) gap-enhanced fabric image, (c) brightness projection in the weft, (d) brightness projection in the warp, and (e) the detection result of yarn layout.
Figure 8.

Detection results of the warp and weft yarns. (a) Smoothed fabric image, (b) gap-enhanced fabric image, (c) brightness projection in the weft, (d) brightness projection in the warp, and (e) the detection result of yarn layout.

The automatic interlacing node recognition of fabric images can be achieved by using the brightness characteristic information of the images. In this study, all the yarns can be regarded lying as straight stripes in the vertical and the horizontal directions after the skew correction and gridding model are constructed to separate the warp yarns as well as the weft yarns. As mentioned at the beginning of this section, the brightness distribution of the fabric image has an obvious gradient change. The gradient of fabric image intensity arranged in descending order is the axial line of yarns, the marginal zone of yarns, and the transition region between the yarns; in addition, an intensity mutation can be observed in the transition region, so the minimum intensity value can be found by using the projection algorithm in the vertical and the horizontal directions. With this method, the yarns can be detected and the interlacing nodes can be successfully located. Assume that B(i, j) is the bright-ness value of the pixel in the ith row and jth column in the image. Then the fabric image can be mapped into two independent one-dimensional waves by using the following equations:

(3) Bi=iBi,j,

(4) Bj=jBi,j,

where B(i) is the accumulated brightness value of ith row pixels and B(j) is the accumulated brightness value of jth column pixel of the fabric image. For the obtained image with enhanced transition area, we first convert it to HSV color space and extract its brightness component V. Then component V is mapped into the two independent one-dimensional waves. The mapping results of the sample fabric image are shown in Figure 8(c) and (d).

It is clearly shown that the valleys of the curve are corresponding to the weft gap position in the fabric image, whereas the peaks represent the axes of the yarns. The case of the warp yarns is similar to that of the weft described above, but the warp yarns lay in the vertical direction. Generally speaking, if B(i)> B(i – 1) and B(i)> B(i + 1) in an ideal smooth curve, B(i) can be regarded as a peak; if B(i)< B(i – 1) and B(i)< B(i + 1), B(i) can be regarded as a valley. However, the fact is that the intensity of the pixel points in the image does not completely follow a descending distribution. It is mentioned before that one yarn occupies about 36 pixels. So we use a 30-pixel pitch slider in the warp and weft directions to search for local minimums, respectively, and the distance between two adjacent local minimums to be greater than 30. The local minimums searched out are considered to be the gap between the yarns in the fabric image, as shown in Figure 8(e). Based on this result, we can achieve automatic measurement of fabric warp density and weft density.

5. Inspection of the density of yarn-dyed fabric

The density of yarn-dyed fabric can be computed with the resolution of the image when the yarns are recognized successfully by using the following equation:

(5) D=NSdpiM,

where D denotes the density of yarn-dyed fabric with thread/inch as its unit; M represents the number of pixels among the first and last yarns; N is the number of (warp or weft) yarns in the fabric image; and Sdpi is the resolution, as mentioned before, Sdpi = 2,400. According to this definition, we can obtain the density of the sample fabric shown in Figure 4: Dwarp = 79.9 threads/inch, where M = 2,228, N = 74; Dweft = 59.2 threads/inch, where M = 1,580, N = 39.

6. Detection of the color pattern

The node type depends on the relative position of the weft and warp where they interlace. The node is defined as a warp node when the warp is at the top; otherwise, it is defined as a weft node. In this article, the node types are determined by a quadrilateral boundary of interlacing nodes, which is proposed in ref. [30]. The matrix representation of node type is shown in Figure 9(a).

Figure 9. The recognition results of the node type and color pattern of the sample fabric. (a) The matrix representation of node type and (b) color pattern of the yarn-dyed fabric image.
Figure 9.

The recognition results of the node type and color pattern of the sample fabric. (a) The matrix representation of node type and (b) color pattern of the yarn-dyed fabric image.

In the yarn-dyed fabric, color information is an important feature. The color feature of each interlacing node is represented by the following equation:

(6) V=R¯,G¯,B¯,

where

(7) R¯=1hwi=1hj=1wri,j,

(8) G¯=1hwi=1hj=1wgi,j,

(9) B¯=1hwi=1hj=1wbi,j,

where h and w represent the height and width of the corresponding interlacing node, respectively. r(i, j), g(i, j), and b(i, j) represent the value of red, green, and blue component at point (i, j). In other words, we use the average value of R, G, and B of all pixels in the interlacing node to represent it. However, the colors at the junctions between the yarns affect each other, making the colors in some interlacing nodes have outliers. Assume that the color of each pixel in the interlacing node obeys a 4D Gaussian distribution, expressed as c(r, g, b) ~ N (C|μ,Σ). The probability density function can be expressed as:

(10) fc=1(2π)3Σ1/2exp(cμ)T(Σ)1cμ2,

where Σ is the covariance matrix, and μ is the matrix consisting of the mean value of the three components of R, G, and B. We select those pixels with a 95% confidence level as statistical samples to represent the color feature of each interlacing code.

The FCM clustering method is proposed to classify the nodes with the color features extracted from the nodes. And it is an unsupervised method, which attempts to minimize the objective function by organizing the data into different clusters. We follow the method proposed in ref. [26] for node classification. However, the number of yarn colors in the yarn-dyed fabric should be known first. For automatic detection, Pan et al. [26] introduced the cluster validity criterion VMPC [31] to find the optimal cluster number. VMPC indices of the sample yarn-dyed fabric are shown in Table 1. We can easily find that the sample fabric does contain five colors of yarn.

Table 1.

VMPC indices for yarn-dyed fabric image

c 2 3 4 5 6 7 8 9 10
VMPC 0.657 0.696 0.769 0.841 0.713 0.642 0.616 0.609 0.596
  1. Note: The best result in the table is marked in bold.

FCM (c = 5) is used to classify the nodes with the color feature. The color of the nodes is then replaced by the corresponding cluster center. The boundary pixels are removed from the image and the nodes are expressed by the color rectangles of the same size. As shown in Figure 9(b), the color pattern can be obtained by using this method. We can also obtain the layout of color yarns by combining the node type and color pattern.

7. Summary

For a yarn-dyed sample, first the scanner is used to obtain its image. The inclination of the yarns in the fabric image will generally cause errors in the recognition of fabric parameters. Therefore, the method introduced in Section 3 is proposed to correct the warp and weft yarns. For accurate detection of the yarns in the image, a brightness-projection method is used to locate the gaps between the yarns, and thus recognize the layout of yarns. The results can be applied to measure the density of the yarn-dyed fabric. The types of interweaving nodes are recognized by using the method proposed in ref. [30]. FCM is proposed to classify the nodes with the color features extracted from the nodes. Combining the clustering result and weave pattern, we can obtain the color pattern of the yarn-dyed fabric.

Table 2 and Figure 10 present the actual parameters and detection results of three yarn-dyed fabric samples. The promising results demonstrate the effectiveness of the proposed method for the recognition of yarn-dyed fabric.

Figure 10. Recognition results of two other samples. (a) Original images, (b) corrected images, (c) weave pattern, and (d) color pattern.
Figure 10.

Recognition results of two other samples. (a) Original images, (b) corrected images, (c) weave pattern, and (d) color pattern.

Table 2.

Actual parameters and detection parameters of the three samples

Name Size (px) Actual threads Actual density Re. threads Re. density Color
Warp Weft Warp Weft Warp Weft Warp Weft Warp Weft
a 2,228 1,580 74 39 80 59 74 39 79.9 59.2 5
b 2,494 1,684 101 45 97 64 101 45 97.2 64.1 6
c 1,940 1,332 60 39 74 70 60 39 74.2 70.3 6

To demonstrate the superiority of the proposed method, we compare some current methods on this topic including those of Pan et al. [26], Guo [5], Xin et al. [27], Iqbal Hussain et al. [2], and Meng et al. [1]. In this experiment, we implemented the aforementioned method following the authors’ paper and then tested it on 100 samples of yarn-dyed fabrics. Here, error rate [1] is employed to evaluate the recognition performance of contrasting methods. The experimental results are presented in Table 3. The proposed method achieves superior performance of 2.1%, 1.2%, 1.4%, and 0.6% on various metrics, out-performing other comparative methods. The method of Iqbal Hussain et al. [2] is the best performing method among the compared methods, however, our method reduces the error rate by about 2%. The experimental results show that the proposed method has a certain improvement in the identification of yarn-dyed fabrics, which demonstrates the superiority of the proposed method.

Table 3.

Comparison of experimental results with other methods

Methods Threads (%) Density (%)
Warp Weft Warp Weft
Pan et al. [26] 7.4 5.2 3.9 4.2
Guo [5] 8.3 6.5 5.1 4.9
Xin et al. [27] 6.8 6.3 4.3 3.9
Iqbal Hussain et al. [2] 4.5 3.4 3.5 3.1
Meng et al. [1] 23.5 43.3 34.9 43.8
Proposed 2.1 1.2 1.4 0.6

We also use the method proposed in ref. [1] to recognize the yarn layout of the yarn-dyed fabrics, and the results are presented in Figure 11. The poor results are mainly caused by two reasons: (1) The model was trained from a small dataset, which leads to its low generalization; (2) the image resolution is different from the original. These results prove that in the case of limited data, low-level feature-based methods are more effective than learning-based methods.

Figure 11. The recognition results of yarn-dyed using the method proposed in ref. [1]. (a) Original image, (b) warp location, (c) weft location, and (d) warp node location.
Figure 11.

The recognition results of yarn-dyed using the method proposed in ref. [1]. (a) Original image, (b) warp location, (c) weft location, and (d) warp node location.

8. Conclusion

Aiming at the recognition of the structural parameters of yarn-dyed fabric, an effective method is proposed based on the previous work, especially refs [1,26, 30]. In the recognition system, the fabric images are captured by a scanner with a resolution of 2,400 dpi. The inclination angles of the warps and wefts of the fabric are detected by Hough transform. Then we use rotation transformation and affine transformation to correct the fabric image, respectively. A smoothing method is used to enhance the yarns and the gaps between the yarns to improve the accuracy of the yarn location. Using the detection results, the density of yarn-dyed fabric can be calculated by equation (5). The edge information of the four directions of the interweaving node is the main basis for judging whether it is a warp node or a weft node. The number of yarn colors obtained from cluster validity analysis and the extracted color features are used to classify the nodes with FCM algorithm. Based on the clustering results, the color pattern of yarn-dyed fabric can be obtained.

The proposed method has some limitations. For jacquard fabrics or multi-layer fabrics, our method may not work. However, it still can be used to measure the density of the fabric.

  1. Funding information: This work was supported in part by the National Natural Science Foundation of China under Grant no. 61976105.

  2. Conflict of interest: Authors state no conflict of interest.

References

[1] Meng, S., Pan, R., Gao, W., Zhou, J., Wang, J., He, W. (2021). A multi-task and multi-scale convolutional neural network for automatic recognition of woven fabric pattern. Journal of Intelligent Manufacturing, 32(4), 1147–1161.10.1007/s10845-020-01607-9Search in Google Scholar

[2] Iqbal Hussain, M. A., Khan, B., Wang, Z., Ding, S. (2020). Woven fabric pattern recognition and classification based on deep convolutional neural networks. Electronics, 9(6), 1048.10.3390/electronics9061048Search in Google Scholar

[3] Pan, R., Gao, W., Liu, J., Wang, H. (2011). Automatic recognition of woven fabric pattern based on image processing and BP neural network. The Journal of the Textile Institute, 102(1),19–30.10.1080/00405000903430255Search in Google Scholar

[4] Pan, R., Gao, W., Liu, J., Wang, H. (2010). Automatic recognition of woven fabric patterns based on pattern database. Fibers and Polymers, 11(2), 303–308.10.1007/s12221-010-0303-6Search in Google Scholar

[5] Guo, Y., Ge, X., Yu, M., Yan, G., Liu, Y. (2019). Automatic recognition method for the repeat size of a weave pattern on a woven fabric image. Textile Research Journal, 89(14), 2754–2775.10.1177/0040517518801197Search in Google Scholar

[6] Anila, S., Rani, K. S. S., Saranya, B. (2018). Fabric texture analysis and weave pattern recognition by intelligent processing. Journal of Telecommunication, Electronic and Computer Engineering (JTEC),10(1–13), 121–127.Search in Google Scholar

[7] Xu, B. (1996). Identifying fabric structures with fast Fourier transform techniques. Textile Research Journal, 66(8), 496–506.10.1177/004051759606600803Search in Google Scholar

[8] Lachkar, A., Gadi, T., Benslimane, R., D’orazio, L., Martuscelli, E. (2003). Textile woven-fabric recognition by using Fourier image-analysis techniques: Part I: a fully automatic approach for crossed-points detection. Journal of the Textile Institute, 94(3–4), 194–201.10.1080/00405000308630608Search in Google Scholar

[9] Lachkar, A., Benslimane, R., D’orazio, L., Martuscelli, E. (2005). Textile woven fabric recognition using Fourier image analysis techniques: Part II–texture analysis for crossed-states detection. Journal of the Textile Institute, 96(3), 179–183.10.1533/joti.2004.0069Search in Google Scholar

[10] Matsuyama, T., Miura, S.-I., Nagao, M. (1983). Structural analysis of natural textures by Fourier transformation. Computer Vision, Graphics, And Image Processing, 24(3),347–362.10.1016/0734-189X(83)90060-9Search in Google Scholar

[11] Soltany, M., Zadeh, S. T., Pourreza, H.-R. (2011). Fast and accurate pupil positioning algorithm using circular Hough transform and gray projection. International Conference on Computer Communication and Management, 211, 556–561.Search in Google Scholar

[12] Jeong, Y. J., Jang, J. (2005). Applying image analysis to automatic inspection of fabric density for woven fabrics. Fibers and Polymers, 6(2), 156–161.10.1007/BF02875608Search in Google Scholar

[13] Jeon, B. S., Bae, J. H., Suh, M. W. (2003). Automatic recognition of woven fabric patterns by an artificial neural network. Textile Research Journal, 73(7), 645–650.10.1177/004051750307300714Search in Google Scholar

[14] Kuo, C. F. J., Shih, C. Y., Lee, J. Y. (2004). Automatic recognition of Fabric weave patterns by a fuzzy C-means clustering method. Textile Research Journal, 74(2), 107–111.10.1177/004051750407400204Search in Google Scholar

[15] Ajallouian, F., Tavanai, H., Palhang, M., Hosseini, S., Sadri, S., Matin, K. (2009). A novel method for the identification of weave repeat through image processing. The Journal of The Textile Institute, 100(3), 195–206.10.1080/00405000701660244Search in Google Scholar

[16] Deng, D., Wang, R., Wu, H., He, H., Li, Q., Luo, X. (2018). Learning deep similarity models with focus ranking for fabric image retrieval. Image and Vision Computing, 70, 11–20.10.1016/j.imavis.2017.12.005Search in Google Scholar

[17] Xiang, J., Zhang, N., Pan, R., Gao, W. (2019). Fabric image retrieval system using hierarchical search based on deep convolutional neural network. IEEE Access, 7, 35405–35417.10.1109/ACCESS.2019.2898906Search in Google Scholar

[18] Xiang, J., Zhang, N., Pan, R., Gao, W. (2020). Fabric retrieval based on multi-task learning. IEEE Transactions on Image Processing, 30, 1570–1582.10.1109/TIP.2020.3043877Search in Google Scholar PubMed

[19] Meng, S., Pan, R., Gao, W., Zhou, J., Wang, J., He, W. (2019). Woven fabric density measurement by using multi-scale convolutional neural networks. IEEE Access, 7, 75810–75821.10.1109/ACCESS.2019.2922502Search in Google Scholar

[20] Puarungroj, W., Boonsirisumpun, N. (2019). Recognizing hand-woven fabric pattern designs based on deep learning. Advances in Computer Communication and Computational Sciences, 19, 325–336.10.1007/978-981-13-6861-5_28Search in Google Scholar

[21] Wang, F., Liu, H., Sun, F., Pan, H. (2019). Fabric recognition using zero-shot learning. Tsinghua Science and Technology, 24(6), 645–653.10.26599/TST.2018.9010095Search in Google Scholar

[22] Liu, J., Wang, C., Su, H., Du, B., Tao, D. (2019). Multistage GAN for fabric defect detection. IEEE Transactions on Image Processing, 29, 3388–3400.10.1109/TIP.2019.2959741Search in Google Scholar PubMed

[23] Ouyang, W., Xu, B., Hou, J., Yuan, X. (2019). Fabric defect detection using activation layer embedded convolutional neural network. IEEE Access, 7, 70130–70140.10.1109/ACCESS.2019.2913620Search in Google Scholar

[24] Jun, X., Wang, J., Zhou, J., Meng, S., Pan, R., Gao, W. (2021). Fabric defect detection based on a deep convolutional neural network using a two-stage strategy. Textile Research Journal, 91(1–2), 130–142.10.1177/0040517520935984Search in Google Scholar

[25] Xiao, Z., Liu, X., Wu, J., Geng, L., Sun, Y., Zhang, F., et al. (2018). Knitted fabric structure recognition based on deep learning. The Journal of The Textile Institute, 109(9), 1217–1223.10.1080/00405000.2017.1422309Search in Google Scholar

[26] Pan, R., Gao, W., Liu, J., Wang, H. (2010). Automatic detection of the layout of color yarns for yarn-dyed fabric via a FCM algorithm. Textile Research Journal, 80(12), 1222–1231.10.1177/0040517509355349Search in Google Scholar

[27] Xin, B., Hu, J., Baciu, G., Yu, X. (2009). Investigation on the classification of weave pattern based on an active grid model. Textile Research Journal, 79(12), 1123–1134.10.1177/0040517508101459Search in Google Scholar

[28] Liu, J., Yamaura, I., Gao, W. (2006). Discussing reflecting model of yarn. International Journal of Clothing Science and Technology, 18(2), 129–141.10.1108/09556220610645775Search in Google Scholar

[29] Xu, L., Yan, Q., Xia, Y., Jia, J. (2012). Structure extraction from texture via relative total variation. ACM Transactions On Graphics (TOG),31(6),1–10.10.1145/2366145.2366158Search in Google Scholar

[30] Zhong, P., Shi, Y., Chen, X., Tan, Q., Zhang, C. (2013). Research on digital intelligent recognition method of the weave pattern of fabric based on the redundant information. Fibers and Polymers, 14(11), 1919–1926.10.1007/s12221-013-1919-0Search in Google Scholar

[31] Dave, R. N. (1996). Validating fuzzy partitions obtained through c-shells clustering. Pattern Recognition Letters, 17(6), 613–623.10.1016/0167-8655(96)00026-8Search in Google Scholar

Published Online: 2023-10-25

© 2022 Jun Xiang et al., published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 2.12.2023 from https://www.degruyter.com/document/doi/10.2478/aut-2022-0025/html
Scroll to top button