• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Detection of Residual Yarn in Bobbin Based on Odd Partial Gabor Filter and Multi-Color Space Hierarchical Clustering

    2023-12-28 09:11:40ZHANGJinZHANGTuanshan張團(tuán)善SHENGXiaochao盛曉超HUYANPengfei呼延鵬飛
    關(guān)鍵詞:鵬飛

    ZHANG Jin(張 瑾),ZHANG Tuanshan(張團(tuán)善),SHENG Xiaochao(盛曉超), HUYAN Pengfei(呼延鵬飛)

    1 School of Textile Science and Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    2 School of Mechanical and Electrical Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    Abstract:In an automatic bobbin management system that simultaneously detects bobbin color and residual yarn, a composite texture segmentation and recognition operation based on an odd partial Gabor filter and multi-color space hierarchical clustering are proposed. Firstly, the parameter-optimized odd partial Gabor filter is used to distinguish bobbin and yarn texture, to explore Garbor parameters for yarn bobbins, and to accurately discriminate frequency characteristics of yarns and texture. Secondly, multi-color clustering segmentation using color spaces such as red, green, blue (RGB) and CIELUV (LUV) solves the problems of over-segmentation and segmentation errors, which are caused by the difficulty of accurately representing the complex and variable color information of yarns in a single-color space and the low contrast between the target and background. Finally, the segmented bobbin is combined with the odd partial Gabor’s edge recognition operator to further distinguish bobbin texture from yarn texture and locate the position and size of the residual yarn. Experimental results show that the method is robust in identifying complex texture, damaged and dyed bobbins, and multi-color yarns. Residual yarn identification can distinguish texture features and residual yarns well and it can be transferred to the detection and differentiation of complex texture, which is significantly better than traditional methods.

    Key words:residual yarn detection; Gabor filter; image segmentation; multi-color space hierarchical clustering

    0 Introduction

    Winding is the final step in the ring-spinning process, which transfers the spun yarns from the spinning bobbin into a large package form containing a considerable length of yarns for weaving and knitting. Since the bobbins may not be completely withdrawn from the winder, it is necessary to detect the amount of residual yarn and classify the bobbin according to the results. Currently, textile companies have the following three problems with the handling of bobbins. Firstly, there are many types of bobbins and the sorting situation is complicated. Secondly, manual sorting is inefficient and costly. Finally, the intermixing of yarns and yarn-free bobbins makes sorting difficult. How to effectively sort out these mixed bobbins is an urgent problem for textile companies today. The residual yarn inspection technology combined with machine vision not only excels in line inspection efficiency, but also has an irreplaceable position in terms of inspection accuracy and precision, and it has become the mainstream product line inspection technology today. Figure 1 shows bobbins in the factory application process.

    Non-contact methods of residual yarn detection have emerged, the data is mainly collected using a color sensor or camera to detect the color of the bobbin. By comparing the same type of the empty bobbin with the preset value, and using a support vector machine or neural network to segment the color, the yarn region is extracted. These solutions rely more on the color difference between the bobbin and the yarn. They are easily misidentified when two colors are close together and are not ideal for detection with very small amounts of yarns. In addition, the large variety of yarns and bobbins and their complex combination of colors, texture, and shapes make it difficult to detect the number of residual yarns on the bobbin using simple image techniques.

    Texture image segmentation is an important tool to solve this problem and is one of the main research hotspots in graphics and computer vision processing. There are many algorithms for texture image segmentation. These algorithms first use the strategy of extracting the texture features of the image and then performing image segmentation. The commonly used algorithms can be divided into three categories. (1) The use of filter-based algorithms represented by Gabor filter[1]and wavelet filter[2], if combined with the level set algorithms for energy functions, can get good results. (2) Algorithm based on cluster analysis[3-5]is represented by the fusion of Gabor, Steer and other filters and color information, which firstly extract texture features with filters and then fuse the color information to finally achieve the segmentation of texture images. (3) The level set algorithm based on energy function is represented by the regional model[6]to build the segmentation model and texture features. The third algorithm firstly establishes segmentation models, then extracts texture features based on local binary patterns (LBPs)[7], local ternary patterns (LTPs)[8], Gabor filter, structure tensor[9], local Chan-Vese (LCV) method based on extended structure tensor[10], combined with a tensor structure consisting of multiple features[11], local similarity factor (RLSF)[12], and finally segments the texture images. All the above methods are for medical and remote sensing images, which are too complex and poorly robust for application in real-time systems.

    As a linear filter, the frequency and direction representation of Gabor filter is very close to the banding ability and directivity of the human visual system, and is widely used in edge detection[13], texture segmentation[14], defect detection[15-17]and other fields. Considering the color difference between the bobbin and yarns as well as the difference in surface texture, a parameter-optimized multi-directional and multi-scale filter bank is designed to filter the two-dimensional image signal, extract the edges of the wrapped yarn, and perform preliminary calculation and segmentation of the residual yarn by using the antisymmetry of the odd part of the Gabor filter.

    Texture segmentation starts with color segmentation, and there are two main problems in color image segmentation: choosing a suitable color space; choosing a suitable segmentation method. The choice of the color feature space depends on the specific image and segmentation method. Currently, no one color space can replace other color spaces and is suitable for the segmentation of all color images[18]. Many scholars have used more complex feature selection and clustering techniques and then improved the final segmentation results with sophisticated optimization methods. Several segmentation techniques that combine specific theories, methods, and tools have emerged, such as graph theory-based method[19-20], wavelet domain hidden the Markov model[21], mean shift[22], and other information fusion strategies that have been used for image segmentation.

    Combining the same segmentation method or different segmentation methods for the segmentation of multi-color and multi-feature space can effectively solve the two main problems of color image segmentation mentioned above and improve the segmentation effect. The combined method is simple and does not require complex segmentation theories and models. The authors[23-24]verified the feasibility and effectiveness of this strategy in medical image segmentation, remote sensing image segmentation, and natural scene image segmentation. However, there are relatively few research results on feature fusion in image segmentation due to the particular difficulties associated with features[25-26].

    In this paper, we use multi-color spaces for clustering, each dealing with its linear part, and then synthesize the segmentation results. In terms of color space selection, there are six color spaces to choose from: red, green and blue (RGB); hue-saturation-value (HSV); brightness, in-phase, quadrature-phase (BIQ); XYZ (an international standard); CIELAB (LAB) (luminosity,a,b) and LUV (to further improve and unify color evaluation methods, the International Commission on Illumination proposed a unified LUV color space, where L represents the luminance of an object, U and V are chromaticities). Firstly, through experiments, we select two color spaces, RGB and LUV, and use the clustering method to initially segment the enhanced images of RGB and LUV. Secondly, we use the second clustering method to fuse the two initial segmentation results to obtain the fused segmented images. Finally, the segmentation results are obtained by region merging, which effectively solves the over-segmentation and mis-segmentation problems in natural image segmentation.

    The overall structure of this paper is as follows. Section 1 introduces the experimental software and hardware configuration. Section 2 is the detection algorithm, which is divided into three processes: bobbin image acquisition and extraction of main regions; the specific method of yarn edge extraction and segmentation based on Gabor filter; the color space fusion algorithm based on multi-color space hierarchical clustering and segmentation that combines Gabor edge detection to achieve residual yarn detection. Section 3 is a summary of the proposed method and experiments.

    1 Experimental Device

    The experimental system contains the following equipment. A charge coupled device(CCD) camera MV-GED500C-T (Shenzhen Minvision, China) is used with 2 448×2 048 dots per inch (DPI), 9 frames per second, and an ethernet interface, which can meet the needs of the experiment and actual production applications. For the lens, an industrial lens with a focal length of 25 mm (Zhejiang Dahua Technology CO., LTD., China) is adopted. The illumination source (Hangzhou Hikvision Digital Technology Co., LTD., China) is a downward-inclined white LED positive light source with a color temperature of 6 500 K. The experimental platform is shown in Fig.2. The computer operating system is Windows 7, the processor is i5-2500k@3.3GHz, the memory is 4 GB, and the graphics card is GeForce GTX 750. The framework is Visual Studio, and the OpenCV version is 3.4.3.

    According to the process of bobbin inspection, the whole system can be divided into a bobbin transfer module in the first stage, the bobbin online inspection module in the middle stage and the bobbin management module in the last stage. In the first stage, the messy bobbin needs to be pre-sorted, so that the bobbin can be placed more neatly before entering the online inspection in the middle stage to improve the inspection efficiency. The bobbin online inspection module is the core part of the pipeline management system, and the inspection result is directly related to the effect of the later bobbin management module. The bobbin online inspection module mainly realizes the detection of the bobbin with yarns and the color classification detection of the bobbin without yarns.

    Among them, yarn-containing bobbin detection precedes yarn-free bobbin color classification detection, and yarn-containing bobbin detection includes broken bobbin detection, end detection and yarn-containing detection. The color classification detection of yarn-free bobbins requires classification based on the head and body color of the bobbin. The bobbin online inspection system analyzes each bobbin image captured from the conveyor belt by the corresponding image processing algorithm to calculate the pixel length, end pixel width, edge detection, and color classification process of the bobbin. According to the results of calculation and analysis, communication and interaction with the embedded computer master is carried out to separate the broken end, inverted end and yarn-containing yarn barrel from the conveyor belt and send them from the plate to the corresponding box. If it is a qualified yarn barrel, then it can be processed for color analysis and move on to the next module. The process diagram is shown in Fig.3.

    Since the system is based on a fixed color background, in order to effectively preserve the target area, it is necessary to remove the support frame, background and other factors, and it is necessary to pixel mask each read image to obtain a binary image containing only 0 and 1 values. In the initially segmented binary image, each pixel with a value of 0 is judged to be within the similarity threshold. If it is within this range, the pixel range is reset according to the processing relationship, and the desired color image is obtained. The whole process is depicted in Figs. 4-6.

    The bobbin is positioned according to the optical band to effectively cut the experimental map and get the region of interest. The bilateral filtering can effectively filter out the noise in the image, make the contour of the bobbin clearer, effectively retain the details of the image, and extract the region of interest (ROI).

    2 Gabor Detection Algorithm

    2.1 Image acquisition of bobbin

    As shown in Fig.7(a), the camera captures an image of the bobbin spool directly. Horizontal correction of the image is needed to crop the main area of the bobbin spool. First, the smallest outer rectangle of the bobbin axis region is drawn with OpenCV, then using the built-in function of OpenCV. The angle of the bobbin is

    (1)

    2.2 Gabor filter feature extraction

    2.2.1ExtractionofyarnedgesusingoddpartialGaborfilter

    Gabor filter features have been widely used as effective texture feature descriptor for target detection and recognition. In the spatial domain, two-dimensional Gabor filter is usually obtained by multiplying a sinusoidal plane wave and a Gaussian kernel function. The properties of Gabor filter are self-similar. That is, any Gabor filter can be obtained by extending and rotating the mother wavelet. In practice, Gabor filter can extract relevant features from images with different orientations and different scales in the frequency domain. For a Gabor filter with a two-dimensional digital part, the functional expression is

    (2)

    where (x,y) is the pixel coordinate;x′=xcosθ+ysinθ;y′=-xsinθ+ycosθ;θdenotes the direction of the parallel stripes of the Gabor function, taking values from 0° to 360°;ωis the central frequency;σis the variance of the Gaussian distribution;ψis the phase of the sinusoidal function, and in general,ψ= 0;γis the aspect ratio,i.e., the spatial aspect ratio, which determines the ellipticity of the Gabor function. Whenγ= 1, the shape is circular, and whenγ<1, the shape is elongated in the direction of parallel stripes.σcannot be set directly. It varies with the bandwidth of the filter’s half-response spatial frequency (defined asb). The relationship between them is

    (3)

    whereσandλdetermine the wave form of the odd partial of the Gabor filter. When the product of the two is constant, the center frequency only affects the effective response region, while the waveform remains constant. Depending on the number of pixels occupied by a single veil in the image, a filter bank consisting of filters with the same center frequency is designed, and each bank contains filters with different orientations. In the experiments,σ/λis set to 1.2, the horizontal direction is ±90°, the number of groups is 3, and the center frequencies are 4.8, 5.6 and 6.4 Hz, respectively. The filter banks are convolved for each RGB channel component separately. The maximum value is taken from different directions of the same center frequency, and the minimum value is taken between different center frequencies to form the filter output. The processing effect of the filter is shown in Fig.8. After parameter optimization, the filter can suppress the texture of the cylinder wall and enhance the gray gradient of the bobbin at the same time.

    2.2.2Filterwaveformoptimization

    The waveform and output response of two-dimensional Gabor filter vary withσ,ω,θ,etc. Due to the bandpass characteristics of two-dimensional Gabor filter, it is necessary to use a multi-center frequency and multi-direction filter bank, and process the input signal according to fineness, so that the filter produces a greater response to the edge of the cylinder, while suppressing other texture. To enhance the edge detection of the odd part Gabor filter, it is necessary to make its waveform have a more pronounced step feature at axisx= 0. This step also gives a larger output response of filterG(x,y). LetEbe the integral of the response on the central axis side ofG(x,y).

    (4)

    After the integral operation, we can get

    (5)

    whereI(·) represents the imaginary part of the function;L(·) represents the error function. Equation (5) shows that ifσis known, thenEhas a Dawson distribution withωx(frequency atx). The inflection point of the Dawson integral occurs at approximately 0.924 14. It can be seen that whenEis the maximum value, the following equation can be obtained.

    (6)

    Figure 9 shows the filter output response curves withλ/σfor the test images. It can be seen that the actual peak occurs aroundλ/σ=6.4 and the actual output value is consistent with the theoretical output value.

    2.2.3Filterangleselection

    Figure 10 shows the ideal variation curve of the output response of the filter in the surprise section withθ. The maximum response of the filter was obtained whenθ= 90°. To prove the theoretical results, a total of five filters were selected to filter the test images in the range of 0° to 180° with 45° intervals. It can be seen that the filters filter best when the axes of the filters coincide with the step edge lines. Whenθis 90° and the centrosymmetric angle is 270°, these two angles can produce the ideal filtering effect. For a vertical bobbin, too many filters in different directions can reduce the operating efficiency and increase the noise.

    2.2.4Filtercenterfrequencyselection

    Fig.1 Bobbins in factory: (a) yarn bobbins in textile mills; (b) multi-color and yarn-containing bobbins that require sorting

    Fig.2 Experimental platform

    Fig.3 Process diagram : (a) yarn identification and detection system; (b) image acquisition device

    Fig.4 Image of bobbin

    Fig.7 Region of interest automatic process of image: (a) bobbin image; (b) main area of the extraction; (c) main area of bobbin

    Fig.8 Gabor rendering: (a) bobbin backbone area; (b) Gabor filtering effect of odd part

    Fig.9 Output response curve with λ/σ

    Fig.10 Output response curve with θ

    Fig.11 Theoretical output curve with respect to σ/d

    2.3 Fusion segmentation of multi-color spaces

    RGB model is the most common and basic color model in digital image processing. In the RGB color space, any color can be represented as a weighted mixture of different components of the three primary colors red, green and blue. The RGB color space, also known as the additive mixed color space, is characterized by poor color uniformity. This color space model is shown in Fig.12(a). For general images, the luminance ranges from 1 to 100, and the values of U and V are between -100 and 100. +U is red, -U is green, +V is yellow, and -V is blue. The LUV color space model is shown in Fig.12(b).

    Fig.12 Different color spaces: (a) RGB color space; (b) LUV color space

    The calculation formula is obtained by XYZ through nonlinear calculation. The specific equation is

    (7)

    (8)

    whereYnis 1.0;u′ andv′ describe the test color stimulus.

    The proposed method of fusion and segmentation of multi-color space based on hierarchical clustering is shown in Fig.13.

    Fig.13 Multi-color space fusion and segmentation method flow

    The specific steps are as follows.

    3)The local class labeled histogram features ofRRGBandRLUVare extracted and serialized into a fusion feature vector, and then the second fuzzyC-means clustering is performed to obtain a fusion segmentation resultSfusion.

    4)Conduct region merging onSfusionto obtain the final segmentation resultS.

    2.3.1Fusionandsegmentationmethodflowbasedonhierarchicalclustering

    In color image segmentation, choosing an appropriate color space is a difficult problem because it is difficult to represent complex natural scene images in a single-color space. Different color space representations can be seen as images with different channels provided by different sensors. An information fusion strategy combining complementary information from multi-color spaces is an effective way to improve the segmentation effect. Through the segmentation experiment of multiple groups of natural scene images, two color spaces, RGB and LUV, are selected to represent the segmented image and perform fusion segmentation. Image enhancement can highlight the light and dark changes in the image, and enhance the contrast between the background and the target, thereby effectively improving the segmentation effect of grayscale images. It is found that color image segmentation after enhancement processing can highlight the contours of the original image, reduce the number of segmentation blocks and improve the segmentation effect. The mathematical morphology image enhancement method is a simple and effective image enhancement method that can obtain a description of the structural features of an image by the influence of structural elements on the image.

    (9)

    2.3.2InitialsegmentationbasedonfuzzyC-meansclustering

    In general, the color histogram has a high degree of freedom. Taking RGB color space as an example, the color histogram has a high degree of freedom of 2563. Here, each color component is uniformly quantized to levelP, and the color histogram has a degree of freedom ofP3. For any pixelXin the image, the normalized local color histogramh1in a windowRcentered onXis computed.

    (10)

    2.3.3Fusionoftheinitialsegmentationofmulti-colorspace

    The fuzzyC-means clustering method is used to fuse the two initial segmentation results ofK1category, and the fusion segmentation results ofK2different categories are obtained bySfusion.

    For the two initial segmentation results, the feature vectors of the local class-labeled histograms centered on pixelxare extracted respectively, and the class-labeled histogramsh2are calculated in the windowRx.

    (11)

    wherenjis the number of pixels in the window labeled as (j+1);Nwrepresents the number of images corresponding to the bobbin. Then, two local class labeling histograms are concatenated and normalized to obtain the fused local class labeling histogramh2(Rx) with vector dimensions ofK1andK2.

    2.3.4Regionalconsolidation

    Since the clustering results are pixel-based, it is necessary to perform region merging onSfusionto achieve a more complete description of the target. In the segmentation results, the distance between a regionRand an adjacent regionRois denoted asDmerging(R,Ro),

    (12)

    whereC= {RGB, LUV};h(R) represents the normalized local color histogram of theRregion;h(Rx) represents the normalized local color histogram of a pixelxin the neighborhoodRo;DB[h(R),h(Rx)] represents the Batacaria distance of the histogram;h(i,R) andh(i,Rx) represent the occurrence frequency of the bin in the histograms ofRandRxrespectively;Nbrepresents the number of bins in the histogram. By calculating the distance betweenRand all adjacent regions ofR, the minimum adjacent regionRminis obtained. If the distance betweenRandRminisDmerging(R,Rmin) and less than the thresholdT, thenRis merged intoRmin. In the experiment of this paper, the segmentation of three different yarn volume fractions has achieved a very good segmentation effect, which can effectively restrain the texture of the bobbin and eliminate the influence of reflection. The experimental results are shown in Fig.14.

    Fig.14 Segmentation effect: (a) unsegmented image; (b)segmentation result image

    3 Experiment and Analysis

    In this study, we select 150 bobbins as the test image set, as shown in Fig.15. We test them according to LUV color space, and generally use chromaticity, saturation, and luminance according to the judgment rules of human eyes. For the sake of simplicity, we use saturation as the variable, so it is divided into four cases. For the first one, the saturation and luminance of the bobbin and yarns are completely distinguished. For the second one, the saturation of the bobbin and yarns is similar, but the hue is different. For the third one, the hue of the bobbin and yarns is similar, but with different saturation. For the fourth one, the bobbin and yarns are with little difference in hue saturation, which is the industry phenomenon of “the same bobbin and the same yarn”. In these four cases, the bobbin texture, slight stains,etc. should be considered. From the classification point of view, bobbin texture belongs to the fourth category.

    Fig.15 Images for testing

    Regarding the clustering, theoretically, two clusters are enough, one is the color of the bobbin itself and the other is the color of the yarn, but in practice, the bobbin is reflective and the bright band counts as a cluster. The bobbin texture counts as a cluster, so there are four clusters in total, but this increases the burden on the computer and is not suitable for low-cost situations. Thus three clusters are considered. Garbo filter is used for texture processing. The idea is that at the boundary of the cluster, the Garbo binarized image is checked. If the yarn shape is satisfied and the width is greater than that of the texture, it is the yarn. If both are texture, further clustering of texture is performed.

    An experimental process with the yellow bobbin is carried out, because the high saturation of the yellow bobbin is the high saturation of the white bobbin, and the identification of the yellow bobbin is the best test set to validate the system. The main reasons are as follows.

    1)The RGB of yellow is (255, 255, 0) and the RGB of crimson is (255, 0, 255). These two colors are easily confused in a color instant, which means that the colors easily distinguished by the naked eye are mathematically identical.

    2)Yellow is more likely to be contaminated and fade.

    3)Yellow (255, 255, 0) is visually white shown in Fig.16. Yellow and light white yarns, which are indistinguishable by normal methods, are theoretically extremely susceptible to the interference from color component blue, resulting in white color.

    Fig.16 Test process

    Fig.17 Comparison curves of different algorithms

    This process is also the block diagram of our software. From the diagram, we can see that three methods are used. Firstly, the Gabor filter is used to get the approximate position of the yarns, as seen in Fig.16. The texture and yarns are mixed and the exact position of the yarns cannot be distinguished, and other means are needed. Secondly, RGB three-component clustering is used, and the green component appears as a possible judgment of the yarn, but stain interference needs to be excluded. Thirdly, three-component clustering in the color space of LUV, from the L component to determine the location of the yarn. This “L” judgment is more realistic. If the U and V components are connected on the Y-axis at the same time, it can also be considered as having a yarn. The results of the three methods are fused by using the method of this paper to derive the yarns and their specific location. The accuracy of the method requires tuning the parameters of the fusion, which will not be developed here. Therefore, the yarns and texture which are very close to each other can be differentiated and processed. The situation is ideal when the filter is running at a speed of 50 bobbins per second, as shown in Fig.16.

    To test the accuracy and robustness of the detection algorithm in this paper, different colors of yarns and different kinds of bobbins are selected for test groups, and the test groups are shown in Table 1. Five colors of white, black, blue, red and gray yarns with linear densities of 4.0 tex and 10.0 tex are selected, the true positive rate (TPR) of the residual yarn skeleton is used as the test evaluation index, which is defined as

    (13)

    whereQTPRrepresents the accuracy rate, and the value range ofQTPRis [0,1];ETPrepresents the actual bobbin with residual yarns, and the detection result is also the number of samples of the bobbin with residual yarns;A(TP+FN)represents the actual bobbin with residual yarns and the actual empty bobbin, and the detection result is the number of the bobbin with residual yarns plus the number of empty bobbin samples. The higher the value, the better the classification effect. The test results corresponding to Table 1 are shown in Table 2. Figure 17 shows the effect comparison curve, from which it can be seen that the accuracy of the traditional algorithm is lower than 65% and that of the algorithm in this paper is higher than 80%.

    Table 1 Test groups

    Table 2 Accuracy of test results

    4 Experimental Verification of Indistinguishability

    To further validate the method of this paper, the following are a few cases that are more difficult for the human eye to distinguish.

    4.1 Texture judgment

    The bobbin in Fig.18 is very prone to error even if it is manually selected. At this point, the Gabor filter does not work as well as required, as shown in Fig.18(c), color clustering must be considered to complete the yarn judgment. In LUV color space, the L component acts obviously; in RGB color space the blue component acts obviously, so that the state marker of the yarn can be judged after fusion.

    Fig.18 Texture judgment process: (a) camera’s capture; (b) bobbin automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    4.2 Loose yarn judgment

    This is less common but still exists in a certain percentage. Figure 19 shows an example where the yarn is not horizontal on theY-axis and has a certain angle. In the adaptive Gabor filter used in Fig.18, the yarn contour is still distinguished, but when the yarn is close to the bobbin, the texture features are confused with the yarn features. In this identification, the concept of convex packets is utilized, as shown in Fig.19(g).

    Fig.19 Yarn judgment process: (a) camera’s capture; (b) automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    5 Conclusions

    In this paper, odd partial Gabor filter, multi-color space and hierarchical clustering of compound texture segmentation operators are used to detect residual yarns. Yarn segmentation is realized by optimizing the design of Gabor filter banks and adjusting the parameters to maximize the order amplitude in the band pass range. To solve the problem of the specific width, the most suitable center frequency is explored. By setting a reasonable filter combination, the frequency inconsistent with the yarn direction is removed, the noise is suppressed and the detection efficiency is improved.

    At the same time, it combines the fusion segmentation based on RGB and LUV color space hierarchical clustering to solve the problems of over-segmentation and mis-segmentation caused by the low contrast between the target and the background in the color. For image segmentation, image enhancement techniques in color image segmentation are introduced to make the segmented image better reflect the contours of the original image and highlight the parts of interest in the image.

    The results show that the algorithm can accurately detect yarn bobbins with different colors and brightness, and its optimization strategy provides a theoretical reference for the research of non-contact bobbin sorting.

    猜你喜歡
    鵬飛
    樊應(yīng)舉
    書香兩岸(2020年3期)2020-06-29 12:33:45
    漫畫
    Quality Control for Traditional Medicines - Chinese Crude Drugs
    為了避嫌
    雜文月刊(2019年18期)2019-12-04 08:30:40
    懲“前”毖“后”
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    執(zhí)“迷”不悟
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    舉賢
    21世紀(jì)(2019年9期)2019-10-12 06:33:44
    漫畫
    粗看“段”,細(xì)看“端”
    漫畫
    日韩成人在线观看一区二区三区| 色婷婷av一区二区三区视频| 免费观看精品视频网站| 动漫黄色视频在线观看| 高清视频免费观看一区二区| 国产精华一区二区三区| 黄片播放在线免费| 悠悠久久av| 亚洲男人天堂网一区| 夜夜爽天天搞| 亚洲精品国产色婷婷电影| 亚洲一区中文字幕在线| 精品一区二区三卡| 满18在线观看网站| 多毛熟女@视频| 亚洲国产精品合色在线| 日韩 欧美 亚洲 中文字幕| 国产亚洲精品久久久久5区| 黄色片一级片一级黄色片| 免费在线观看日本一区| 黄色 视频免费看| 亚洲男人天堂网一区| 欧美日韩成人在线一区二区| 精品少妇久久久久久888优播| 中文亚洲av片在线观看爽 | 国产欧美日韩综合在线一区二区| 美女福利国产在线| av天堂在线播放| 久久久国产精品麻豆| 别揉我奶头~嗯~啊~动态视频| 9色porny在线观看| 亚洲精品国产区一区二| 成人免费观看视频高清| 午夜福利一区二区在线看| 视频区欧美日本亚洲| 在线观看免费日韩欧美大片| 日韩一卡2卡3卡4卡2021年| 久久久国产成人精品二区 | √禁漫天堂资源中文www| 欧美日韩瑟瑟在线播放| 少妇被粗大的猛进出69影院| 久久九九热精品免费| 伦理电影免费视频| 老司机深夜福利视频在线观看| 婷婷精品国产亚洲av在线 | 欧美日韩av久久| 日韩精品免费视频一区二区三区| 视频在线观看一区二区三区| 国产成人av激情在线播放| 中文亚洲av片在线观看爽 | 色播在线永久视频| 一级a爱片免费观看的视频| 精品国产乱码久久久久久男人| 亚洲成a人片在线一区二区| 色综合婷婷激情| 日韩欧美国产一区二区入口| 一夜夜www| 色综合欧美亚洲国产小说| 好看av亚洲va欧美ⅴa在| 欧美+亚洲+日韩+国产| 女性生殖器流出的白浆| av片东京热男人的天堂| 亚洲色图av天堂| 精品人妻在线不人妻| cao死你这个sao货| 国内久久婷婷六月综合欲色啪| 777久久人妻少妇嫩草av网站| 久久精品aⅴ一区二区三区四区| 老鸭窝网址在线观看| 黑人猛操日本美女一级片| 九色亚洲精品在线播放| 欧美激情 高清一区二区三区| 黄片大片在线免费观看| 久久 成人 亚洲| 自线自在国产av| videosex国产| 91麻豆av在线| 天堂俺去俺来也www色官网| 99热只有精品国产| 日本wwww免费看| 热re99久久国产66热| 十八禁高潮呻吟视频| 丝袜美腿诱惑在线| 高清在线国产一区| 激情视频va一区二区三区| 天堂动漫精品| 国产区一区二久久| www日本在线高清视频| av在线播放免费不卡| 电影成人av| 成人影院久久| 大型黄色视频在线免费观看| 亚洲av第一区精品v没综合| 美女扒开内裤让男人捅视频| 日本黄色视频三级网站网址 | 国产在视频线精品| 精品人妻在线不人妻| 十八禁人妻一区二区| 亚洲在线自拍视频| 十八禁高潮呻吟视频| 亚洲欧美一区二区三区黑人| 国产日韩一区二区三区精品不卡| 老司机靠b影院| 日韩熟女老妇一区二区性免费视频| 免费在线观看亚洲国产| 女性被躁到高潮视频| 久久久精品免费免费高清| 亚洲五月婷婷丁香| 男男h啪啪无遮挡| 午夜影院日韩av| 亚洲七黄色美女视频| 日韩欧美在线二视频 | 国产麻豆69| 91九色精品人成在线观看| 日韩成人在线观看一区二区三区| 麻豆av在线久日| 伦理电影免费视频| 亚洲av成人不卡在线观看播放网| 最近最新中文字幕大全免费视频| 丰满迷人的少妇在线观看| 丝袜在线中文字幕| 狠狠婷婷综合久久久久久88av| 精品一区二区三区视频在线观看免费 | 少妇粗大呻吟视频| 日韩欧美免费精品| 多毛熟女@视频| 不卡一级毛片| 亚洲精品国产精品久久久不卡| 国产精品秋霞免费鲁丝片| 国产亚洲精品一区二区www | 亚洲国产欧美网| 亚洲国产毛片av蜜桃av| 这个男人来自地球电影免费观看| 精品国产国语对白av| 在线观看一区二区三区激情| 国产精品成人在线| 欧美乱码精品一区二区三区| 久久天堂一区二区三区四区| 亚洲成a人片在线一区二区| 亚洲午夜精品一区,二区,三区| 国产91精品成人一区二区三区| 国产熟女午夜一区二区三区| 免费日韩欧美在线观看| 国产一区二区三区综合在线观看| 欧美丝袜亚洲另类 | 欧美性长视频在线观看| 日韩欧美三级三区| 精品一区二区三区四区五区乱码| av免费在线观看网站| 天堂动漫精品| 777米奇影视久久| 中文亚洲av片在线观看爽 | 欧美不卡视频在线免费观看 | 免费女性裸体啪啪无遮挡网站| 99热国产这里只有精品6| 久久精品国产综合久久久| 黑人巨大精品欧美一区二区mp4| 1024香蕉在线观看| 操美女的视频在线观看| 亚洲中文字幕日韩| 欧美老熟妇乱子伦牲交| 黄频高清免费视频| 国产成人免费观看mmmm| 精品久久久久久久久久免费视频 | 搡老熟女国产l中国老女人| 美女视频免费永久观看网站| 80岁老熟妇乱子伦牲交| 亚洲av熟女| 欧美丝袜亚洲另类 | 国产99白浆流出| 欧美黑人精品巨大| 亚洲国产精品一区二区三区在线| 操美女的视频在线观看| 一区二区三区国产精品乱码| 丝袜在线中文字幕| 欧美黄色片欧美黄色片| 最近最新免费中文字幕在线| 成熟少妇高潮喷水视频| 王馨瑶露胸无遮挡在线观看| 悠悠久久av| a级片在线免费高清观看视频| 人成视频在线观看免费观看| 欧美另类亚洲清纯唯美| 99国产精品一区二区三区| 午夜影院日韩av| 露出奶头的视频| 别揉我奶头~嗯~啊~动态视频| 久久热在线av| 变态另类成人亚洲欧美熟女 | 一本综合久久免费| 极品少妇高潮喷水抽搐| 亚洲av日韩精品久久久久久密| 久久久国产成人免费| 人人澡人人妻人| 曰老女人黄片| 99re在线观看精品视频| 国产成人精品无人区| 亚洲成国产人片在线观看| 男男h啪啪无遮挡| 老熟女久久久| 国产一区二区三区综合在线观看| 国产精品成人在线| 人人澡人人妻人| 日本黄色视频三级网站网址 | 久久影院123| 亚洲专区国产一区二区| 在线永久观看黄色视频| 中文字幕另类日韩欧美亚洲嫩草| 国产精品美女特级片免费视频播放器 | 少妇 在线观看| 欧美黄色片欧美黄色片| 亚洲人成电影免费在线| 亚洲 欧美一区二区三区| 妹子高潮喷水视频| 宅男免费午夜| 成人黄色视频免费在线看| 高潮久久久久久久久久久不卡| 老汉色∧v一级毛片| 成年人免费黄色播放视频| 国产欧美日韩综合在线一区二区| 成人精品一区二区免费| 亚洲av成人不卡在线观看播放网| 国产三级黄色录像| 精品熟女少妇八av免费久了| 男男h啪啪无遮挡| 好男人电影高清在线观看| av有码第一页| 极品人妻少妇av视频| 好看av亚洲va欧美ⅴa在| 日韩成人在线观看一区二区三区| 少妇裸体淫交视频免费看高清 | 日韩三级视频一区二区三区| bbb黄色大片| 亚洲av成人一区二区三| 涩涩av久久男人的天堂| 午夜亚洲福利在线播放| 成熟少妇高潮喷水视频| 一二三四在线观看免费中文在| 国产欧美日韩一区二区三| 人成视频在线观看免费观看| 成人影院久久| 亚洲av成人不卡在线观看播放网| 老司机影院毛片| 女人被狂操c到高潮| 一本大道久久a久久精品| 十八禁高潮呻吟视频| 满18在线观看网站| 两个人看的免费小视频| 飞空精品影院首页| 丝袜美腿诱惑在线| 成熟少妇高潮喷水视频| 国产精品一区二区精品视频观看| 热99久久久久精品小说推荐| 大香蕉久久成人网| 黄色怎么调成土黄色| 日韩欧美国产一区二区入口| 精品久久久久久久毛片微露脸| 精品一区二区三区四区五区乱码| 欧美精品啪啪一区二区三区| 欧美av亚洲av综合av国产av| 在线看a的网站| 中文字幕最新亚洲高清| 久久婷婷成人综合色麻豆| 女人久久www免费人成看片| 亚洲人成伊人成综合网2020| 天天躁日日躁夜夜躁夜夜| 国产在线观看jvid| 一级,二级,三级黄色视频| 国产日韩一区二区三区精品不卡| 国产成人欧美| 国产精品自产拍在线观看55亚洲 | 国产成人系列免费观看| 大型av网站在线播放| av一本久久久久| 超碰97精品在线观看| 国产欧美亚洲国产| 女人精品久久久久毛片| 欧美日韩瑟瑟在线播放| 欧美黄色片欧美黄色片| 亚洲人成伊人成综合网2020| 免费在线观看日本一区| 日韩欧美国产一区二区入口| 欧美日韩乱码在线| 男女高潮啪啪啪动态图| 午夜精品国产一区二区电影| 国产av精品麻豆| 高清av免费在线| 久久久精品区二区三区| 色94色欧美一区二区| 久久草成人影院| 欧美人与性动交α欧美软件| 在线观看免费视频网站a站| 日韩大码丰满熟妇| 丁香六月欧美| 巨乳人妻的诱惑在线观看| 视频在线观看一区二区三区| 精品国内亚洲2022精品成人 | 亚洲av日韩在线播放| 中文字幕最新亚洲高清| 纯流量卡能插随身wifi吗| 叶爱在线成人免费视频播放| 日韩欧美在线二视频 | 日韩有码中文字幕| 亚洲自偷自拍图片 自拍| 国产精品国产av在线观看| 欧美成狂野欧美在线观看| 亚洲av成人一区二区三| 精品一品国产午夜福利视频| 成人手机av| 热re99久久国产66热| 久久精品熟女亚洲av麻豆精品| 亚洲一卡2卡3卡4卡5卡精品中文| 国产免费男女视频| 日本精品一区二区三区蜜桃| 免费不卡黄色视频| 一级a爱视频在线免费观看| 19禁男女啪啪无遮挡网站| 老司机午夜福利在线观看视频| 丰满饥渴人妻一区二区三| 亚洲精品美女久久av网站| 三级毛片av免费| 在线十欧美十亚洲十日本专区| 热99久久久久精品小说推荐| 制服诱惑二区| 啦啦啦视频在线资源免费观看| 男人舔女人的私密视频| 人人妻人人澡人人爽人人夜夜| av片东京热男人的天堂| 手机成人av网站| a级片在线免费高清观看视频| 亚洲人成电影免费在线| 午夜91福利影院| 国产一区在线观看成人免费| 中国美女看黄片| av国产精品久久久久影院| 午夜免费成人在线视频| 精品免费久久久久久久清纯 | 亚洲美女黄片视频| 亚洲欧美一区二区三区黑人| 亚洲国产中文字幕在线视频| 热re99久久精品国产66热6| 欧美日韩中文字幕国产精品一区二区三区 | 亚洲国产精品一区二区三区在线| √禁漫天堂资源中文www| 久久国产精品影院| 日本精品一区二区三区蜜桃| 精品无人区乱码1区二区| 欧美精品啪啪一区二区三区| 欧美黄色片欧美黄色片| 两人在一起打扑克的视频| 成年人午夜在线观看视频| 国产精品免费一区二区三区在线 | 法律面前人人平等表现在哪些方面| 精品国产超薄肉色丝袜足j| 亚洲人成77777在线视频| 亚洲国产中文字幕在线视频| 久久久久久免费高清国产稀缺| 成人免费观看视频高清| 91麻豆av在线| 日韩中文字幕欧美一区二区| 丝袜美腿诱惑在线| 国产精品永久免费网站| 国产高清激情床上av| 国产成人欧美在线观看 | 久久久久久亚洲精品国产蜜桃av| 日本五十路高清| 久久香蕉激情| 国产高清视频在线播放一区| 亚洲一卡2卡3卡4卡5卡精品中文| 丝袜在线中文字幕| 国产熟女午夜一区二区三区| 免费在线观看影片大全网站| 免费观看a级毛片全部| 国产亚洲精品久久久久久毛片 | 黑人巨大精品欧美一区二区mp4| 99国产精品一区二区蜜桃av | 国产精品久久视频播放| 九色亚洲精品在线播放| 高清黄色对白视频在线免费看| 成人永久免费在线观看视频| 久久中文字幕一级| 欧美乱色亚洲激情| 中文字幕人妻丝袜制服| 美国免费a级毛片| 久久国产精品大桥未久av| 看片在线看免费视频| 欧美精品高潮呻吟av久久| 夜夜爽天天搞| 大型av网站在线播放| 久久影院123| 欧美日韩亚洲高清精品| av超薄肉色丝袜交足视频| www.精华液| 国产日韩欧美亚洲二区| 亚洲色图 男人天堂 中文字幕| 女性被躁到高潮视频| 午夜福利乱码中文字幕| 99热网站在线观看| 欧美不卡视频在线免费观看 | 欧美另类亚洲清纯唯美| 国产精品99久久99久久久不卡| 亚洲中文av在线| 色尼玛亚洲综合影院| 日韩欧美国产一区二区入口| 人成视频在线观看免费观看| 夜夜躁狠狠躁天天躁| 精品一品国产午夜福利视频| 欧美日韩视频精品一区| 老司机亚洲免费影院| 国产欧美亚洲国产| 正在播放国产对白刺激| 搡老岳熟女国产| 18禁黄网站禁片午夜丰满| 九色亚洲精品在线播放| 欧美激情高清一区二区三区| 久久青草综合色| 三上悠亚av全集在线观看| 久久人妻熟女aⅴ| 欧美最黄视频在线播放免费 | 丝袜美足系列| 美女高潮喷水抽搐中文字幕| 日韩 欧美 亚洲 中文字幕| 国产成人一区二区三区免费视频网站| 午夜影院日韩av| 国产亚洲欧美精品永久| 欧美激情 高清一区二区三区| 高潮久久久久久久久久久不卡| 欧美激情极品国产一区二区三区| 女人爽到高潮嗷嗷叫在线视频| 在线看a的网站| 一级黄色大片毛片| 成人手机av| 91字幕亚洲| 国产单亲对白刺激| 国产av一区二区精品久久| 久久久久久久久免费视频了| 亚洲国产看品久久| 人妻久久中文字幕网| 国产主播在线观看一区二区| 人人妻人人添人人爽欧美一区卜| 黑丝袜美女国产一区| 成人永久免费在线观看视频| 中亚洲国语对白在线视频| 激情视频va一区二区三区| 人人妻,人人澡人人爽秒播| 操美女的视频在线观看| 精品一品国产午夜福利视频| 亚洲在线自拍视频| 99精国产麻豆久久婷婷| 欧美精品亚洲一区二区| 69精品国产乱码久久久| 精品久久久久久,| 欧美国产精品一级二级三级| 亚洲精品美女久久久久99蜜臀| 亚洲 欧美一区二区三区| 极品教师在线免费播放| 老熟妇乱子伦视频在线观看| 中文欧美无线码| 纯流量卡能插随身wifi吗| 国产区一区二久久| 国产精品久久久人人做人人爽| 高潮久久久久久久久久久不卡| 亚洲人成77777在线视频| 岛国在线观看网站| 如日韩欧美国产精品一区二区三区| 男人舔女人的私密视频| 一级片'在线观看视频| 亚洲视频免费观看视频| 日日爽夜夜爽网站| 久久99一区二区三区| 久久香蕉精品热| 操出白浆在线播放| 精品一区二区三卡| 黄色怎么调成土黄色| 成在线人永久免费视频| 黄色成人免费大全| 国产极品粉嫩免费观看在线| 精品国产一区二区三区四区第35| 久久久久久久久免费视频了| 国产av精品麻豆| av免费在线观看网站| 欧美人与性动交α欧美精品济南到| 精品人妻1区二区| 美女扒开内裤让男人捅视频| 国产一区二区三区综合在线观看| 欧美人与性动交α欧美软件| 嫁个100分男人电影在线观看| 欧美+亚洲+日韩+国产| 人妻久久中文字幕网| 亚洲男人天堂网一区| 午夜91福利影院| 亚洲国产欧美一区二区综合| 久久久久久人人人人人| 久久久久视频综合| 美女高潮喷水抽搐中文字幕| 亚洲一区二区三区不卡视频| 久久久久国内视频| 人人妻,人人澡人人爽秒播| 法律面前人人平等表现在哪些方面| 精品人妻熟女毛片av久久网站| 亚洲av片天天在线观看| 中亚洲国语对白在线视频| 午夜精品国产一区二区电影| 欧美另类亚洲清纯唯美| 在线天堂中文资源库| 亚洲自偷自拍图片 自拍| 嫁个100分男人电影在线观看| av线在线观看网站| 国产精品永久免费网站| 成人亚洲精品一区在线观看| 亚洲精品一卡2卡三卡4卡5卡| 久99久视频精品免费| 国产又色又爽无遮挡免费看| 欧美最黄视频在线播放免费 | 日韩人妻精品一区2区三区| 中文字幕制服av| 欧美一级毛片孕妇| 国产一区二区三区综合在线观看| 老司机亚洲免费影院| 欧美日韩视频精品一区| 国产真人三级小视频在线观看| 午夜免费鲁丝| 满18在线观看网站| 午夜视频精品福利| 99久久综合精品五月天人人| 亚洲熟妇中文字幕五十中出 | 久久国产精品男人的天堂亚洲| 我的亚洲天堂| 一夜夜www| 亚洲精品中文字幕在线视频| 精品午夜福利视频在线观看一区| 亚洲精品美女久久av网站| www日本在线高清视频| 麻豆国产av国片精品| 美女午夜性视频免费| 国产99久久九九免费精品| 国产成人精品久久二区二区91| 免费在线观看影片大全网站| 校园春色视频在线观看| 国产一区二区激情短视频| 精品福利观看| 国产成人一区二区三区免费视频网站| 在线永久观看黄色视频| 99久久99久久久精品蜜桃| 日韩大码丰满熟妇| tube8黄色片| 黄色怎么调成土黄色| 亚洲一码二码三码区别大吗| 精品电影一区二区在线| 国产精品永久免费网站| 亚洲性夜色夜夜综合| 黄网站色视频无遮挡免费观看| 欧美黄色淫秽网站| 天堂动漫精品| 亚洲午夜精品一区,二区,三区| 国产精品美女特级片免费视频播放器 | 免费日韩欧美在线观看| 别揉我奶头~嗯~啊~动态视频| avwww免费| 国产成人精品久久二区二区免费| 这个男人来自地球电影免费观看| 午夜影院日韩av| 50天的宝宝边吃奶边哭怎么回事| 一夜夜www| ponron亚洲| 男女高潮啪啪啪动态图| 国产精品av久久久久免费| 妹子高潮喷水视频| 熟女少妇亚洲综合色aaa.| 久久久久久久午夜电影 | 欧美激情 高清一区二区三区| 中文字幕精品免费在线观看视频| 久9热在线精品视频| 久久狼人影院| 亚洲av片天天在线观看| 久久精品亚洲av国产电影网| 国产xxxxx性猛交| 少妇猛男粗大的猛烈进出视频| 中国美女看黄片| 免费黄频网站在线观看国产| 欧美最黄视频在线播放免费 | 无人区码免费观看不卡| a级片在线免费高清观看视频| 不卡av一区二区三区| 国产欧美日韩综合在线一区二区| 国产欧美日韩一区二区三| 一本大道久久a久久精品| 午夜精品久久久久久毛片777| 欧美黑人精品巨大| 国产亚洲精品一区二区www | 搡老岳熟女国产| 免费一级毛片在线播放高清视频 | 人妻 亚洲 视频| 我的亚洲天堂| av有码第一页| 好看av亚洲va欧美ⅴa在| 亚洲一码二码三码区别大吗| 日本黄色视频三级网站网址 | 不卡一级毛片| 日本vs欧美在线观看视频| 成年动漫av网址| 国产人伦9x9x在线观看| a级片在线免费高清观看视频| 亚洲成人手机| 亚洲国产精品一区二区三区在线| 成年人午夜在线观看视频| 国产色视频综合| 国产无遮挡羞羞视频在线观看| 免费在线观看亚洲国产| 精品一品国产午夜福利视频| 黄色a级毛片大全视频| 国产一区二区三区在线臀色熟女 | 久久精品亚洲熟妇少妇任你| 色精品久久人妻99蜜桃| 丝袜在线中文字幕|