• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Detection of Residual Yarn in Bobbin Based on Odd Partial Gabor Filter and Multi-Color Space Hierarchical Clustering

    2023-12-28 09:11:40ZHANGJinZHANGTuanshan張團(tuán)善SHENGXiaochao盛曉超HUYANPengfei呼延鵬飛
    關(guān)鍵詞:鵬飛

    ZHANG Jin(張 瑾),ZHANG Tuanshan(張團(tuán)善),SHENG Xiaochao(盛曉超), HUYAN Pengfei(呼延鵬飛)

    1 School of Textile Science and Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    2 School of Mechanical and Electrical Engineering, Xi’an Polytechnic University, Xi’an 710048, China

    Abstract:In an automatic bobbin management system that simultaneously detects bobbin color and residual yarn, a composite texture segmentation and recognition operation based on an odd partial Gabor filter and multi-color space hierarchical clustering are proposed. Firstly, the parameter-optimized odd partial Gabor filter is used to distinguish bobbin and yarn texture, to explore Garbor parameters for yarn bobbins, and to accurately discriminate frequency characteristics of yarns and texture. Secondly, multi-color clustering segmentation using color spaces such as red, green, blue (RGB) and CIELUV (LUV) solves the problems of over-segmentation and segmentation errors, which are caused by the difficulty of accurately representing the complex and variable color information of yarns in a single-color space and the low contrast between the target and background. Finally, the segmented bobbin is combined with the odd partial Gabor’s edge recognition operator to further distinguish bobbin texture from yarn texture and locate the position and size of the residual yarn. Experimental results show that the method is robust in identifying complex texture, damaged and dyed bobbins, and multi-color yarns. Residual yarn identification can distinguish texture features and residual yarns well and it can be transferred to the detection and differentiation of complex texture, which is significantly better than traditional methods.

    Key words:residual yarn detection; Gabor filter; image segmentation; multi-color space hierarchical clustering

    0 Introduction

    Winding is the final step in the ring-spinning process, which transfers the spun yarns from the spinning bobbin into a large package form containing a considerable length of yarns for weaving and knitting. Since the bobbins may not be completely withdrawn from the winder, it is necessary to detect the amount of residual yarn and classify the bobbin according to the results. Currently, textile companies have the following three problems with the handling of bobbins. Firstly, there are many types of bobbins and the sorting situation is complicated. Secondly, manual sorting is inefficient and costly. Finally, the intermixing of yarns and yarn-free bobbins makes sorting difficult. How to effectively sort out these mixed bobbins is an urgent problem for textile companies today. The residual yarn inspection technology combined with machine vision not only excels in line inspection efficiency, but also has an irreplaceable position in terms of inspection accuracy and precision, and it has become the mainstream product line inspection technology today. Figure 1 shows bobbins in the factory application process.

    Non-contact methods of residual yarn detection have emerged, the data is mainly collected using a color sensor or camera to detect the color of the bobbin. By comparing the same type of the empty bobbin with the preset value, and using a support vector machine or neural network to segment the color, the yarn region is extracted. These solutions rely more on the color difference between the bobbin and the yarn. They are easily misidentified when two colors are close together and are not ideal for detection with very small amounts of yarns. In addition, the large variety of yarns and bobbins and their complex combination of colors, texture, and shapes make it difficult to detect the number of residual yarns on the bobbin using simple image techniques.

    Texture image segmentation is an important tool to solve this problem and is one of the main research hotspots in graphics and computer vision processing. There are many algorithms for texture image segmentation. These algorithms first use the strategy of extracting the texture features of the image and then performing image segmentation. The commonly used algorithms can be divided into three categories. (1) The use of filter-based algorithms represented by Gabor filter[1]and wavelet filter[2], if combined with the level set algorithms for energy functions, can get good results. (2) Algorithm based on cluster analysis[3-5]is represented by the fusion of Gabor, Steer and other filters and color information, which firstly extract texture features with filters and then fuse the color information to finally achieve the segmentation of texture images. (3) The level set algorithm based on energy function is represented by the regional model[6]to build the segmentation model and texture features. The third algorithm firstly establishes segmentation models, then extracts texture features based on local binary patterns (LBPs)[7], local ternary patterns (LTPs)[8], Gabor filter, structure tensor[9], local Chan-Vese (LCV) method based on extended structure tensor[10], combined with a tensor structure consisting of multiple features[11], local similarity factor (RLSF)[12], and finally segments the texture images. All the above methods are for medical and remote sensing images, which are too complex and poorly robust for application in real-time systems.

    As a linear filter, the frequency and direction representation of Gabor filter is very close to the banding ability and directivity of the human visual system, and is widely used in edge detection[13], texture segmentation[14], defect detection[15-17]and other fields. Considering the color difference between the bobbin and yarns as well as the difference in surface texture, a parameter-optimized multi-directional and multi-scale filter bank is designed to filter the two-dimensional image signal, extract the edges of the wrapped yarn, and perform preliminary calculation and segmentation of the residual yarn by using the antisymmetry of the odd part of the Gabor filter.

    Texture segmentation starts with color segmentation, and there are two main problems in color image segmentation: choosing a suitable color space; choosing a suitable segmentation method. The choice of the color feature space depends on the specific image and segmentation method. Currently, no one color space can replace other color spaces and is suitable for the segmentation of all color images[18]. Many scholars have used more complex feature selection and clustering techniques and then improved the final segmentation results with sophisticated optimization methods. Several segmentation techniques that combine specific theories, methods, and tools have emerged, such as graph theory-based method[19-20], wavelet domain hidden the Markov model[21], mean shift[22], and other information fusion strategies that have been used for image segmentation.

    Combining the same segmentation method or different segmentation methods for the segmentation of multi-color and multi-feature space can effectively solve the two main problems of color image segmentation mentioned above and improve the segmentation effect. The combined method is simple and does not require complex segmentation theories and models. The authors[23-24]verified the feasibility and effectiveness of this strategy in medical image segmentation, remote sensing image segmentation, and natural scene image segmentation. However, there are relatively few research results on feature fusion in image segmentation due to the particular difficulties associated with features[25-26].

    In this paper, we use multi-color spaces for clustering, each dealing with its linear part, and then synthesize the segmentation results. In terms of color space selection, there are six color spaces to choose from: red, green and blue (RGB); hue-saturation-value (HSV); brightness, in-phase, quadrature-phase (BIQ); XYZ (an international standard); CIELAB (LAB) (luminosity,a,b) and LUV (to further improve and unify color evaluation methods, the International Commission on Illumination proposed a unified LUV color space, where L represents the luminance of an object, U and V are chromaticities). Firstly, through experiments, we select two color spaces, RGB and LUV, and use the clustering method to initially segment the enhanced images of RGB and LUV. Secondly, we use the second clustering method to fuse the two initial segmentation results to obtain the fused segmented images. Finally, the segmentation results are obtained by region merging, which effectively solves the over-segmentation and mis-segmentation problems in natural image segmentation.

    The overall structure of this paper is as follows. Section 1 introduces the experimental software and hardware configuration. Section 2 is the detection algorithm, which is divided into three processes: bobbin image acquisition and extraction of main regions; the specific method of yarn edge extraction and segmentation based on Gabor filter; the color space fusion algorithm based on multi-color space hierarchical clustering and segmentation that combines Gabor edge detection to achieve residual yarn detection. Section 3 is a summary of the proposed method and experiments.

    1 Experimental Device

    The experimental system contains the following equipment. A charge coupled device(CCD) camera MV-GED500C-T (Shenzhen Minvision, China) is used with 2 448×2 048 dots per inch (DPI), 9 frames per second, and an ethernet interface, which can meet the needs of the experiment and actual production applications. For the lens, an industrial lens with a focal length of 25 mm (Zhejiang Dahua Technology CO., LTD., China) is adopted. The illumination source (Hangzhou Hikvision Digital Technology Co., LTD., China) is a downward-inclined white LED positive light source with a color temperature of 6 500 K. The experimental platform is shown in Fig.2. The computer operating system is Windows 7, the processor is i5-2500k@3.3GHz, the memory is 4 GB, and the graphics card is GeForce GTX 750. The framework is Visual Studio, and the OpenCV version is 3.4.3.

    According to the process of bobbin inspection, the whole system can be divided into a bobbin transfer module in the first stage, the bobbin online inspection module in the middle stage and the bobbin management module in the last stage. In the first stage, the messy bobbin needs to be pre-sorted, so that the bobbin can be placed more neatly before entering the online inspection in the middle stage to improve the inspection efficiency. The bobbin online inspection module is the core part of the pipeline management system, and the inspection result is directly related to the effect of the later bobbin management module. The bobbin online inspection module mainly realizes the detection of the bobbin with yarns and the color classification detection of the bobbin without yarns.

    Among them, yarn-containing bobbin detection precedes yarn-free bobbin color classification detection, and yarn-containing bobbin detection includes broken bobbin detection, end detection and yarn-containing detection. The color classification detection of yarn-free bobbins requires classification based on the head and body color of the bobbin. The bobbin online inspection system analyzes each bobbin image captured from the conveyor belt by the corresponding image processing algorithm to calculate the pixel length, end pixel width, edge detection, and color classification process of the bobbin. According to the results of calculation and analysis, communication and interaction with the embedded computer master is carried out to separate the broken end, inverted end and yarn-containing yarn barrel from the conveyor belt and send them from the plate to the corresponding box. If it is a qualified yarn barrel, then it can be processed for color analysis and move on to the next module. The process diagram is shown in Fig.3.

    Since the system is based on a fixed color background, in order to effectively preserve the target area, it is necessary to remove the support frame, background and other factors, and it is necessary to pixel mask each read image to obtain a binary image containing only 0 and 1 values. In the initially segmented binary image, each pixel with a value of 0 is judged to be within the similarity threshold. If it is within this range, the pixel range is reset according to the processing relationship, and the desired color image is obtained. The whole process is depicted in Figs. 4-6.

    The bobbin is positioned according to the optical band to effectively cut the experimental map and get the region of interest. The bilateral filtering can effectively filter out the noise in the image, make the contour of the bobbin clearer, effectively retain the details of the image, and extract the region of interest (ROI).

    2 Gabor Detection Algorithm

    2.1 Image acquisition of bobbin

    As shown in Fig.7(a), the camera captures an image of the bobbin spool directly. Horizontal correction of the image is needed to crop the main area of the bobbin spool. First, the smallest outer rectangle of the bobbin axis region is drawn with OpenCV, then using the built-in function of OpenCV. The angle of the bobbin is

    (1)

    2.2 Gabor filter feature extraction

    2.2.1ExtractionofyarnedgesusingoddpartialGaborfilter

    Gabor filter features have been widely used as effective texture feature descriptor for target detection and recognition. In the spatial domain, two-dimensional Gabor filter is usually obtained by multiplying a sinusoidal plane wave and a Gaussian kernel function. The properties of Gabor filter are self-similar. That is, any Gabor filter can be obtained by extending and rotating the mother wavelet. In practice, Gabor filter can extract relevant features from images with different orientations and different scales in the frequency domain. For a Gabor filter with a two-dimensional digital part, the functional expression is

    (2)

    where (x,y) is the pixel coordinate;x′=xcosθ+ysinθ;y′=-xsinθ+ycosθ;θdenotes the direction of the parallel stripes of the Gabor function, taking values from 0° to 360°;ωis the central frequency;σis the variance of the Gaussian distribution;ψis the phase of the sinusoidal function, and in general,ψ= 0;γis the aspect ratio,i.e., the spatial aspect ratio, which determines the ellipticity of the Gabor function. Whenγ= 1, the shape is circular, and whenγ<1, the shape is elongated in the direction of parallel stripes.σcannot be set directly. It varies with the bandwidth of the filter’s half-response spatial frequency (defined asb). The relationship between them is

    (3)

    whereσandλdetermine the wave form of the odd partial of the Gabor filter. When the product of the two is constant, the center frequency only affects the effective response region, while the waveform remains constant. Depending on the number of pixels occupied by a single veil in the image, a filter bank consisting of filters with the same center frequency is designed, and each bank contains filters with different orientations. In the experiments,σ/λis set to 1.2, the horizontal direction is ±90°, the number of groups is 3, and the center frequencies are 4.8, 5.6 and 6.4 Hz, respectively. The filter banks are convolved for each RGB channel component separately. The maximum value is taken from different directions of the same center frequency, and the minimum value is taken between different center frequencies to form the filter output. The processing effect of the filter is shown in Fig.8. After parameter optimization, the filter can suppress the texture of the cylinder wall and enhance the gray gradient of the bobbin at the same time.

    2.2.2Filterwaveformoptimization

    The waveform and output response of two-dimensional Gabor filter vary withσ,ω,θ,etc. Due to the bandpass characteristics of two-dimensional Gabor filter, it is necessary to use a multi-center frequency and multi-direction filter bank, and process the input signal according to fineness, so that the filter produces a greater response to the edge of the cylinder, while suppressing other texture. To enhance the edge detection of the odd part Gabor filter, it is necessary to make its waveform have a more pronounced step feature at axisx= 0. This step also gives a larger output response of filterG(x,y). LetEbe the integral of the response on the central axis side ofG(x,y).

    (4)

    After the integral operation, we can get

    (5)

    whereI(·) represents the imaginary part of the function;L(·) represents the error function. Equation (5) shows that ifσis known, thenEhas a Dawson distribution withωx(frequency atx). The inflection point of the Dawson integral occurs at approximately 0.924 14. It can be seen that whenEis the maximum value, the following equation can be obtained.

    (6)

    Figure 9 shows the filter output response curves withλ/σfor the test images. It can be seen that the actual peak occurs aroundλ/σ=6.4 and the actual output value is consistent with the theoretical output value.

    2.2.3Filterangleselection

    Figure 10 shows the ideal variation curve of the output response of the filter in the surprise section withθ. The maximum response of the filter was obtained whenθ= 90°. To prove the theoretical results, a total of five filters were selected to filter the test images in the range of 0° to 180° with 45° intervals. It can be seen that the filters filter best when the axes of the filters coincide with the step edge lines. Whenθis 90° and the centrosymmetric angle is 270°, these two angles can produce the ideal filtering effect. For a vertical bobbin, too many filters in different directions can reduce the operating efficiency and increase the noise.

    2.2.4Filtercenterfrequencyselection

    Fig.1 Bobbins in factory: (a) yarn bobbins in textile mills; (b) multi-color and yarn-containing bobbins that require sorting

    Fig.2 Experimental platform

    Fig.3 Process diagram : (a) yarn identification and detection system; (b) image acquisition device

    Fig.4 Image of bobbin

    Fig.7 Region of interest automatic process of image: (a) bobbin image; (b) main area of the extraction; (c) main area of bobbin

    Fig.8 Gabor rendering: (a) bobbin backbone area; (b) Gabor filtering effect of odd part

    Fig.9 Output response curve with λ/σ

    Fig.10 Output response curve with θ

    Fig.11 Theoretical output curve with respect to σ/d

    2.3 Fusion segmentation of multi-color spaces

    RGB model is the most common and basic color model in digital image processing. In the RGB color space, any color can be represented as a weighted mixture of different components of the three primary colors red, green and blue. The RGB color space, also known as the additive mixed color space, is characterized by poor color uniformity. This color space model is shown in Fig.12(a). For general images, the luminance ranges from 1 to 100, and the values of U and V are between -100 and 100. +U is red, -U is green, +V is yellow, and -V is blue. The LUV color space model is shown in Fig.12(b).

    Fig.12 Different color spaces: (a) RGB color space; (b) LUV color space

    The calculation formula is obtained by XYZ through nonlinear calculation. The specific equation is

    (7)

    (8)

    whereYnis 1.0;u′ andv′ describe the test color stimulus.

    The proposed method of fusion and segmentation of multi-color space based on hierarchical clustering is shown in Fig.13.

    Fig.13 Multi-color space fusion and segmentation method flow

    The specific steps are as follows.

    3)The local class labeled histogram features ofRRGBandRLUVare extracted and serialized into a fusion feature vector, and then the second fuzzyC-means clustering is performed to obtain a fusion segmentation resultSfusion.

    4)Conduct region merging onSfusionto obtain the final segmentation resultS.

    2.3.1Fusionandsegmentationmethodflowbasedonhierarchicalclustering

    In color image segmentation, choosing an appropriate color space is a difficult problem because it is difficult to represent complex natural scene images in a single-color space. Different color space representations can be seen as images with different channels provided by different sensors. An information fusion strategy combining complementary information from multi-color spaces is an effective way to improve the segmentation effect. Through the segmentation experiment of multiple groups of natural scene images, two color spaces, RGB and LUV, are selected to represent the segmented image and perform fusion segmentation. Image enhancement can highlight the light and dark changes in the image, and enhance the contrast between the background and the target, thereby effectively improving the segmentation effect of grayscale images. It is found that color image segmentation after enhancement processing can highlight the contours of the original image, reduce the number of segmentation blocks and improve the segmentation effect. The mathematical morphology image enhancement method is a simple and effective image enhancement method that can obtain a description of the structural features of an image by the influence of structural elements on the image.

    (9)

    2.3.2InitialsegmentationbasedonfuzzyC-meansclustering

    In general, the color histogram has a high degree of freedom. Taking RGB color space as an example, the color histogram has a high degree of freedom of 2563. Here, each color component is uniformly quantized to levelP, and the color histogram has a degree of freedom ofP3. For any pixelXin the image, the normalized local color histogramh1in a windowRcentered onXis computed.

    (10)

    2.3.3Fusionoftheinitialsegmentationofmulti-colorspace

    The fuzzyC-means clustering method is used to fuse the two initial segmentation results ofK1category, and the fusion segmentation results ofK2different categories are obtained bySfusion.

    For the two initial segmentation results, the feature vectors of the local class-labeled histograms centered on pixelxare extracted respectively, and the class-labeled histogramsh2are calculated in the windowRx.

    (11)

    wherenjis the number of pixels in the window labeled as (j+1);Nwrepresents the number of images corresponding to the bobbin. Then, two local class labeling histograms are concatenated and normalized to obtain the fused local class labeling histogramh2(Rx) with vector dimensions ofK1andK2.

    2.3.4Regionalconsolidation

    Since the clustering results are pixel-based, it is necessary to perform region merging onSfusionto achieve a more complete description of the target. In the segmentation results, the distance between a regionRand an adjacent regionRois denoted asDmerging(R,Ro),

    (12)

    whereC= {RGB, LUV};h(R) represents the normalized local color histogram of theRregion;h(Rx) represents the normalized local color histogram of a pixelxin the neighborhoodRo;DB[h(R),h(Rx)] represents the Batacaria distance of the histogram;h(i,R) andh(i,Rx) represent the occurrence frequency of the bin in the histograms ofRandRxrespectively;Nbrepresents the number of bins in the histogram. By calculating the distance betweenRand all adjacent regions ofR, the minimum adjacent regionRminis obtained. If the distance betweenRandRminisDmerging(R,Rmin) and less than the thresholdT, thenRis merged intoRmin. In the experiment of this paper, the segmentation of three different yarn volume fractions has achieved a very good segmentation effect, which can effectively restrain the texture of the bobbin and eliminate the influence of reflection. The experimental results are shown in Fig.14.

    Fig.14 Segmentation effect: (a) unsegmented image; (b)segmentation result image

    3 Experiment and Analysis

    In this study, we select 150 bobbins as the test image set, as shown in Fig.15. We test them according to LUV color space, and generally use chromaticity, saturation, and luminance according to the judgment rules of human eyes. For the sake of simplicity, we use saturation as the variable, so it is divided into four cases. For the first one, the saturation and luminance of the bobbin and yarns are completely distinguished. For the second one, the saturation of the bobbin and yarns is similar, but the hue is different. For the third one, the hue of the bobbin and yarns is similar, but with different saturation. For the fourth one, the bobbin and yarns are with little difference in hue saturation, which is the industry phenomenon of “the same bobbin and the same yarn”. In these four cases, the bobbin texture, slight stains,etc. should be considered. From the classification point of view, bobbin texture belongs to the fourth category.

    Fig.15 Images for testing

    Regarding the clustering, theoretically, two clusters are enough, one is the color of the bobbin itself and the other is the color of the yarn, but in practice, the bobbin is reflective and the bright band counts as a cluster. The bobbin texture counts as a cluster, so there are four clusters in total, but this increases the burden on the computer and is not suitable for low-cost situations. Thus three clusters are considered. Garbo filter is used for texture processing. The idea is that at the boundary of the cluster, the Garbo binarized image is checked. If the yarn shape is satisfied and the width is greater than that of the texture, it is the yarn. If both are texture, further clustering of texture is performed.

    An experimental process with the yellow bobbin is carried out, because the high saturation of the yellow bobbin is the high saturation of the white bobbin, and the identification of the yellow bobbin is the best test set to validate the system. The main reasons are as follows.

    1)The RGB of yellow is (255, 255, 0) and the RGB of crimson is (255, 0, 255). These two colors are easily confused in a color instant, which means that the colors easily distinguished by the naked eye are mathematically identical.

    2)Yellow is more likely to be contaminated and fade.

    3)Yellow (255, 255, 0) is visually white shown in Fig.16. Yellow and light white yarns, which are indistinguishable by normal methods, are theoretically extremely susceptible to the interference from color component blue, resulting in white color.

    Fig.16 Test process

    Fig.17 Comparison curves of different algorithms

    This process is also the block diagram of our software. From the diagram, we can see that three methods are used. Firstly, the Gabor filter is used to get the approximate position of the yarns, as seen in Fig.16. The texture and yarns are mixed and the exact position of the yarns cannot be distinguished, and other means are needed. Secondly, RGB three-component clustering is used, and the green component appears as a possible judgment of the yarn, but stain interference needs to be excluded. Thirdly, three-component clustering in the color space of LUV, from the L component to determine the location of the yarn. This “L” judgment is more realistic. If the U and V components are connected on the Y-axis at the same time, it can also be considered as having a yarn. The results of the three methods are fused by using the method of this paper to derive the yarns and their specific location. The accuracy of the method requires tuning the parameters of the fusion, which will not be developed here. Therefore, the yarns and texture which are very close to each other can be differentiated and processed. The situation is ideal when the filter is running at a speed of 50 bobbins per second, as shown in Fig.16.

    To test the accuracy and robustness of the detection algorithm in this paper, different colors of yarns and different kinds of bobbins are selected for test groups, and the test groups are shown in Table 1. Five colors of white, black, blue, red and gray yarns with linear densities of 4.0 tex and 10.0 tex are selected, the true positive rate (TPR) of the residual yarn skeleton is used as the test evaluation index, which is defined as

    (13)

    whereQTPRrepresents the accuracy rate, and the value range ofQTPRis [0,1];ETPrepresents the actual bobbin with residual yarns, and the detection result is also the number of samples of the bobbin with residual yarns;A(TP+FN)represents the actual bobbin with residual yarns and the actual empty bobbin, and the detection result is the number of the bobbin with residual yarns plus the number of empty bobbin samples. The higher the value, the better the classification effect. The test results corresponding to Table 1 are shown in Table 2. Figure 17 shows the effect comparison curve, from which it can be seen that the accuracy of the traditional algorithm is lower than 65% and that of the algorithm in this paper is higher than 80%.

    Table 1 Test groups

    Table 2 Accuracy of test results

    4 Experimental Verification of Indistinguishability

    To further validate the method of this paper, the following are a few cases that are more difficult for the human eye to distinguish.

    4.1 Texture judgment

    The bobbin in Fig.18 is very prone to error even if it is manually selected. At this point, the Gabor filter does not work as well as required, as shown in Fig.18(c), color clustering must be considered to complete the yarn judgment. In LUV color space, the L component acts obviously; in RGB color space the blue component acts obviously, so that the state marker of the yarn can be judged after fusion.

    Fig.18 Texture judgment process: (a) camera’s capture; (b) bobbin automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    4.2 Loose yarn judgment

    This is less common but still exists in a certain percentage. Figure 19 shows an example where the yarn is not horizontal on theY-axis and has a certain angle. In the adaptive Gabor filter used in Fig.18, the yarn contour is still distinguished, but when the yarn is close to the bobbin, the texture features are confused with the yarn features. In this identification, the concept of convex packets is utilized, as shown in Fig.19(g).

    Fig.19 Yarn judgment process: (a) camera’s capture; (b) automatic identification; (c) Gabor filter; (d) L-component clustering; (e) U-component clustering; (f) V-component clustering; (g) R-component clustering; (h) G-component clustering; (i) B-component clustering

    5 Conclusions

    In this paper, odd partial Gabor filter, multi-color space and hierarchical clustering of compound texture segmentation operators are used to detect residual yarns. Yarn segmentation is realized by optimizing the design of Gabor filter banks and adjusting the parameters to maximize the order amplitude in the band pass range. To solve the problem of the specific width, the most suitable center frequency is explored. By setting a reasonable filter combination, the frequency inconsistent with the yarn direction is removed, the noise is suppressed and the detection efficiency is improved.

    At the same time, it combines the fusion segmentation based on RGB and LUV color space hierarchical clustering to solve the problems of over-segmentation and mis-segmentation caused by the low contrast between the target and the background in the color. For image segmentation, image enhancement techniques in color image segmentation are introduced to make the segmented image better reflect the contours of the original image and highlight the parts of interest in the image.

    The results show that the algorithm can accurately detect yarn bobbins with different colors and brightness, and its optimization strategy provides a theoretical reference for the research of non-contact bobbin sorting.

    猜你喜歡
    鵬飛
    樊應(yīng)舉
    書香兩岸(2020年3期)2020-06-29 12:33:45
    漫畫
    Quality Control for Traditional Medicines - Chinese Crude Drugs
    為了避嫌
    雜文月刊(2019年18期)2019-12-04 08:30:40
    懲“前”毖“后”
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    執(zhí)“迷”不悟
    21世紀(jì)(2019年10期)2019-11-02 03:17:02
    舉賢
    21世紀(jì)(2019年9期)2019-10-12 06:33:44
    漫畫
    粗看“段”,細(xì)看“端”
    漫畫
    三级男女做爰猛烈吃奶摸视频| 久久这里只有精品中国| 国产午夜精品论理片| 99久久精品国产亚洲精品| 国产一区二区亚洲精品在线观看| 村上凉子中文字幕在线| 伦理电影大哥的女人| 啦啦啦观看免费观看视频高清| 给我免费播放毛片高清在线观看| 午夜老司机福利剧场| 欧美黄色片欧美黄色片| 此物有八面人人有两片| 好看av亚洲va欧美ⅴa在| 国产日本99.免费观看| eeuss影院久久| 又黄又爽又免费观看的视频| 91麻豆av在线| 一边摸一边抽搐一进一小说| 日本精品一区二区三区蜜桃| 97碰自拍视频| 国产熟女xx| 午夜久久久久精精品| 亚洲最大成人手机在线| 精品免费久久久久久久清纯| 欧美一区二区精品小视频在线| 国产综合懂色| 国产欧美日韩一区二区精品| 国产高清有码在线观看视频| 亚洲熟妇熟女久久| 久久精品国产亚洲av香蕉五月| 中文字幕人成人乱码亚洲影| 亚洲精品在线美女| 亚洲av中文字字幕乱码综合| 一级av片app| 欧美黑人巨大hd| or卡值多少钱| 日韩人妻高清精品专区| 国产激情偷乱视频一区二区| 国内久久婷婷六月综合欲色啪| 国产69精品久久久久777片| 久久九九热精品免费| 少妇裸体淫交视频免费看高清| 国产精品一区二区三区四区久久| 亚洲美女视频黄频| 国产黄片美女视频| 久久久久久久亚洲中文字幕 | 一本久久中文字幕| 国产亚洲欧美在线一区二区| 国内精品美女久久久久久| 中文字幕免费在线视频6| 欧美日韩亚洲国产一区二区在线观看| 欧美xxxx性猛交bbbb| 欧美zozozo另类| 国产私拍福利视频在线观看| 欧美在线黄色| av天堂在线播放| 波多野结衣高清无吗| 在线天堂最新版资源| 成年版毛片免费区| 国产黄片美女视频| 免费看光身美女| 免费在线观看日本一区| 国产单亲对白刺激| 日韩高清综合在线| 精品久久久久久久久av| 无遮挡黄片免费观看| 在线观看66精品国产| 国产亚洲精品综合一区在线观看| 日韩欧美在线二视频| 很黄的视频免费| 桃红色精品国产亚洲av| 在线观看美女被高潮喷水网站 | 超碰av人人做人人爽久久| 88av欧美| 五月伊人婷婷丁香| 亚洲精品亚洲一区二区| 久久香蕉精品热| 乱人视频在线观看| 宅男免费午夜| 色综合站精品国产| 18美女黄网站色大片免费观看| 熟女人妻精品中文字幕| 欧美另类亚洲清纯唯美| 丁香六月欧美| 麻豆成人av在线观看| 亚洲成人精品中文字幕电影| 亚州av有码| 久久6这里有精品| 我要看日韩黄色一级片| 又黄又爽又免费观看的视频| 看十八女毛片水多多多| 精品人妻1区二区| 国产主播在线观看一区二区| 国产私拍福利视频在线观看| 亚洲av电影在线进入| 午夜福利在线观看吧| 午夜激情福利司机影院| 午夜久久久久精精品| 欧美绝顶高潮抽搐喷水| 12—13女人毛片做爰片一| 一区福利在线观看| a级毛片a级免费在线| 亚洲黑人精品在线| 简卡轻食公司| 乱码一卡2卡4卡精品| 亚洲最大成人中文| 国产高潮美女av| 亚洲激情在线av| 男女之事视频高清在线观看| 国产三级黄色录像| 国产精品人妻久久久久久| 久久国产精品影院| 亚洲国产精品sss在线观看| 成人三级黄色视频| 我要搜黄色片| 亚洲欧美激情综合另类| 天堂网av新在线| 赤兔流量卡办理| 精品国内亚洲2022精品成人| 人妻夜夜爽99麻豆av| 久久精品国产清高在天天线| 两个人视频免费观看高清| 午夜激情福利司机影院| 久久久精品大字幕| 日本成人三级电影网站| av国产免费在线观看| av在线天堂中文字幕| 最新在线观看一区二区三区| 99精品在免费线老司机午夜| 亚洲va日本ⅴa欧美va伊人久久| 午夜福利18| 国产一区二区在线观看日韩| 久久人人爽人人爽人人片va | 一夜夜www| 99国产精品一区二区蜜桃av| 久9热在线精品视频| 久久人妻av系列| 日本成人三级电影网站| 亚洲美女视频黄频| 国产精品久久视频播放| 一本精品99久久精品77| 脱女人内裤的视频| 国产精品久久久久久久电影| 欧美bdsm另类| 国产精品精品国产色婷婷| 精品欧美国产一区二区三| 内地一区二区视频在线| 亚洲一区二区三区不卡视频| 91久久精品电影网| 高清日韩中文字幕在线| 听说在线观看完整版免费高清| 一区二区三区免费毛片| 91麻豆av在线| 两性午夜刺激爽爽歪歪视频在线观看| 精品一区二区三区视频在线观看免费| 亚洲av.av天堂| 97超视频在线观看视频| 欧美成人a在线观看| 校园春色视频在线观看| 欧美日韩国产亚洲二区| 搡老岳熟女国产| 黄色女人牲交| 嫩草影院新地址| 色视频www国产| 欧美潮喷喷水| 又黄又爽又免费观看的视频| 婷婷色综合大香蕉| 久久6这里有精品| 女人被狂操c到高潮| 欧美日本视频| 国产爱豆传媒在线观看| 国产成人福利小说| 精品一区二区三区视频在线| 丰满人妻熟妇乱又伦精品不卡| 国产精品亚洲一级av第二区| 一区福利在线观看| 亚洲av第一区精品v没综合| 免费在线观看亚洲国产| 国产精品99久久久久久久久| 国产高清激情床上av| 免费搜索国产男女视频| 中文在线观看免费www的网站| 亚洲国产高清在线一区二区三| 欧美日本亚洲视频在线播放| 男女之事视频高清在线观看| 校园春色视频在线观看| 九色国产91popny在线| 午夜福利高清视频| 18禁在线播放成人免费| 又爽又黄无遮挡网站| 一本综合久久免费| 美女大奶头视频| 丝袜美腿在线中文| 如何舔出高潮| 又紧又爽又黄一区二区| 亚洲av免费在线观看| 最后的刺客免费高清国语| 久久精品国产99精品国产亚洲性色| 午夜福利成人在线免费观看| 亚洲,欧美,日韩| 男人舔奶头视频| 精品人妻视频免费看| 90打野战视频偷拍视频| 久久久国产成人免费| 夜夜躁狠狠躁天天躁| 亚洲专区中文字幕在线| 舔av片在线| 如何舔出高潮| 小蜜桃在线观看免费完整版高清| 国内少妇人妻偷人精品xxx网站| 欧美性猛交黑人性爽| aaaaa片日本免费| 亚洲成av人片在线播放无| 精品一区二区三区人妻视频| 国产欧美日韩一区二区三| 男女视频在线观看网站免费| 首页视频小说图片口味搜索| 免费一级毛片在线播放高清视频| 麻豆久久精品国产亚洲av| 国产欧美日韩精品亚洲av| 国产成人福利小说| 久久精品国产99精品国产亚洲性色| 国产精品一区二区三区四区免费观看 | 亚洲片人在线观看| 搡老妇女老女人老熟妇| 亚洲成a人片在线一区二区| 国产熟女xx| 搡老岳熟女国产| av视频在线观看入口| 中文字幕人妻熟人妻熟丝袜美| 又黄又爽又免费观看的视频| 国产高潮美女av| 琪琪午夜伦伦电影理论片6080| 欧美性感艳星| 成人性生交大片免费视频hd| 欧美成人免费av一区二区三区| 国语自产精品视频在线第100页| 日本精品一区二区三区蜜桃| 国产真实乱freesex| 18+在线观看网站| 精品一区二区三区人妻视频| 国产亚洲欧美在线一区二区| 两人在一起打扑克的视频| 欧美日韩国产亚洲二区| 深夜a级毛片| а√天堂www在线а√下载| 别揉我奶头 嗯啊视频| 黄色日韩在线| eeuss影院久久| 欧美一区二区国产精品久久精品| 一本一本综合久久| 国产精品一区二区免费欧美| 男人舔奶头视频| 69av精品久久久久久| 九色国产91popny在线| 国产高清视频在线播放一区| 精品国产三级普通话版| 毛片一级片免费看久久久久 | 51国产日韩欧美| 亚洲中文日韩欧美视频| 麻豆久久精品国产亚洲av| 五月玫瑰六月丁香| 男人和女人高潮做爰伦理| 国产欧美日韩精品一区二区| 日本与韩国留学比较| 日韩高清综合在线| 午夜视频国产福利| 深夜a级毛片| 观看美女的网站| 国产亚洲精品久久久com| 少妇的逼水好多| 91字幕亚洲| 最近视频中文字幕2019在线8| 成人亚洲精品av一区二区| 免费无遮挡裸体视频| www.999成人在线观看| 亚洲成人久久爱视频| netflix在线观看网站| 丁香欧美五月| 性色avwww在线观看| 精品一区二区三区av网在线观看| 成人三级黄色视频| 国产成人影院久久av| 啦啦啦韩国在线观看视频| 欧美一区二区精品小视频在线| 舔av片在线| 夜夜夜夜夜久久久久| 搞女人的毛片| 国产精品免费一区二区三区在线| 少妇人妻精品综合一区二区 | 国产亚洲精品综合一区在线观看| 欧美bdsm另类| 青草久久国产| 校园春色视频在线观看| 亚洲经典国产精华液单 | 中文字幕久久专区| 亚洲 国产 在线| 麻豆国产97在线/欧美| 非洲黑人性xxxx精品又粗又长| 欧美激情久久久久久爽电影| 深夜精品福利| 亚洲欧美清纯卡通| 日韩欧美在线乱码| 人人妻,人人澡人人爽秒播| 少妇裸体淫交视频免费看高清| 成熟少妇高潮喷水视频| 韩国av一区二区三区四区| 亚洲美女黄片视频| 欧美性猛交黑人性爽| 久久伊人香网站| 国内少妇人妻偷人精品xxx网站| 亚洲七黄色美女视频| 老女人水多毛片| av在线蜜桃| 国内毛片毛片毛片毛片毛片| 久久精品国产99精品国产亚洲性色| 免费av不卡在线播放| 久久久精品欧美日韩精品| 亚洲av不卡在线观看| 在线观看66精品国产| 久久中文看片网| 欧美乱妇无乱码| 国产激情偷乱视频一区二区| 我的老师免费观看完整版| 国产真实伦视频高清在线观看 | 国产精品一区二区三区四区久久| 可以在线观看的亚洲视频| 可以在线观看毛片的网站| 最好的美女福利视频网| 亚洲国产欧美人成| 欧洲精品卡2卡3卡4卡5卡区| 成人av一区二区三区在线看| 欧美在线一区亚洲| 小蜜桃在线观看免费完整版高清| 在线观看一区二区三区| 一进一出抽搐动态| 国产精品亚洲一级av第二区| 国产一区二区在线av高清观看| 夜夜夜夜夜久久久久| 亚洲av二区三区四区| 国产精品一及| 成人性生交大片免费视频hd| 亚洲国产精品合色在线| 国产蜜桃级精品一区二区三区| 在线免费观看不下载黄p国产 | 日韩欧美 国产精品| 欧美激情在线99| 国产综合懂色| 自拍偷自拍亚洲精品老妇| 一边摸一边抽搐一进一小说| 韩国av一区二区三区四区| 亚州av有码| 性欧美人与动物交配| 日韩欧美 国产精品| 九色成人免费人妻av| 在线看三级毛片| 成年版毛片免费区| 色哟哟·www| 国产成人av教育| 国产一区二区三区视频了| 亚洲aⅴ乱码一区二区在线播放| 人人妻人人看人人澡| 亚洲激情在线av| 国产麻豆成人av免费视频| 老熟妇乱子伦视频在线观看| 在线免费观看的www视频| 欧美中文日本在线观看视频| 日本黄大片高清| 最近最新免费中文字幕在线| 欧美日本视频| 欧美黑人欧美精品刺激| 亚洲av免费高清在线观看| 在线播放国产精品三级| 国产精品久久久久久久电影| 亚洲狠狠婷婷综合久久图片| 欧美日韩亚洲国产一区二区在线观看| 久久精品国产亚洲av涩爱 | 一级黄片播放器| 色哟哟哟哟哟哟| 中亚洲国语对白在线视频| 免费看光身美女| 极品教师在线免费播放| 观看免费一级毛片| 亚洲美女视频黄频| 91麻豆av在线| 色在线成人网| 好男人在线观看高清免费视频| 免费无遮挡裸体视频| 国产av不卡久久| 色综合欧美亚洲国产小说| 国产男靠女视频免费网站| 国产黄色小视频在线观看| 免费人成在线观看视频色| 亚洲av.av天堂| 成年女人毛片免费观看观看9| 一进一出抽搐gif免费好疼| 俄罗斯特黄特色一大片| 麻豆一二三区av精品| eeuss影院久久| 日韩高清综合在线| 精品久久久久久久久久久久久| 亚洲乱码一区二区免费版| 日韩有码中文字幕| 免费无遮挡裸体视频| 国产黄a三级三级三级人| 精品久久久久久久久久免费视频| 九九久久精品国产亚洲av麻豆| 99久国产av精品| 亚洲av成人不卡在线观看播放网| www.www免费av| 久久久久久久久久成人| 噜噜噜噜噜久久久久久91| 精品一区二区三区av网在线观看| 他把我摸到了高潮在线观看| av女优亚洲男人天堂| 午夜两性在线视频| 亚洲aⅴ乱码一区二区在线播放| 欧美+日韩+精品| 色综合站精品国产| 欧美一级a爱片免费观看看| 久久久久久久久大av| 欧美日韩亚洲国产一区二区在线观看| 精品人妻1区二区| 亚洲欧美清纯卡通| 亚洲专区国产一区二区| 制服丝袜大香蕉在线| 18+在线观看网站| 他把我摸到了高潮在线观看| 特大巨黑吊av在线直播| 男人舔奶头视频| 青草久久国产| 亚洲精品一卡2卡三卡4卡5卡| 极品教师在线免费播放| 在线观看舔阴道视频| 黄色女人牲交| 美女高潮的动态| 99国产极品粉嫩在线观看| 日韩有码中文字幕| 757午夜福利合集在线观看| 超碰av人人做人人爽久久| 国产亚洲精品综合一区在线观看| 亚洲精品粉嫩美女一区| 国产真实乱freesex| 成人毛片a级毛片在线播放| 国产成年人精品一区二区| 国产aⅴ精品一区二区三区波| 精品人妻偷拍中文字幕| 在现免费观看毛片| h日本视频在线播放| 极品教师在线免费播放| 波多野结衣巨乳人妻| 免费高清视频大片| www.熟女人妻精品国产| 日韩中字成人| 麻豆av噜噜一区二区三区| 婷婷精品国产亚洲av| 欧美在线一区亚洲| 亚洲成av人片在线播放无| 久久精品影院6| 中文字幕久久专区| 亚洲片人在线观看| 人妻久久中文字幕网| 成人毛片a级毛片在线播放| 久久久精品欧美日韩精品| 国产在线精品亚洲第一网站| 亚洲激情在线av| 久久伊人香网站| 久久这里只有精品中国| 欧美国产日韩亚洲一区| 久久亚洲精品不卡| 亚洲精品456在线播放app | 91午夜精品亚洲一区二区三区 | 18禁黄网站禁片午夜丰满| 身体一侧抽搐| 一本久久中文字幕| 亚洲成人免费电影在线观看| 99riav亚洲国产免费| 国产老妇女一区| 免费搜索国产男女视频| 国产精品亚洲美女久久久| a级毛片a级免费在线| 亚洲午夜理论影院| 舔av片在线| 色5月婷婷丁香| 亚洲无线在线观看| 欧美最黄视频在线播放免费| 99热只有精品国产| 九色国产91popny在线| 色吧在线观看| 午夜久久久久精精品| 97超视频在线观看视频| 亚洲av成人精品一区久久| 国产精品免费一区二区三区在线| 丰满人妻熟妇乱又伦精品不卡| 国产主播在线观看一区二区| 亚洲久久久久久中文字幕| 人人妻人人澡欧美一区二区| 亚洲人与动物交配视频| 老熟妇仑乱视频hdxx| 可以在线观看的亚洲视频| 12—13女人毛片做爰片一| 亚洲五月婷婷丁香| 99视频精品全部免费 在线| 免费搜索国产男女视频| 亚洲国产精品合色在线| av视频在线观看入口| 老司机深夜福利视频在线观看| 欧美一区二区亚洲| 亚洲国产日韩欧美精品在线观看| 免费看美女性在线毛片视频| 在线观看舔阴道视频| 国产一区二区激情短视频| 在线国产一区二区在线| 国产一区二区三区在线臀色熟女| 少妇的逼水好多| 欧美日韩乱码在线| 亚洲最大成人中文| 亚洲精品乱码久久久v下载方式| 久久欧美精品欧美久久欧美| 国产麻豆成人av免费视频| 人人妻,人人澡人人爽秒播| 亚洲人成伊人成综合网2020| 久久欧美精品欧美久久欧美| 美女 人体艺术 gogo| 91字幕亚洲| aaaaa片日本免费| 自拍偷自拍亚洲精品老妇| 最新在线观看一区二区三区| 亚洲成人精品中文字幕电影| 特大巨黑吊av在线直播| 搡老岳熟女国产| 亚洲av不卡在线观看| 日韩中字成人| 99热这里只有精品一区| 日本 av在线| 国内久久婷婷六月综合欲色啪| 日韩欧美一区二区三区在线观看| 国产精品久久电影中文字幕| 亚洲综合色惰| 尤物成人国产欧美一区二区三区| 国产精品久久久久久精品电影| 国产精品永久免费网站| 婷婷六月久久综合丁香| 别揉我奶头 嗯啊视频| 日韩欧美三级三区| 成人美女网站在线观看视频| 草草在线视频免费看| 亚洲久久久久久中文字幕| 99久久精品热视频| 天堂影院成人在线观看| 男女床上黄色一级片免费看| 日本一本二区三区精品| 天堂动漫精品| 丁香六月欧美| 午夜福利免费观看在线| 看黄色毛片网站| 久久久久久国产a免费观看| 日日夜夜操网爽| 久久久久久国产a免费观看| 国产在线男女| 久久久精品欧美日韩精品| 99国产精品一区二区三区| 国产黄色小视频在线观看| 亚洲国产精品合色在线| 国产精品久久久久久人妻精品电影| 国产乱人伦免费视频| 91久久精品国产一区二区成人| 成人国产综合亚洲| 久久6这里有精品| 国产亚洲精品久久久久久毛片| 给我免费播放毛片高清在线观看| 国产三级中文精品| 亚洲一区高清亚洲精品| 久久精品久久久久久噜噜老黄 | 成人午夜高清在线视频| 波野结衣二区三区在线| 一本一本综合久久| 男人舔奶头视频| 老司机午夜福利在线观看视频| 男女做爰动态图高潮gif福利片| 内地一区二区视频在线| 亚洲,欧美精品.| 日日夜夜操网爽| 免费看美女性在线毛片视频| 老熟妇乱子伦视频在线观看| 午夜福利视频1000在线观看| 国产精品综合久久久久久久免费| 国产亚洲欧美98| 国产高清激情床上av| 夜夜躁狠狠躁天天躁| 一个人免费在线观看的高清视频| 亚洲va日本ⅴa欧美va伊人久久| 国产精品久久视频播放| 人妻夜夜爽99麻豆av| 久久久久性生活片| 五月伊人婷婷丁香| 老女人水多毛片| 国产精品日韩av在线免费观看| 久久久久久久久久成人| 欧美xxxx黑人xx丫x性爽| 国产精品一区二区性色av| 国产亚洲欧美98| 色吧在线观看| 亚洲国产日韩欧美精品在线观看| 久久亚洲真实| 麻豆久久精品国产亚洲av| 国产中年淑女户外野战色| 91av网一区二区| 亚洲av中文字字幕乱码综合| 久久久久九九精品影院| 久久国产乱子免费精品| 99精品在免费线老司机午夜| 国模一区二区三区四区视频| 村上凉子中文字幕在线| 啦啦啦观看免费观看视频高清|