• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Boundary-aware texture region segmentation from manga

    2017-06-19 19:20:12XuetingLiuChengzeLiandTienTsinWong
    Computational Visual Media 2017年1期

    Xueting Liu,Chengze Li,and Tien-Tsin Wong()

    Boundary-aware texture region segmentation from manga

    Xueting Liu1,2,Chengze Li1,2,and Tien-Tsin Wong1,2()

    Due to the lack of color in manga (Japanese comics),black-and-white textures are often used to enrich visual experience.With the rising need to digitize manga,segmenting texture regions from manga has become an indispensable basis for almost all manga processing,from vectorization to colorization.Unfortunately,such texture segmentation is not easy since textures in manga are composed of lines and exhibit similar features to structural lines (contour lines).So currently,texture segmentation is still manually performed,which is labor-intensive and time-consuming.To extract a texture region,various texture features have been proposed for measuring texture similarity,but precise boundaries cannot be achieved since boundary pixels exhibit dif f erent features from inner pixels.In this paper,we propose a novel method which also adopts texture features to estimate texture regions.Unlike existing methods,the estimated texture region is only regarded an initial,imprecise texture region.We expand the initial texture region to the precise boundary based on local smoothness via a graph-cut formulation.This allows our method to extract texture regions with precise boundaries.We have applied our method to various manga images and satisfactory results were achieved in all cases.

    manga;texture segmentation

    1 Introduction

    Manga is a world-wide popular form ofentertainment enjoyed by people of all ages (see Fig.1).Nowadays,with the development of electronic devices,more and more people read manga on electronic devices such as computers, tablets,and even cellphones.With the power of electronic devices,manga can be presented in a visually enriched form by adding color,motion,or stereoscopic ef f ects.There is thus a rising trend in the manga industry to convert legacy manga books into digital versions.During the digitization process, one major challenge is to segment texture regions from manga as a basis for various applications such as vectorization and colorization.However,texture region segmentation is not easy for manga since textures in manga are composed of lines.This leads to the diffi culty of discriminating structural lines from textures,and further leads to the diffi culty of identifying the precise boundary of each texture region.Therefore,texture region segmentation is still manually performed in the manga industry currently.As one may imagine,this process is quite tedious and time-consuming.

    Fig.1 Manga images.

    To help identify and classify textures,various texture features have been proposed in the computer vision f i eld[1].The similarity of the texture features for two pixels shows whether these two pixels are inside the same texture region.Texture segmentationor texture classif i cation techniques can be further applied to extract texture regions based on similarity of texture features.However,since texture features are analyzed within a local neighborhood,pixels in the interior of a texture region(e.g.,the blue box in Fig.2)and pixels near the boundary of a texture region(e.g.,the red and orange boxes in Fig.2)exhibit dif f erent texture features(see Fig.2(c)),even though they belong to the same texture region.Thus,pixels near texture boundaries may be mistakenly regarded as not belonging to that texture region.This will lead to imprecise boundaries of the segmented texture region if only similarity of texture features is considered(see Fig.3).

    To resolve the boundary issue,we noticed that texture smoothing techniques are quite powerful in suppressing local textures while still preserving sharp boundaries.Using texture smoothing methods, one could suggest f i rst smoothing the input manga image,and then performing intensity-based image segmentation to extract texture regions. However,a texture region may have spatial varying textures(e.g.,the background region in Fig.4(a) changes from dark to light vertically).Furthermore, the texture smoothing method is also incapable of dif f erentiating textures with similar overall intensities(as in Fig.4(b)).Therefore,we still need to analyze texture features in order to robustly handle spatial-varying textures and textures with similar intensities.

    Fig.3 Texture region segmentation based on Gabor features.Note that a precise boundary cannot be achieved.

    Fig.4(a)A background region with a spatial-varying texture.(b) Dif f erent textures may have similar intensities after smoothing.

    In this paper,we propose a novel,user-interactionbased texture region segmentation system,which integrates texture feature analysis and texture smoothing techniques in order to segment texture regions with precise boundaries from manga images. The user may draw one or several strokes inside the region or object to be segmented.Our system automatically estimates a region mask from the user input.To do so,we f i rst summarize the texture features of the user-drawn strokes,and estimate an initial texture region having similar texture features to the user-drawn strokes.In this step,we adopt a conservative similarity measurement for estimating the initial texture regions in order to guarantee that all pixels inside the initial texture region lie within the user-specif i ed region.We then expand the initial texture region to the precise boundaries using a graph-cut formulation.We formulate the accumulated smoothness dif f erence from the initial texture region as the data cost,and the local smoothness dif f erence as the smoothness cost.Using this formulation,we can extract the user-specif i ed region with a precise boundary.

    We demonstrate the ef f ectiveness of our texture region segmentation system on a set of manga images containing various types of textures, including regular textures,irregular textures,and spatial-varying textures.Our contributions can be summarized as follows:

    ?We propose a novel user-interaction-based system for extracting texture regions with precise boundaries.

    ?Our system can handle user strokes drawn across multiple dif f erent textures simultaneously.

    2 Related work

    While only a few existing research work are tailored for extracting texture regions from manga, extensive research has been done on identifying and segmenting textures from natural photos.We can roughly classify this related research into three categories:feature-based texture segmentation, regular texture analysis,and texture smoothing.

    Feature-based texture segmentation.Texture features are based on statistical models describing local characteristics of point neighborhoods.The statistical model of a texture feature usually contains a range of texture properties,such as size,aspect ratio,orientation, brightness,and density[3].Various texture features have been proposed to describe and model textures including various Gabor f i lters[4],f i lter bank responses[5],random f i eld models,wavelet representations,and so on.In particular,the Gabor f i lter was adopted by Ref.[6]to analyze textures in manga,and is still considered to be the state-of-the-art method.Texture features are utilized in various applications such as texture classif i cation,segmentation,and synthesis.Texture segmentation methods fall into two categories, supervised[7]and unsupervised[6,8].In particular, Ref.[6]proposed to segment textures in manga images via a level-set method based on Gabor features.However,as we have stated,although feature-based texture segmentation methods can identify textures well and dif f erentiate between textures,precise region boundaries cannot be achieved since boundary pixels exhibit dif f erent texture features from interior pixels.In contrast, our method can extract texture regions with precise boundaries.

    Regular texture analysis.For regular or nearregular texture patterns,attempts have been made to detect and analyze the regularity of the textures based on spatial relationships[9–11].In particular, Liuy et al.[12]considered how to detect and remove fences in natural images.A stream of research has also considered de-screening,i.e.,detecting and smoothing halftone textures.An in depth survey of de-screening can be found in Ref.[13].Kopf and Lischinski[14]discussed how to extract halftone patterns in printed color comics by modeling dot patterns.Very recently,Yao et al.[15]considered how to extract textures from manga by modeling three specif i c texture primitives,dots,stripes,and grids.However,these methods can only handle a small set of pre-def i ned regular textures.In comparison,our method can handle regular or irregular,and even spatial-varying,textures of the kind that exist in real manga images.

    Texture smoothing.In order to dif f erentiate textures and structures,various edge-preserving texture smoothing methods have been proposed, such as the total variation regularizer[16–19], bilateral f i ltering[20–22],local histogram-based f i ltering[23],weighted least squares[24],extrema extraction and extrapolation[25],L0gradient optimization[26],and relative total variation (RTV)[2].While these texture smoothing methods may suppress local oscillations based on local information,they are incapable of identifying a texture region or dif f erentiating between two textures.This is because that these methods do not model textures or structures,so they do not have a higher-level understanding of the semantics of the textures.In this paper,we thus utilize texture features to identify textures,but we incorporate texture smoothing techniques to identify sharp texture boundaries.

    3 Overview

    The input to our system includes a manga image (see Fig.5(a))and one or more user-specif i ed strokes (see Fig.5(b)).To extract regions with similar textures to the ones identif i ed by the user-specif i ed strokes,we f i rst summarize the texture features of the pixels belonging to the strokes.The texture features we use are Gabor features,which are also used by Ref.[6]in a manga colorization application. Since textures may spatially vary,and one userspecif i ed stroke may go across several texture regions (see Fig.5(b)),we summarize several main texture features inside each stroke by clustering.Then wecalculate the similarity between the texture feature of each pixel in the manga image and the clustered main texture features,to form a texture similarity map(see Fig.5(c)).In this map,intensity of pixel values indicates similarity of texture feature values with those of the user-specif i ed strokes.Based on the computed texture similarities,we then obtain one or several initial texture regions using a graphcut formulation(see Fig.5(d)).Our initial texture region extraction method is detailed in Section 4.

    Fig.5 System overview(image size:778×764,loading time:3.51 s,processing time:0.58 s).

    We then expand the initial texture regions to their precise boundaries.To do so,we f i rst obtain a smoothed image from the input image using texture smoothing techniques(see Fig.5(e)).Amongst all existing texture smoothing techniques,we found that the RTV metric proposed by Ref.[2]is the most ef f ective at smoothing textures while preserving sharp structures in manga images.In the smoothed image,if two neighboring pixels have close intensity values,it means that these two pixels are very likely to be inside the same texture region.Conversely, if two neighboring pixels have a jump in intensity values,it means that these two pixels are very likely to be inside two dif f erent regions.Therefore,we can dif f use the initial texture regions using local intensity continuity of the smoothed image to obtain a dif f usion map(see Fig.5(f)).This dif f usion map shows how smoothly each pixel is connected to the initial regions.If a pixel has a low dif f usion value, it means that this pixel is smoothly connected to the initial region,so it is very likely that this pixel is inside the same texture region.Conversely,a pixel with a high dif f usion value is very likely to lie outside the texture region.Finally,we extract the precise texture region based on the dif f usion map using another graph-cut formulation(see Figs.5(g) and 5(h)).Our region expansion method is detailed in Section 5.

    We have evaluated our system with various manga,and the results are shown in Section 6.We also show how users can easily adjust the retrieved texture region using a single parameter.

    4 Initial region extraction

    Given an input manga image,and one or more userspecif i ed strokes,we f i rst extract the initial regions with similar textures to the user-specif i ed strokes. To do so,we f i rst summarize the texture features of the pixels inside the strokes.Then we obtain a texture similarity map which gives the texture similarity between each pixel in the image and the summarized texture features.The initial regions are then extracted using a graph-cut formulation.

    4.1 Texture feature summarization

    To judge whether two pixels have similar textures, we use statistical features in the Gabor wavelet domain[27],which have already proved useful in dif f erentiating textures in manga[6].A Gabor feature vector is an M×N-dimensional vector where M is the number of scales and N is the number of orientations used in the Gabor feature analysis. In this paper,we f i x the numbers of scales and orientations to M=4 and N=6 respectively in all our experiments.Therefore,for a pixel p in themanga image,its Gabor feature vector Gpdescribing local texture feature around p is 24-dimensional.

    Given a set of user-specif i ed strokes U={u1,u2, ...},we could calculate a main texture feature for these strokes by averaging the texture features of all pixels inside the strokes aswhere |U|is the cardinality of U.However,a textured area may have a spatial-varying texture,or a single user-specif i ed stroke may also go across multiple textured areas at the same time.For example,the moon in Fig.5 cannot be specif i ed by a single texture feature vector.Therefore,we represent the textures determined by the user-specif i ed strokes using multiple texture feature vectors.To extract the most representative textures for the user-specif i ed strokes,we use the k-means clustering method to cluster the texture features of all pixels inside the strokes into k groups as TU={T1,...,Tk}.These satisfy:

    4.2 Initial region extraction via graph-cut

    From the summarized representative texture feature vectors of the user-specif i ed strokes,we then calculate the texture similarity value between each pixel p and the representative textures TUas

    Pixels with higher texture similarity are more likely to be inside the texture region specif i ed by the user.

    Using the calculated texture similarity,we extract an initial texture region via a graph-cut formulation. There are two terminal nodes,the source and the sink,in our graph.Each pixel p in the image corresponds to a non-terminal node np,which is connected to both source and sink.If the graphcut result connects a non-terminal node(pixel)to the source,it means this pixel lies inside the initial regions.Otherwise,this pixel lies outside the initial regions.Each edge connecting a non-terminal node and a terminal node is associated with a data cost:

    For every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqby an edge.Each edge connecting two non-terminal nodes is associated with a smoothness cost which measures our conf i dence that these two neighboring pixels should be assigned the same label,and therefore belong to the same texture region.We model the smoothness cost as the magnitude of the dif f erence of the texture feature vectors of these two nodes npand nq:

    Intuitively speaking,if two neighboring pixels p and q have similar textures,the smoothness cost S(np,nq) should be low,and there is high probability for them to be assigned the same label,and so belong to the same texture region.

    After constructing the graph,we can obtain the optimal cut through an optimization process which minimizes the energy function:

    where u∈{source,sink}is the label,and wcweights the data cost and smoothness cost.We experimentally set wcto 1 in all our experiments. The pixels labeled as source after graph-cut form the initial regions.Since other regions might also have similar patterns to the user-specif i ed region, we remove regions that do not intersect the userspecif i ed strokes.For regions that contain spatialvarying textures,dif f erent strokes may lead to dif f erent initial regions(see Fig.6)and af f ect the followed expansion.The user-specif i ed strokes should go across all dif f erent textures in order to achieve good segmentation results.

    5 Initial region expansion

    Fig.6 Initial region extraction from strokes.Top:several userspecif i ed strokes.Bottom:corresponding extracted initial regions.

    Starting from the extracted initial regions,we now show how we expand the regions to their precise boundaries.In short,we f i rst smooth the input image and dif f use the initial regions based on the smoothed image.Then we extract the f i nal texture region with precise boundary via another graph-cut formulation.

    5.1 Smoothness map construction

    To smooth a manga image,we have experimentally determined that the relative total variation method proposed by Xu et al.[2]performs best among the large pool of existing methods.This is mainly because that this measurement is more tolerant of high-contrast textures than other methods.This method has two input parameters λ and σ.Here,λ is a weighting factor which we set to 0.015 in all our experiments.σ is the local window size for measuring local oscillations(textures).In our experiments,we found that σ=5 works best for most textures.But when the texture is sparser,we may need to assign σ a higher value.Some texture smoothing results of manga images are shown in Figs.4 and 5(e).

    In the smoothed image,if two neighboring pixels have similar intensity values,it is very likely that they are in the same texture region,and vice versa. We also observe that texture regions in manga images are usually enclosed by black boundary lines. Therefore,we can judge whether a pixel is likely to be inside the user-specif i ed region by measuring whether this pixel is smoothly connected to the initial regions.Here,by saying smoothly connected,we mean that there exists a path from this pixel to the initial regions where the intensity values of the pixels change smoothly along the path.Formally,given the initial regions R and a pixel p outside the initial region,we def i ne a path from R to p as a set of pixels h(R,p)={q1,...,ql}where,q1∈R and qL=p.Here,is the L1-norm operator.We can measure whether p is smoothly connected to R along a path h(R,p)by accumulating the intensity dif f erences along this path:

    where J is the smoothed manga image.Since there is more than one path from p to R,we can measure whether p is smoothly connected to R by taking the minimal smoothness value of all possible paths:

    In a practical implementation,we compute the above smoothness values via a dif f usion process.More concretely,we f i rst construct a dif f usion map F by setting pixels inside the initial regions to 0 and pixels outside initial regions to+∞.Then we iteratively update the smoothness value of each pixel based on its surrounding pixels using:

    We visualize the dif f usion process in Fig.7.

    5.2 Final region extraction via graph-cut

    While we could extract a f i nal texture region by thresholding the dif f usion map,we have found that naive thresholding generally leads to bumpy and leaky boundaries.To avoid these issues,we formulate another graph to extract the f i nal region. As in the previous graph-cut formulation,each pixel p is formulated as a non-terminal node np,and is connected to two terminal nodes,the source and the sink.If the graph-cut result labels a non-terminal node(pixel)as connected to the source,it means this pixel is inside the f i nal texture region;otherwise,it is outside.The data cost associated with each edge connecting a terminal node and a non-terminal node measures how likely this pixel is smoothly connected to the initial region based on the smoothness map, and is expressed as

    Fig.7 The dif f usion process.

    where σsis empirically set to 0.05 in all our experiments.Intuitively speaking,if a pixel p has low smoothness value Fp,(np,source)should be relatively high,and there is a high probability that npwill be connected to the source.Similarly,for every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqwith an edge.The smoothness cost associated with each edge connecting two non-terminal nodes measures how likely the two neighboring pixels are to have the same label.We connect the nodes for every pair of nieghboring pixels p and q by an edge,whose associated cost is

    Intuitively,if the intensity values of two neighboring pixels are similar in the smoothed image,there is a high probability that they are in the same texture region.Finally,we solve this graph-cut problem by minimizing the following energy function:

    where u∈{source,sink}is the label,and wvweights data and smoothness costs.We empirically set wvto 0.25 in all our experiments.After graph-cut,pixels assigned to source form the f i nal texture region.

    5.3 User control

    Given a set of user-specif i ed strokes,while our system quite stably extracts texture regions based on a set of pre-def i ned parameters,we also allow user control. We let the user control the f i nal region by adjusting a single parameter z∈[?1,1].The smaller z,the smaller the f i nal texture region will be,and vice versa.We achieve this by incorporating z in the graph-cut formulation;in particular,the data cost is re-def i ned as

    where P(v,c1,c2)is a piecewise function def i ned as

    If z is set to?1,all pixels are labeled as sink and the extracted region is empty.If z is set to 1,all pixels are labeled as source and the extracted region is the whole image.The default value of z is 0.We show an example of parameter tuning in Fig.8.Even though user can control the extracted region with this parameter,our method is quite stable.In fact, the extracted region is constant for z∈[?0.6,0.6]in this case.

    6 Results and discussion

    6.1 Validation

    To validate the ef f ectiveness of our method,we have applied our methods to manga images with a variety of texture patterns,including regular patterns(e.g., as in Fig.9),near-regular patterns(e.g.,as in Fig.10),and irregular patterns(e.g.,as in Figs.5 and 11).We also compare our results with two stateof-the-art methods tailored for manga colorization[6] and manga vectorization[15]respectively.

    Figure 9(a)shows a manga image of a cat with a dot pattern.While the feature-based method[6]failed to detect precise boundaries of the texture regions(see Fig.9(b)),the primitivebased method[15]correctly detects the regions by formulating a specif i c dot pattern model(see Fig.9(c)).However,Yao et al.’s method makes very strong assumptions about the primitives in the textures,so they can only handle well textures such as dots,stripes,and grids.In comparison,we make no assumption about the primitives in the textures, but our method can still achieve similar results to those in Ref.[15]by use of texture feature analysis and smoothness dif f usion(see Fig.9(d)).Figure 10 shows another comparison between the results of Ref.[6]and our method.While both their method and ours analyze the texture of the user-specif i edstroke,their method only uses a single texture feature to represent the whole stroke.Therefore, their method is incapable of f i nding texture regions that are spatial-varying(see Fig.10(b)).In contrast, our method can handle spatial-varying textures well (see Fig.10(c)).

    Fig.8 User control of the extracted region via a single parameter z.

    Fig.10‘Boy’(1712×907,loading time:9.72 s,processing time:2.07 s).

    Fig.11‘Cars’(762×843,loading time:4.03 s,processing time:1.38 s).

    In Figs.5 and 11 the user specif i es texture regions with large spatial variation,especially in Fig.11. By using a set of texture feature vectors to represent the user-specif i ed strokes,our method successfully extracts the texture regions with precise boundaries. Since a manga image may contain multiple textures, we also allow user to specify several sets of strokes indicating dif f erent texture regions.The userspecif i ed strokes are sequentially processed so that the user can control the extracted regions more easily.We show three examples in Figs.12–14 where each input image contains multiple dif f erent textures including solid-color regions,regular texture regions, and irregular texture regions.Our method achieves good results in all cases.

    6.2 Timing statistics

    Fig.12‘Poker’(838×1210,loading time:7.05 s,processing time: 4.72 s).

    Fig.13‘Basketball’(1251×1013,loading time:7.97 s,processing time:6.27 s).

    Fig.14‘Astonished’(1251×1013,loading time:7.10 s,processing time:5.36 s).

    All of our experiments were conducted on a PC with a 2.7 GHz CPU and 64 GB memory;all tests were single threaded,unoptimized code,and no GPU was used.We break down the computational time for each example into two parts,the loading time and the processing time(given in the caption of each f i gure).The loading time is the time spent immediately when the user loads an image into the system,and can be regarded as the offl ine computation time.The reaction time is the time spent when the system returns the extracted region after the strokes have been drawn,and can be regarded as the online computation time.Whenever the user draws a new stroke or adjusts the control parameter,only the online parts need to be reexecuted.We observe that total computation time depends strongly on the resolution of the input image.

    6.3 Limitations

    One of our limitations concerns the latent assumption that the boundaries between two regions are sharp and smooth.Currently,we cannot handle blurred boundaries well.Furthermore,if the boundary of a region is quite spiky(e.g.,the shock balloons in Fig.14(a)),our current graph-cut formulation will result in a smoothed boundary (e.g.,blue regions in Fig.14(b)).Our method also cannot separate neighboring regions if they are visually inseparable.For example,in Fig.14(a), the black boundary of the top left shock balloon is connected to the black background of the second panel.Furthermore,the boundary in the smoothed image may deviate by one or two pixels from the original boundary due to limitations of the texture smoothing technique.In this case,we may also fail to extract the precise boundaries of the texture regions.

    7 Conclusions

    In this paper,we have proposed a novel system to extract texture regions with precise boundaries. Our method starts from an input image and a set of user-specif i ed strokes,and extracts initial regions containing pixels with similar textures, using Gabor wavelets.However,texture features, such as Gabor wavelets,cannot provide precise boundaries.We further smooth the original image via a texture smoothing technique,and ref i ne the initial regions based on the smoothed image.Our method outperforms existing methods in extracting precise boundaries especially for spatial-varying textures.

    While our method currently assumes hard boundaries,we could adopt matting techniques instead of the current graph-cut formulation to restore regions with alpha values.We also note that identif i cation of regions depends highly on the semantics of the image content,and introducing perception-based edge extraction techniques could help extract more precise boundaries.

    Acknowledgements

    This project was supported by the National Natural Science Foundation of China(Project No.61272293),and Research Grants Council of the Hong Kong Special Administrative Region under RGC General Research Fund(Project Nos.CUHK14200915 and CUHK14217516).

    [1]Tuytelaars,T.;Mikolajczyk,K.Local invariant feature detectors:A survey.Foundations and Trends in Computer Graphics and Vision Vol.3,No.3,177–280, 2008.

    [2]Xu,L.;Yan,Q.;Xia,Y.;Jia,J.Structure extraction from texture via relative total variation. ACM Transactions on Graphics Vol.31,No.6,Article No.139,2012.

    [3]Julesz,B.Textons,the elements of texture perception, and their interactions.Nature Vol.290,91–97,1981.

    [4]Weldon,T.P.;Higgins,W.E.;Dunn,D.F.Effi cient Gabor f i lter design for texture segmentation.Pattern Recognition Vol.29,No.12,2005–2015,1996.

    [5]Varma,M.;Zisserman,A.Texture classif i cation: Are f i lter banks necessary?In:Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,Vol.2,II-691-8,2003.

    [6]Qu,Y.;Wong,T.-T.;Heng,P.-A.Manga colorization. ACM Transactions on Graphics Vol.25,No.3,1214–1220,2006.

    [7]Hofmann,T.;Puzicha,J.;Buhmann,J.M. Unsupervised texture segmentation in a deterministic annealing framework.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.20,No.8,803–818,1998.

    [8]Paragios,N.;Deriche,R.Geodesic active regions for supervised texture segmentation.In:Proceedings of the 7th IEEE International Conference on Computer Vision,Vol.2,926–932,1999.

    [9]Hays,J.;Leordeanu,M.;Efros,A.A.;Liu, Y.Discovering texture regularity as a higher-order correspondence problem.In:Computer Vision–ECCV 2006.Leonardis,A.;Bischof,H.;Pinz,A.Eds. Springer Berlin Heidelberg,522–535,2006.

    [10]Liu,Y.;Collins,R.T.;Tsin,Y.A computational model for periodic pattern perception based on frieze and wallpaper groups.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.26,No.3,354–371,2004.

    [11]Liu,Y.;Lin,W.-C.;Hays,J.Near-regular texture analysis and manipulation.ACM Transactions on Graphics Vol.23,No.3,368–376,2004.

    [12]Liuy,Y.;Belkina,T.;Hays,J.H.;Lublinerman,R. Image de-fencing.In:Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,1–8, 2008.

    [13]Siddiqui,H.;Boutin,M.;Bouman,C.A.Hardwarefriendly descreening.IEEE Transactions on Image Processing Vol.19,No.3,746–757,2010.

    [14]Kopf,J.;Lischinski,D.Digital reconstruction of halftoned color comics.ACM Transactions on Graphics Vol.31,No.6,Article No.140,2012.

    [15]Yao,C.-Y.;Hung,S.-H.;Li,G.-W.;Chen,I.-Y.; Adhitya,R.;Lai,Y.-C.Manga vectorization and manipulation with procedural simple screentone.IEEE Transactions on Visualization and Computer Graphics Vol.23,No.2,1070–1084,2017.

    [16]Aujol,J.-F.;Gilboa,G.;Chan,T.;Osher,S.Structuretexture image decomposition—Modeling,algorithms, and parameter selection.International Journal of Computer Vision Vol.67,No.1,111–136,2006.

    [17]Meyer,Y.Oscillating Patterns in Image Processing and Nonlinear Evolution Equations:The Fifteenth Dean Jacqueline B.Lewis Memorial Lectures. American Mathematical Society,2001.

    [18]Rudin,L.I.;Osher,S.;Fatemi,E.Nonlinear total variation based noise removal algorithms.Physica D: Nonlinear Phenomena Vol.60,Nos.1–4,259–268, 1992.

    [19]Yin,W.;Goldfarb,D.;Osher,S.Image cartoontexture decomposition and feature selection using the total variation regularized L1functional.In: Variational,Geometric,and Level Set Methods in Computer Vision.Paragios,N.;Faugeras,O.;Chan, T.;Schn¨orr,C.Eds.Springer Berlin Heidelberg,73–84,2005.

    [20]Durand,F.;Dorsey,J.Fast bilateral f i ltering for the display of high-dynamic-range images.ACM Transactions on Graphics Vol.21,No.3,257–266, 2002.

    [21]Fattal,R.;Agrawala,M.;Rusinkiewicz,S.Multiscale shape and detail enhancement from multi-light image collections.ACM Transactions on Graphics Vol.26, No.3,Article No.51,2007.

    [22]Paris,S.;Durand,F.A fast approximation of the bilateral f i lter using a signal processing approach.In: Computer Vision–ECCV 2006.Leonardis,A.;Bischof, H.;Pinz,A.Eds.Springer Berlin Heidelberg,568–580, 2006.

    [23]Kass,M.;Solomon,J.Smoothed local histogram f i lters.ACM Transactions on Graphics Vol.29,No. 4,Article No.100,2010.

    [24]Farbman,Z.;Fattal,R.;Lischinski,D.;Szeliski, R.Edge-preserving decompositions for multi-scale tone and detail manipulation.ACM Transactions on Graphics Vol.27,No.3,Article No.67,2008.

    [25]Subr,K.;Soler,C.;Durand,F.Edge-preserving multiscale image decomposition based on local extrema.ACM Transactions on Graphics Vol.28,No. 5,Article No.147,2009.

    [26]Xu,L.;Lu,C.;Xu,Y.;Jia,J.Image smoothing via L0gradient minimization.ACM Transactions on Graphics Vol.30,No.6,Article No.174,2011.

    [27]Manjunath,B.S.;Ma,W.-Y.Texture features for browsing and retrieval of image data.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.18,No.8,837–842,1996.

    Chengze Lireceived his B.S.degree from University of Science and Technology of China in 2013.He is currently a Ph.D.student in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.His research interests include computer vision,pattern recognition,and high-performance computing.

    Tien-Tsin Wongreceived his B.Sc., M.Phil.,and Ph.D.degrees in computer science from the Chinese University of Hong Kong in 1992,1994,and 1998,respectively.He is currently a professor in the Department of Computer Science and Engineering, the Chinese University of Hong Kong. His main research interests include computer graphics, computational manga,precomputed lighting,imagebased rendering,GPU techniques,medical visualization, multimedia compression,and computer vision.He received the IEEE Transactions on Multimedia Prize Paper Award 2005 and the Young Researcher Award 2004.

    Open AccessThe articles published in this journal are distributed under the terms of the Creative Commons Attribution 4.0 International License(http:// creativecommons.org/licenses/by/4.0/),which permits unrestricted use,distribution,and reproduction in any medium,provided you give appropriate credit to the original author(s)and the source,provide a link to the Creative Commons license,and indicate if changes were made.

    Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript,please go to https://www. editorialmanager.com/cvmj.

    iu

    her B.Eng. degree from Tsinghua University and Ph.D.degree from the Chinese University of Hong Kong in 2009 and 2014,respectively.She is currently a postdoctoral research fellow in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.Her research interests include computer graphics,computer vision,computational manga and anime,and non-photorealistic rendering.

    1 The Chinese University of Hong Kong,Hong Kong, China.E-mail:X.Liu,xtliu@cse.cuhk.edu.hk;C.Li, czli@cse.cuhk.edu.hk;T.-T.Wong,ttwong@cse.cuhk. edu.hk().

    2 Shenzhen Research Institute,the Chinese University of Hong Kong,Shenzhen,China.

    Manuscript received:2016-09-09;accepted:2016-12-20

    有码 亚洲区| 久久久久久伊人网av| 欧美xxxx性猛交bbbb| 久久女婷五月综合色啪小说| 99热全是精品| 国产精品99久久99久久久不卡 | 在线观看免费日韩欧美大片 | 18禁在线播放成人免费| 99热这里只有是精品50| 免费不卡的大黄色大毛片视频在线观看| 麻豆乱淫一区二区| 男的添女的下面高潮视频| 曰老女人黄片| 国产成人一区二区在线| 亚洲av电影在线观看一区二区三区| 精品国产一区二区久久| 久久热精品热| 亚洲精品,欧美精品| 亚洲国产精品专区欧美| √禁漫天堂资源中文www| 国产高清有码在线观看视频| 91久久精品国产一区二区成人| 国产男女内射视频| 日本av手机在线免费观看| 久久 成人 亚洲| 丝袜脚勾引网站| 午夜精品国产一区二区电影| 美女视频免费永久观看网站| 老熟女久久久| 国产午夜精品一二区理论片| 国产在线免费精品| 男女免费视频国产| 亚洲熟女精品中文字幕| av天堂久久9| 久久久久久久久久人人人人人人| 日韩在线高清观看一区二区三区| 亚洲精品一区蜜桃| 国产 精品1| 最后的刺客免费高清国语| 一区二区三区免费毛片| 最近2019中文字幕mv第一页| 美女视频免费永久观看网站| 亚洲精品乱码久久久v下载方式| 97在线人人人人妻| 欧美成人精品欧美一级黄| 国产老妇伦熟女老妇高清| 欧美丝袜亚洲另类| 国产高清三级在线| 日本91视频免费播放| 欧美丝袜亚洲另类| 一级av片app| 欧美一级a爱片免费观看看| 久久国产精品男人的天堂亚洲 | 国产精品国产av在线观看| 欧美成人午夜免费资源| 国产精品蜜桃在线观看| 精品一区二区三卡| 日韩 亚洲 欧美在线| av女优亚洲男人天堂| 久久99热6这里只有精品| 超碰97精品在线观看| 久久99热6这里只有精品| 另类精品久久| 国产中年淑女户外野战色| 热re99久久精品国产66热6| 欧美丝袜亚洲另类| 中文字幕免费在线视频6| 蜜桃久久精品国产亚洲av| 亚洲不卡免费看| 国产在线男女| 成年人免费黄色播放视频 | 亚洲av成人精品一二三区| 亚洲av福利一区| 人人妻人人添人人爽欧美一区卜| 午夜福利影视在线免费观看| 国产欧美日韩一区二区三区在线 | 九草在线视频观看| 99视频精品全部免费 在线| 22中文网久久字幕| 汤姆久久久久久久影院中文字幕| 国产精品偷伦视频观看了| 桃花免费在线播放| 日韩强制内射视频| 最近中文字幕高清免费大全6| 黑人猛操日本美女一级片| videossex国产| 精品人妻熟女av久视频| 在线 av 中文字幕| 妹子高潮喷水视频| 亚洲av综合色区一区| 乱人伦中国视频| 男女无遮挡免费网站观看| 久久久精品免费免费高清| 少妇的逼水好多| 亚洲经典国产精华液单| 美女大奶头黄色视频| 美女国产视频在线观看| 青青草视频在线视频观看| 一本一本综合久久| av有码第一页| 亚洲人成网站在线观看播放| 亚洲在久久综合| h日本视频在线播放| 日本-黄色视频高清免费观看| 蜜臀久久99精品久久宅男| 一区二区三区精品91| 亚洲精品久久久久久婷婷小说| 亚洲国产av新网站| 夫妻午夜视频| 18禁在线播放成人免费| av不卡在线播放| 青春草国产在线视频| 蜜桃久久精品国产亚洲av| kizo精华| av天堂中文字幕网| 国产免费一级a男人的天堂| 一区二区三区精品91| 美女国产视频在线观看| av一本久久久久| 国语对白做爰xxxⅹ性视频网站| 亚洲欧美一区二区三区国产| 亚洲精品国产av蜜桃| 国产成人精品一,二区| 人妻制服诱惑在线中文字幕| 你懂的网址亚洲精品在线观看| 国产黄频视频在线观看| 亚洲一级一片aⅴ在线观看| av天堂久久9| 久久韩国三级中文字幕| 欧美精品一区二区大全| 亚洲精品亚洲一区二区| 99热网站在线观看| 天美传媒精品一区二区| 女性被躁到高潮视频| 18禁动态无遮挡网站| 一级爰片在线观看| 亚洲国产日韩一区二区| 国产黄色免费在线视频| 成人国产av品久久久| 久久久久人妻精品一区果冻| 国精品久久久久久国模美| 免费观看的影片在线观看| 熟女人妻精品中文字幕| 特大巨黑吊av在线直播| 亚洲国产精品国产精品| h视频一区二区三区| 女性生殖器流出的白浆| 只有这里有精品99| 少妇人妻 视频| 人人妻人人澡人人看| 日韩不卡一区二区三区视频在线| 亚洲精品国产色婷婷电影| 18禁在线无遮挡免费观看视频| 久久久久国产精品人妻一区二区| 极品少妇高潮喷水抽搐| 亚洲性久久影院| 午夜福利网站1000一区二区三区| 日日啪夜夜撸| 日日撸夜夜添| 日本爱情动作片www.在线观看| 黄色视频在线播放观看不卡| 三上悠亚av全集在线观看 | 精品久久久久久电影网| 成人18禁高潮啪啪吃奶动态图 | 亚洲av电影在线观看一区二区三区| 高清欧美精品videossex| 免费看日本二区| 国产色婷婷99| 人人妻人人澡人人爽人人夜夜| 深夜a级毛片| 搡老乐熟女国产| 男人和女人高潮做爰伦理| 欧美丝袜亚洲另类| 新久久久久国产一级毛片| 免费看光身美女| 天天躁夜夜躁狠狠久久av| 草草在线视频免费看| 99九九线精品视频在线观看视频| 国产精品嫩草影院av在线观看| 乱人伦中国视频| 伊人久久精品亚洲午夜| 日产精品乱码卡一卡2卡三| 国产欧美另类精品又又久久亚洲欧美| av在线播放精品| 亚洲欧美日韩另类电影网站| 99热这里只有是精品50| 久久久国产精品麻豆| 18禁裸乳无遮挡动漫免费视频| 久久久久国产精品人妻一区二区| 国产成人精品福利久久| 夫妻午夜视频| 乱码一卡2卡4卡精品| 男女啪啪激烈高潮av片| 卡戴珊不雅视频在线播放| 亚洲av成人精品一区久久| 亚洲精品久久久久久婷婷小说| 国产成人a∨麻豆精品| 国产色婷婷99| 搡女人真爽免费视频火全软件| 在线 av 中文字幕| 人人妻人人澡人人爽人人夜夜| 九色成人免费人妻av| 亚洲av综合色区一区| 在线天堂最新版资源| 亚洲精品国产成人久久av| 免费黄网站久久成人精品| 边亲边吃奶的免费视频| 六月丁香七月| 高清视频免费观看一区二区| 少妇的逼好多水| 国产精品蜜桃在线观看| 22中文网久久字幕| 亚洲丝袜综合中文字幕| 美女大奶头黄色视频| 一区二区三区四区激情视频| 午夜视频国产福利| 各种免费的搞黄视频| 麻豆精品久久久久久蜜桃| 色网站视频免费| 国产精品欧美亚洲77777| 欧美 日韩 精品 国产| 亚洲欧美精品自产自拍| 91精品国产国语对白视频| 美女中出高潮动态图| 欧美日韩精品成人综合77777| av卡一久久| 亚洲不卡免费看| 成人亚洲欧美一区二区av| 国产伦精品一区二区三区视频9| 久久精品久久精品一区二区三区| 久久精品久久久久久久性| 中文字幕制服av| 日韩一区二区视频免费看| 人体艺术视频欧美日本| 精品国产一区二区久久| 亚洲成人手机| 久久久久久久国产电影| 中国美白少妇内射xxxbb| 婷婷色麻豆天堂久久| 国国产精品蜜臀av免费| 亚洲av免费高清在线观看| tube8黄色片| 国产av精品麻豆| 日韩制服骚丝袜av| 综合色丁香网| a级毛色黄片| 熟女av电影| 久久99精品国语久久久| 春色校园在线视频观看| 国产精品.久久久| 国产亚洲午夜精品一区二区久久| 中文在线观看免费www的网站| 日本与韩国留学比较| 我要看黄色一级片免费的| 国产毛片在线视频| 亚洲精品国产成人久久av| 十八禁高潮呻吟视频 | 久热久热在线精品观看| 久久久午夜欧美精品| 最近手机中文字幕大全| 欧美日本中文国产一区发布| 美女xxoo啪啪120秒动态图| 一个人免费看片子| 亚洲美女视频黄频| 国产精品久久久久久久久免| 2022亚洲国产成人精品| 女性被躁到高潮视频| 亚洲国产精品999| 久久久久精品久久久久真实原创| 精品国产国语对白av| 欧美精品高潮呻吟av久久| 少妇高潮的动态图| 精品人妻偷拍中文字幕| 国产男女内射视频| 中文乱码字字幕精品一区二区三区| 91成人精品电影| 肉色欧美久久久久久久蜜桃| 亚洲怡红院男人天堂| 久久6这里有精品| 国产一级毛片在线| 国产精品一区二区在线观看99| 成人国产av品久久久| 日本91视频免费播放| 亚洲欧美精品自产自拍| 夫妻午夜视频| 国产精品久久久久久久久免| av天堂中文字幕网| 国产精品秋霞免费鲁丝片| 99九九在线精品视频 | 国产 一区精品| 丰满乱子伦码专区| 国产精品99久久99久久久不卡 | 一本一本综合久久| 国产高清三级在线| 亚洲av中文av极速乱| 成人漫画全彩无遮挡| 婷婷色av中文字幕| 女性生殖器流出的白浆| 成人国产av品久久久| 亚洲国产毛片av蜜桃av| 亚洲精品日韩在线中文字幕| 久久综合国产亚洲精品| 嘟嘟电影网在线观看| 老司机影院毛片| 自拍欧美九色日韩亚洲蝌蚪91 | 亚洲欧美日韩东京热| 我的女老师完整版在线观看| 亚洲精品国产av成人精品| av在线app专区| 永久免费av网站大全| 99九九线精品视频在线观看视频| 99久久人妻综合| 三级经典国产精品| 99久久中文字幕三级久久日本| 午夜福利影视在线免费观看| 一级毛片aaaaaa免费看小| 国产极品天堂在线| 国产av国产精品国产| 另类精品久久| 日本av免费视频播放| 久久韩国三级中文字幕| 大香蕉久久网| 久久人人爽av亚洲精品天堂| 亚洲精华国产精华液的使用体验| 91久久精品国产一区二区三区| 免费少妇av软件| 女人精品久久久久毛片| 亚洲国产精品国产精品| 国产一区二区在线观看av| 极品教师在线视频| 一区在线观看完整版| 人妻系列 视频| 永久免费av网站大全| 夜夜骑夜夜射夜夜干| 久久久久久久久久久免费av| 91精品伊人久久大香线蕉| 久久精品国产亚洲av涩爱| 一级毛片电影观看| 久久久精品免费免费高清| 午夜激情久久久久久久| 美女中出高潮动态图| 亚州av有码| 伊人久久国产一区二区| 女人精品久久久久毛片| 久久久国产欧美日韩av| 国产欧美另类精品又又久久亚洲欧美| 又黄又爽又刺激的免费视频.| 草草在线视频免费看| 纯流量卡能插随身wifi吗| 欧美日韩在线观看h| 青春草国产在线视频| 亚洲精品乱码久久久久久按摩| 老司机影院成人| 夜夜看夜夜爽夜夜摸| 精品少妇内射三级| 久久女婷五月综合色啪小说| 能在线免费看毛片的网站| 波野结衣二区三区在线| 一区二区三区乱码不卡18| 波野结衣二区三区在线| 一区二区三区乱码不卡18| 久久久国产精品麻豆| 啦啦啦视频在线资源免费观看| 日韩,欧美,国产一区二区三区| 18禁在线播放成人免费| 熟女av电影| 中国国产av一级| 亚洲精品一二三| 搡老乐熟女国产| kizo精华| 观看美女的网站| 尾随美女入室| 嫩草影院新地址| 亚洲人成网站在线观看播放| 亚洲av电影在线观看一区二区三区| 18+在线观看网站| 久久久精品94久久精品| 国产午夜精品一二区理论片| 亚洲欧洲日产国产| 蜜桃在线观看..| 丰满乱子伦码专区| 尾随美女入室| 91在线精品国自产拍蜜月| 伊人久久精品亚洲午夜| 卡戴珊不雅视频在线播放| 欧美日本中文国产一区发布| 卡戴珊不雅视频在线播放| 伦理电影大哥的女人| 午夜福利视频精品| 人体艺术视频欧美日本| 2021少妇久久久久久久久久久| 亚洲第一区二区三区不卡| 国产亚洲精品久久久com| 日本免费在线观看一区| 久久午夜福利片| 人人妻人人澡人人爽人人夜夜| 亚洲婷婷狠狠爱综合网| 免费观看av网站的网址| 下体分泌物呈黄色| 男的添女的下面高潮视频| 欧美精品亚洲一区二区| 久久99精品国语久久久| 国产精品蜜桃在线观看| 亚洲欧美成人综合另类久久久| 亚洲图色成人| 色吧在线观看| 久久国产精品男人的天堂亚洲 | 久久国产乱子免费精品| 久久久久久久国产电影| 精品国产一区二区久久| tube8黄色片| 欧美精品国产亚洲| 久久久欧美国产精品| 成人漫画全彩无遮挡| 伦精品一区二区三区| 嫩草影院新地址| 女人久久www免费人成看片| 18禁在线无遮挡免费观看视频| 最近中文字幕2019免费版| 美女cb高潮喷水在线观看| 亚洲av日韩在线播放| 丝袜脚勾引网站| 男人舔奶头视频| 少妇熟女欧美另类| 哪个播放器可以免费观看大片| 午夜精品国产一区二区电影| 我的女老师完整版在线观看| 日韩电影二区| 激情五月婷婷亚洲| 伊人久久精品亚洲午夜| 国模一区二区三区四区视频| 国产成人精品无人区| 狂野欧美白嫩少妇大欣赏| 亚洲真实伦在线观看| 国产一级毛片在线| 亚洲伊人久久精品综合| 大又大粗又爽又黄少妇毛片口| √禁漫天堂资源中文www| 国产精品不卡视频一区二区| 免费观看无遮挡的男女| 一级毛片黄色毛片免费观看视频| 丝袜脚勾引网站| 亚洲真实伦在线观看| 欧美 日韩 精品 国产| 亚洲怡红院男人天堂| 熟妇人妻不卡中文字幕| 国产熟女午夜一区二区三区 | 人妻人人澡人人爽人人| 欧美丝袜亚洲另类| 国产极品天堂在线| 日韩一区二区三区影片| 这个男人来自地球电影免费观看 | 少妇丰满av| 国产精品久久久久久精品古装| 纵有疾风起免费观看全集完整版| av卡一久久| 国产精品一区二区在线观看99| 久久精品久久精品一区二区三区| 久久亚洲国产成人精品v| 夫妻午夜视频| 精品久久久噜噜| www.色视频.com| 亚洲人成网站在线播| 日本色播在线视频| 国产精品久久久久久精品电影小说| 国产伦理片在线播放av一区| 九九久久精品国产亚洲av麻豆| 嘟嘟电影网在线观看| 午夜日本视频在线| 国产黄色视频一区二区在线观看| 在线天堂最新版资源| 美女福利国产在线| 99国产精品免费福利视频| 久久久久精品久久久久真实原创| 国产亚洲91精品色在线| 精品人妻熟女av久视频| av.在线天堂| 王馨瑶露胸无遮挡在线观看| 色5月婷婷丁香| 亚洲国产精品一区三区| 成人综合一区亚洲| 五月伊人婷婷丁香| 建设人人有责人人尽责人人享有的| 纵有疾风起免费观看全集完整版| 亚洲va在线va天堂va国产| 国产精品一区二区在线观看99| av国产精品久久久久影院| 国产精品99久久久久久久久| 天堂8中文在线网| 国产精品不卡视频一区二区| 一级二级三级毛片免费看| 日韩一区二区视频免费看| 国产成人精品婷婷| 精品亚洲成国产av| 亚洲国产av新网站| av线在线观看网站| 国产 一区精品| 丰满饥渴人妻一区二区三| 欧美xxⅹ黑人| 日韩成人av中文字幕在线观看| 久久精品国产鲁丝片午夜精品| 2022亚洲国产成人精品| 亚洲欧美日韩另类电影网站| 国产精品国产三级专区第一集| 日韩亚洲欧美综合| 亚洲精品aⅴ在线观看| 色婷婷久久久亚洲欧美| 人体艺术视频欧美日本| 好男人视频免费观看在线| 国产精品99久久久久久久久| 嫩草影院新地址| 97超碰精品成人国产| 日韩av免费高清视频| 熟妇人妻不卡中文字幕| 亚洲一区二区三区欧美精品| 王馨瑶露胸无遮挡在线观看| 另类亚洲欧美激情| 国产成人91sexporn| 日韩一区二区三区影片| 大话2 男鬼变身卡| 曰老女人黄片| 日韩欧美 国产精品| 亚洲欧美中文字幕日韩二区| 日本黄大片高清| 亚洲精品久久午夜乱码| 91成人精品电影| 亚洲精品自拍成人| 日本爱情动作片www.在线观看| 日日啪夜夜爽| 成人毛片a级毛片在线播放| 伊人亚洲综合成人网| 人妻一区二区av| 狠狠精品人妻久久久久久综合| 亚洲图色成人| 中文字幕制服av| 免费高清在线观看视频在线观看| 国产高清国产精品国产三级| 老司机亚洲免费影院| 蜜桃在线观看..| 久久久久久久久大av| 在线免费观看不下载黄p国产| 成人午夜精彩视频在线观看| 久久鲁丝午夜福利片| 日韩三级伦理在线观看| 日日爽夜夜爽网站| 久久精品熟女亚洲av麻豆精品| 国产 一区精品| 99久国产av精品国产电影| 日韩,欧美,国产一区二区三区| 亚洲精品,欧美精品| 国产免费视频播放在线视频| 国产一区有黄有色的免费视频| 国产精品成人在线| 中文乱码字字幕精品一区二区三区| 午夜福利网站1000一区二区三区| 少妇猛男粗大的猛烈进出视频| 亚洲国产精品999| 亚洲成色77777| 国产精品女同一区二区软件| 亚洲国产精品国产精品| 国产av码专区亚洲av| 91久久精品国产一区二区三区| av在线播放精品| a 毛片基地| 久久久精品免费免费高清| 永久免费av网站大全| 精品少妇久久久久久888优播| 国产精品伦人一区二区| 日韩一本色道免费dvd| 丝袜脚勾引网站| 久久免费观看电影| 免费在线观看成人毛片| www.av在线官网国产| 久久ye,这里只有精品| 成人午夜精彩视频在线观看| 亚洲国产av新网站| videos熟女内射| 桃花免费在线播放| 自线自在国产av| 少妇人妻 视频| 国产在线一区二区三区精| 搡女人真爽免费视频火全软件| 国产69精品久久久久777片| av网站免费在线观看视频| 国产亚洲精品久久久com| 欧美日韩综合久久久久久| 欧美日韩视频高清一区二区三区二| 久久狼人影院| 黑人高潮一二区| 青春草视频在线免费观看| 亚州av有码| 欧美精品高潮呻吟av久久| 色婷婷av一区二区三区视频| 欧美成人精品欧美一级黄| 91aial.com中文字幕在线观看| 最近的中文字幕免费完整| a级毛片在线看网站| 精品人妻熟女av久视频| 亚洲欧美精品自产自拍| 18禁在线无遮挡免费观看视频| av专区在线播放| 最黄视频免费看| 少妇 在线观看| freevideosex欧美| 亚洲欧美一区二区三区国产| 高清黄色对白视频在线免费看 | 午夜福利视频精品| 国产在线免费精品| 欧美另类一区| 在线观看免费视频网站a站| 精品久久久久久久久av| 日本黄色片子视频| 美女大奶头黄色视频| 亚洲欧美一区二区三区国产| 男女免费视频国产|