• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Boundary-aware texture region segmentation from manga

    2017-06-19 19:20:12XuetingLiuChengzeLiandTienTsinWong
    Computational Visual Media 2017年1期

    Xueting Liu,Chengze Li,and Tien-Tsin Wong()

    Boundary-aware texture region segmentation from manga

    Xueting Liu1,2,Chengze Li1,2,and Tien-Tsin Wong1,2()

    Due to the lack of color in manga (Japanese comics),black-and-white textures are often used to enrich visual experience.With the rising need to digitize manga,segmenting texture regions from manga has become an indispensable basis for almost all manga processing,from vectorization to colorization.Unfortunately,such texture segmentation is not easy since textures in manga are composed of lines and exhibit similar features to structural lines (contour lines).So currently,texture segmentation is still manually performed,which is labor-intensive and time-consuming.To extract a texture region,various texture features have been proposed for measuring texture similarity,but precise boundaries cannot be achieved since boundary pixels exhibit dif f erent features from inner pixels.In this paper,we propose a novel method which also adopts texture features to estimate texture regions.Unlike existing methods,the estimated texture region is only regarded an initial,imprecise texture region.We expand the initial texture region to the precise boundary based on local smoothness via a graph-cut formulation.This allows our method to extract texture regions with precise boundaries.We have applied our method to various manga images and satisfactory results were achieved in all cases.

    manga;texture segmentation

    1 Introduction

    Manga is a world-wide popular form ofentertainment enjoyed by people of all ages (see Fig.1).Nowadays,with the development of electronic devices,more and more people read manga on electronic devices such as computers, tablets,and even cellphones.With the power of electronic devices,manga can be presented in a visually enriched form by adding color,motion,or stereoscopic ef f ects.There is thus a rising trend in the manga industry to convert legacy manga books into digital versions.During the digitization process, one major challenge is to segment texture regions from manga as a basis for various applications such as vectorization and colorization.However,texture region segmentation is not easy for manga since textures in manga are composed of lines.This leads to the diffi culty of discriminating structural lines from textures,and further leads to the diffi culty of identifying the precise boundary of each texture region.Therefore,texture region segmentation is still manually performed in the manga industry currently.As one may imagine,this process is quite tedious and time-consuming.

    Fig.1 Manga images.

    To help identify and classify textures,various texture features have been proposed in the computer vision f i eld[1].The similarity of the texture features for two pixels shows whether these two pixels are inside the same texture region.Texture segmentationor texture classif i cation techniques can be further applied to extract texture regions based on similarity of texture features.However,since texture features are analyzed within a local neighborhood,pixels in the interior of a texture region(e.g.,the blue box in Fig.2)and pixels near the boundary of a texture region(e.g.,the red and orange boxes in Fig.2)exhibit dif f erent texture features(see Fig.2(c)),even though they belong to the same texture region.Thus,pixels near texture boundaries may be mistakenly regarded as not belonging to that texture region.This will lead to imprecise boundaries of the segmented texture region if only similarity of texture features is considered(see Fig.3).

    To resolve the boundary issue,we noticed that texture smoothing techniques are quite powerful in suppressing local textures while still preserving sharp boundaries.Using texture smoothing methods, one could suggest f i rst smoothing the input manga image,and then performing intensity-based image segmentation to extract texture regions. However,a texture region may have spatial varying textures(e.g.,the background region in Fig.4(a) changes from dark to light vertically).Furthermore, the texture smoothing method is also incapable of dif f erentiating textures with similar overall intensities(as in Fig.4(b)).Therefore,we still need to analyze texture features in order to robustly handle spatial-varying textures and textures with similar intensities.

    Fig.3 Texture region segmentation based on Gabor features.Note that a precise boundary cannot be achieved.

    Fig.4(a)A background region with a spatial-varying texture.(b) Dif f erent textures may have similar intensities after smoothing.

    In this paper,we propose a novel,user-interactionbased texture region segmentation system,which integrates texture feature analysis and texture smoothing techniques in order to segment texture regions with precise boundaries from manga images. The user may draw one or several strokes inside the region or object to be segmented.Our system automatically estimates a region mask from the user input.To do so,we f i rst summarize the texture features of the user-drawn strokes,and estimate an initial texture region having similar texture features to the user-drawn strokes.In this step,we adopt a conservative similarity measurement for estimating the initial texture regions in order to guarantee that all pixels inside the initial texture region lie within the user-specif i ed region.We then expand the initial texture region to the precise boundaries using a graph-cut formulation.We formulate the accumulated smoothness dif f erence from the initial texture region as the data cost,and the local smoothness dif f erence as the smoothness cost.Using this formulation,we can extract the user-specif i ed region with a precise boundary.

    We demonstrate the ef f ectiveness of our texture region segmentation system on a set of manga images containing various types of textures, including regular textures,irregular textures,and spatial-varying textures.Our contributions can be summarized as follows:

    ?We propose a novel user-interaction-based system for extracting texture regions with precise boundaries.

    ?Our system can handle user strokes drawn across multiple dif f erent textures simultaneously.

    2 Related work

    While only a few existing research work are tailored for extracting texture regions from manga, extensive research has been done on identifying and segmenting textures from natural photos.We can roughly classify this related research into three categories:feature-based texture segmentation, regular texture analysis,and texture smoothing.

    Feature-based texture segmentation.Texture features are based on statistical models describing local characteristics of point neighborhoods.The statistical model of a texture feature usually contains a range of texture properties,such as size,aspect ratio,orientation, brightness,and density[3].Various texture features have been proposed to describe and model textures including various Gabor f i lters[4],f i lter bank responses[5],random f i eld models,wavelet representations,and so on.In particular,the Gabor f i lter was adopted by Ref.[6]to analyze textures in manga,and is still considered to be the state-of-the-art method.Texture features are utilized in various applications such as texture classif i cation,segmentation,and synthesis.Texture segmentation methods fall into two categories, supervised[7]and unsupervised[6,8].In particular, Ref.[6]proposed to segment textures in manga images via a level-set method based on Gabor features.However,as we have stated,although feature-based texture segmentation methods can identify textures well and dif f erentiate between textures,precise region boundaries cannot be achieved since boundary pixels exhibit dif f erent texture features from interior pixels.In contrast, our method can extract texture regions with precise boundaries.

    Regular texture analysis.For regular or nearregular texture patterns,attempts have been made to detect and analyze the regularity of the textures based on spatial relationships[9–11].In particular, Liuy et al.[12]considered how to detect and remove fences in natural images.A stream of research has also considered de-screening,i.e.,detecting and smoothing halftone textures.An in depth survey of de-screening can be found in Ref.[13].Kopf and Lischinski[14]discussed how to extract halftone patterns in printed color comics by modeling dot patterns.Very recently,Yao et al.[15]considered how to extract textures from manga by modeling three specif i c texture primitives,dots,stripes,and grids.However,these methods can only handle a small set of pre-def i ned regular textures.In comparison,our method can handle regular or irregular,and even spatial-varying,textures of the kind that exist in real manga images.

    Texture smoothing.In order to dif f erentiate textures and structures,various edge-preserving texture smoothing methods have been proposed, such as the total variation regularizer[16–19], bilateral f i ltering[20–22],local histogram-based f i ltering[23],weighted least squares[24],extrema extraction and extrapolation[25],L0gradient optimization[26],and relative total variation (RTV)[2].While these texture smoothing methods may suppress local oscillations based on local information,they are incapable of identifying a texture region or dif f erentiating between two textures.This is because that these methods do not model textures or structures,so they do not have a higher-level understanding of the semantics of the textures.In this paper,we thus utilize texture features to identify textures,but we incorporate texture smoothing techniques to identify sharp texture boundaries.

    3 Overview

    The input to our system includes a manga image (see Fig.5(a))and one or more user-specif i ed strokes (see Fig.5(b)).To extract regions with similar textures to the ones identif i ed by the user-specif i ed strokes,we f i rst summarize the texture features of the pixels belonging to the strokes.The texture features we use are Gabor features,which are also used by Ref.[6]in a manga colorization application. Since textures may spatially vary,and one userspecif i ed stroke may go across several texture regions (see Fig.5(b)),we summarize several main texture features inside each stroke by clustering.Then wecalculate the similarity between the texture feature of each pixel in the manga image and the clustered main texture features,to form a texture similarity map(see Fig.5(c)).In this map,intensity of pixel values indicates similarity of texture feature values with those of the user-specif i ed strokes.Based on the computed texture similarities,we then obtain one or several initial texture regions using a graphcut formulation(see Fig.5(d)).Our initial texture region extraction method is detailed in Section 4.

    Fig.5 System overview(image size:778×764,loading time:3.51 s,processing time:0.58 s).

    We then expand the initial texture regions to their precise boundaries.To do so,we f i rst obtain a smoothed image from the input image using texture smoothing techniques(see Fig.5(e)).Amongst all existing texture smoothing techniques,we found that the RTV metric proposed by Ref.[2]is the most ef f ective at smoothing textures while preserving sharp structures in manga images.In the smoothed image,if two neighboring pixels have close intensity values,it means that these two pixels are very likely to be inside the same texture region.Conversely, if two neighboring pixels have a jump in intensity values,it means that these two pixels are very likely to be inside two dif f erent regions.Therefore,we can dif f use the initial texture regions using local intensity continuity of the smoothed image to obtain a dif f usion map(see Fig.5(f)).This dif f usion map shows how smoothly each pixel is connected to the initial regions.If a pixel has a low dif f usion value, it means that this pixel is smoothly connected to the initial region,so it is very likely that this pixel is inside the same texture region.Conversely,a pixel with a high dif f usion value is very likely to lie outside the texture region.Finally,we extract the precise texture region based on the dif f usion map using another graph-cut formulation(see Figs.5(g) and 5(h)).Our region expansion method is detailed in Section 5.

    We have evaluated our system with various manga,and the results are shown in Section 6.We also show how users can easily adjust the retrieved texture region using a single parameter.

    4 Initial region extraction

    Given an input manga image,and one or more userspecif i ed strokes,we f i rst extract the initial regions with similar textures to the user-specif i ed strokes. To do so,we f i rst summarize the texture features of the pixels inside the strokes.Then we obtain a texture similarity map which gives the texture similarity between each pixel in the image and the summarized texture features.The initial regions are then extracted using a graph-cut formulation.

    4.1 Texture feature summarization

    To judge whether two pixels have similar textures, we use statistical features in the Gabor wavelet domain[27],which have already proved useful in dif f erentiating textures in manga[6].A Gabor feature vector is an M×N-dimensional vector where M is the number of scales and N is the number of orientations used in the Gabor feature analysis. In this paper,we f i x the numbers of scales and orientations to M=4 and N=6 respectively in all our experiments.Therefore,for a pixel p in themanga image,its Gabor feature vector Gpdescribing local texture feature around p is 24-dimensional.

    Given a set of user-specif i ed strokes U={u1,u2, ...},we could calculate a main texture feature for these strokes by averaging the texture features of all pixels inside the strokes aswhere |U|is the cardinality of U.However,a textured area may have a spatial-varying texture,or a single user-specif i ed stroke may also go across multiple textured areas at the same time.For example,the moon in Fig.5 cannot be specif i ed by a single texture feature vector.Therefore,we represent the textures determined by the user-specif i ed strokes using multiple texture feature vectors.To extract the most representative textures for the user-specif i ed strokes,we use the k-means clustering method to cluster the texture features of all pixels inside the strokes into k groups as TU={T1,...,Tk}.These satisfy:

    4.2 Initial region extraction via graph-cut

    From the summarized representative texture feature vectors of the user-specif i ed strokes,we then calculate the texture similarity value between each pixel p and the representative textures TUas

    Pixels with higher texture similarity are more likely to be inside the texture region specif i ed by the user.

    Using the calculated texture similarity,we extract an initial texture region via a graph-cut formulation. There are two terminal nodes,the source and the sink,in our graph.Each pixel p in the image corresponds to a non-terminal node np,which is connected to both source and sink.If the graphcut result connects a non-terminal node(pixel)to the source,it means this pixel lies inside the initial regions.Otherwise,this pixel lies outside the initial regions.Each edge connecting a non-terminal node and a terminal node is associated with a data cost:

    For every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqby an edge.Each edge connecting two non-terminal nodes is associated with a smoothness cost which measures our conf i dence that these two neighboring pixels should be assigned the same label,and therefore belong to the same texture region.We model the smoothness cost as the magnitude of the dif f erence of the texture feature vectors of these two nodes npand nq:

    Intuitively speaking,if two neighboring pixels p and q have similar textures,the smoothness cost S(np,nq) should be low,and there is high probability for them to be assigned the same label,and so belong to the same texture region.

    After constructing the graph,we can obtain the optimal cut through an optimization process which minimizes the energy function:

    where u∈{source,sink}is the label,and wcweights the data cost and smoothness cost.We experimentally set wcto 1 in all our experiments. The pixels labeled as source after graph-cut form the initial regions.Since other regions might also have similar patterns to the user-specif i ed region, we remove regions that do not intersect the userspecif i ed strokes.For regions that contain spatialvarying textures,dif f erent strokes may lead to dif f erent initial regions(see Fig.6)and af f ect the followed expansion.The user-specif i ed strokes should go across all dif f erent textures in order to achieve good segmentation results.

    5 Initial region expansion

    Fig.6 Initial region extraction from strokes.Top:several userspecif i ed strokes.Bottom:corresponding extracted initial regions.

    Starting from the extracted initial regions,we now show how we expand the regions to their precise boundaries.In short,we f i rst smooth the input image and dif f use the initial regions based on the smoothed image.Then we extract the f i nal texture region with precise boundary via another graph-cut formulation.

    5.1 Smoothness map construction

    To smooth a manga image,we have experimentally determined that the relative total variation method proposed by Xu et al.[2]performs best among the large pool of existing methods.This is mainly because that this measurement is more tolerant of high-contrast textures than other methods.This method has two input parameters λ and σ.Here,λ is a weighting factor which we set to 0.015 in all our experiments.σ is the local window size for measuring local oscillations(textures).In our experiments,we found that σ=5 works best for most textures.But when the texture is sparser,we may need to assign σ a higher value.Some texture smoothing results of manga images are shown in Figs.4 and 5(e).

    In the smoothed image,if two neighboring pixels have similar intensity values,it is very likely that they are in the same texture region,and vice versa. We also observe that texture regions in manga images are usually enclosed by black boundary lines. Therefore,we can judge whether a pixel is likely to be inside the user-specif i ed region by measuring whether this pixel is smoothly connected to the initial regions.Here,by saying smoothly connected,we mean that there exists a path from this pixel to the initial regions where the intensity values of the pixels change smoothly along the path.Formally,given the initial regions R and a pixel p outside the initial region,we def i ne a path from R to p as a set of pixels h(R,p)={q1,...,ql}where,q1∈R and qL=p.Here,is the L1-norm operator.We can measure whether p is smoothly connected to R along a path h(R,p)by accumulating the intensity dif f erences along this path:

    where J is the smoothed manga image.Since there is more than one path from p to R,we can measure whether p is smoothly connected to R by taking the minimal smoothness value of all possible paths:

    In a practical implementation,we compute the above smoothness values via a dif f usion process.More concretely,we f i rst construct a dif f usion map F by setting pixels inside the initial regions to 0 and pixels outside initial regions to+∞.Then we iteratively update the smoothness value of each pixel based on its surrounding pixels using:

    We visualize the dif f usion process in Fig.7.

    5.2 Final region extraction via graph-cut

    While we could extract a f i nal texture region by thresholding the dif f usion map,we have found that naive thresholding generally leads to bumpy and leaky boundaries.To avoid these issues,we formulate another graph to extract the f i nal region. As in the previous graph-cut formulation,each pixel p is formulated as a non-terminal node np,and is connected to two terminal nodes,the source and the sink.If the graph-cut result labels a non-terminal node(pixel)as connected to the source,it means this pixel is inside the f i nal texture region;otherwise,it is outside.The data cost associated with each edge connecting a terminal node and a non-terminal node measures how likely this pixel is smoothly connected to the initial region based on the smoothness map, and is expressed as

    Fig.7 The dif f usion process.

    where σsis empirically set to 0.05 in all our experiments.Intuitively speaking,if a pixel p has low smoothness value Fp,(np,source)should be relatively high,and there is a high probability that npwill be connected to the source.Similarly,for every pair of(4-connected)neighboring pixels p and q in the image,we connect npand nqwith an edge.The smoothness cost associated with each edge connecting two non-terminal nodes measures how likely the two neighboring pixels are to have the same label.We connect the nodes for every pair of nieghboring pixels p and q by an edge,whose associated cost is

    Intuitively,if the intensity values of two neighboring pixels are similar in the smoothed image,there is a high probability that they are in the same texture region.Finally,we solve this graph-cut problem by minimizing the following energy function:

    where u∈{source,sink}is the label,and wvweights data and smoothness costs.We empirically set wvto 0.25 in all our experiments.After graph-cut,pixels assigned to source form the f i nal texture region.

    5.3 User control

    Given a set of user-specif i ed strokes,while our system quite stably extracts texture regions based on a set of pre-def i ned parameters,we also allow user control. We let the user control the f i nal region by adjusting a single parameter z∈[?1,1].The smaller z,the smaller the f i nal texture region will be,and vice versa.We achieve this by incorporating z in the graph-cut formulation;in particular,the data cost is re-def i ned as

    where P(v,c1,c2)is a piecewise function def i ned as

    If z is set to?1,all pixels are labeled as sink and the extracted region is empty.If z is set to 1,all pixels are labeled as source and the extracted region is the whole image.The default value of z is 0.We show an example of parameter tuning in Fig.8.Even though user can control the extracted region with this parameter,our method is quite stable.In fact, the extracted region is constant for z∈[?0.6,0.6]in this case.

    6 Results and discussion

    6.1 Validation

    To validate the ef f ectiveness of our method,we have applied our methods to manga images with a variety of texture patterns,including regular patterns(e.g., as in Fig.9),near-regular patterns(e.g.,as in Fig.10),and irregular patterns(e.g.,as in Figs.5 and 11).We also compare our results with two stateof-the-art methods tailored for manga colorization[6] and manga vectorization[15]respectively.

    Figure 9(a)shows a manga image of a cat with a dot pattern.While the feature-based method[6]failed to detect precise boundaries of the texture regions(see Fig.9(b)),the primitivebased method[15]correctly detects the regions by formulating a specif i c dot pattern model(see Fig.9(c)).However,Yao et al.’s method makes very strong assumptions about the primitives in the textures,so they can only handle well textures such as dots,stripes,and grids.In comparison,we make no assumption about the primitives in the textures, but our method can still achieve similar results to those in Ref.[15]by use of texture feature analysis and smoothness dif f usion(see Fig.9(d)).Figure 10 shows another comparison between the results of Ref.[6]and our method.While both their method and ours analyze the texture of the user-specif i edstroke,their method only uses a single texture feature to represent the whole stroke.Therefore, their method is incapable of f i nding texture regions that are spatial-varying(see Fig.10(b)).In contrast, our method can handle spatial-varying textures well (see Fig.10(c)).

    Fig.8 User control of the extracted region via a single parameter z.

    Fig.10‘Boy’(1712×907,loading time:9.72 s,processing time:2.07 s).

    Fig.11‘Cars’(762×843,loading time:4.03 s,processing time:1.38 s).

    In Figs.5 and 11 the user specif i es texture regions with large spatial variation,especially in Fig.11. By using a set of texture feature vectors to represent the user-specif i ed strokes,our method successfully extracts the texture regions with precise boundaries. Since a manga image may contain multiple textures, we also allow user to specify several sets of strokes indicating dif f erent texture regions.The userspecif i ed strokes are sequentially processed so that the user can control the extracted regions more easily.We show three examples in Figs.12–14 where each input image contains multiple dif f erent textures including solid-color regions,regular texture regions, and irregular texture regions.Our method achieves good results in all cases.

    6.2 Timing statistics

    Fig.12‘Poker’(838×1210,loading time:7.05 s,processing time: 4.72 s).

    Fig.13‘Basketball’(1251×1013,loading time:7.97 s,processing time:6.27 s).

    Fig.14‘Astonished’(1251×1013,loading time:7.10 s,processing time:5.36 s).

    All of our experiments were conducted on a PC with a 2.7 GHz CPU and 64 GB memory;all tests were single threaded,unoptimized code,and no GPU was used.We break down the computational time for each example into two parts,the loading time and the processing time(given in the caption of each f i gure).The loading time is the time spent immediately when the user loads an image into the system,and can be regarded as the offl ine computation time.The reaction time is the time spent when the system returns the extracted region after the strokes have been drawn,and can be regarded as the online computation time.Whenever the user draws a new stroke or adjusts the control parameter,only the online parts need to be reexecuted.We observe that total computation time depends strongly on the resolution of the input image.

    6.3 Limitations

    One of our limitations concerns the latent assumption that the boundaries between two regions are sharp and smooth.Currently,we cannot handle blurred boundaries well.Furthermore,if the boundary of a region is quite spiky(e.g.,the shock balloons in Fig.14(a)),our current graph-cut formulation will result in a smoothed boundary (e.g.,blue regions in Fig.14(b)).Our method also cannot separate neighboring regions if they are visually inseparable.For example,in Fig.14(a), the black boundary of the top left shock balloon is connected to the black background of the second panel.Furthermore,the boundary in the smoothed image may deviate by one or two pixels from the original boundary due to limitations of the texture smoothing technique.In this case,we may also fail to extract the precise boundaries of the texture regions.

    7 Conclusions

    In this paper,we have proposed a novel system to extract texture regions with precise boundaries. Our method starts from an input image and a set of user-specif i ed strokes,and extracts initial regions containing pixels with similar textures, using Gabor wavelets.However,texture features, such as Gabor wavelets,cannot provide precise boundaries.We further smooth the original image via a texture smoothing technique,and ref i ne the initial regions based on the smoothed image.Our method outperforms existing methods in extracting precise boundaries especially for spatial-varying textures.

    While our method currently assumes hard boundaries,we could adopt matting techniques instead of the current graph-cut formulation to restore regions with alpha values.We also note that identif i cation of regions depends highly on the semantics of the image content,and introducing perception-based edge extraction techniques could help extract more precise boundaries.

    Acknowledgements

    This project was supported by the National Natural Science Foundation of China(Project No.61272293),and Research Grants Council of the Hong Kong Special Administrative Region under RGC General Research Fund(Project Nos.CUHK14200915 and CUHK14217516).

    [1]Tuytelaars,T.;Mikolajczyk,K.Local invariant feature detectors:A survey.Foundations and Trends in Computer Graphics and Vision Vol.3,No.3,177–280, 2008.

    [2]Xu,L.;Yan,Q.;Xia,Y.;Jia,J.Structure extraction from texture via relative total variation. ACM Transactions on Graphics Vol.31,No.6,Article No.139,2012.

    [3]Julesz,B.Textons,the elements of texture perception, and their interactions.Nature Vol.290,91–97,1981.

    [4]Weldon,T.P.;Higgins,W.E.;Dunn,D.F.Effi cient Gabor f i lter design for texture segmentation.Pattern Recognition Vol.29,No.12,2005–2015,1996.

    [5]Varma,M.;Zisserman,A.Texture classif i cation: Are f i lter banks necessary?In:Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition,Vol.2,II-691-8,2003.

    [6]Qu,Y.;Wong,T.-T.;Heng,P.-A.Manga colorization. ACM Transactions on Graphics Vol.25,No.3,1214–1220,2006.

    [7]Hofmann,T.;Puzicha,J.;Buhmann,J.M. Unsupervised texture segmentation in a deterministic annealing framework.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.20,No.8,803–818,1998.

    [8]Paragios,N.;Deriche,R.Geodesic active regions for supervised texture segmentation.In:Proceedings of the 7th IEEE International Conference on Computer Vision,Vol.2,926–932,1999.

    [9]Hays,J.;Leordeanu,M.;Efros,A.A.;Liu, Y.Discovering texture regularity as a higher-order correspondence problem.In:Computer Vision–ECCV 2006.Leonardis,A.;Bischof,H.;Pinz,A.Eds. Springer Berlin Heidelberg,522–535,2006.

    [10]Liu,Y.;Collins,R.T.;Tsin,Y.A computational model for periodic pattern perception based on frieze and wallpaper groups.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.26,No.3,354–371,2004.

    [11]Liu,Y.;Lin,W.-C.;Hays,J.Near-regular texture analysis and manipulation.ACM Transactions on Graphics Vol.23,No.3,368–376,2004.

    [12]Liuy,Y.;Belkina,T.;Hays,J.H.;Lublinerman,R. Image de-fencing.In:Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,1–8, 2008.

    [13]Siddiqui,H.;Boutin,M.;Bouman,C.A.Hardwarefriendly descreening.IEEE Transactions on Image Processing Vol.19,No.3,746–757,2010.

    [14]Kopf,J.;Lischinski,D.Digital reconstruction of halftoned color comics.ACM Transactions on Graphics Vol.31,No.6,Article No.140,2012.

    [15]Yao,C.-Y.;Hung,S.-H.;Li,G.-W.;Chen,I.-Y.; Adhitya,R.;Lai,Y.-C.Manga vectorization and manipulation with procedural simple screentone.IEEE Transactions on Visualization and Computer Graphics Vol.23,No.2,1070–1084,2017.

    [16]Aujol,J.-F.;Gilboa,G.;Chan,T.;Osher,S.Structuretexture image decomposition—Modeling,algorithms, and parameter selection.International Journal of Computer Vision Vol.67,No.1,111–136,2006.

    [17]Meyer,Y.Oscillating Patterns in Image Processing and Nonlinear Evolution Equations:The Fifteenth Dean Jacqueline B.Lewis Memorial Lectures. American Mathematical Society,2001.

    [18]Rudin,L.I.;Osher,S.;Fatemi,E.Nonlinear total variation based noise removal algorithms.Physica D: Nonlinear Phenomena Vol.60,Nos.1–4,259–268, 1992.

    [19]Yin,W.;Goldfarb,D.;Osher,S.Image cartoontexture decomposition and feature selection using the total variation regularized L1functional.In: Variational,Geometric,and Level Set Methods in Computer Vision.Paragios,N.;Faugeras,O.;Chan, T.;Schn¨orr,C.Eds.Springer Berlin Heidelberg,73–84,2005.

    [20]Durand,F.;Dorsey,J.Fast bilateral f i ltering for the display of high-dynamic-range images.ACM Transactions on Graphics Vol.21,No.3,257–266, 2002.

    [21]Fattal,R.;Agrawala,M.;Rusinkiewicz,S.Multiscale shape and detail enhancement from multi-light image collections.ACM Transactions on Graphics Vol.26, No.3,Article No.51,2007.

    [22]Paris,S.;Durand,F.A fast approximation of the bilateral f i lter using a signal processing approach.In: Computer Vision–ECCV 2006.Leonardis,A.;Bischof, H.;Pinz,A.Eds.Springer Berlin Heidelberg,568–580, 2006.

    [23]Kass,M.;Solomon,J.Smoothed local histogram f i lters.ACM Transactions on Graphics Vol.29,No. 4,Article No.100,2010.

    [24]Farbman,Z.;Fattal,R.;Lischinski,D.;Szeliski, R.Edge-preserving decompositions for multi-scale tone and detail manipulation.ACM Transactions on Graphics Vol.27,No.3,Article No.67,2008.

    [25]Subr,K.;Soler,C.;Durand,F.Edge-preserving multiscale image decomposition based on local extrema.ACM Transactions on Graphics Vol.28,No. 5,Article No.147,2009.

    [26]Xu,L.;Lu,C.;Xu,Y.;Jia,J.Image smoothing via L0gradient minimization.ACM Transactions on Graphics Vol.30,No.6,Article No.174,2011.

    [27]Manjunath,B.S.;Ma,W.-Y.Texture features for browsing and retrieval of image data.IEEE Transactions on Pattern Analysis and Machine Intelligence Vol.18,No.8,837–842,1996.

    Chengze Lireceived his B.S.degree from University of Science and Technology of China in 2013.He is currently a Ph.D.student in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.His research interests include computer vision,pattern recognition,and high-performance computing.

    Tien-Tsin Wongreceived his B.Sc., M.Phil.,and Ph.D.degrees in computer science from the Chinese University of Hong Kong in 1992,1994,and 1998,respectively.He is currently a professor in the Department of Computer Science and Engineering, the Chinese University of Hong Kong. His main research interests include computer graphics, computational manga,precomputed lighting,imagebased rendering,GPU techniques,medical visualization, multimedia compression,and computer vision.He received the IEEE Transactions on Multimedia Prize Paper Award 2005 and the Young Researcher Award 2004.

    Open AccessThe articles published in this journal are distributed under the terms of the Creative Commons Attribution 4.0 International License(http:// creativecommons.org/licenses/by/4.0/),which permits unrestricted use,distribution,and reproduction in any medium,provided you give appropriate credit to the original author(s)and the source,provide a link to the Creative Commons license,and indicate if changes were made.

    Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095. To submit a manuscript,please go to https://www. editorialmanager.com/cvmj.

    iu

    her B.Eng. degree from Tsinghua University and Ph.D.degree from the Chinese University of Hong Kong in 2009 and 2014,respectively.She is currently a postdoctoral research fellow in the Department of Computer Science and Engineering,the Chinese University of Hong Kong.Her research interests include computer graphics,computer vision,computational manga and anime,and non-photorealistic rendering.

    1 The Chinese University of Hong Kong,Hong Kong, China.E-mail:X.Liu,xtliu@cse.cuhk.edu.hk;C.Li, czli@cse.cuhk.edu.hk;T.-T.Wong,ttwong@cse.cuhk. edu.hk().

    2 Shenzhen Research Institute,the Chinese University of Hong Kong,Shenzhen,China.

    Manuscript received:2016-09-09;accepted:2016-12-20

    香蕉av资源在线| 欧美最黄视频在线播放免费| 88av欧美| 九九在线视频观看精品| 精品久久久噜噜| 嫩草影院入口| 少妇裸体淫交视频免费看高清| 午夜免费成人在线视频| 麻豆久久精品国产亚洲av| 国内精品一区二区在线观看| 在线播放无遮挡| 国产私拍福利视频在线观看| 韩国av一区二区三区四区| 韩国av在线不卡| 亚洲一区高清亚洲精品| 丝袜美腿在线中文| av中文乱码字幕在线| 亚洲性久久影院| 久久精品国产亚洲av天美| 午夜福利视频1000在线观看| 久久人人爽人人爽人人片va| 男人舔奶头视频| 人妻制服诱惑在线中文字幕| 国产又黄又爽又无遮挡在线| 精品欧美国产一区二区三| 国产伦人伦偷精品视频| 久久久久久久久久黄片| 嫩草影视91久久| 国产欧美日韩精品一区二区| 国产淫片久久久久久久久| 少妇被粗大猛烈的视频| 一区二区三区激情视频| 日韩在线高清观看一区二区三区 | 成年免费大片在线观看| 精品人妻熟女av久视频| 91在线精品国自产拍蜜月| 搡女人真爽免费视频火全软件 | 欧美在线一区亚洲| 久久久久久国产a免费观看| 丰满的人妻完整版| avwww免费| 五月伊人婷婷丁香| 无遮挡黄片免费观看| 最好的美女福利视频网| 久久热精品热| 欧美+亚洲+日韩+国产| 亚洲成人久久性| 日本-黄色视频高清免费观看| 三级男女做爰猛烈吃奶摸视频| 最近在线观看免费完整版| 亚洲内射少妇av| 国产免费av片在线观看野外av| 日日摸夜夜添夜夜添av毛片 | 在线观看免费视频日本深夜| 日韩在线高清观看一区二区三区 | av在线亚洲专区| 国产在线男女| 欧美一级a爱片免费观看看| 级片在线观看| 特大巨黑吊av在线直播| 欧美极品一区二区三区四区| 午夜激情福利司机影院| 国产日本99.免费观看| 久久久久性生活片| 国产精品一区www在线观看 | 麻豆成人午夜福利视频| 他把我摸到了高潮在线观看| 一区二区三区高清视频在线| 久久久精品大字幕| 久久久久性生活片| 亚洲欧美清纯卡通| 丰满乱子伦码专区| 国产主播在线观看一区二区| 久久人人精品亚洲av| av在线亚洲专区| 蜜桃久久精品国产亚洲av| 在线免费观看不下载黄p国产 | 久久人妻av系列| 乱系列少妇在线播放| 亚洲欧美清纯卡通| 俺也久久电影网| 国产真实乱freesex| 无人区码免费观看不卡| 99热6这里只有精品| 免费观看的影片在线观看| 99在线人妻在线中文字幕| 久久精品国产自在天天线| 欧美日本亚洲视频在线播放| av在线蜜桃| 国产女主播在线喷水免费视频网站 | 精品久久久噜噜| 精品福利观看| 婷婷精品国产亚洲av| 在线a可以看的网站| 亚洲av二区三区四区| 天天一区二区日本电影三级| 精品福利观看| 老司机福利观看| 一区二区三区免费毛片| 久久久久久久精品吃奶| 久久精品国产清高在天天线| 中文亚洲av片在线观看爽| 日韩国内少妇激情av| 啦啦啦韩国在线观看视频| 成人欧美大片| 中国美白少妇内射xxxbb| 久久精品国产亚洲av涩爱 | 琪琪午夜伦伦电影理论片6080| 国产精品自产拍在线观看55亚洲| 一级毛片久久久久久久久女| 九色成人免费人妻av| 在线天堂最新版资源| 日本一本二区三区精品| 日本撒尿小便嘘嘘汇集6| 免费搜索国产男女视频| 久久久久久大精品| 亚州av有码| 久久久成人免费电影| www日本黄色视频网| 成人三级黄色视频| 精品人妻1区二区| 三级国产精品欧美在线观看| 国产视频一区二区在线看| 亚洲国产精品合色在线| 亚洲在线自拍视频| 欧美极品一区二区三区四区| 国产精品久久久久久久久免| 校园春色视频在线观看| 久久人妻av系列| 午夜日韩欧美国产| 成人高潮视频无遮挡免费网站| 九九在线视频观看精品| 久久久午夜欧美精品| 欧美日韩黄片免| 亚洲专区国产一区二区| 中文亚洲av片在线观看爽| 一区二区三区激情视频| 午夜免费成人在线视频| 无遮挡黄片免费观看| 在线观看一区二区三区| 美女高潮喷水抽搐中文字幕| 色哟哟·www| 亚洲av熟女| 18禁裸乳无遮挡免费网站照片| 国产人妻一区二区三区在| 麻豆精品久久久久久蜜桃| 欧美日韩亚洲国产一区二区在线观看| 一本精品99久久精品77| 黄色丝袜av网址大全| 亚洲图色成人| 久久久久久久精品吃奶| 午夜亚洲福利在线播放| 国产高清视频在线观看网站| 国产午夜精品久久久久久一区二区三区 | 床上黄色一级片| 日本精品一区二区三区蜜桃| 我要看日韩黄色一级片| 久久亚洲真实| 99在线视频只有这里精品首页| 狂野欧美白嫩少妇大欣赏| 亚洲欧美精品综合久久99| 99久久成人亚洲精品观看| 午夜视频国产福利| 亚洲专区国产一区二区| 又紧又爽又黄一区二区| 成人亚洲精品av一区二区| 亚洲欧美清纯卡通| 超碰av人人做人人爽久久| 观看美女的网站| 国产黄色小视频在线观看| 国产精品女同一区二区软件 | 亚洲人成网站高清观看| 亚洲欧美日韩高清在线视频| 精品久久久噜噜| 国产伦精品一区二区三区四那| 国产午夜精品论理片| 久久久久国产精品人妻aⅴ院| 亚洲最大成人中文| 免费av不卡在线播放| 国产高清不卡午夜福利| 国产乱人伦免费视频| 成人高潮视频无遮挡免费网站| 1000部很黄的大片| 成人三级黄色视频| 哪里可以看免费的av片| 国产一级毛片七仙女欲春2| 久久精品国产亚洲av涩爱 | 天堂动漫精品| 国产亚洲91精品色在线| 久久久精品欧美日韩精品| 国内毛片毛片毛片毛片毛片| 国产亚洲av嫩草精品影院| 日本黄色片子视频| 成人二区视频| 日韩大尺度精品在线看网址| 少妇高潮的动态图| 国产爱豆传媒在线观看| 男女下面进入的视频免费午夜| 国产黄a三级三级三级人| 男女之事视频高清在线观看| 99精品久久久久人妻精品| 亚洲av电影不卡..在线观看| 看片在线看免费视频| 日本熟妇午夜| .国产精品久久| 少妇的逼水好多| 欧美一级a爱片免费观看看| 欧美最黄视频在线播放免费| 一本精品99久久精品77| 精品日产1卡2卡| 干丝袜人妻中文字幕| 91麻豆av在线| 免费一级毛片在线播放高清视频| 国产精品一区二区三区四区免费观看 | 一进一出抽搐动态| 国产精品久久电影中文字幕| a级一级毛片免费在线观看| 精品福利观看| 亚洲 国产 在线| 国产免费男女视频| 国产欧美日韩精品一区二区| 一个人看的www免费观看视频| 一个人看的www免费观看视频| 亚洲一区二区三区色噜噜| 色哟哟哟哟哟哟| 日本五十路高清| 日本 av在线| 天堂动漫精品| 欧美一区二区亚洲| 女人十人毛片免费观看3o分钟| 桃红色精品国产亚洲av| 亚洲成人久久爱视频| 精品久久久久久成人av| 九色国产91popny在线| 亚洲人成网站高清观看| 欧美日本视频| 又紧又爽又黄一区二区| 国产av在哪里看| www.色视频.com| 十八禁网站免费在线| 免费搜索国产男女视频| 久99久视频精品免费| 又黄又爽又刺激的免费视频.| 国产白丝娇喘喷水9色精品| 自拍偷自拍亚洲精品老妇| 亚洲熟妇中文字幕五十中出| 久久午夜福利片| 亚洲不卡免费看| 成人精品一区二区免费| 欧美日韩综合久久久久久 | 999久久久精品免费观看国产| 两个人的视频大全免费| 九九久久精品国产亚洲av麻豆| 我要搜黄色片| 韩国av在线不卡| 成人一区二区视频在线观看| av福利片在线观看| 黄片wwwwww| 18禁在线播放成人免费| 午夜爱爱视频在线播放| 国产成人a区在线观看| 网址你懂的国产日韩在线| 精品人妻1区二区| 国产一区二区在线av高清观看| 欧美成人a在线观看| 成人二区视频| 久久久久免费精品人妻一区二区| 人人妻,人人澡人人爽秒播| 波多野结衣高清无吗| 国产高清激情床上av| 久久久久久伊人网av| 国产成年人精品一区二区| 国产精品爽爽va在线观看网站| 综合色av麻豆| 成人午夜高清在线视频| 国产精品久久视频播放| 最新在线观看一区二区三区| a级一级毛片免费在线观看| 人妻夜夜爽99麻豆av| 亚洲自偷自拍三级| 久久午夜福利片| 亚洲久久久久久中文字幕| 欧美日韩瑟瑟在线播放| 精品久久久久久,| 黄色视频,在线免费观看| 人妻少妇偷人精品九色| 午夜福利在线观看吧| 在线免费十八禁| 91麻豆av在线| 国产探花在线观看一区二区| 搞女人的毛片| 亚洲中文字幕一区二区三区有码在线看| 国产精品无大码| 久久久久久久午夜电影| 国产精品一区二区三区四区久久| 狠狠狠狠99中文字幕| 热99re8久久精品国产| 国产高清三级在线| 少妇熟女aⅴ在线视频| 少妇人妻一区二区三区视频| 九色成人免费人妻av| 俺也久久电影网| 免费黄网站久久成人精品| 国产精品乱码一区二三区的特点| 国产男靠女视频免费网站| 十八禁网站免费在线| 毛片一级片免费看久久久久 | 真实男女啪啪啪动态图| 在线a可以看的网站| 国产精品久久久久久精品电影| 精品久久久久久久末码| 亚洲国产精品久久男人天堂| 亚洲欧美日韩东京热| www日本黄色视频网| 亚洲五月天丁香| 直男gayav资源| 黄色一级大片看看| 神马国产精品三级电影在线观看| 国产一区二区激情短视频| 国产欧美日韩一区二区精品| 啦啦啦韩国在线观看视频| 老熟妇仑乱视频hdxx| 麻豆国产97在线/欧美| 日本 欧美在线| 中文字幕免费在线视频6| 欧美日韩瑟瑟在线播放| 少妇被粗大猛烈的视频| 91久久精品国产一区二区成人| 欧美另类亚洲清纯唯美| 国产成人一区二区在线| 亚洲精品成人久久久久久| 日本爱情动作片www.在线观看 | 一本久久中文字幕| 亚洲av成人精品一区久久| 小蜜桃在线观看免费完整版高清| 亚洲av五月六月丁香网| ponron亚洲| 精品午夜福利在线看| 一个人免费在线观看电影| aaaaa片日本免费| 老师上课跳d突然被开到最大视频| 在线国产一区二区在线| 午夜免费成人在线视频| 综合色av麻豆| 美女高潮的动态| 看免费成人av毛片| 成人国产麻豆网| 最近最新免费中文字幕在线| 中文字幕人妻熟人妻熟丝袜美| 日本黄色视频三级网站网址| 亚洲18禁久久av| 精品人妻熟女av久视频| 久久久久免费精品人妻一区二区| 深夜a级毛片| 国产精品三级大全| 亚洲精华国产精华精| 欧美高清性xxxxhd video| 99国产精品一区二区蜜桃av| 久久天躁狠狠躁夜夜2o2o| 蜜桃亚洲精品一区二区三区| 免费av观看视频| 国产乱人视频| 欧美日韩中文字幕国产精品一区二区三区| 国产淫片久久久久久久久| 人妻丰满熟妇av一区二区三区| 国产爱豆传媒在线观看| 国产国拍精品亚洲av在线观看| 蜜桃亚洲精品一区二区三区| 久久久久国产精品人妻aⅴ院| 一进一出好大好爽视频| 亚洲国产精品成人综合色| 欧美极品一区二区三区四区| 亚洲专区国产一区二区| 欧美黑人欧美精品刺激| 亚洲va在线va天堂va国产| 高清在线国产一区| 亚洲性久久影院| 亚洲精品亚洲一区二区| 少妇的逼好多水| 在线看三级毛片| 不卡视频在线观看欧美| 国产精品久久久久久精品电影| 成年人黄色毛片网站| 午夜日韩欧美国产| 五月伊人婷婷丁香| 久久午夜亚洲精品久久| 欧美潮喷喷水| 嫩草影视91久久| 舔av片在线| 亚洲欧美精品综合久久99| av天堂中文字幕网| 波多野结衣高清作品| 精品无人区乱码1区二区| 欧美日韩瑟瑟在线播放| 免费不卡的大黄色大毛片视频在线观看 | av女优亚洲男人天堂| 18+在线观看网站| 午夜影院日韩av| 亚洲五月天丁香| 中文字幕久久专区| 午夜福利高清视频| 俄罗斯特黄特色一大片| 在线播放国产精品三级| 日本黄大片高清| 淫秽高清视频在线观看| 中国美女看黄片| 国产在线精品亚洲第一网站| .国产精品久久| 国产男人的电影天堂91| 精品久久久久久久末码| av在线老鸭窝| 欧美区成人在线视频| 亚洲av.av天堂| 国产视频内射| 亚洲av五月六月丁香网| 国产91精品成人一区二区三区| 国产美女午夜福利| 欧美激情久久久久久爽电影| 亚洲自偷自拍三级| 狂野欧美白嫩少妇大欣赏| 欧美精品啪啪一区二区三区| 五月玫瑰六月丁香| 久久精品国产99精品国产亚洲性色| 国产视频一区二区在线看| 国产精品日韩av在线免费观看| 99热这里只有精品一区| 亚洲一区二区三区色噜噜| 18禁在线播放成人免费| 亚洲欧美精品综合久久99| 亚洲欧美日韩东京热| 九九久久精品国产亚洲av麻豆| 午夜福利视频1000在线观看| 亚洲中文日韩欧美视频| 村上凉子中文字幕在线| 久久久久久久精品吃奶| 观看免费一级毛片| 91午夜精品亚洲一区二区三区 | 国产aⅴ精品一区二区三区波| ponron亚洲| 嫩草影院新地址| 99久久成人亚洲精品观看| 大型黄色视频在线免费观看| а√天堂www在线а√下载| 亚洲人成网站高清观看| 日韩人妻高清精品专区| 欧美成人一区二区免费高清观看| 欧美最黄视频在线播放免费| 亚洲自拍偷在线| 久久午夜亚洲精品久久| 99精品在免费线老司机午夜| 午夜免费成人在线视频| 久久亚洲真实| 少妇的逼水好多| 亚洲av.av天堂| 亚洲人与动物交配视频| 在线观看66精品国产| 亚洲狠狠婷婷综合久久图片| 乱码一卡2卡4卡精品| 久久人人精品亚洲av| 桃红色精品国产亚洲av| 亚洲国产精品合色在线| 女生性感内裤真人,穿戴方法视频| 午夜福利欧美成人| 午夜亚洲福利在线播放| 久久久精品大字幕| 精品久久久久久久久亚洲 | 亚洲欧美日韩东京热| 麻豆国产97在线/欧美| 亚洲美女搞黄在线观看 | 12—13女人毛片做爰片一| 一个人观看的视频www高清免费观看| 大又大粗又爽又黄少妇毛片口| 中文字幕av成人在线电影| 夜夜爽天天搞| 久久草成人影院| 精品久久久噜噜| 久久香蕉精品热| 国产黄a三级三级三级人| 精品99又大又爽又粗少妇毛片 | a在线观看视频网站| 欧美+日韩+精品| 五月玫瑰六月丁香| 别揉我奶头 嗯啊视频| 99国产极品粉嫩在线观看| 久久九九热精品免费| 成年版毛片免费区| 亚洲性夜色夜夜综合| 真实男女啪啪啪动态图| 精品久久久久久久久亚洲 | 国产免费一级a男人的天堂| 免费搜索国产男女视频| 国产午夜精品论理片| 欧美黑人欧美精品刺激| 日韩欧美在线二视频| 亚洲真实伦在线观看| 午夜爱爱视频在线播放| .国产精品久久| 久久天躁狠狠躁夜夜2o2o| 色5月婷婷丁香| 床上黄色一级片| 一本精品99久久精品77| 亚洲成人中文字幕在线播放| 乱系列少妇在线播放| 在线天堂最新版资源| 我要搜黄色片| 十八禁国产超污无遮挡网站| av福利片在线观看| 极品教师在线免费播放| 久久久精品大字幕| 国内揄拍国产精品人妻在线| 久久精品国产亚洲网站| 国产一级毛片七仙女欲春2| 欧美zozozo另类| 欧美高清成人免费视频www| 国产精品一及| 亚洲精品乱码久久久v下载方式| 国产乱人视频| 亚洲四区av| h日本视频在线播放| 极品教师在线免费播放| 国产在线男女| 哪里可以看免费的av片| 国产又黄又爽又无遮挡在线| 黄片wwwwww| 国产精品一区www在线观看 | 国产精品日韩av在线免费观看| 精品久久久久久久久久久久久| 国产主播在线观看一区二区| 国产精品久久久久久久久免| 国产三级在线视频| 少妇人妻精品综合一区二区 | 我的女老师完整版在线观看| 亚洲av不卡在线观看| 国产精品福利在线免费观看| 免费在线观看成人毛片| 国产一区二区三区视频了| 久久6这里有精品| 小蜜桃在线观看免费完整版高清| 美女黄网站色视频| 老司机福利观看| 精华霜和精华液先用哪个| 日本黄色片子视频| 国产男靠女视频免费网站| 无人区码免费观看不卡| 日韩人妻高清精品专区| 国产亚洲精品久久久com| 国产高清不卡午夜福利| 少妇猛男粗大的猛烈进出视频 | 免费观看精品视频网站| 亚洲一级一片aⅴ在线观看| 老司机福利观看| 国产三级在线视频| 国产精品美女特级片免费视频播放器| 在线观看66精品国产| 亚洲五月天丁香| 亚洲av五月六月丁香网| 亚洲 国产 在线| 51国产日韩欧美| 悠悠久久av| 色精品久久人妻99蜜桃| 熟妇人妻久久中文字幕3abv| 长腿黑丝高跟| 日本三级黄在线观看| 午夜a级毛片| 欧美3d第一页| 国内少妇人妻偷人精品xxx网站| 欧美性猛交黑人性爽| 欧美极品一区二区三区四区| 18禁黄网站禁片免费观看直播| xxxwww97欧美| 亚洲精品一区av在线观看| x7x7x7水蜜桃| 午夜免费成人在线视频| xxxwww97欧美| 午夜免费成人在线视频| 久久久久性生活片| 免费看a级黄色片| 中文字幕精品亚洲无线码一区| 成人国产一区最新在线观看| 在线免费观看不下载黄p国产 | 在线观看午夜福利视频| 亚洲精品一卡2卡三卡4卡5卡| 欧美黑人巨大hd| 国产老妇女一区| 男插女下体视频免费在线播放| av.在线天堂| 日本欧美国产在线视频| 国产亚洲精品久久久久久毛片| a在线观看视频网站| 日本黄色视频三级网站网址| 春色校园在线视频观看| 国产午夜福利久久久久久| 成人av在线播放网站| 91久久精品电影网| 欧美色视频一区免费| 91久久精品电影网| 国产精品美女特级片免费视频播放器| 国产爱豆传媒在线观看| 国产麻豆成人av免费视频| 此物有八面人人有两片| aaaaa片日本免费| 国产精品福利在线免费观看| 成人鲁丝片一二三区免费| 成人三级黄色视频| 女的被弄到高潮叫床怎么办 | 人妻少妇偷人精品九色| 中文字幕av成人在线电影| 国产中年淑女户外野战色| av中文乱码字幕在线| 蜜桃亚洲精品一区二区三区| 色噜噜av男人的天堂激情| 色尼玛亚洲综合影院| 免费不卡的大黄色大毛片视频在线观看 | 国产精品乱码一区二三区的特点| 最近中文字幕高清免费大全6 |