• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    基于變換空間近鄰圖的自助型局部保持投影

    2010-05-05 22:55:34喬立山1張麗梅1孫忠貴2
    關(guān)鍵詞:工程系南京航空航天大學(xué)聊城

    喬立山1,2 張麗梅1,2 孫忠貴2

    (1.南京航空航天大學(xué)計(jì)算機(jī)科學(xué)與工程系,南京,210016,中國;2.聊城大學(xué)數(shù)學(xué)科學(xué)學(xué)院,聊城,252000,中國)

    INTRODUCTION

    In practice,the high-dimensional data are often utilized such as face images and gene expression micro-arrays. Dimensionality reduction(DR)is a principle way to mine and understand such high-dimensional data by mapping them into another(usually low-dimensional)space.Classical DR methods include PCA,ICA and LDA,and so on[1].However,these methods cannot discover the nonlinear structure in data.To address this issue,in the past decade,researchers have developed many nonlinear manifold learning algorithms such as LLE[2],ISOM AP[3]and Laplacian eigenmaps[4].They give more flexibility in data modeling,but generally suffer from high computational cost and the so-called″out-of-sample″problem.Ref.[5]showed that although such nonlinear techniques perform well on selected artificial data sets,they generally do not outperform the traditional PCA on real-world tasks yet.

    In the recent years,there is an increasing interest in linearized locality-oriented DR methods,e.g., the locality preserving projections(LPP)[6],the neighborhood preserving embedding (NPE)[7],the unsupervised discriminant projection(UDP)[8],and the sparsity preserving projections(SPP)[9],etc..On one hand,these algorithms are linear in nature,thus avoiding the″out-of-sample″problem involved in nonlinear manifold learning.On the other hand,they model the local neighborhood structure in data,and generally achieve better performance than typical global linear methods such as PCA.According to Ref.[10],almost all of the existing locality-oriented methods essentially share the similar objective function yet with different well-constructed neighborhood graphs.Therefore,without loss of generality,LPP in virtue of its popularity and typicality is chosen to develop the algorithm and demonstrate the idea,though the idea in this paper can be easily and naturally extended to other locality-oriented DR methods.

    As a representative of locality-oriented DR algorithms,LPP has been widely used in many practical problems,such as the face recognition[11].Despite its unsupervised nature,LPP has potential discriminative power by preserving the local geometry of data.However,the neighborhood graph underlying LPP is defined(Eq.(2))based on original data points,and must be fixed all along once constructed.As a result,the performance of LPP generally relies heavily on how well the nearest neighbor criterion work in original space[12].In addition,the so-defined neighborhood graph suffers seriously from the difficulty of parameter selections,i.e.,the neighborhood size k and the Gaussian kernel widthe.

    To address these problems,this paper proposes a novel DR algorithm,called the self-dependent LPP(sdLPP),which is based on the observation that the nearest neighbor criterion usually works well in LPP transformed space.Firstly,LPP is performed based on the typical neighborhood graph.Then,a new neighborhood graph is constructed in LPP transformed space and repeats LPP.Furthermore,a new criterion called the improved laplacian score is developed as an empirical reference for discriminative power and iterative stop condition.Finally,experiments on several publicly available UCI and face data sets verify the feasibility and the effectiveness of the proposed algorithm with promising results.

    1 BRIEF REVIEW OF LOCALITY PRESERVING PROJECTIONS

    Given a set of data pointsX= [x1,x2,…,xn],xi∈Rd,LPP aims at seeking a set of projection directions by preserving the local geometric data structure.The objective function of LPP is defined as follows

    whereW∈ Rd×d′(d′<d)is the projection matrix and S= (Sij)n×nthe adjacency weight matrix defined as

    where Nk(xj)is k nearest neighbors of xj.

    To avoid degenerative solution,this paper,where diiis the

    where D=diag(d11,d22,…,dnn)is a diagonal matrix and L=D-S the graph Laplacian[6].Eq.(3)is a typical non-convex[13],and it is often solved approximately by the generalized eigenvalue problem:XLXTw=λ XDXTw,where w is the eigenvector for constructing.row sum of S(or column since S is symmetrical).Then, with simple formulation, LPP can be rewritten as a compact trace ratio form[6]

    2 SELF-DEPENDENT LOCALITY PRESERVING PROJECTIONS

    2.1 Motivation

    According to Refs.[6,14],the locality preserving power of LPP is potentially related to discriminating power.As a result,the 1-NN classifier in LPP transformed space generally achieves better performance in comparison with the baseline scenario(i.e.,performing 1-NN classifier in original space without any transformation).In fact,this is also the observation from the experiments in many research work[6,9,14]related to LPP.

    Since the neighborhood graph and 1-NN classifier are both closely related to the nearest neighbor criterion,a natural idea is that the neighborhood graph constructed in the LPP transformed space includes more discriminative information than the one constructed in the original space.Therefore,this paper updates the neighborhood graph in previous LPP transformed space,and then repeats LPP.The corresponding algorithm is called the self-dependent LPP(sdLPP),since it only depends on LPP itself instead of resorting to other extra tricks or tools.This paper develops an improved Laplacian score(LS)as an alternative under the stop condition,and proposes a specific algorithm.

    2.2 Improved Laplacian score

    The original Laplacian score(LS)is introduced to scale the importance of the feature(or variable)[15]. Different from classical Fisher score[16]which can only work under supervised scenario,LS can work under supervised,unsupervised and even semi-supervised scenarios.Although LS aims at the feature selection,it can naturally be extended to the feature extraction or the dimensionality reduction field.However,typical LS is based on the artificially predefined neighborhood graph,and it becomes a constant once the specific projection directions are given.So the reliability of LS relies heavily on the single pre-fixed neighborhood graph;and LS cannot directly be used as iteration termination condition in the proposed algorithm.

    An improved Laplacian score(iLS)is defined as follows

    The i LS shares the similar mathematical expression to the objective function of Eq.(3)and typical LS[15],but it has remarkable differences from them.In typical LS(or more generally,the most existing locality-oriented DR algorithms),the adjacency weight matrix S of the graph is fixed in advance.While in the proposed iLS,S is variable,that is,the iLS is a joint function with respect to W and S.

    2.3 sdLPP Algorithm

    Based on the above discussion,sdLPP algorithm is given as follows:

    (1)As in LPP and other locality-oriented DR algorithms,the data points in a PCA transformed subspace are firstly projected to remove the null space of XXTand the possible singularity problem is avoided.Without loss of generality,X is used to denote the data points in the PCA transformed subspace.

    (2)Constructing initial neighborhood graph G(X,S)with appropriate neighborhood parameter k and Gaussian kernel width ein Eq.(2),then calculating the projection matrix W by solving the generalized eigenvalue problem: XLXTw =λ XDXTw,and finally calculating the iLS,Lold=L(W,S)with current W and S.

    (3)Updating S in the previous LPP transformed space with the same parameters k ande,then repeatly computing the new projection matrix W and i LS,Lnew=L(W,S)with the new W and S.

    (4)If Lnew>Lold,let Lold= Lnewand turn to Step(3),otherwise,break and return to W.

    It is easy to see that the sdLPP algorithm is simple.Here,it needs to point out that iLSis not an accurate indicator of the discriminate power,since the proposed algorithm completely works under the unsupervised scenario.Therefore,in the experiments,this paper first performs ten iterations and judges whether they should be continued according to iLS at another ten iterations.This trick is used to avoid the unexpected stop due to the possible fluctuation of iLS.Of course,a well-designed heuristic strategy is worthy of deep study in the future.

    3 EXPERIMENTS

    In this section,the algorithm is compared with LPP through the illustrative example,the clustering and the face recognition experiments.

    3.1 Illustrative example

    Firstly,this paper visually show how and why the algorithm works on the widely used Wine data set from the UCI machine learning repository.The Wine has 13 features,3 classes and 178 instances.A main characteristic of this data set is that its last feature has large range and variance relative to other ones.As a result,such remarkable feature will play a key role in the data distribution.This generally challenges the typical locality-oriented DR methods including LPP,since the neighborhood graph is fixed in advance and heavily depends on the last remarkable feature due to its large range and variance in the original space.

    (1)Data visualization

    In particular,Fig.1(a)shows the 2-D projections of Wine by the typical LPP whose adjacency graph is constructed with neighborhood size k=min{n1,n2,n3}- 1,where niis the sample number from the i th class,and heat kernel widtheas the mean norm of the data[17](the influence of these parameters on the ultimate performance is discussed.).By the LPP transformation,the three classes are overlapped together.Then,the proposed sdLPP is performed.Fig.1(b)shows the improved Laplacian score at each iteration.Generally speaking,it presents an increasing tendency with the iteration.Figs.1(c-f)give the 2-D projections by sdLPP after 1,5,10,and 20 iterations,respectively,which show that the three classes in the subspace are gradually separated from each other.This illustrates that the graph updating strategy can potentially benefit the subsequent learning task.

    Fig.1 Data visualization results and iLS of sdLPP at each iteration on Wine data set

    (2)Sensitivity to parameters k ande

    The model selection for unsupervised learning is one of classical challenges to machine learning and pattern recognition tasks.Fig.2 shows the performances of LPP and sdLPP on Wine data set with different parameter values,respectively.In the experiment,25 samples per class are randomly selected for training and remaining for test.Then,the classification accuracies of 20 training splits per test are averaged,and the best results at certain dimensions are plotted.In particular,Fig.2(a)shows the classification accuracies using graphs with the different neighborhood size k and fixed heat kernel widthe0,where k is traversed from 1 to 50 ande0is set as the mean norm of the training samples[17].Fig.2(b)shows the accuracies using graphs with different kernel width e and fixed k,where e is chosen from{2-10e0,…,20e0,…,210e0}and k is set to 24,i,e.,25-1.

    Fig.2 Classification accuracies on Wine using graphs

    As shown in Fig.2,with the pre-fixed graph,typical LPP generally suffers from serious performance drop if it gives an improper parameter value.In contrast,sdLPP is not so sensitive to the setting of the initial parameters due to the fact that the graph becomes better with the subsequent updating process.

    3.2 Clustering

    In what follows,this paper performs clustering experiments using ten data sets by widely used UCIincluding Iris,Wine,Wdbc,and so on.The statistical parameters of all the data sets are summarized in Table 1.

    Table1 Statistical parameters of used data sets for clustering

    In the experiments,the normalized mutual information(NMI)[17]is adopted to evaluate the clustering performance.NMI measurement is defined as

    where A and B are the random variables of the cluster memberships from the ground truth and the output of clustering algorithm,respectively;I(A,B)is the mutual information between A and B.H(A)and H(B)are the entropies of A and B,respectively.

    On all the data sets,the k-means clustering is performed in the original space(baseline),the LPP transformed space and the sdLPP transformed space,respectively.For LPP and sdLPP,the neighborhood size k is set to min{n1,n2,…,nc},where niis the training sample number from the ith class;the heat kernel parametereis set as the mean norm of the training samples according to the scheme used in Ref.[17].This paper repeats 50 experiments and reports the mean performances and the corresponding subspace dimensions in Table 2.

    Table 2 Performance(NMI)comparisons for clustering task

    The results show that:(1)the performance of k-means can generally be improved after DR,i.e.,on data sets①—⑤⑧⑨,which illustrates that the locality preserving power is potentially related to discriminative power;(2)on very few data sets(sets⑥⑦),the DR algorithms do not help the clustering,which is due to serious overlap of the data or inappropriately assigned param-eter values for the neighborhood graph;(3)the sdLPP algorithm can remarkably outperform typical LPP on most of the data sets.This illustrates that the neighborhood graph updating can potentially benefit the subsequent discriminant task.

    3.3 Face recognition

    Typical LPP has been successfully used in the face recognition where it is appeared as the popular Laplacianface[11].This paper experimentally comparesthe proposed algorithmswith Laplacianface on two publicly available face databases:AR and extended Yale B.

    (1)Database description

    In the experiments,a subset of the AR face database provided and preprocessed by Martinez[18]is used.This subset contains 1 400 face images corresponding to 100 person(50 men and 50 women),where each person has 14 different images with illumination change and expressions.The original resolution of these image faces is 165×120. For the computational convenience,this paper resizes them to 33× 24.Fig.3(a)gives 14 face images of one person taken in two sessions with different illuminations and expressions.Extended Yale B database[19]contains 2 414 front-view face images of 38individuals.For each individual,about 64 pictures are taken under various laboratory-controlled lighting conditions.In the experiments,this paper uses cropped images with resolution of 32×32.Fig.3(b)gives some face images of one person from this database.

    (2)Experimental setting

    For AR database,the face images taken in the first session are used for training,and the images taken in the second session are used for test;for extended Yale B,this paper randomly selects l(l=10,20 and 30,respectively)samples for training and remaining for test.The ultimate performance is the mean of 10 training per test splits.

    Firstly,the data is projected in a PCA transformed subspace,which is calculated using the training samples and kept 98% energy;then,LPP and sdLPP are performed in the subspace,and 1-NN classifier is chosen due to its simplicity,effectiveness and efficiency to evaluate the recognition rates on the test data.As a Baseline,the recognition rates of 1-NN classifier are also given on the raw data.

    Fig.3 Face images of one person

    (3)Parameter selection

    For LPP and sdLPP,the model parameters include the neighborhood size k and the kernel widthe.In our experiments,we empirically sete as the mean norm of the training samples[17],and determine k values by searching in a large range of candidate values and reporting the best results.

    (4)Experimental results and discussion

    Fig.4 shows the recognition rate curves of different methods on AR and extended Yale B(l=10)databases.For AR,the best recognition rates of Baseline,LPP and sdLPP are 74.57%,79.00% and 81.71%,respectively,where the best neighborhood size k= 1.For extended Yale B,the best performances and corresponding dimensions are reported in Table 3.For l=10,20 and 30,the best neighborhood size k is 1,1 and 2,respectively.

    From the experimental results,the following observations are obtained:

    Table3 Performance comparisons on extended Yale B database

    Fig.4 Recognition rate curves based on different methods

    (1)Typical LPP and the proposed sdLPP can achieve betterperformance than the baseline method.This further illustrates that the locality preserving DR algorithms can encode potential discriminating information,even though under the unsupervised scenario.

    (2)The proposed sdLPP consistently outperforms LPP on the used face databases.This illustrates that sdLPP actually benefits from the graph updating process.

    4 CONCLUSIONS

    This paper develops a novel LPP algorithm with the adjustable neighborhood graph.As a summary,several favorable properties of the algorithm are enumerate.

    (1)sdLPP is self-dependent.It does not acquire to pay incidental expenses for exploiting new tools,but directly work on off-the-shelf LPP.So sdLPP is very simple and analytically tractable.And sdLPP naturally inherits some good characteristics from original LPP.For example,it can avoid the″out-of-sample″problem involved in manifold learning.

    (2)sdLPP is not so sensitive to neighborhood size k and Gaussian kernel width erelative to typical LPP,due to the fact that the neighborhood graph becomes better and better in the subsequent updating process.

    (3)sdLPP can potentially use the discriminative information lying in both the original space and the transformed space,since the graph in sdLPP is adjustable instead of being fixed beforehand as in LPP.This can potentially help the subsequent learning task.

    (4)The idea behind sdLPP is extremely general.It can easily and naturally be applied to many other graph-based DRs just through slight modifications.

    It is worthwhile to point out that the proposed algorithm including the improved Laplacian score is completely unsupervised.Although unsupervised DR methods do not require the efforts of human annotators,reliable supervised information generally help to achieve better discriminative power.In the future,we expect to further improve the proposed algorithm by absorbing possibly known label information and extend it to semi-supervised scenario.

    [1] Hastie T.The elements of statistical learning:data mining,inference,and prediction [M].2nd Ed.New York:Springer,2009.

    [2] Roweis S T,Saul L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.

    [3] Tenenbaum J B,Mapping a manifold of perceptual observations [C]//Neural Information Processing Systems(NIPS). Cambridge,M A,USA:MIT Press,1998:682-688.

    [4] Belkin M,Niyogi P.Laplacian eigenmaps for dimensionality reduction and data representation[J].Neural Computation,2003,15(6):1373-1396.

    [5] Van-der-Maaten L J P,Postma E O,Van-den-Herik H J.Dimensionality reduction:a comparative review[EB/OL].http://ict.ewi.tudelft.nl/~ lvandermaaten/Publications-files/JM LR-Paper.pdf,2009-10.

    [6] He X F,Niyogi P.Locality preserving projections[C]//Neural Information Processing Systems(NIPS).Cambridge,M A,USA:M IT Press,2003:153-160.

    [7] He X F,Cai D,Yan S C,et al.Neighborhood preserving embedding[C]//IEEE International Conference on Computer Vision(ICCV).Washington,DC,USA:IEEE Computer Society,2005:1208-1213.

    [8] Yang J,Zhang D,Yang J Y,et al.Globally maximizing,locally minimizing: Unsupervised discriminant projection with applications to face and palm biometrics[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(4): 650-664.

    [9] Qiao L,Chen S,Tan X,Sparsity preserving projections with applications to face recognition[J].Pattern recognition,2010,43(1):331-341.

    [10]Yan S C,Xu D,Zhang B Y,et al.Graph embedding and extensions:a general framework for dimensionality reduction[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(1):40-51.

    [11]He X F,Yan S C,Hu Y X,et al.Face recognition using Laplacianfaces[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(3):328-340.

    [12]Chen H T,Chang H W,Liu T L.Local discriminant embedding and its variants[C]//IEEE Conference on Computer Vision and Pattern Recognition(CVPR).Washington,DC,U SA:IEEE ComputerSociety,2005:846-853.

    [13]Wang H,Yan S C,Xu D,et al.Trace ratio vs.ratio trace for dimensionality reduction[C]//IEEE Conference on Computer Vision and Pattern Recognition(CV PR).Washington,DC,U SA:IEEE Computer Society,2007:1-8.

    [14]Cai D,He X F,Han JW,et al.Orthogonal Laplacianfaces for face recognition [J].IEEE Transactions on Image Processing,2006,15(11): 3608-3614.

    [15]He X,Cai D,Niyogi P.Laplacian score for feature selection[C]//Neural Information Processing Systems(NIPS).Cambridge,M A,U SA:MIT Press,2005:507-514.

    [16]Bishop C M.Pattern recognition and machine learning[M].New York:Springer,2006.

    [17]Wu M,Scholkopf B.A local learning approach for clustering[C]//Neural Information Processing Systems(NIPS).Cambridge,MA,USA:MIT Press,2006:1529-1536.

    [18]Martinez A M,Kak A C.PCA versus LDA[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(2):228-233.

    [19]Lee K C,Ho J,Kriegman D J.Acquiring linear subspaces for face recognition under variable lighting[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2005,27(5):684-698.

    猜你喜歡
    工程系南京航空航天大學(xué)聊城
    南京航空航天大學(xué)機(jī)電學(xué)院
    南京航空航天大學(xué)機(jī)電學(xué)院
    南京航空航天大學(xué)
    南京航空航天大學(xué)生物醫(yī)學(xué)光子學(xué)實(shí)驗(yàn)室
    聊城高新區(qū)多措并舉保障貧困戶“居住無憂”
    聊城 因水而生 有水則靈
    走向世界(2018年11期)2018-12-26 01:12:44
    聊城,宛在水中央
    走向世界(2018年11期)2018-12-26 01:12:44
    新動能,新聊城
    走向世界(2018年11期)2018-12-26 01:12:32
    電子信息工程系
    機(jī)電工程系簡介
    中文字幕人妻丝袜一区二区| 天堂√8在线中文| av在线蜜桃| 12—13女人毛片做爰片一| 国产男靠女视频免费网站| 日韩欧美精品v在线| 亚洲真实伦在线观看| 婷婷精品国产亚洲av在线| 每晚都被弄得嗷嗷叫到高潮| 亚洲一区二区三区色噜噜| 99久久久亚洲精品蜜臀av| 国内少妇人妻偷人精品xxx网站 | 国产精品久久久人人做人人爽| 国内揄拍国产精品人妻在线| 婷婷六月久久综合丁香| 日韩 欧美 亚洲 中文字幕| 国产视频一区二区在线看| 中文字幕av在线有码专区| 午夜福利成人在线免费观看| 在线观看免费午夜福利视频| 又爽又黄无遮挡网站| 嫩草影院精品99| av中文乱码字幕在线| 夜夜躁狠狠躁天天躁| 亚洲乱码一区二区免费版| 欧美在线一区亚洲| 久久人妻av系列| 悠悠久久av| 岛国在线免费视频观看| 久久久久国产一级毛片高清牌| 网址你懂的国产日韩在线| 天堂√8在线中文| 国产成人一区二区三区免费视频网站| 黑人操中国人逼视频| 国产成人系列免费观看| 欧美激情在线99| 欧美最黄视频在线播放免费| 又粗又爽又猛毛片免费看| 成在线人永久免费视频| 精品久久蜜臀av无| 国产一区二区三区视频了| 在线免费观看的www视频| 久久久久性生活片| 亚洲专区国产一区二区| 哪里可以看免费的av片| aaaaa片日本免费| 亚洲午夜精品一区,二区,三区| 岛国视频午夜一区免费看| 两个人的视频大全免费| 中文字幕熟女人妻在线| 久久久久精品国产欧美久久久| 日韩欧美 国产精品| 欧美日韩乱码在线| 九色国产91popny在线| 日韩人妻高清精品专区| 两个人视频免费观看高清| 色尼玛亚洲综合影院| 好男人电影高清在线观看| 国模一区二区三区四区视频 | 欧美精品啪啪一区二区三区| 精品福利观看| 欧美三级亚洲精品| 国产精品综合久久久久久久免费| 久久精品人妻少妇| 亚洲成人精品中文字幕电影| 国产99白浆流出| 国产亚洲精品综合一区在线观看| 国内久久婷婷六月综合欲色啪| aaaaa片日本免费| 国产成人影院久久av| 成人18禁在线播放| 国产精品久久久久久久电影 | 日韩欧美在线乱码| 日韩三级视频一区二区三区| 国产精品久久电影中文字幕| 91字幕亚洲| 麻豆av在线久日| 欧美日韩国产亚洲二区| 亚洲熟妇中文字幕五十中出| 老司机午夜十八禁免费视频| 国内精品美女久久久久久| 一个人免费在线观看电影 | 每晚都被弄得嗷嗷叫到高潮| 欧美成狂野欧美在线观看| 黄色日韩在线| 久久久精品欧美日韩精品| 国产高清三级在线| 男人舔女人下体高潮全视频| www国产在线视频色| 两个人的视频大全免费| 久久天躁狠狠躁夜夜2o2o| 超碰成人久久| 久久久精品欧美日韩精品| 国产av不卡久久| 免费高清视频大片| 国产精品免费一区二区三区在线| 国产精品香港三级国产av潘金莲| av国产免费在线观看| 天天躁狠狠躁夜夜躁狠狠躁| 91av网一区二区| 欧美+亚洲+日韩+国产| 男人舔女人的私密视频| 麻豆久久精品国产亚洲av| 亚洲 欧美 日韩 在线 免费| 亚洲欧美精品综合一区二区三区| 国产精品98久久久久久宅男小说| 不卡av一区二区三区| 国产视频一区二区在线看| 亚洲 国产 在线| 久久精品夜夜夜夜夜久久蜜豆| 深夜精品福利| 午夜福利免费观看在线| 嫩草影院入口| 欧美乱妇无乱码| 操出白浆在线播放| 黄色片一级片一级黄色片| 精品一区二区三区四区五区乱码| 中亚洲国语对白在线视频| 久久精品国产综合久久久| 天堂av国产一区二区熟女人妻| 一个人看视频在线观看www免费 | 波多野结衣高清无吗| 国内精品美女久久久久久| 亚洲国产高清在线一区二区三| 日本成人三级电影网站| 亚洲精品一卡2卡三卡4卡5卡| 国语自产精品视频在线第100页| 亚洲国产高清在线一区二区三| 国产精品综合久久久久久久免费| 999久久久国产精品视频| 18禁观看日本| 人人妻,人人澡人人爽秒播| 久久欧美精品欧美久久欧美| 此物有八面人人有两片| 此物有八面人人有两片| 亚洲精品在线观看二区| 国产私拍福利视频在线观看| 男女之事视频高清在线观看| 欧美日韩一级在线毛片| av黄色大香蕉| 国产午夜精品论理片| 老汉色∧v一级毛片| 国产精品av视频在线免费观看| 久久精品综合一区二区三区| 九色成人免费人妻av| 欧美性猛交黑人性爽| 淫秽高清视频在线观看| 国产伦人伦偷精品视频| 精品欧美国产一区二区三| 丝袜人妻中文字幕| 一边摸一边抽搐一进一小说| 美女cb高潮喷水在线观看 | 欧美日韩瑟瑟在线播放| 国产精品99久久久久久久久| 搞女人的毛片| 国产黄片美女视频| 中文亚洲av片在线观看爽| 国内精品美女久久久久久| 大型黄色视频在线免费观看| 男女午夜视频在线观看| 精华霜和精华液先用哪个| 草草在线视频免费看| 色吧在线观看| 亚洲精品美女久久av网站| 露出奶头的视频| 日日干狠狠操夜夜爽| 亚洲国产欧美一区二区综合| 桃红色精品国产亚洲av| 麻豆国产av国片精品| 狂野欧美白嫩少妇大欣赏| 国产欧美日韩一区二区精品| 操出白浆在线播放| 色视频www国产| 成熟少妇高潮喷水视频| 欧美国产日韩亚洲一区| 日本 av在线| 国产亚洲av嫩草精品影院| 精品欧美国产一区二区三| 国产麻豆成人av免费视频| 免费看日本二区| 99久久成人亚洲精品观看| 黑人欧美特级aaaaaa片| 一级毛片精品| 又黄又粗又硬又大视频| 在线观看免费视频日本深夜| 精品国产三级普通话版| 久久久久久久精品吃奶| 美女高潮的动态| 黑人操中国人逼视频| 91字幕亚洲| 国产一区二区激情短视频| 99在线视频只有这里精品首页| 两个人看的免费小视频| 国产精品电影一区二区三区| 看片在线看免费视频| 国语自产精品视频在线第100页| 日韩精品青青久久久久久| 18禁国产床啪视频网站| 国产高清有码在线观看视频| 国产美女午夜福利| 成熟少妇高潮喷水视频| 啦啦啦观看免费观看视频高清| 黄色日韩在线| 亚洲色图av天堂| 黑人巨大精品欧美一区二区mp4| 色哟哟哟哟哟哟| 亚洲成人精品中文字幕电影| 久久久成人免费电影| 亚洲精品美女久久av网站| 国产精品女同一区二区软件 | 婷婷六月久久综合丁香| 美女黄网站色视频| 国产精品精品国产色婷婷| 91久久精品国产一区二区成人 | 欧美午夜高清在线| 久久久久久久午夜电影| 性色av乱码一区二区三区2| 午夜福利在线观看吧| 一进一出抽搐动态| 丰满人妻熟妇乱又伦精品不卡| 午夜精品一区二区三区免费看| 久久久久免费精品人妻一区二区| av天堂中文字幕网| 久久久久亚洲av毛片大全| 亚洲美女视频黄频| 成年女人永久免费观看视频| 色播亚洲综合网| 亚洲欧美日韩高清专用| 99久久综合精品五月天人人| 国产探花在线观看一区二区| 无限看片的www在线观看| 久久香蕉精品热| 亚洲美女黄片视频| 国产精品免费一区二区三区在线| 黄色视频,在线免费观看| 中国美女看黄片| 国产亚洲欧美98| 最新美女视频免费是黄的| 久久精品国产综合久久久| 中文亚洲av片在线观看爽| 午夜影院日韩av| 欧美黄色片欧美黄色片| 18禁美女被吸乳视频| 午夜精品在线福利| 精品久久蜜臀av无| 窝窝影院91人妻| 激情在线观看视频在线高清| 日韩人妻高清精品专区| 久久精品国产亚洲av香蕉五月| a级毛片a级免费在线| 性色avwww在线观看| 又爽又黄无遮挡网站| 成人一区二区视频在线观看| 又黄又爽又免费观看的视频| 伊人久久大香线蕉亚洲五| 午夜免费观看网址| 一个人免费在线观看电影 | 亚洲精品一卡2卡三卡4卡5卡| 日韩欧美国产一区二区入口| 色综合亚洲欧美另类图片| 午夜亚洲福利在线播放| 日韩有码中文字幕| 制服人妻中文乱码| 国语自产精品视频在线第100页| 草草在线视频免费看| 男人的好看免费观看在线视频| 久久久精品大字幕| 精品电影一区二区在线| 听说在线观看完整版免费高清| 成年免费大片在线观看| 久久久久国产精品人妻aⅴ院| 国产免费av片在线观看野外av| 俺也久久电影网| 99精品在免费线老司机午夜| 欧美又色又爽又黄视频| 日本免费a在线| 色综合欧美亚洲国产小说| 日韩国内少妇激情av| 天天躁狠狠躁夜夜躁狠狠躁| 制服丝袜大香蕉在线| 性欧美人与动物交配| 国产精品亚洲美女久久久| 久久国产乱子伦精品免费另类| 精品国产乱码久久久久久男人| 国产精品亚洲av一区麻豆| 日本一本二区三区精品| 亚洲国产精品合色在线| 嫩草影视91久久| 久久久久久久久久黄片| 亚洲av熟女| 在线观看日韩欧美| 日韩欧美国产在线观看| 日韩中文字幕欧美一区二区| 19禁男女啪啪无遮挡网站| 久久亚洲真实| 日本黄色片子视频| 真人做人爱边吃奶动态| 亚洲熟女毛片儿| 少妇丰满av| 欧美性猛交╳xxx乱大交人| 国产欧美日韩一区二区精品| 国产精品久久电影中文字幕| netflix在线观看网站| 麻豆久久精品国产亚洲av| 他把我摸到了高潮在线观看| 亚洲精品中文字幕一二三四区| 久久午夜亚洲精品久久| 亚洲男人的天堂狠狠| 国产三级中文精品| 精品久久久久久久毛片微露脸| 午夜福利成人在线免费观看| 国产极品精品免费视频能看的| 深夜精品福利| a级毛片在线看网站| 首页视频小说图片口味搜索| 国产精品电影一区二区三区| 变态另类成人亚洲欧美熟女| 国产精品久久久av美女十八| 欧美性猛交黑人性爽| 精品欧美国产一区二区三| 伦理电影免费视频| 欧美不卡视频在线免费观看| 亚洲乱码一区二区免费版| 国产精品日韩av在线免费观看| 久99久视频精品免费| 欧美一级a爱片免费观看看| 亚洲中文字幕日韩| 日韩欧美一区二区三区在线观看| 一夜夜www| 日本黄色片子视频| 免费一级毛片在线播放高清视频| 亚洲中文字幕一区二区三区有码在线看 | 悠悠久久av| 制服丝袜大香蕉在线| 亚洲av电影不卡..在线观看| 国产亚洲av高清不卡| 欧美日韩中文字幕国产精品一区二区三区| 校园春色视频在线观看| 亚洲 欧美 日韩 在线 免费| 精品国产乱码久久久久久男人| 两个人看的免费小视频| 亚洲国产精品sss在线观看| 最近在线观看免费完整版| 亚洲一区二区三区不卡视频| 中文字幕人妻丝袜一区二区| 成熟少妇高潮喷水视频| 日韩人妻高清精品专区| 日韩欧美三级三区| 一本一本综合久久| www日本在线高清视频| 国产午夜精品久久久久久| www.精华液| 午夜免费成人在线视频| 国产精华一区二区三区| 成人高潮视频无遮挡免费网站| 麻豆国产97在线/欧美| www.精华液| 日本五十路高清| 亚洲成av人片免费观看| 少妇熟女aⅴ在线视频| 国产成年人精品一区二区| 欧美另类亚洲清纯唯美| 久久精品人妻少妇| 两性夫妻黄色片| 老司机在亚洲福利影院| 免费观看人在逋| 精品国产乱子伦一区二区三区| 一a级毛片在线观看| 欧美丝袜亚洲另类 | 少妇人妻一区二区三区视频| 久久九九热精品免费| 天天一区二区日本电影三级| 五月伊人婷婷丁香| 亚洲成av人片免费观看| 国产午夜精品论理片| 久久精品aⅴ一区二区三区四区| 最近在线观看免费完整版| 香蕉久久夜色| 久久天堂一区二区三区四区| 国产淫片久久久久久久久 | 18禁观看日本| 制服丝袜大香蕉在线| 日本在线视频免费播放| 亚洲中文字幕一区二区三区有码在线看 | 亚洲七黄色美女视频| 高清在线国产一区| 每晚都被弄得嗷嗷叫到高潮| 亚洲性夜色夜夜综合| 熟女少妇亚洲综合色aaa.| 亚洲人成网站在线播放欧美日韩| 亚洲精品色激情综合| 无人区码免费观看不卡| 欧美绝顶高潮抽搐喷水| 国内揄拍国产精品人妻在线| 日本一二三区视频观看| 人人妻人人看人人澡| 18禁黄网站禁片免费观看直播| 真人一进一出gif抽搐免费| 亚洲国产日韩欧美精品在线观看 | 久久久久久国产a免费观看| 午夜成年电影在线免费观看| 男人舔女人的私密视频| 99riav亚洲国产免费| 亚洲人成网站高清观看| 欧美一区二区精品小视频在线| 嫩草影院精品99| 99久久综合精品五月天人人| 国产精品一区二区精品视频观看| 国产91精品成人一区二区三区| 欧美色视频一区免费| 在线国产一区二区在线| 成人三级黄色视频| 色精品久久人妻99蜜桃| 亚洲在线观看片| 国产三级中文精品| 免费大片18禁| 国产精品 欧美亚洲| 国产午夜福利久久久久久| 麻豆国产97在线/欧美| 免费在线观看亚洲国产| 757午夜福利合集在线观看| 久久精品国产清高在天天线| 久9热在线精品视频| 99久久99久久久精品蜜桃| www日本黄色视频网| 高清毛片免费观看视频网站| 啦啦啦观看免费观看视频高清| 亚洲av成人不卡在线观看播放网| 少妇裸体淫交视频免费看高清| 欧美中文综合在线视频| 国产亚洲精品久久久久久毛片| 国产一区二区在线av高清观看| 美女高潮的动态| 亚洲专区国产一区二区| 亚洲无线观看免费| 成人一区二区视频在线观看| 国产精品影院久久| cao死你这个sao货| 久久国产精品人妻蜜桃| 久久精品人妻少妇| 亚洲欧美日韩卡通动漫| 伦理电影免费视频| 男人舔女人的私密视频| 亚洲av熟女| 高潮久久久久久久久久久不卡| 国产三级中文精品| 亚洲国产精品成人综合色| 国产成人av激情在线播放| 最好的美女福利视频网| 国产精品久久视频播放| 99久久精品一区二区三区| 亚洲成人精品中文字幕电影| 日韩高清综合在线| 国产欧美日韩精品一区二区| 亚洲成人精品中文字幕电影| 日韩高清综合在线| 国产亚洲精品综合一区在线观看| 国产精品av视频在线免费观看| 国产亚洲av嫩草精品影院| 韩国av一区二区三区四区| 国产免费av片在线观看野外av| 黄色片一级片一级黄色片| 欧美不卡视频在线免费观看| АⅤ资源中文在线天堂| 老司机午夜福利在线观看视频| 成人特级黄色片久久久久久久| 极品教师在线免费播放| 中文字幕最新亚洲高清| 国产av在哪里看| 18禁裸乳无遮挡免费网站照片| 午夜日韩欧美国产| 亚洲18禁久久av| 在线播放国产精品三级| 丁香欧美五月| 97超级碰碰碰精品色视频在线观看| 欧美+亚洲+日韩+国产| 操出白浆在线播放| 国产精品综合久久久久久久免费| 国产欧美日韩一区二区精品| 国产一级毛片七仙女欲春2| 毛片女人毛片| 国产午夜福利久久久久久| 夜夜爽天天搞| 亚洲色图av天堂| 天天躁日日操中文字幕| 岛国在线观看网站| 亚洲av熟女| 激情在线观看视频在线高清| 91久久精品国产一区二区成人 | 亚洲人成网站高清观看| 国产精品,欧美在线| av黄色大香蕉| 黄片小视频在线播放| 黑人欧美特级aaaaaa片| 日本精品一区二区三区蜜桃| 久久久精品大字幕| АⅤ资源中文在线天堂| 日日夜夜操网爽| 人人妻人人澡欧美一区二区| 亚洲无线观看免费| 一个人免费在线观看电影 | 亚洲av熟女| 成年版毛片免费区| 色精品久久人妻99蜜桃| 国产精品亚洲一级av第二区| 久久精品91无色码中文字幕| 熟女电影av网| 午夜精品在线福利| 日本黄色视频三级网站网址| 久久久色成人| 亚洲专区中文字幕在线| 老司机午夜十八禁免费视频| 精品人妻1区二区| 性欧美人与动物交配| www日本黄色视频网| 亚洲男人的天堂狠狠| 他把我摸到了高潮在线观看| 日本在线视频免费播放| 黄片小视频在线播放| 看免费av毛片| 亚洲av日韩精品久久久久久密| 亚洲国产日韩欧美精品在线观看 | 18美女黄网站色大片免费观看| 亚洲成人免费电影在线观看| 欧美乱妇无乱码| 日本免费一区二区三区高清不卡| 99在线视频只有这里精品首页| 91字幕亚洲| 国产97色在线日韩免费| 美女午夜性视频免费| 校园春色视频在线观看| 久久这里只有精品中国| 中文字幕最新亚洲高清| 国产高清激情床上av| e午夜精品久久久久久久| 国产精品影院久久| 免费看日本二区| 亚洲av成人不卡在线观看播放网| 青草久久国产| 99久久久亚洲精品蜜臀av| 精品一区二区三区四区五区乱码| 久久国产乱子伦精品免费另类| 精品久久久久久成人av| 香蕉av资源在线| 精品国产乱子伦一区二区三区| 久久精品综合一区二区三区| 香蕉久久夜色| 久久亚洲真实| 动漫黄色视频在线观看| 国产美女午夜福利| 在线视频色国产色| 色综合欧美亚洲国产小说| 精品国产美女av久久久久小说| 九色国产91popny在线| 国产成人福利小说| 亚洲无线观看免费| 国产精品久久久久久人妻精品电影| 亚洲一区二区三区不卡视频| 亚洲黑人精品在线| 动漫黄色视频在线观看| 亚洲人与动物交配视频| 精品久久久久久,| svipshipincom国产片| 国产精品一区二区三区四区久久| 小蜜桃在线观看免费完整版高清| 黄频高清免费视频| 欧美日韩综合久久久久久 | 人妻久久中文字幕网| 91麻豆精品激情在线观看国产| 脱女人内裤的视频| 偷拍熟女少妇极品色| 老司机在亚洲福利影院| 两性夫妻黄色片| 婷婷精品国产亚洲av在线| 精品午夜福利视频在线观看一区| 午夜免费观看网址| 日日摸夜夜添夜夜添小说| 免费观看人在逋| 精品不卡国产一区二区三区| 麻豆成人午夜福利视频| 色在线成人网| av黄色大香蕉| 亚洲av日韩精品久久久久久密| 日本熟妇午夜| 给我免费播放毛片高清在线观看| 国产三级中文精品| 精品久久久久久成人av| 国产欧美日韩精品亚洲av| 成人永久免费在线观看视频| 又大又爽又粗| 亚洲 国产 在线| 精品久久久久久成人av| 久久亚洲精品不卡| 日日夜夜操网爽| 日本a在线网址| 午夜福利在线观看免费完整高清在 | 亚洲第一电影网av| 国产欧美日韩精品亚洲av| 国产精品99久久久久久久久| 18禁裸乳无遮挡免费网站照片| 悠悠久久av| 99久久综合精品五月天人人| 欧美黑人欧美精品刺激| 亚洲av中文字字幕乱码综合| 成人午夜高清在线视频| 我的老师免费观看完整版| 精品一区二区三区av网在线观看| 88av欧美| 亚洲成av人片在线播放无| 色精品久久人妻99蜜桃| 久久精品人妻少妇| 法律面前人人平等表现在哪些方面| 床上黄色一级片| 色老头精品视频在线观看| 亚洲国产色片|