• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Multiple Kernel Clustering Based on Self-Weighted Local Kernel Alignment

    2019-11-07 03:13:08ChuanliWangEnZhuXinwangLiuJiaohuaQinJianpingYinandKaikaiZhao
    Computers Materials&Continua 2019年10期

    Chuanli Wang,En ZhuXinwang LiuJiaohua Qin,Jianping Yinand Kaikai Zhao

    Abstract:Multiple kernel clustering based on local kernel alignment has achieved outstanding clustering performance by applying local kernel alignment on each sample.However,we observe that most of existing works usually assume that each local kernel alignment has the equal contribution to clustering performance,while local kernel alignment on different sample actually has different contribution to clustering performance.Therefore this assumption could have a negative effective on clustering performance.To solve this issue,we design a multiple kernel clustering algorithm based on self-weighted local kernel alignment,which can learn a proper weight to clustering performance for each local kernel alignment.Specifically,we introduce a new optimization variable-weight-to denote the contribution of each local kernel alignment to clustering performance,and then,weight,kernel combination coefficients and cluster membership are alternately optimized under kernel alignment frame.In addition,we develop a three-step alternate iterative optimization algorithm to address the resultant optimization problem.Broad experiments on five benchmark data sets have been put into effect to evaluate the clustering performance of the proposed algorithm.The experimental results distinctly demonstrate that the proposed algorithm outperforms the typical multiple kernel clustering algorithms,which illustrates the effectiveness of the proposed algorithm.

    Keywords:Multiple kernel clustering,kernel alignment,local kernel alignment,self-weighted.

    1 Introduction

    On the one hand,kernel-based clustering algorithms are simple and effective[Filippone,Camastra,Masulli et al.(2008);Tzortzis and Likas(2009)];on the other hand,many typical clustering algorithms,such as spectral clustering and non-negative matrix factorization clustering,can be interpreted from the perspective of kernel[Dhillon,Guan and Kulis(2007);Ding,He,Simon et al.(2005)].Therefore,kernel-based clustering algorithms have been a research hot in various applications[Gnen and Margolin(2014);Li,Qin,Xiang et al.(2015)].Compared with one kernel,multiple kernel can provides more useful and complementary information for clustering[Cai,Nie and Huang(2013);Cai,Jiao,Zhuge et al.(2018);Hou,Nie,Tao et al.(2017)].Multiple kernel clustering(MKC)has attracted more and more attention,and a lot of MKC algorithms and their variants have been proposed recently[Han,Yang,Yang et al.(2018);Du,Zhou,Shi et al.(2015)].

    MKC algorithms aim to improve the clustering performance by jointly optimizing a group of kernel combination coefficients and cluster membership[Liu,Dou,Yin et al.(2016)].In light of the difference of optimization frame,existing MKC algorithms can be roughly grouped into two categories.The spirit of the first category is that the single kernel is replaced with a combined kernel in objective function of clustering,and the optimal kernel combination coefficients and cluster membership are solved under clustering frame.The algorithms with regard to this one mainly include:multiple kernelK-means[Huang,Chuang,Chen et al.(2012)],multiple kernel fuzzyC-means[Chen,Chen,Lu et al.(2011)],robust multiple kernelK-means[Du,Zhou,Shi et al.(2015)],optimal neighborhood clustering[Liu,Zhou,Wang et al.(2017)],etc.Instead,the idea of the other category is that the cluster membership is viewed as pseudo label,and then it is put in the objective of kernel alignment[Wang,Zhao,and Tian(2015)],which is a widely used learning criterion in supervised learning,and the optimal kernel combination coefficients and pseudo label are optimized under multiple kernel learning frame.Along this idea,Lu et al.[Lu,Wang,Lu et al.(2014)]proposed centered kernel alignment for multiple kernel clustering,Liu et al.[Liu,Dou,Yin et al.(2016)]proposed kernel alignment maximization for clustering,Li et al.[Li,Liu,Wang et al.(2016)]proposed multiple kernel clustering based on local kernel alignment,etc.our work in this paper pays close attention to the clustering algorithms belonging to the second category.

    Among these algorithms belonging to the second category,multiple kernel clustering based on local kernel alignment(LKAMKC)obtains prominent clustering performance by using local kernel to exploit the local structure information of data for clustering.Concretely,the sum of the objective of each local kernel alignment is defined as the optimization objective of LKAMKC,that is,it conducts local kernel alignment on each sample.

    However,LKAMKC has achieved significant clustering performance,we observe that most of existing works usually assume that each local kernel alignment has an equal contribution to clustering performance,that is,each local kernel alignment is equally considered in whole clustering period.Obviously,this assumption does not well take the difference of each local kernel alignment into count,which could hinder the improving of the clustering performance.To address this issue,we propose a multiple kernel clustering algorithm based on self-weighted local kernel alignment to improve clustering performance.In detail,we introduce a weight variable to denote the contribution of each local kernel alignment to clustering performance,and then,weight of each local kernel alignment,kernel combination coefficients and cluster membership are jointly optimized.The proposed algorithm improves clustering performance by imposing learned weight on each local kernel alignment.After that,we develop a three-step alternate iterative optimization algorithm to solve the new optimization problem.Broad experiments on five benchmark data sets have been put into effect to evaluate the clustering performance of the proposed algorithm.The experimental results clearly show that the proposed algorithm outperforms the typically compared methods,which illustrates the effectiveness of the proposed algorithm.

    2 Related work

    In this section,we first review the related work about the kernel alignment and local kernel alignment for multiple kernel clustering.

    2.1 Kernel alignment for multiple kernel clustering

    Supposed a data set with n samples has m kernel feature matrices{Ki}i=1,…,m,and the data set needs to be divided into k clusters.Letand Kμdenote the relaxed cluster membership matrix and the combined kernel respectively.Kμcan be calculated as:

    where μp≥ 0 denotes the combination coefficient of kernel matrix Kpto Kμ.

    According to Lu et al.[Lu,Wang,Lu et al.(2014)],HH?can be regarded as a pseudo ideal kernel matrix.By substituting the true deal kernel with HH?,the objective of kernel alignment for multiple kernel clustering(KAMKC)can be expressed as:

    where 〈·,·〉Fdenotes the Frobenius inner product of the two matrices and μ=[μ1,…,μm].H?H=Ikmeans H satisfies orthogonal constraint,μ?1m=1 means μ satisfies one norm constraint.

    Because Eq.(2)is too complicated to directly optimize,Liu et al.[Liu,Dou,Yin et al.(2016)]not only theoretically discusses the connection between KAMKC and multiple kernelK-means(MKKM)but also derives an easy and equivalent optimization objective of KAMKC based on MKKM.The new optimization formula of KAMKC can be fulfilled as:

    2.2 Local kernel alignment for multiple kernel clustering

    As seen from Eq.(2)or Eq.(3),KAMKC only utilizes the global structure information of kernel,while ignores its local structure information.Local kernel alignment for multiple kernel clustering(LKAMKC)enhances the clustering performance by exploiting the local structure of each sample with local kernel.

    Replacing the global Kμ,H and M with the local,H(i)and M(i)respectively,the objective of local kernel alignment(LKA)on ithsample can be written as:

    By accumulating objective of each LKA one by one,the objective of LKAMKC can be written as:

    3 Multiple kernel clustering based on self-weighted local kernel alignment

    3.1 The proposed formulation

    As shown from Eq.(8),LKAMKC equally considers each local kernel alignment,while inappropriately ignores the difference between each local kernel alignment.Thus,the contribution of each LKA to clustering performance is not properly exploited,which could hinder the improving of the clustering performance.To address the issue,we introduce a new weight variable to denote the contribution of each local kernel alignment to clustering performance.The new optimization variable and the old optimization variables in Eq.(8)are jointly optimized.By imposing a weight for each local kernel alignment on Eq.(8),the formulation of the proposed multiple kernel clustering algorithm can be written as:

    where w=[w1,w2,…,wn]is the weight of the each local kernel alignment,w?1n=1 means w needs satisfy one norm constraint.

    3.2 Optimization

    Although the proposed algorithm introduces a new variable,the optimization problem in Eq.(9)can still be solved.Specifically,we proposed a three-step alternating iterative method to optimize Eq.(9).

    (i)Optimizing H when μ and w are given

    Supposed other two optimization variables are given beforehand,then Eq.(9)can be translated into the following optimization problem.

    Eq.(10)is a standard problem of kernel k-means,and the optimal H can be comprised by the k eigenvectors that correspond to the k largest eigenvalues of V.

    (ii)Optimizing μ when H and w are given

    If H and w are fixed,Eq.(9)is equivalent to a quadratic programming problem about μ.

    Eq.(11)can be effectively solved by existing off-the-shelf packages.

    (iii)Optimizing w when H and μ are given

    If H and μ are fixed,Eq.(9)is equivalent to the following optimization problem.

    Clearly,if aiis greater than zero,Eq.(12)is a convex quadratic programming problem,and it has an analytic solution.By applying the KKT condition on Eq.(12),the global optimal wican be computed by the following.

    To prove that aiis greater than zero,we only need to prove thatis greater than zero because μ?M(i)μ must be greater than zero since M(i)is a positive definite matrix.

    Proof:H?H=Ikand HH?H=H because of H is an orthogonal matrix.Let hidenote ithcolumn of matrix H,where i>=1 and i<=k.Clearly,HH?hi=hiwhich illustrates that HH?has k eigenvalue equalling to one and n-k eigenvalue equaling to zero.Alike,In-HH?has n-k eigenvalue equalling to one and k eigenvalue equaling to zero,so In-HH?is a positive definite matrix.In addition,Kμis a positive definite kernel matrix,therefore,is greater than zero.

    Proof:We have A(i)=A(i)*A(i)and A(i)=A(i)?since A(i)=S(i)S(i)?.A(i)-A(i)HH?A(i)=A(i)(In-HH?)A(i).Let y is arbitrary vector.Clearly,y?A(i)(In-HH?)A(i)y> 0 because(y?A(i))?=A(i)y and In-HH?is a positive definite matrix,which is justified by theorem 1.Therefore,A(i)-A(i)HH?A(i)is a positive definite matrix,correspondingly,

    3.3 Analysis of convergence

    In the proposed algorithm,the neighborhood of samples is crucial while it is difficult to exactly define during clustering.To simplify the optimization problem,we keep the neighborhood of samples fixed in the while process of optimization.By doing so,Eq.(10)is a standard kernel k-means optimization problem,Eq.(11)is a convex quadratic programming problem and Eq.(12)is also a convex quadratic programming problem.They are all convergent.Besides,the objective of the proposed algorithm has a lower bound.Therefore,the proposed clustering algorithm is convergent.The following results of experiment can illustrate the convergence of proposed algorithm.

    We use Algorithm 1 to describe the implementation of the proposed algorithm,where t is the number of iteration.The input of Algorithm 1 includes kernel matrix,the number k of clusters,regularization parameter λ and the threshold θ of convergence.The output includes the relaxed clustering membership H,kernel combination coefficients μ and the weight w of each local kernel alignment.The convergent condition of Algorithm 1 is that the difference of the last two objectives is less than θ.

    Algorithm 1Multiple Kernel Clustering based on Self-weighted Local Kernel Alignment

    Input:

    Output:μ and w.

    Initialize A(i)for ?it?samples according to one criterion of τ nearest neighbors.

    4 Experiment

    In this section,we conduct a large number of experiments to evaluate the clustering performance of the proposed algorithms.Moreover,we compare the proposed algorithm with many state-of-the-art MKC algorithms proposed recently.

    4.1 Data sets

    To conveniently and convincingly evaluate the clustering performance of the proposed algorithm,five benchmark data sets from multiple kernel learning,are adopted in our experiments.They are Yale2https://vismod.media.mit.edu/vismod/classes/mas62200/datasets/,Digital3https://ss.sysu.edu.cn/ py/,ProteinFold4https://mkl.ucsd.edu/dataset/protein-fold-prediction,Movement5https://archive.ics.uci.edu/ml/datasets/Libras+Movement,Catech1026https://mkl.ucsd.edu/dataset/.The detailed number of samples,kernels and classes of these data sets are listed in Tab.1.

    Table 1:The details of data sets in our experiments

    4.2 Compared algorithms

    Local kernel alignment for multiple kernel clustering(LKAMKC)[Li,Liu,Wang et al.(2016)]is a strong baseline since the proposed clustering algorithm directly extends it.In addition,the compared algorithms also include many related and the state-of-the-art multiple kernel clustering algorithms.Details of compared algorithms are as follows:Multiple kernelK-means(MKKM)[Huang,Chuang,Chen et al.(2012)],Localized multiple kernelK-means(LMKKM)[Gnen and Margolin(2014)],Robust multiple kernelK-means(RMKKM)[Du,Zhou,Shi et al.(2015)],Co-regularized spectral clustering(CRSC)[Kumar and Daumé(2011)],Robust multi-view spectral clustering(RMSC)[Xia,Pan,Du,et al.(2014)],Robust Multiple Kernel Clustering(RMKC)[Zhou,Du,Shi et al.(2015)],Kernel alignment for multiple kernel clustering(KAMKC)[Liu,Dou,Yin et al.(2016)],Optimal kernel clustering with multiple kernels(OKMKC)[Liu,Zhou,Wang et al.(2017)].

    4.3 Experiment setup

    For movement data set,12 kernel matrices are computed according to Zhou et al.[Zhou,Du,Shi et al.(2015)],and kernel matrices of the other data sets are downloaded from respective websites.To eliminate differences between kernels,we let the diagonal elements of all kernel matrices equal to one by applying centering and scaling on kernels[Cortes,Mohri and Rostamizadeh(2013)].

    LKAMKC algorithm and the proposed algorithm has the same two parameters:the number of neighbors τ and regularization parameter λ.For the number of neighbors,we respectively select the first kernel,the second kernel,the third kernel,and the average kernel to measure the neighborhood of samples,and the optimal τ is obtained by grid search from[0.05,0.1,…,0.95]*n where n is the number of samples.For the regularization parameter λ,the optimal value is chosen by grid search from[2-10,2-13,…,210].For the other compared algorithms,their parameters are set up according to the methods used in corresponding references.

    To objectively evaluate the performance of the clustering algorithms,in all experiments we use the true number of classes as the number of clusters,and we adopt clustering accuracy(ACC),normalized mutual information(NMI)and purity as the indicators of the clustering performance.For all experiments,the simulations of the proposed algorithm and compared algorithms are carried out in MATLAB 2013b environment with windows 8 operation system.To reduce the effect of randomness caused byK-means as much as possible,we repeat each experiment for 30 times and report the best result.

    Table 2:Clustering performance of all algorithms on all data sets

    4.4 Experimental results

    Tab.2 reports the best experimental results of the proposed algorithm and all compared algorithm,and Tab.3reports the more detailed comparison results between the proposed algorithm and LKAMKC algorithm.In all experiment,the neighborhood of samples is fixed but the criterion to measure the neighborhood of samples is adjustable.In Tab.2,LKAMKC and the proposed algorithm use the average combination kernel to measure the neighborhood of samples.In Tab.3,LKAMKC-K1,LKAMKC-K2,LKAMKC-K3 and LKAMKC-A denotes LKAMKC respectively adopts the first kernel,the second kernel,the third kernel and the average combination kernel to measure the neighborhood of samples.Also proposed-K1,proposed-K2,proposed-K3 and proposed-A denotes the proposed algorithm respectively adopts the first kernel,the second kernel,the third kernel and the average combination kernel to measure the neighborhood of samples.From Table 2,we have the following observation.

    Table 3:The detailed comparison between the proposed algorithm and LKAMKC

    These clustering algorithms which utilize local kernel,including LKAMKC and the proposed algorithm,significantly outperform the compared ones,which do not utilize local kernel,and among them,OKMKC demonstrates the best performance.Taking the results of ACC for an example,the proposed algorithm exceeds OKMKC 4.02%,10.92%,3.53%,3.47%,and 6.29% on Rale,Digital,ProteinFold,Movement and Caltech102,respectively.Similar conclusion can also be found in light of NMI and purity.It clearly indicates the importance of the local geometrical structure of data for clustering.

    In terms of performance indicators:ACC,NMI and purity,the proposed algorithm obtains the best clustering performance on all data sets.Taking the results of ACC for an example,it exceeds LKAMKC,which is a strong baseline since the proposed algorithm directly extends it,by 0.99%,3.23%,3.53%,3.01% and 4.13% on Rale,Digital,ProteinFold,ProteinFold,Movement and Caltech102,respectively.Also,the excellent performance of the proposed algorithm in terms of the NMI and purity can be seen from the Tab.2,where similar observation can be found.It clearly shows the superiority of suitably utilizing the local kernel alignment.

    From Tab.3,we can draw the following points:

    Both LKAMKC and the proposed algorithm are sensitive to the neighborhood of samples.Taking Digital for an example,for LKAMKC and the proposed algorithm using the third kernel to measure the neighborhood of samples can achieve the better performance than using the first kernel to measure the neighborhood of samples.

    Using the average kernel to measure the neighborhood of samples can achieve the better performance than using the single kernel to measure it.Taking ACC for an example,Proposed-A and LKAMKC-A exceed Proposed-K1 and LKAMKC-K1 by 0.06% and 0.08%,3.45% and 3.20%,5.75% and 2.65%,4.51% and 2.85%,0.73% and 0.78% on Yale,Digital,ProteinFold,Movement and Caltech102,respectively,which also shows that the combined kernel can contains more information of the neighborhood of samples than the single kernel contains.

    No matter which the neighborhood of samples is chosen,the proposed algorithm is always better than LKAMKC.Taking ACC for an example,Proposed-K1 exceed LKAMKC-K1 by 1.82%,1.35%,2.62%,0.97%,2.19% on Yale,Digital,ProteinFold,Movement and Caltech102,respectively,which confirms the superiority and effectiveness of the proposed algorithm again.

    4.5 Parameter selection and convergence

    When applying the proposed algorithm to cluster data,two parameters that contains the number τ of the nearest neighbors and regularization parameter λ need to be set up manually.Tab.3has analyzed the effect of the neighborhood of samples on the clustering performance.To evaluate the stability of the parameter λ,we select average kernel to measure the neighborhood of samples and fix the τ firstly and carry out a series of experiments on all data sets.Both the experimental results of the proposed algorithm and a baseline,which is the best result of LKAMKC with the same set,are drawn in Fig.1.From Fig.1,the following observation can be found.

    1)The clustering performance of the proposed algorithm on all data sets is stable when parameter λ varies from a wide range;2)For Yale,λ=2-1is a watershed,if λ is less than watershed the ACC of the proposed is higher than the baseline,or the ACC of proposed is lower than the baseline.3)For Digital and Caltech,λ also has a watershed,differently,if λ is less than watershed the ACC of proposed is lower than the baseline,or the ACC of proposed is higher than the baseline.4)For ProteinFold and Movement,The ACC of proposed is better than the baseline when λ varies from a bounded range.For instance,when 2-4.5≤λ≤28.5,the curve of the proposed algorithm is on the top of the baseline.

    To validate the convergence of the proposed algorithm,we record the objective value of the proposed algorithms at each iteration with fixing parameter τ and λ.Fig.2plots the number of iteration and the corresponding objective value of the proposed algorithms at one iteration.As seen from Fig.2,the objective value of the proposed algorithm is monotonically decreasing with regard to the time of iteration,and the proposed algorithm quickly converged in less than eleven iterations,which confirm the convergence of the proposed algorithm from the view of experiment.

    Figure 1:The performance of the proposed algorithm with regard to parameter λ

    Figure 2:The convergence of the proposed algorithm

    5 Conclusions and future work

    In this paper,we propose a multiple kernel clustering algorithm based on self-weighted local kernel alignment,which can improve the clustering performance by exploiting the contribution of each local kernel alignment to clustering performance more rationally.A three-step alternate optimization algorithm with convergence is developed to address the subsequent optimization problem.Broad experiments on five benchmark data sets validate the effectiveness and superiority of the proposed algorithm.

    As shown from Eq.(8)and Eq.(9),both LKAMKC and the proposed algorithm utilize all local kernel to cluster.However,if the number of samples is big,the clustering algorithms based on local kernel alignment is very time-consuming.Therefore,a fast version of the proposed algorithm,which is suitable for the big data sets[Xiao,Wang,Liu et al.(2018)],is worth studying in the future.

    Acknowledgement:This work was supported by the National Key R&D Program of China(No.2018YFB1003203),National Natural Science Foundation of China(Nos.61672528,61773392,61772561),Educational Commission of Hu Nan Province,China(No.14B193)and the Key Research&Development Plan of Hunan Province(No.2018NK2012).

    精品不卡国产一区二区三区| 欧美日韩中文字幕国产精品一区二区三区| 免费看日本二区| 大型黄色视频在线免费观看| 久久久久国内视频| 国产精品一区二区三区四区免费观看 | 久久久久久久午夜电影| 淫妇啪啪啪对白视频| 久久亚洲精品不卡| 亚洲欧美激情综合另类| 麻豆久久精品国产亚洲av| 日本一二三区视频观看| 亚洲成人免费电影在线观看| 18禁裸乳无遮挡免费网站照片| 国产精品98久久久久久宅男小说| 亚洲aⅴ乱码一区二区在线播放| 午夜亚洲福利在线播放| 99国产精品一区二区蜜桃av| 性色avwww在线观看| 亚洲最大成人av| 久久精品久久久久久噜噜老黄 | 日本色播在线视频| 日韩大尺度精品在线看网址| 成年女人永久免费观看视频| 在线播放无遮挡| 少妇裸体淫交视频免费看高清| 久久精品国产亚洲av涩爱 | 欧美色欧美亚洲另类二区| 自拍偷自拍亚洲精品老妇| 国产精品伦人一区二区| 干丝袜人妻中文字幕| 蜜桃亚洲精品一区二区三区| 中国美女看黄片| 久久人人爽人人爽人人片va| 人人妻人人澡欧美一区二区| 婷婷精品国产亚洲av| 国产欧美日韩精品一区二区| 一边摸一边抽搐一进一小说| 三级国产精品欧美在线观看| 精品无人区乱码1区二区| 成人欧美大片| 18禁在线播放成人免费| 久久久久国产精品人妻aⅴ院| 一区二区三区免费毛片| 国内精品美女久久久久久| 少妇人妻精品综合一区二区 | 国产午夜精品久久久久久一区二区三区 | 啦啦啦韩国在线观看视频| 亚洲无线在线观看| 国产伦一二天堂av在线观看| 日韩av在线大香蕉| 亚洲专区国产一区二区| 好男人在线观看高清免费视频| 成人国产一区最新在线观看| 欧美中文日本在线观看视频| 搡女人真爽免费视频火全软件 | 精品久久久噜噜| 久久久久久久午夜电影| 亚洲欧美日韩东京热| 久久久久国内视频| 国产高清视频在线播放一区| 非洲黑人性xxxx精品又粗又长| 精品人妻视频免费看| 99久久成人亚洲精品观看| 国产国拍精品亚洲av在线观看| 精品久久久久久久久av| 美女被艹到高潮喷水动态| 天堂动漫精品| 级片在线观看| 国产成人福利小说| 伊人久久精品亚洲午夜| 久久久久九九精品影院| 波多野结衣高清无吗| 91精品国产九色| 亚洲av.av天堂| 亚洲人成伊人成综合网2020| 搡女人真爽免费视频火全软件 | 三级男女做爰猛烈吃奶摸视频| 色av中文字幕| 国产欧美日韩精品亚洲av| 欧美日韩国产亚洲二区| 美女cb高潮喷水在线观看| 国产三级在线视频| 最近最新中文字幕大全电影3| 乱人视频在线观看| 在线观看66精品国产| 亚洲精品亚洲一区二区| 欧美+亚洲+日韩+国产| 亚洲美女搞黄在线观看 | 欧美一区二区国产精品久久精品| 3wmmmm亚洲av在线观看| 亚洲三级黄色毛片| 成人午夜高清在线视频| 精品一区二区三区人妻视频| 色av中文字幕| 露出奶头的视频| 免费观看在线日韩| 成人鲁丝片一二三区免费| 久久精品国产亚洲av香蕉五月| 18+在线观看网站| 国产精华一区二区三区| 精品人妻视频免费看| 成人欧美大片| 成人无遮挡网站| 亚洲乱码一区二区免费版| 又爽又黄a免费视频| 久久精品综合一区二区三区| 国语自产精品视频在线第100页| 很黄的视频免费| 日本精品一区二区三区蜜桃| 日韩精品中文字幕看吧| aaaaa片日本免费| 欧美黑人欧美精品刺激| 午夜福利成人在线免费观看| 欧美精品国产亚洲| 观看免费一级毛片| 亚洲av.av天堂| 十八禁国产超污无遮挡网站| 嫩草影院新地址| 国产乱人视频| 少妇被粗大猛烈的视频| 搡老熟女国产l中国老女人| 免费观看的影片在线观看| 国产真实伦视频高清在线观看 | 欧美日韩精品成人综合77777| 成人欧美大片| 给我免费播放毛片高清在线观看| 欧美日韩精品成人综合77777| 午夜日韩欧美国产| 国产精华一区二区三区| 久久精品久久久久久噜噜老黄 | 免费在线观看成人毛片| 99国产精品一区二区蜜桃av| 国产真实乱freesex| 特级一级黄色大片| 美女被艹到高潮喷水动态| 99热这里只有是精品50| 中出人妻视频一区二区| 国产黄色小视频在线观看| av中文乱码字幕在线| 午夜免费男女啪啪视频观看 | 99热这里只有是精品50| 变态另类成人亚洲欧美熟女| 国产精品98久久久久久宅男小说| 一进一出抽搐动态| 午夜激情欧美在线| 国产精品永久免费网站| 一个人看的www免费观看视频| 日韩一本色道免费dvd| 久久精品综合一区二区三区| 在现免费观看毛片| 国产在线精品亚洲第一网站| 久久热精品热| 12—13女人毛片做爰片一| 午夜福利18| 久久亚洲精品不卡| 99久久中文字幕三级久久日本| 女生性感内裤真人,穿戴方法视频| 春色校园在线视频观看| 能在线免费观看的黄片| 日韩大尺度精品在线看网址| 全区人妻精品视频| 久久亚洲精品不卡| 天天躁日日操中文字幕| 色av中文字幕| 不卡视频在线观看欧美| 成人三级黄色视频| 最近视频中文字幕2019在线8| 日本欧美国产在线视频| 午夜久久久久精精品| 日韩中文字幕欧美一区二区| 他把我摸到了高潮在线观看| 中国美女看黄片| 午夜久久久久精精品| 成人特级黄色片久久久久久久| 欧美日韩精品成人综合77777| 丝袜美腿在线中文| 91精品国产九色| 亚洲性久久影院| 免费看a级黄色片| 天堂动漫精品| 国产成人av教育| 免费一级毛片在线播放高清视频| 91午夜精品亚洲一区二区三区 | 在线免费观看不下载黄p国产 | 国产三级中文精品| 色噜噜av男人的天堂激情| 色噜噜av男人的天堂激情| 少妇猛男粗大的猛烈进出视频 | 国产高潮美女av| 九色成人免费人妻av| 制服丝袜大香蕉在线| 午夜精品在线福利| 免费观看精品视频网站| 免费无遮挡裸体视频| 日本五十路高清| 黄色视频,在线免费观看| 欧美色欧美亚洲另类二区| 天天一区二区日本电影三级| 国产高清激情床上av| 亚洲精华国产精华精| 最近最新中文字幕大全电影3| 精品不卡国产一区二区三区| 日本精品一区二区三区蜜桃| 亚洲国产精品成人综合色| 欧美另类亚洲清纯唯美| 又紧又爽又黄一区二区| 国产91精品成人一区二区三区| a级毛片a级免费在线| 麻豆av噜噜一区二区三区| 老司机深夜福利视频在线观看| 身体一侧抽搐| 国产精品日韩av在线免费观看| 久99久视频精品免费| 乱码一卡2卡4卡精品| 国产亚洲精品av在线| 午夜福利在线在线| 露出奶头的视频| 亚洲va在线va天堂va国产| 亚洲男人的天堂狠狠| 成人国产综合亚洲| 欧美又色又爽又黄视频| 精品久久国产蜜桃| 亚洲四区av| 国产成人a区在线观看| 成人特级黄色片久久久久久久| 久久久午夜欧美精品| 久久精品综合一区二区三区| 热99re8久久精品国产| 真人做人爱边吃奶动态| 中文字幕av成人在线电影| 有码 亚洲区| 18禁裸乳无遮挡免费网站照片| 午夜免费男女啪啪视频观看 | 校园人妻丝袜中文字幕| 亚洲在线观看片| 亚洲精品日韩av片在线观看| 精品人妻熟女av久视频| 亚洲成人免费电影在线观看| 丝袜美腿在线中文| 欧美激情国产日韩精品一区| 天堂√8在线中文| 老司机福利观看| 日本免费一区二区三区高清不卡| 欧美成人性av电影在线观看| 久久国产精品人妻蜜桃| ponron亚洲| 国产欧美日韩精品亚洲av| 最后的刺客免费高清国语| 老师上课跳d突然被开到最大视频| 色综合色国产| eeuss影院久久| 91久久精品国产一区二区成人| 中出人妻视频一区二区| 白带黄色成豆腐渣| 久久久成人免费电影| 亚洲va在线va天堂va国产| av在线老鸭窝| 不卡一级毛片| 变态另类丝袜制服| 一进一出抽搐动态| 91在线观看av| 亚洲成人中文字幕在线播放| 国国产精品蜜臀av免费| 亚洲五月天丁香| 欧洲精品卡2卡3卡4卡5卡区| 欧美精品啪啪一区二区三区| 国产精品久久久久久亚洲av鲁大| 黄色视频,在线免费观看| 午夜福利成人在线免费观看| 国产三级中文精品| 国产亚洲av嫩草精品影院| 国产精品国产高清国产av| 噜噜噜噜噜久久久久久91| 十八禁国产超污无遮挡网站| 午夜福利欧美成人| 男人和女人高潮做爰伦理| 欧美高清成人免费视频www| 精品一区二区三区视频在线观看免费| 欧洲精品卡2卡3卡4卡5卡区| 久久香蕉精品热| 国产精品自产拍在线观看55亚洲| 欧美中文日本在线观看视频| 全区人妻精品视频| 亚洲av日韩精品久久久久久密| 免费看日本二区| 在线观看免费视频日本深夜| 亚洲精品国产成人久久av| 亚洲不卡免费看| 国内少妇人妻偷人精品xxx网站| 国产精品女同一区二区软件 | 午夜福利在线在线| 18禁裸乳无遮挡免费网站照片| 韩国av一区二区三区四区| 国产免费av片在线观看野外av| 可以在线观看的亚洲视频| 亚洲最大成人中文| 99在线人妻在线中文字幕| 日本黄色片子视频| 内射极品少妇av片p| 亚州av有码| 桃色一区二区三区在线观看| 亚洲最大成人中文| 91麻豆精品激情在线观看国产| 麻豆国产av国片精品| 国产精品亚洲美女久久久| 香蕉av资源在线| 熟女人妻精品中文字幕| 亚洲无线观看免费| 国内揄拍国产精品人妻在线| 亚洲最大成人手机在线| 99久久精品一区二区三区| 色在线成人网| 欧美在线一区亚洲| 九色国产91popny在线| 97碰自拍视频| 欧美zozozo另类| 国产成人a区在线观看| av中文乱码字幕在线| 性插视频无遮挡在线免费观看| 老女人水多毛片| 午夜福利在线在线| 色播亚洲综合网| 国产色婷婷99| 一级a爱片免费观看的视频| 一本一本综合久久| 在线天堂最新版资源| а√天堂www在线а√下载| 欧美最黄视频在线播放免费| 午夜福利高清视频| av视频在线观看入口| 99热精品在线国产| 3wmmmm亚洲av在线观看| 岛国在线免费视频观看| 国产精品永久免费网站| 在线观看av片永久免费下载| 亚洲内射少妇av| 国产精品美女特级片免费视频播放器| 波野结衣二区三区在线| 亚洲狠狠婷婷综合久久图片| 欧洲精品卡2卡3卡4卡5卡区| 午夜福利在线在线| 国产成人影院久久av| 国产精华一区二区三区| 亚洲最大成人中文| 国产欧美日韩精品亚洲av| 亚洲精品456在线播放app | 一本久久中文字幕| 国产亚洲精品av在线| 99久久中文字幕三级久久日本| 精品无人区乱码1区二区| 亚洲av二区三区四区| 九色成人免费人妻av| 高清在线国产一区| 人人妻,人人澡人人爽秒播| 日韩欧美精品免费久久| 少妇的逼水好多| 精品久久久久久久久亚洲 | 三级男女做爰猛烈吃奶摸视频| 九色成人免费人妻av| 欧美成人免费av一区二区三区| 1000部很黄的大片| 最近中文字幕高清免费大全6 | 悠悠久久av| av视频在线观看入口| 啦啦啦韩国在线观看视频| 精品久久久久久久久亚洲 | 成人毛片a级毛片在线播放| 日韩欧美免费精品| 久久国产精品人妻蜜桃| 久久精品国产自在天天线| 少妇丰满av| 蜜桃亚洲精品一区二区三区| 成人三级黄色视频| 91精品国产九色| 五月伊人婷婷丁香| 久久草成人影院| 亚洲成人中文字幕在线播放| 婷婷丁香在线五月| 亚洲精品成人久久久久久| 成年女人毛片免费观看观看9| 亚洲精品日韩av片在线观看| 国产精品久久久久久久久免| 一级a爱片免费观看的视频| 国产精品久久久久久久久免| 精品久久久久久久久亚洲 | 国产精品一区二区三区四区久久| 淫妇啪啪啪对白视频| 国产欧美日韩精品亚洲av| 99热精品在线国产| 亚洲国产欧美人成| 在线国产一区二区在线| 日韩 亚洲 欧美在线| 91午夜精品亚洲一区二区三区 | 一区福利在线观看| 国产久久久一区二区三区| 欧美bdsm另类| 别揉我奶头 嗯啊视频| 丰满的人妻完整版| 精品久久久久久久久久久久久| 99久久中文字幕三级久久日本| 国产 一区 欧美 日韩| 国产精品人妻久久久影院| 大又大粗又爽又黄少妇毛片口| 国产精品三级大全| 色综合婷婷激情| 色综合站精品国产| 尾随美女入室| 少妇裸体淫交视频免费看高清| 天天一区二区日本电影三级| 99久久无色码亚洲精品果冻| 变态另类成人亚洲欧美熟女| 日本一二三区视频观看| 三级国产精品欧美在线观看| 精品人妻熟女av久视频| 人人妻,人人澡人人爽秒播| 啦啦啦啦在线视频资源| eeuss影院久久| 久久久久久久久久久丰满 | 内地一区二区视频在线| 久久久久久伊人网av| 99热这里只有是精品50| 有码 亚洲区| 亚洲午夜理论影院| 国内毛片毛片毛片毛片毛片| 亚洲国产精品合色在线| 性插视频无遮挡在线免费观看| 99视频精品全部免费 在线| 男人和女人高潮做爰伦理| 22中文网久久字幕| 很黄的视频免费| 97人妻精品一区二区三区麻豆| 人妻久久中文字幕网| 男女做爰动态图高潮gif福利片| 最近最新中文字幕大全电影3| 久久人人爽人人爽人人片va| 人人妻人人看人人澡| 国产黄色小视频在线观看| 国语自产精品视频在线第100页| 亚洲五月天丁香| 欧美成人性av电影在线观看| 免费一级毛片在线播放高清视频| 国产精品女同一区二区软件 | 男人和女人高潮做爰伦理| 日韩人妻高清精品专区| 中出人妻视频一区二区| 国产私拍福利视频在线观看| 美女被艹到高潮喷水动态| 99九九线精品视频在线观看视频| 国产毛片a区久久久久| 一区二区三区免费毛片| 国产精品综合久久久久久久免费| 啦啦啦啦在线视频资源| 久久久成人免费电影| 人妻丰满熟妇av一区二区三区| 男女边吃奶边做爰视频| 成人永久免费在线观看视频| 亚洲精品久久国产高清桃花| 99久久久亚洲精品蜜臀av| xxxwww97欧美| 黄色女人牲交| 国产精品野战在线观看| 狠狠狠狠99中文字幕| 国产高清不卡午夜福利| 国产精品美女特级片免费视频播放器| 日韩欧美精品v在线| 欧洲精品卡2卡3卡4卡5卡区| 哪里可以看免费的av片| 国产成人一区二区在线| 亚洲黑人精品在线| 日本黄大片高清| 狂野欧美白嫩少妇大欣赏| 亚洲欧美日韩无卡精品| 老师上课跳d突然被开到最大视频| 色哟哟·www| 国产探花极品一区二区| www.www免费av| 精品人妻偷拍中文字幕| 老司机午夜福利在线观看视频| 色综合亚洲欧美另类图片| 成人综合一区亚洲| 午夜亚洲福利在线播放| 在线观看免费视频日本深夜| 无人区码免费观看不卡| 成人国产麻豆网| 久久久精品大字幕| 在线观看美女被高潮喷水网站| 国产乱人伦免费视频| 国产欧美日韩一区二区精品| 亚洲国产精品sss在线观看| 中文在线观看免费www的网站| 久久精品综合一区二区三区| 婷婷亚洲欧美| 午夜精品久久久久久毛片777| 日韩欧美精品v在线| 乱码一卡2卡4卡精品| 国产成人av教育| 婷婷色综合大香蕉| 九色国产91popny在线| 日韩强制内射视频| 亚洲经典国产精华液单| 可以在线观看的亚洲视频| 婷婷色综合大香蕉| 国内精品久久久久精免费| 我要搜黄色片| 亚洲电影在线观看av| 99热6这里只有精品| 国产高清激情床上av| 可以在线观看毛片的网站| 最近中文字幕高清免费大全6 | 99国产精品一区二区蜜桃av| 12—13女人毛片做爰片一| 久久草成人影院| 亚洲av美国av| 欧美激情国产日韩精品一区| 免费观看的影片在线观看| 国产男靠女视频免费网站| 国产av一区在线观看免费| 九九热线精品视视频播放| 国语自产精品视频在线第100页| 国产人妻一区二区三区在| 日韩中字成人| 成人一区二区视频在线观看| 久久6这里有精品| 人妻夜夜爽99麻豆av| 中出人妻视频一区二区| 久久人人爽人人爽人人片va| 黄色配什么色好看| 午夜精品久久久久久毛片777| 五月伊人婷婷丁香| 99精品在免费线老司机午夜| 99国产精品一区二区蜜桃av| 午夜福利高清视频| 亚洲美女搞黄在线观看 | av女优亚洲男人天堂| 婷婷色综合大香蕉| 99精品在免费线老司机午夜| 亚洲黑人精品在线| 久久精品人妻少妇| 亚洲美女搞黄在线观看 | 亚洲欧美激情综合另类| 久久久成人免费电影| 国产精品人妻久久久久久| 性插视频无遮挡在线免费观看| 大型黄色视频在线免费观看| 久久久久久久久久久丰满 | 女同久久另类99精品国产91| 国产视频内射| 亚洲av美国av| 日韩,欧美,国产一区二区三区 | 日本与韩国留学比较| 亚洲美女搞黄在线观看 | 午夜a级毛片| 又爽又黄a免费视频| 免费不卡的大黄色大毛片视频在线观看 | 欧美日韩精品成人综合77777| 老司机福利观看| 色吧在线观看| 中文字幕久久专区| 99九九线精品视频在线观看视频| 黄色一级大片看看| 一本久久中文字幕| 午夜福利高清视频| 麻豆精品久久久久久蜜桃| 欧美bdsm另类| 真实男女啪啪啪动态图| 亚洲国产欧美人成| 国产免费男女视频| 欧美成人一区二区免费高清观看| 国产高清不卡午夜福利| 国内精品宾馆在线| 欧美+亚洲+日韩+国产| 99久久成人亚洲精品观看| 女生性感内裤真人,穿戴方法视频| 国产精品98久久久久久宅男小说| 国产日本99.免费观看| 99久久中文字幕三级久久日本| 伦精品一区二区三区| 真人一进一出gif抽搐免费| 男人舔女人下体高潮全视频| 人人妻人人看人人澡| 一本久久中文字幕| 亚洲精品成人久久久久久| 天堂av国产一区二区熟女人妻| 日本 欧美在线| 亚洲精品一区av在线观看| 亚洲在线自拍视频| 少妇猛男粗大的猛烈进出视频 | 夜夜看夜夜爽夜夜摸| 91麻豆av在线| 亚洲av日韩精品久久久久久密| 熟女电影av网| 特大巨黑吊av在线直播| 久久久久久大精品| 亚洲电影在线观看av| 校园春色视频在线观看| 亚洲av一区综合| 一进一出抽搐gif免费好疼| 国产av一区在线观看免费| 国产伦人伦偷精品视频| 成人av一区二区三区在线看| 尤物成人国产欧美一区二区三区| 深爱激情五月婷婷| 国产一区二区三区视频了| 欧美极品一区二区三区四区| 高清日韩中文字幕在线| 成人综合一区亚洲| 日韩欧美免费精品| 三级男女做爰猛烈吃奶摸视频| 麻豆成人午夜福利视频| 欧美日本亚洲视频在线播放| 内地一区二区视频在线| 床上黄色一级片| 欧美激情久久久久久爽电影| 久久天躁狠狠躁夜夜2o2o|