• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Multi-Label Learning Based on Transfer Learning and Label Correlation

    2019-11-07 03:12:24KehuaYangChaoweiSheWeiZhangJiqingYaoandShaosongLong
    Computers Materials&Continua 2019年10期

    Kehua Yang,Chaowei SheWei Zhang Jiqing Yao and Shaosong Long

    Abstract:In recent years,multi-label learning has received a lot of attention.However,most of the existing methods only consider global label correlation or local label correlation.In fact,on the one hand,both global and local label correlations can appear in real-world situation at same time.On the other hand,we should not be limited to pairwise labels while ignoring the high-order label correlation.In this paper,we propose a novel and effective method called GLLCBN for multi-label learning.Firstly,we obtain the global label correlation by exploiting label semantic similarity.Then,we analyze the pairwise labels in the label space of the data set to acquire the local correlation.Next,we build the original version of the label dependency model by global and local label correlations.After that,we use graph theory,probability theory and Bayesian networks to eliminate redundant dependency structure in the initial version model,so as to get the optimal label dependent model.Finally,we obtain the feature extraction model by adjusting the Inception V3 model of convolution neural network and combine it with the GLLCBN model to achieve the multi-label learning.The experimental results show that our proposed model has better performance than other multi-label learning methods in performance evaluating.

    Keywords:Bayesian networks,multi-label learning,global and local label correlations,transfer learning.

    1 Introduction

    Nowadays,we live in an information age.An instance cannot be labeled with just single label,so the instance is often associated with more than one class label.For example,an image can be annotated with several labels[Su,Chou,Lin et al.(2011)],a piece of music can belong to many types[Turnbull,Barrington,Torres et al.(2008)],a text can reflect different themes[Ueda and Saito(2002)].Therefore,multi-label classification attracts more and more researchers to research.

    There are two categories in multi-label learning algorithms[Zhang and Zhou(2007)]:problem transformation and algorithm adaption.Problem transformation is a straightforward method.The main idea is to convert multi-label problem as one or more traditional single label problems.Algorithms include Binary Relevance(BR)[Boutell,Luo,Shen et al.(2004)],Pruned Problem Transformation(PPT)[Read,Pfahringer and Holmes(2009)]and so on.Algorithm adaptation is an adaptive method.The main idea is to use single-label classification algorithm to adapt to multi-label classification.Classic algorithms include C4.5 Decision Tree[Quinlan(1992)],Multi-label Dimensionality reduction via Dependence Maximization(MDDM)[Zhang and Zhou(2010)],Multi-label Informed Latent Semantic Indexing(MLSL)[Yu,Yu and Tresp(2005)]and so on.

    Label correlation can provide important information for multi-label classification.For example,“blue sky” and “white cloud” are highly symbiotic labels,while “sunny” and“black clouds” are highly exclusive labels.“ocean” and “sailboat” appear at the same time,it is highly likely that the “fish” label will be included,while the “desert” label will not appear.However,most of the existing methods mainly focus on the sharing characteristics of global label and ignore the label correlation of local data sets.For example,“Jack Ma” is associated with “Alibaba” in the IT company data set[Liu,Peng and Wang(2018)],but it is weakly related to global label correlation.Therefore,according to the above analysis,it is more practical and comprehensive to consider the global and local label correlations in multi-label classification.

    Each instance has characteristics of multi-dimensional label in multi-label learning.If the label of instance is annotated simply by manual labeling,human may sometimes ignore labels that they do not know or of little interest,or follow the guide by some algorithm to reduce labeling costs[Huang,Chen and Zhou(2015)].Some labels may be missing from the training set,which is a kind of weakly supervised learning.So subjectivity factors are unavoidable in the labels.As a result,some labels may be missing from the data set,resulting in label imbalance,which makes it more difficult and potentially negatively impacting performance to estimate label correlations.

    In this paper,we propose a novel and effective method called “Bayesian Networks with Global and Local Label Correlation”(GLLCBN).The main idea of GLLCBN is to use the global label semantics correlation and local label correlation of data set to balance label correlation and reduce the impact of label noise on data set.First of all,the probability of each individual label is obtained by analyzing the data set.Similarly,we get the probability between pairwise labels by using the data set.And then,the global label correlation matrix is constructed by label semantic similarity.After that,according to the relevant probability information received in the first to three steps,the initial Bayesian networks topology is constructed to obtain the high-order label correlation.In addition,redundant edge(label correlation)in the network structure are optimized by graph theory and probability theory.Subsequently,GLLCBN model is constructed.Finally,the initial prediction label is obtained by using transfer learning to adjust and train the Inception V3 model,and then,the prediction result is combined with GLLCBN to achieve multi-label classification.

    The remainder of this paper is organized as follows.Section 2 introduces the related work of multi-label learning.Section 3 presents our proposed algorithm in detail.We experimented to verify the performance of our proposed method in Section 4.Finally,conclusions and future work are given in Section 5.

    2 Related work

    In recent years,multi-label learning has been extensively studied and many methods have been proposed[Zhang and Zhou(2007)].In addition,the role of label correlation has gradually become the focus of researchers.Methods can be divided into three categories according to the degree of label correlation[Zhang and Zhang(2010)].

    First-order method is to convert a multi-label classification into multiple one-dimensional independent classifiers.For example,the classic BR[Boutell,Luo,Shen et al.(2004)]trained a corresponding classifier for each label independently.Obviously,the advantage of this approach is its simplicity,but it ignores label correlation.Second-order method refers to the correlation between pairs of labels.For example,the CLR[Brinker(2008)]achieved conversion of multi-label classification problems by analyzing the correlation of pairwise labels and establishing label rankings.Although the advantage of this method is that it considers the internal pairwise label correlations,which has a certain efficiency improvement.However,multi-label learning generally has high dimensions,we should not be limited only to consider the existence of pairwise labels.Therefore,higher-order methods are proposed.High-order method refers to analyzing the correlation between the high-dimensional of the labels and is not limited to the pairwise labels.For example,MLLRC[Huang and Zhou(2012)]solves multi-label classification problem by using characteristics of the matrix rank.Obviously,the advantage of high-order method is to extract the intrinsic connection of label and strengthen the dependency of labels,but label correlation analysis is more difficult and the label correlation structure is more complicated.

    Labeling of an instance may result in label imbalance due to subjectivity factors.For example,the actual label for this image should contain “bull”,“mountain” and “road” in Fig.1.By manual labeling,on the one hand,the picture on the left can be marked in the order of “cattle”,“mountain”,“road”.On the other hand,the picture on the right may be marked as “mountain”,“road”,“bull”.Sometimes the label of “bull” even be lost by visual effects.GLOCAL[Zhu,Kwok and Zhou(2017)]indicated that missing label and label order are influential factors for multi-label classification.

    Figure 1:Image annotation

    In summary,for the study of multi-label classification,not only global label correlation should be considered,but also local label correlation.Therefore,a more balanced and comprehensive label correlation can be received.

    3 The proposed approach

    In this section,details of the proposed approach GLLCBN will be presented.Firstly,we perform predefined of model and analysis global and local label correlation to obtain GLLCBN model.Secondly,we combine the optimized Inception V3[Szegedy,Vanhoucke,Ioffe et al.(2016)]model by transfer learning with GLLCBN to achieve multi-label classification.

    3.1 Preliminaries

    Since multi-label classification has high-dimensional features,this is the difference between multi-label classification and single-label classification,we have following predefined processing.LetD=Rnbe n-dimensional sample space andwheremis the number of labels in dataset.On the one hand,the correspondence between the data set instance and the sample label is defined asQ=({Ni,Mi)|i=1,2,...,n},wherenrepresents the total number of data set andNi∈Dis an n-dimensional feature vector.So we defineto represent the feature vector of a sample instance.On the other hand,we denoteas sample label matrix,whereis the label vector of instance associated withNi.In addition,we denoteas each elementif thei-th instance hasj-th label,otherwise.

    3.2 Label correlation

    Label correlation contains potentially important information for multi-label classification problem,so label correlation is an essential part of our analysis[Punera,Rajan and Ghosh(2005)].However,there are certain difficulties in the analysis of this aspect,then how to solve this problem has become a new research direction.In order to analyze label correlation more reasonably and comprehensively,we deal with local correlation of the data set and global correlation of the label semantics.

    3.2.1 Local label correlation

    We consider local label correlation from the data set.Since data in the data set is random,the probability of different labels is inconsistent.According to this feature,we denoteP=[p(l1),p(li),...,p(lm)]as the probability of each label occurrence,wheremrepresents the total number of sample labels andp(li)indicate the probability of thei-th label in the data set.Since label correlation is at least second-order,we need to calculate the probability of pairwise labels.The local label correlation is defined as:

    wherei,jare a single label in the label set andi,j∈m.We denoteT(Nlj)as the number of sample instances with the labellj.However,to avoid anomalous expressions,ifT(Nlj)value is equal to 0,it means thatp(li|lj)is also equal to 0.Similarly,T(Nli|lj)represents the number of sample instances that simultaneously have both labelsliandlj.In addition,we denotep(li|lj)andX(li|lj)as the probability of pairwise label correlations.It is important to note that pairwise label correlations is not a symmetric equivalent relationship,which is defined as:

    For example,there is a data set as shown in Tab.1:

    Table 1:Data set

    According to the above table,,there isp(lA|lB)≠p(lB|lA),so Eq.(2)is correct.

    3.2.2 Global label correlation

    We obtain global correlation by analyzing the word similarity.At present,the correlation between words mainly uses context semantics of words,the word vector is used to judge correlation between two words and Word2vec[Mikolov,Chen,Corrado et al.(2013)]is a classic algorithm.For example,words “man” and “woman” are highly relevant to “man”and “beautiful”,because they are used in a similar context.For example the word “man”can be used in the position of the sentence and the word “woman” can be replaced.Therefore,we defineW=[W1,W2,...,Wm]T∈[0,1]m×mas word matrix,whereW1=[w(l1|l1),w(l1|l2),...,w(l1|lm)]is a vector of pairwise words correlation andw(li|lj)is the word correlation probability between labelsiandj.The process is defined as:

    As shown in Eq.(3),each label is perfectly correlated with itself,so the value is 1 and a small value means that the label correlation is low,otherwise the opposite.

    3.3 GLLCBN model

    According to the analysis in Section 3.2,we have dealt with global and local label correlations.The relationship between them is defined as Eq.(4):

    wherei,jare pairwise labels,andλ1,λ2∈[0,1],λ1+λ2=1are trade-off parameters for controlling the weight between global and local label correlations.Then that,E(li|lj)is the comprehensive label correlation.

    It is not enough to finally acquire pairwise label correlations through theE(li|lj),because according to the global and local label correlation,this will result in a cyclic relationship between pairwise labels,the labelsliandljhave a relationship betweenE(li|lj)andE(lj|li).When a symmetrical relationship occurs,it becomes ambiguous because it is impossible to determine which side of the pairwise labels is strongly dependent on the other.Therefore,in order to solve this problem,it is necessary to eliminate the ambiguous dependencies of pairwise labels,so that the definition can be defined as Eq.(5):

    whereli,ljare pairwise labels,andlx1|lx2is finalized label dependency.

    For example,there is a structure in Fig.2,where a circle represents a label(e.g.,A,B)and the edges between the circles represent the probability of pairwise label correlations(e.g.,E(lB|lA),E(lA|lB)).

    Figure 2:Correlation between label A and B

    According to the description of the Eq.(5),since correlation follow the principle of maximum value,correlation between label A and label B should be such that the structure of Fig.2 should be optimized to the structure of Fig.3.Therefore,this shows that label A is more dependent on label B.

    Figure 3:Optimized label A and B correlation

    As shown in Fig.4,it is worth noting that if there are multiple reachable paths for one label to another.As shown in the Eq.(6),it is used to determine label dependencies of multiple reachable edges.

    Figure 4:Multiple reachable paths between label

    Eq.(6)can be equivalent to the form as shown in the Eq.(7):

    wherek1,k2,...are the middle label node on the path from labelitoj.Then that,Q1=The

    specific proof of the Eq.(7)is as follows:

    Proof.First,we are based on Fig.4.On the one hand,N(A),N(B),N(C),N(A,B),N(A,C),N(B,C),N(A,B,C)are the number of sample instances.On the other hand,p(lC|lA),p(lB|lA),p(lC|lB)are the probability of pairwise local label correlations.Then,w(lB|lA),w(lC|lA),w(lC|lB)are the probability of label semantic correlation.Moreover,their values are known according to the analysis from Eq.(1)to Eq.(5).

    Second,according to graph theory and probability theory,we have the following defines:

    If we have to proveE(lC|lA)≥(E(lB|lA),E(lC|lB)),according to Eq.(8)and Eq.(9),we only need to prove.

    We observe that the data set can be able to guaranteeN(A,C)≥N(A,B,C),because both of them belong to the inclusion relationship,and the former has a larger scope.By the same logic,we know thatN(A)≥N(A,B)and the values ofλ1and λ2 on both sides of the equation are the same,so it does not require additional consideration.In addition,w(lABC|lAB)=w(lB|lA)×w(lC|lB)×(lC),wherew(lC)is equal 1,thus,w(lABC|lAB)=w(lB|lA)×w(lC|lB).

    According to the above analysis,we know that only if the sample data set satisfies the value of,thenE(lC|lA)≥(E(lB|lA),E(lC|lB))can be obtained,otherwise,the opposite is true.Therefore,we prove that Eq.(7)is true.

    According to the Eq.(7),the reachable path for eliminating the dependency of the label correlation can be performed,but the following two cases require special handling.

    Case 1.E(li|lj)=E(li|lk1|lk2...|lj)

    According to the principle of maximum label correlation,Fig.4 is optimized as shown in Fig.5.

    Figure 5:Optimized GLLCBN model

    Case 2.E(li|lj)=E(li|lj)

    Fig.4 does not need to be changed.Because there may be pairwise label correlations,the intermediate nodes in them cannot be eliminated,and the correlation structure of each intermediate node should be retained.

    In summary,directed graph model of GLLCBN can be constructed by analyzing label correlation and Bayesian networks[Friedman,Linial,Nachman et al.(2000)].The GLLCBN model can be used to optimize the label correlation and facilitate the extraction of potential association information between labels,and reduce the impact of label imbalances in the sample data set.

    3.4 Adjustment of Inception V3 model

    Convolutional Neural Network(CNN)plays a very important role in the research of image classification[Song,Hong,Mcloughlin et al.(2017)].There are many excellent models of CNN,such as AlexNet[Krizhevsky,Sutskever and Hinton(2012)],VGGNet[Russakovsky,Deng,Su et al.(2015)],ResNet[He,Zhang,Ren et al.(2015)]and GoogleNet[Szegedy,Liu,Jia et al.(2014)].Among them,Inception V3[Szegedy,Vanhoucke,Ioffe et al.(2016)]created by Google is a very portable and highly usable model.Therefore,we use transfer learning approach to make related adjustments for the Inception V3 model to improve the performance of the multi-label classification problem.In order to adjust the Inception V3 model,we need to make some adjustments.First of all,since the Inception V3 model was initially trained for single classification,but our images are multi-label attributes,we need to treat the label storage of the input data as multidimensional,rather than only as single label.Secondly,in order to achieve applicability,it is generally necessary to remove the top-level structure and then add some new various layers of customization.Therefore,we add a fully connected layer of 1024 nodes for association with the last pooling layer.Finally,since the softmax in Inception V3 outputs 1000 nodes(the ImageNet[Deng,Dong,Socher et al.(2009)]data set has 1000 categories),we need to modify the last layer of the network and convert it to the number of nodes(it equivalent to the label type in the data set)so that label classification is achieved through our model.

    4 Experiments

    In this section,to evaluate the performance of GLLCBN,a description of the multi-label data set used in the experiments,the performance evaluation of multi-label classification and comparative algorithm with GLLCBN model are explained.Finally,the experimental results and analysis are presented.

    4.1 Data sets

    In order to verify the performance of GLLCBN,we chose the open source data set collected by Nanjing University.The download link for the data set ishttp://lamda.nju.edu.cn/files/miml-image-data.rar.The data set contains 2,000 landscape images and five labels(desert,ocean,sunset,mountains,trees).In addition,each instance has an average of two labels.It is worth noting that the original data set archive contains a file called miml_data with a.mat suffix(Matlab file format).It contains three files:bags.mat,targets.mat and class_name.mat.The first file can be ignored directly because it has no special effect.The second file is the label definition for each image,which means a matrix of 5×2000,each column represents label for an instance image,and the value 1 indicates the presence of the label,and-1 means that there is no corresponding label,and the label order is the same as the class_name.mat file.In addition,we need to deal with label format content and convert the matrix into a.txt format file in order to facilitate follow-up training of Inception V3 model.The last file shows all possible label names for the data set.

    4.2 Performance evaluation

    Since an instance has multiple label attributes in the multi-label classification,prediction label may belong to a subset of the actual label,which is represented as.To evaluate the performance of GLLCBN model,we select five widely-used evaluation metrics[Gibaja and Ventura(2015)].

    Hamming Loss expresses the degree of inconsistency between the prediction label and the actual label.Eq.(10)shows the expression of the Hamming Loss.

    Coverage evaluates how far it is needed,on average,to go down the ranked list of labels in order to cover all ground true labels.Eq.(11)shows the expression of the Coverage.

    Ranking Loss evaluates the average fraction of mis-ordered label pairs.Ranking Loss is defined as follows:

    Average Precision represents the average accuracy of the predicted instance label set,just like Eq.(13).

    Average Predicted Time expresses the average time to predict each instance and it time unit is second,which is expressed as follows:

    It is worth noting that in the above five performance evaluation,Hamming Loss,Ranking Loss,Coverage,Ranking Loss and Average Predicted Time,the smaller value means the better performance.But for Average Precision,the larger value means better performance.

    4.3 Comparative algorithms

    In order to validate the validity of GLLCBN model,we compare GLLCBN to the following most advanced multi-label learning algorithms:

    1.Binary Relevance(BR)[Boutell,Luo,Shen et al.(2004)]is first-order method.The main idea is to train a binary linear SVM classifier independently for each label.

    2.Calibarated Label Ranking(CLR)[Brinker(2008)]is second-order method.The main idea is to establish a label ranking by analyzing the pairwise labels.

    3.Multi-label Learning Using Local Correlation(ML-LOC)[Huang and Zhou(2012)]is high-order method.The main idea is to analyze the local label correlation by encoding instance features.

    4.Random K-Labelsets(RAKEL)[Tsoumakas,Katakis and Vlahavas(2011)]is highorder method.The main idea is to transform the multi-label classification problem into several multi-class learning problems by exploiting high-order global label correlation.

    All compared algorithms are summarized in Tab.2.

    Table 2:Compared methods

    4.4 Experimental results

    In our experiments,we randomly use 30%,50% and 70% of the data in data set as the training set,and the rest of data as test data set.Experimental results are shown from Tab.3 to Tab.5.In addition,all of our experimental methods are studied by using the Python or Matlab environment.

    Table 3:Performance evaluation of different algorithms for randomly marking 30% data sets as training data sets:mean ±std(rank)

    Table 4:Performance evaluation of different algorithms for randomly marking 50% data sets as training data sets:mean ±std(rank)

    Table 5:Performance evaluation of different algorithms for randomly marking 70% data sets as training data sets:mean ±std(rank)

    4.5 Experimental analysis

    According to experimental results in Section 4.4,we get the following summary:

    1.When the training data set is 30% of the data set,BR algorithm has some advantages in the performance evaluation of Hamming Loss,Coverage,Ranking Loss,Average Precision and Average Prediction Time.Global label correlation algorithm is superior to local label correlation algorithm.

    2.When the training data set is 50% of the data set,BR algorithm has better performance evaluation in Hamming Loss than the label correlation algorithm.In terms of Coverage,Ranking Loss,Average Precision and Average Prediction Time,the CLR,ML-LOC and GLLCBN algorithms that consider local label correlation have a better advantage than the RAKEL algorithm that considers global label correlation.

    3.When the training data set is 70% of the data set,the advantages of BR are gradually replaced by other algorithms.CLR algorithm in Hamming Loss is higher than other algorithms.For the label correlation algorithm,GLLCBN has better performance evaluation in terms of Coverage,Ranking Loss and Average Precision than MLLOC and RAKEL algorithms,but it is longer in Average Prediction Time.

    4.By analyzing the above three points,we can know that when the training data set is gradually increased,the advantage of the label correlation algorithm in the performance evaluation is gradually reflected,indicating that the label correlation has a certain influence on the multi-label classification problem.

    5 Conclusion and future work

    For the study of multi-label classification,how to mine the potential label correlation information is still a worthy direction in the future.In this paper,we propose a novel and effective approach named GLLCBN for multi-label learning.In the GLLCBN model,the node represents label space,and edge represents global and local comprehensive label correlation.We firstly obtain a complex model by analyzing label,global semantic relevance and local label correlation of data set(This process is called building node association graph),and secondly by using probability theory,Bayesian networks and graph theory to optimize label dependency graph(This process is called eliminating redundant edges),thus we construct a label-dependent network called GLLCBN model.Finally,the multi-label classification is solved by combining the initial prediction results by the Inception V3 model with the GLLCBN model.In addition,experimental results show that our proposed approach has certain effectiveness in performance evaluation.

    In the future,we consider optimizing the performance of our proposed methods in large scale label space data sets and applying this approach to more different multi-label data sets.

    Acknowledgement:The authors gratefully acknowledge support from National Key R&D Program of China(No.2018YFC0831800)and Innovation Base Project for Graduates(Research of Security Embedded System).

    十八禁网站网址无遮挡| 99香蕉大伊视频| 国产福利在线免费观看视频| 精品酒店卫生间| 免费看不卡的av| 极品少妇高潮喷水抽搐| 飞空精品影院首页| 麻豆精品久久久久久蜜桃| av卡一久久| 国产高清国产精品国产三级| 少妇的丰满在线观看| 国产极品粉嫩免费观看在线| 成人黄色视频免费在线看| 免费高清在线观看视频在线观看| 亚洲激情五月婷婷啪啪| 久久女婷五月综合色啪小说| 精品酒店卫生间| 黄色配什么色好看| 黄色一级大片看看| 九草在线视频观看| 日本黄色日本黄色录像| a级片在线免费高清观看视频| 午夜免费男女啪啪视频观看| 巨乳人妻的诱惑在线观看| 极品少妇高潮喷水抽搐| 新久久久久国产一级毛片| 亚洲少妇的诱惑av| 国产爽快片一区二区三区| 婷婷成人精品国产| 人妻系列 视频| 久久久久久人人人人人| 久热久热在线精品观看| 一区二区三区激情视频| 一级黄片播放器| 少妇人妻精品综合一区二区| 国产片特级美女逼逼视频| 女性被躁到高潮视频| 国语对白做爰xxxⅹ性视频网站| 午夜福利影视在线免费观看| 精品久久久精品久久久| 日日摸夜夜添夜夜爱| 亚洲 欧美一区二区三区| 黄色怎么调成土黄色| 国精品久久久久久国模美| videossex国产| 日本爱情动作片www.在线观看| 亚洲综合色网址| 在线观看三级黄色| 色吧在线观看| 欧美日韩一级在线毛片| 如日韩欧美国产精品一区二区三区| 视频在线观看一区二区三区| 中文字幕人妻丝袜制服| 国产高清不卡午夜福利| 亚洲人成77777在线视频| 亚洲经典国产精华液单| 欧美最新免费一区二区三区| 欧美激情极品国产一区二区三区| 亚洲在久久综合| 人人妻人人爽人人添夜夜欢视频| 亚洲精品美女久久av网站| 久久亚洲国产成人精品v| 80岁老熟妇乱子伦牲交| 在线观看人妻少妇| 国产精品麻豆人妻色哟哟久久| 久久国产精品男人的天堂亚洲| 久久久久久久亚洲中文字幕| 涩涩av久久男人的天堂| 国产成人精品一,二区| 日韩一区二区视频免费看| 国产野战对白在线观看| 亚洲精品国产av蜜桃| 婷婷成人精品国产| 国产又爽黄色视频| kizo精华| 亚洲在久久综合| 精品国产露脸久久av麻豆| 亚洲欧美精品自产自拍| 国产精品国产三级专区第一集| 免费黄频网站在线观看国产| 亚洲av在线观看美女高潮| 国产日韩欧美在线精品| 国产av一区二区精品久久| 亚洲视频免费观看视频| 成年人免费黄色播放视频| 大陆偷拍与自拍| 国产在线免费精品| 国产男女超爽视频在线观看| 香蕉丝袜av| 免费高清在线观看视频在线观看| 日韩欧美精品免费久久| 啦啦啦在线免费观看视频4| 性色avwww在线观看| 一边摸一边做爽爽视频免费| 女的被弄到高潮叫床怎么办| 天天躁夜夜躁狠狠躁躁| 少妇被粗大的猛进出69影院| 99精国产麻豆久久婷婷| 成年美女黄网站色视频大全免费| 国产成人精品久久二区二区91 | 伦理电影免费视频| 青青草视频在线视频观看| 99久久精品国产国产毛片| 少妇人妻精品综合一区二区| 亚洲伊人色综图| 欧美在线黄色| 国产伦理片在线播放av一区| 赤兔流量卡办理| 岛国毛片在线播放| 色94色欧美一区二区| 男女免费视频国产| 亚洲精品自拍成人| 性少妇av在线| 我的亚洲天堂| 亚洲精品日本国产第一区| 亚洲精品在线美女| 午夜精品国产一区二区电影| 十八禁高潮呻吟视频| 精品一区二区三卡| 黄色配什么色好看| 美女大奶头黄色视频| 在线观看免费日韩欧美大片| 少妇被粗大猛烈的视频| 精品久久蜜臀av无| av国产精品久久久久影院| 国产老妇伦熟女老妇高清| 婷婷成人精品国产| 亚洲精品一区蜜桃| 日韩视频在线欧美| 欧美人与善性xxx| av.在线天堂| 青草久久国产| 亚洲国产av新网站| 99热国产这里只有精品6| 国产精品三级大全| 少妇猛男粗大的猛烈进出视频| 国产精品免费视频内射| av在线app专区| 日本黄色日本黄色录像| av网站免费在线观看视频| 少妇被粗大猛烈的视频| 欧美精品av麻豆av| 夜夜骑夜夜射夜夜干| 婷婷色av中文字幕| 国产精品香港三级国产av潘金莲 | 黄色毛片三级朝国网站| a级毛片黄视频| 自线自在国产av| 久久av网站| 国产成人精品无人区| 久久精品久久久久久噜噜老黄| 国产亚洲最大av| 丰满少妇做爰视频| 久久久久精品性色| 婷婷成人精品国产| 男女无遮挡免费网站观看| 日韩熟女老妇一区二区性免费视频| 视频在线观看一区二区三区| 国产精品秋霞免费鲁丝片| 一级片'在线观看视频| 99久久综合免费| 一区二区三区激情视频| 少妇精品久久久久久久| 美女高潮到喷水免费观看| 大香蕉久久成人网| 日韩三级伦理在线观看| 大片免费播放器 马上看| 久久精品久久久久久久性| 国产午夜精品一二区理论片| 欧美日韩综合久久久久久| 一级,二级,三级黄色视频| 一级毛片 在线播放| av国产精品久久久久影院| 97人妻天天添夜夜摸| 久久精品国产鲁丝片午夜精品| 成人午夜精彩视频在线观看| 国产高清不卡午夜福利| 青草久久国产| 中国三级夫妇交换| 久久久久久久精品精品| 亚洲精品日本国产第一区| 天天躁夜夜躁狠狠躁躁| 深夜精品福利| 亚洲欧美一区二区三区黑人 | 极品人妻少妇av视频| 黄色怎么调成土黄色| 在现免费观看毛片| 亚洲美女搞黄在线观看| 国产伦理片在线播放av一区| 中文天堂在线官网| 日本wwww免费看| 91久久精品国产一区二区三区| 一级毛片我不卡| 天天躁夜夜躁狠狠躁躁| 久久av网站| 美女大奶头黄色视频| 2018国产大陆天天弄谢| 热99国产精品久久久久久7| 老熟女久久久| 欧美激情极品国产一区二区三区| 亚洲精品,欧美精品| 国产精品免费大片| 一区二区三区乱码不卡18| 亚洲图色成人| 日韩熟女老妇一区二区性免费视频| 欧美人与性动交α欧美精品济南到 | 久久国产精品大桥未久av| 欧美日韩精品成人综合77777| 久久午夜综合久久蜜桃| 日韩 亚洲 欧美在线| 国产精品麻豆人妻色哟哟久久| 国产熟女欧美一区二区| 人人妻人人澡人人看| 青青草视频在线视频观看| 妹子高潮喷水视频| 欧美变态另类bdsm刘玥| 老鸭窝网址在线观看| 97在线视频观看| 777久久人妻少妇嫩草av网站| 精品一区二区三卡| 满18在线观看网站| 自拍欧美九色日韩亚洲蝌蚪91| 午夜老司机福利剧场| 国产乱人偷精品视频| 午夜福利在线免费观看网站| 国产欧美日韩综合在线一区二区| 久久国产亚洲av麻豆专区| 女人被躁到高潮嗷嗷叫费观| 欧美+日韩+精品| 日本av免费视频播放| 人人妻人人添人人爽欧美一区卜| 激情视频va一区二区三区| 日韩精品免费视频一区二区三区| 国产女主播在线喷水免费视频网站| 色婷婷av一区二区三区视频| 午夜日韩欧美国产| 18禁观看日本| 老司机影院毛片| 人人妻人人澡人人看| 国精品久久久久久国模美| 国产精品久久久久久久久免| 欧美老熟妇乱子伦牲交| 免费高清在线观看视频在线观看| 亚洲精品国产色婷婷电影| 一边摸一边做爽爽视频免费| 亚洲视频免费观看视频| 亚洲av国产av综合av卡| 国产人伦9x9x在线观看 | 久久久久久久久久久久大奶| 男女边吃奶边做爰视频| 亚洲欧美精品综合一区二区三区 | 欧美xxⅹ黑人| xxx大片免费视频| 亚洲精品国产av成人精品| 精品99又大又爽又粗少妇毛片| www.av在线官网国产| 看非洲黑人一级黄片| 亚洲精品久久午夜乱码| 亚洲av免费高清在线观看| 亚洲欧美色中文字幕在线| 寂寞人妻少妇视频99o| 亚洲精品乱久久久久久| 久久国内精品自在自线图片| 亚洲第一av免费看| 永久网站在线| 女人精品久久久久毛片| 美国免费a级毛片| 丝袜喷水一区| 精品国产露脸久久av麻豆| 99久久中文字幕三级久久日本| 春色校园在线视频观看| av在线老鸭窝| 成人毛片60女人毛片免费| 国产人伦9x9x在线观看 | 亚洲av欧美aⅴ国产| 日本猛色少妇xxxxx猛交久久| 欧美日韩精品网址| 国产一区有黄有色的免费视频| 国产福利在线免费观看视频| 久久久久国产一级毛片高清牌| 日韩免费高清中文字幕av| 国产日韩欧美在线精品| 满18在线观看网站| 国产亚洲午夜精品一区二区久久| 久久久久精品性色| 欧美老熟妇乱子伦牲交| av免费观看日本| 免费观看性生交大片5| 欧美亚洲日本最大视频资源| 精品酒店卫生间| 日本vs欧美在线观看视频| 高清欧美精品videossex| 国产日韩欧美在线精品| 国产女主播在线喷水免费视频网站| 欧美少妇被猛烈插入视频| 亚洲三区欧美一区| 欧美黄色片欧美黄色片| 欧美人与性动交α欧美精品济南到 | 欧美精品av麻豆av| 亚洲成人一二三区av| 亚洲在久久综合| 精品国产一区二区久久| 一级毛片我不卡| 波多野结衣av一区二区av| 亚洲第一区二区三区不卡| 亚洲精品乱久久久久久| av不卡在线播放| 午夜免费观看性视频| 最近中文字幕2019免费版| 日本vs欧美在线观看视频| 久久国内精品自在自线图片| 亚洲欧美色中文字幕在线| 电影成人av| 美女xxoo啪啪120秒动态图| 日韩中文字幕欧美一区二区 | 日韩欧美精品免费久久| 少妇精品久久久久久久| 国产成人精品无人区| 国产片内射在线| 老汉色∧v一级毛片| 精品酒店卫生间| 十分钟在线观看高清视频www| 色婷婷久久久亚洲欧美| 1024香蕉在线观看| 国产黄频视频在线观看| 中国国产av一级| 在线亚洲精品国产二区图片欧美| 丝袜美腿诱惑在线| 亚洲av福利一区| 一级黄片播放器| 十八禁网站网址无遮挡| 69精品国产乱码久久久| av国产久精品久网站免费入址| 免费少妇av软件| 亚洲精品视频女| 啦啦啦中文免费视频观看日本| 国产爽快片一区二区三区| 中文字幕另类日韩欧美亚洲嫩草| 国产男人的电影天堂91| 日韩中文字幕欧美一区二区 | 精品少妇一区二区三区视频日本电影 | 咕卡用的链子| 精品一品国产午夜福利视频| 日韩一本色道免费dvd| 一区二区三区四区激情视频| 久久久久久久亚洲中文字幕| 男女下面插进去视频免费观看| 国产精品免费大片| 成人二区视频| 久久国产精品大桥未久av| 欧美日韩亚洲国产一区二区在线观看 | 久久综合国产亚洲精品| 亚洲少妇的诱惑av| 午夜日本视频在线| 校园人妻丝袜中文字幕| 涩涩av久久男人的天堂| 亚洲精品aⅴ在线观看| av免费在线看不卡| 欧美少妇被猛烈插入视频| 亚洲情色 制服丝袜| 免费观看性生交大片5| 黄色怎么调成土黄色| 免费观看性生交大片5| videossex国产| 久久精品国产综合久久久| 99热全是精品| 看十八女毛片水多多多| 亚洲精品乱久久久久久| 建设人人有责人人尽责人人享有的| 国产精品一区二区在线观看99| 免费久久久久久久精品成人欧美视频| 亚洲精品在线美女| 亚洲成人av在线免费| 欧美少妇被猛烈插入视频| 天天躁夜夜躁狠狠躁躁| 免费观看av网站的网址| 丰满迷人的少妇在线观看| 成年av动漫网址| 久久精品国产亚洲av高清一级| 亚洲精品国产色婷婷电影| 黑人巨大精品欧美一区二区蜜桃| 纵有疾风起免费观看全集完整版| 亚洲国产精品一区三区| 男女午夜视频在线观看| 亚洲精品日韩在线中文字幕| 亚洲国产欧美网| 国产不卡av网站在线观看| 国产极品粉嫩免费观看在线| 国产片特级美女逼逼视频| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 色哟哟·www| 久久久久久久大尺度免费视频| 成年美女黄网站色视频大全免费| 激情五月婷婷亚洲| 九九爱精品视频在线观看| 1024视频免费在线观看| av国产精品久久久久影院| 久久精品aⅴ一区二区三区四区 | 久久精品久久精品一区二区三区| 亚洲三级黄色毛片| 99香蕉大伊视频| 丝袜脚勾引网站| 美女高潮到喷水免费观看| 中国国产av一级| tube8黄色片| 日本午夜av视频| 国产精品久久久久久精品电影小说| av有码第一页| 国产亚洲欧美精品永久| 伊人久久大香线蕉亚洲五| 亚洲av中文av极速乱| 卡戴珊不雅视频在线播放| av视频免费观看在线观看| 极品人妻少妇av视频| 麻豆精品久久久久久蜜桃| 免费观看在线日韩| 国产综合精华液| 日韩精品免费视频一区二区三区| 日韩伦理黄色片| 一级黄片播放器| av.在线天堂| av国产精品久久久久影院| 人妻 亚洲 视频| 亚洲av.av天堂| 久久精品久久久久久久性| 亚洲欧美精品自产自拍| 黄色怎么调成土黄色| 久久婷婷青草| 国产欧美日韩综合在线一区二区| 亚洲欧美精品综合一区二区三区 | 18禁裸乳无遮挡动漫免费视频| 999精品在线视频| 男女免费视频国产| 色播在线永久视频| 深夜精品福利| 黑人猛操日本美女一级片| 久久精品国产亚洲av天美| 青草久久国产| 人妻人人澡人人爽人人| 黄色配什么色好看| 中文字幕最新亚洲高清| 午夜影院在线不卡| 观看av在线不卡| 亚洲三区欧美一区| 久久久久久久久久久久大奶| 国产免费视频播放在线视频| 亚洲视频免费观看视频| 中国三级夫妇交换| 在线亚洲精品国产二区图片欧美| 老鸭窝网址在线观看| 精品少妇内射三级| 熟女av电影| 2018国产大陆天天弄谢| 亚洲av在线观看美女高潮| 免费人妻精品一区二区三区视频| 亚洲av免费高清在线观看| 激情视频va一区二区三区| 欧美国产精品一级二级三级| 国产野战对白在线观看| 日韩伦理黄色片| 麻豆精品久久久久久蜜桃| 美女脱内裤让男人舔精品视频| 久久99热这里只频精品6学生| 久久久精品免费免费高清| 色吧在线观看| 晚上一个人看的免费电影| 这个男人来自地球电影免费观看 | 色婷婷久久久亚洲欧美| 成人亚洲精品一区在线观看| 成人18禁高潮啪啪吃奶动态图| 少妇精品久久久久久久| 日韩精品有码人妻一区| 亚洲五月色婷婷综合| 免费女性裸体啪啪无遮挡网站| 王馨瑶露胸无遮挡在线观看| a级片在线免费高清观看视频| 一级片'在线观看视频| 国产乱人偷精品视频| 看非洲黑人一级黄片| 建设人人有责人人尽责人人享有的| 亚洲精品aⅴ在线观看| 春色校园在线视频观看| 国产精品一区二区在线观看99| 啦啦啦中文免费视频观看日本| 日韩欧美一区视频在线观看| 日韩大片免费观看网站| 熟妇人妻不卡中文字幕| 男人操女人黄网站| 青春草国产在线视频| 纵有疾风起免费观看全集完整版| 99九九在线精品视频| www.av在线官网国产| 在线观看美女被高潮喷水网站| 午夜免费男女啪啪视频观看| 可以免费在线观看a视频的电影网站 | 青青草视频在线视频观看| 欧美变态另类bdsm刘玥| 免费久久久久久久精品成人欧美视频| 日日啪夜夜爽| 亚洲欧美成人综合另类久久久| 久久精品久久久久久久性| 亚洲精品视频女| a 毛片基地| 精品福利永久在线观看| 一二三四中文在线观看免费高清| 国产探花极品一区二区| 多毛熟女@视频| 在线观看免费视频网站a站| 女人精品久久久久毛片| 国产日韩欧美亚洲二区| 久久99精品国语久久久| 亚洲欧美中文字幕日韩二区| 免费观看在线日韩| 最近手机中文字幕大全| 人人妻人人澡人人爽人人夜夜| xxxhd国产人妻xxx| 另类亚洲欧美激情| 黑人猛操日本美女一级片| 国产xxxxx性猛交| 在线观看三级黄色| 亚洲国产毛片av蜜桃av| 成人国产av品久久久| 国产黄频视频在线观看| www日本在线高清视频| 久久人妻熟女aⅴ| 国产精品久久久久成人av| 午夜福利在线免费观看网站| 国产成人免费无遮挡视频| 亚洲国产精品999| 精品第一国产精品| 深夜精品福利| 欧美日韩视频高清一区二区三区二| 尾随美女入室| 亚洲av电影在线观看一区二区三区| 久久这里有精品视频免费| 国产激情久久老熟女| 午夜福利乱码中文字幕| 熟女少妇亚洲综合色aaa.| 亚洲国产毛片av蜜桃av| 欧美精品国产亚洲| 久久狼人影院| 有码 亚洲区| 久久久欧美国产精品| 中文字幕最新亚洲高清| 久久亚洲国产成人精品v| 国产成人av激情在线播放| 欧美+日韩+精品| 青青草视频在线视频观看| 高清视频免费观看一区二区| 观看美女的网站| 欧美日本中文国产一区发布| 日本av手机在线免费观看| 国产一区二区三区综合在线观看| 国产亚洲最大av| 宅男免费午夜| 欧美日韩视频高清一区二区三区二| 亚洲成av片中文字幕在线观看 | 黄色 视频免费看| 国产熟女午夜一区二区三区| 亚洲精品中文字幕在线视频| 日本午夜av视频| 精品99又大又爽又粗少妇毛片| 国产精品秋霞免费鲁丝片| 高清在线视频一区二区三区| 成人国语在线视频| 亚洲三级黄色毛片| 久久久精品94久久精品| 欧美黄色片欧美黄色片| 欧美日韩视频高清一区二区三区二| 国产探花极品一区二区| 国产成人免费无遮挡视频| 亚洲成色77777| 亚洲av免费高清在线观看| 亚洲精品aⅴ在线观看| 久久久久国产精品人妻一区二区| 视频在线观看一区二区三区| 亚洲成人一二三区av| 久久久久精品久久久久真实原创| 黑人巨大精品欧美一区二区蜜桃| 桃花免费在线播放| 美女午夜性视频免费| 黑人猛操日本美女一级片| 欧美老熟妇乱子伦牲交| 亚洲,一卡二卡三卡| 免费高清在线观看日韩| 亚洲欧美清纯卡通| 在线看a的网站| 精品少妇一区二区三区视频日本电影 | 日韩一区二区三区影片| 一区二区三区精品91| 婷婷成人精品国产| 精品国产一区二区久久| www.自偷自拍.com| 亚洲av福利一区| 色婷婷av一区二区三区视频| 亚洲国产精品国产精品| 国产男人的电影天堂91| 国产av一区二区精品久久| 观看美女的网站| 欧美最新免费一区二区三区| 最近手机中文字幕大全| 中文天堂在线官网| 啦啦啦啦在线视频资源| 欧美日韩一级在线毛片| videos熟女内射| 大片电影免费在线观看免费| 日韩在线高清观看一区二区三区| xxxhd国产人妻xxx| 久久久久久久久久人人人人人人| 热99国产精品久久久久久7| 亚洲精品自拍成人| 在线亚洲精品国产二区图片欧美| 极品人妻少妇av视频| 美女中出高潮动态图| www.精华液|