• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    2016-12-05 07:01:07LiWenfa李文法WangGongmingMaNanLiuHongzhe
    High Technology Letters 2016年3期
    關(guān)鍵詞:文法

    Li Wenfa (李文法):Wang Gongming :Ma Nan:Liu Hongzhe

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    Li Wenfa (李文法)*To whom correspondence should be addressed.E-mail:liwenfa@buu.edu.cnReceived on Dec.16,2015*:Wang Gongming**:Ma Nan*:Liu Hongzhe*

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data.The equidistance problem is solved using NPsim function to calculate similarity.And a sequential NPsim matrix is built to improve indexing performance.To sum up the above innovations:a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others.In addition:the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.

    nearest neighbor search:high-dimensional data:similarity:indexing tree:NPsim:KD-tree:SR-tree:Munsell

    0 Introduction

    The nearest neighbor search is looking for several points that are nearest from the given point[1]:which is widely used in text clustering:recommendation system:multimedia retrieval:sequence analysis:etc.Generally speaking:the data whose dimensionality is more than 20 belongs to high-dimensional data[2].The traditional nearest neighbor search algorithms may fail in high-dimensional data because of the curse of dimensionality[3].Thus:the nearest neighbor search of high-dimensional data has become a challenging but useful issue in data mining.Currently:this issue has been researched to a certain extent.With position sensitive hashing algorithm[4]:a high-dimensional vector is mapped onto address space:and previous similar points are still close to each other in a larger probability:which overcomes the equidistance of high-dimensional data.But its applicability is limited because of same hash functions and neglection of data difference.To solve this problem:a self-taught hashing (STH)[5]was proposed by Zhang:et al.The similarity matrix is built at first.And the matrix decomposition and eigenvalue solution are carried out subsequently.But it has large time and space complexity.The iDistance[6]and vector approximation file (VA-File)[7]are suitable for indexing structure.However:its query cost is very huge.

    In essence:the similarity measurement and index tree construction have affected performance of nearest neighbor search of high-dimensional data.Thus:solving problems that exist in above aspects is very important.At present:most of similarity measurement methods of high-dimensional data ignore the relative difference of property:noise distribution:weight and other factors:which are valid for a small number of data types[8].Psim(X,Y) function considers the above factors[8]and is applicable for all kinds of data type.But it is unable to compare similarity under different dimensions because its range depends on spatial dimensionality.Thus:the NPsim(X:Y) function is proposed to solve this problem and makes its range [0:1].The defect of index tree on construction and query has made up with sequential NPsim matrix.This method is easy to parallelize.Assuming that the dimensionality is M:the time complexity of building sequential NPsim matrix is O(M2·n):but the one after parallelization is reduced into O(M·n).The time complexity of nearest neighbor search is O(1).This algorithm is compared with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.The experimental results show that the similarity of our proposed algorithm is better than the one of other algorithms.The construction of sequential NPsim matrix is time consuming:but its searching speed is more than thousands times of others.In addition:the construction time of sequential NPsim matrix can be reduced dramatically by virtue of parallelization.Thus:its whole performance is better than the one of others.

    1 Related work

    In recent years:the similarity measurement and index tree construction have been researched to a certain extent.But insufficiency still exists.

    To solve the problem in similarity measurement:the Hsim(X:Y) function[9]was proposed by Yang:which is better than traditional method:but neglects the relative difference and noise distribution.The Gsim(X:Y) function[10]is proposed according to relative difference of properties in different dimensions.But the weight discrepancy is ignored.The Close(X:Y) function[11]based on monotone decreasing of e-xcan overcome the influence from components in some dimensions whose variance are larger.But relative difference is not considered which would be affected by noise.The Esim(X:Y)[12]function is proposed by improving Hsim(X:Y) and Close(X:Y) functions.In each dimension:the Esim(X:Y) component is positive correlation to the value in this dimension.All dimensions are divided into two parts:normal dimension and noisy dimension.In noisy dimension:the noise occupies majority.When noise is similar to and larger than the one in normal dimension:this method will be invalid.The secondary measurement method[13]is used to calculate the similarity by virtue of property distribution:space distance:etc.But the noise distribution and weight have been neglected.In addition:it is time-consuming.The projection nearest neighbor is proposed by Hinneburg[14]:which is used to solve the problem in higher dimensional space by dimension reduction.But it is hard to find right quality criterion function.In high-dimensional space:Yi has found[8]that the difference in noisy dimension is larger:no matter how similar data is.This difference has occupied a large amount of similarity calculation:which results in the distances between all points to be similar.Therefore:Psim(X:Y) function[8]is proposed to eliminate the noisy influence by analyzing difference among all dimensions.The experimental result indicates that this method is suitable for all kinds of data type.But its range is [0:n]:where n is dimensionality.Thus:the similarities in different dimensions are unable to compare.

    There are two kinds of index trees used in high-dimensional space:index tree based on the vector space:and index tree based on metric space.The typical example of the former is R-tree[15].It is a natural extension of B-tree in high-dimensional space and can solve the data searching problem.However:R-tree has the problems of brother node overlap:multiple queries:and low utilization:etc.Therefore:the extension of R-tree has been proposed:such as R+tree:R*tree:cR-tree.The common structure of the later is VP-tree[16]:which is a binary search tree and suitable for large-scale data search.But it is a static tree and could not be inserted or deleted.In addition:the distance calculation is time-consuming.MVP-tree is the improvement of VP-tree[17]and the cost of distance calculation is decreased.But its time complexity in stage of creation or query is higher than the one of VP-tree.M-tree is the hash index represented by B-tree[18]and has high searching efficiency.However:it can only carry out single value searching:instead of range searching.SA-tree is created according to the distance between leaf node and root node[19].But it is a completely static tree and could not be inserted or deleted.

    2 Key technology

    2.1 Similarity measurement

    Inn-dimensional space:set S={S1:S2:…:SM} is composed of M points Si={Si1:Si2,…,Sij,…,Sin}:where i=1,2,…,M:j=1,2,…,n:and Sijis the jth property of Si.Assuming that X and Y are any two points in set S:Sim(X:Y) is similarity between X and Y.

    Sim(X:Y) is usually measured with distance function.The typical methods include Manhattan distance:Euclidean distance:etc.However:with the increase of dimensionality:the nearest and farthest distances become the same[2].Thus:these methods are invalid in high-dimensional space.To solve this problem:several methods are proposed:such as Hsim(X:Y):Gsim(X:Y):Close(X:Y):Esim(X:Y):yet they are valid for limited types of data[2].The Psim(X:Y) is suitable for a variety of data type:and its range is dependent on spatial dimensionality and unable to compare the similarity under the different dimensions.Under the circumstance of maintaining effects:Psim(X:Y) is updated as

    (1)

    where Xiand Yiare components in ith dimension.δ(Xi:Yi) is discriminant function.If Xiand Yiare in the same interval [ni:mi]:δ(Xi:Yi)=1 is hold.Otherwise:δ(Xi:Yi)=0 is hold.E(X:Y) is the number of intervals in which components of X and Y are all the same.It can be seen that the range of NPsim(X:Y) is in [0:1].The above is the outline of NPsim:and detailed introduction can be found in reference[8].

    To validate this method:several records in dimensions of 10:30:50:100:150:200:250:300:350:and 400 are generated with normrnd() function of Matlab.The number of records in every dimension is 1000.After that:relative difference between the farthest and the nearest neighbors is calculated with

    (2)

    where Dmaxn:Dminnand Davgnare maximal:minimal and average similarities in n-dimensional space respectively[20].

    According to the characteristics of results:similarity measurement methods are divided into two kinds.The first kind of methods include Manhattan distance:Euclidean distance:Hsim(X:Y):Gsim(X:Y):Close(X:Y) and Esim(X:Y).The others include Psim(X:Y) and NPsim(X:Y).The result is shown in Fig.1.It can be seen that relative difference of the second kind of methods is two or three magnitudes than the one of the first.Therefore:the performance advantage of the second kind of methods is obvious.

    Fig.1 Relative difference of various similarity measurement methods

    The numbers of Psim(X:Y)≥1 in different dimensions are shown in Table 1.The number of Psim(X:Y) in every dimension is 1000×1000=1000000.Thus:the 6%~15% result is more than 1:which is unable to compare the similarity in different dimensions.But this problem is not existed in NPsim(X:Y) function.

    Table 1 Number of Psim(X:Y)>=1 in different dimensions

    2.2Knearest neighbor search

    For any point Stin set S:1≤t≤M:search for set Rt?S composed of k points:which meets the following requirements.

    ?r∈Rt:NPsim(St:r)≥max{NPsim(St:s)|s∈S∩s?Rt}

    Rtis theKnearest neighbor (KNN) set of St.The course for generating Rtis called KNN search.

    3 Nearest neighbor search algorithm

    3.1 Whole framework

    The whole framework is shown in Fig.2.First of all,NPsim matrix is generated.After that:sequential NPsim matrix produced by sorting elements in each row of NPsim is sorted in descending order.

    Fig.2 Whole framework of the proposed algorithm

    3.2 Execution flow

    1) Construction of NPsim matrix

    The NPsim matrix is generated with the following steps.

    Step 1 M points Si:i=1,2,…,M are stored in M×n matrix DataSet.

    Step 2 The elements in every column of DataSet are sorted in ascending order in order to generate the matrix SortDataSet.

    Step 3 The elements in each column of SortDataSet are divided into G=「θ·n? intervals.Thus:the number of elements in every interval is T=「M/G?.Meanwhile:the endpoint of every interval is saved into the matrix FirCutBound.

    Step 4 The interval number of element of DataSet is determined according to the matrix FirCutBound:which is saved into the interval number matrix FirNumSet.

    Step 5 The M×M matrix SameDimNum is generated.For any two points Spand Sq:the number of intervals in which components of Spand Sqare all the same is calculated and saved into the matrix element SameDimNum[p][q].

    Step 6 The matrix SortDataSet is divided along the column again.The number of intervals is G′=G-1 and there are T′=|M/G′| elements in each interval.After that:the endpoint of every interval is saved into the matrix SecCutBound.

    Step 7 The interval number matrix SecNumSet is produced according to Step 4.

    Step 8 The matrix SameDimNum is updated.For any points Spand Sq:if the interval number of components in one dimension is different in Step 3:but same in Step 7:then SameDimNum[p][q]=SameDimNum[p][q]+1.

    Step 9 The M×M matrix NPsimMat is built according to the results from Step 3 to Step 9.The NPsim information of Spand Sq(1≤p:q≤M) is stored into NPsimMat[p][q]:which includes three parts:subscript p and q:NPsim(Sp:Sq).

    2) Sorting for NPsim matrix

    The sequential NPsim matrix is produced with the following steps.

    Step 1 The M×M matrix SortNPsimMat is produced:which is the copy of NPsimMat.

    Step 2 The elements in each row of SortNPsimMat are sorted in the descending order of NPsim.

    With the increase of column number:elements in pth row becomes lower and lower:which represents the distance between Spand corresponding point is farther and farther.

    3) Nearest neighbor search

    The KNN of Siis found out as follows.

    Step 1 The frontierKelements in ith row are selected.

    Step 2 The points different from Siare expected result.

    3.3 Time complexity analysis

    This algorithm is separated into two stages:construction of sequential NPsim matrix and searching KNN.There are two steps at the first stage:the time complexity is analyzed as follows.

    (1) Construction of NPsim matrix

    In this step:four kinds of matrixes are produced.The first is SortDataSet and is produced by sorting elements in every column.Its time complexity is O(MlogM·n).The second is FirCutBound and SecCutBound that are generated by visiting all elements of DataSet.Thus:the time complexity is O(M·n).The third is FirNumSet and SecNumSet which are obtained by locating the column number of element.The corresponding time complexity is O(M2·n).The fourth is SameDimNum that is produced by comparing the element per column.The time complexity of this operation is O(M2·n).Finally:the NPsim component is calculated and summed up to whole NPsim value.

    (2) Sorting for NPsim matrix

    The elements in every row of NPsimMat are sorted in the descending of NPsim.Its time complexity is O(M·nlogn).

    To sum up above analysis:the time complexity of construction stage is O(M2·n)+O(M·nlogn)=O(M2·n).

    In the course of searching:the frontierKelements in ith row are visited.Thus:the corresponding time complexity is O(1).

    4 Experiment

    The proposed algorithm includes two stages (construction and searching) that must be contained in the selected algorithm in comparison.For nearest neighbor search algorithm based on KD-tree:the KD-tree is built at first:and searching is carried out subsequently.The nearest neighbor search algorithm based on SR-tree is similar.Thus:the above two algorithms are selected in the following experiment.

    4.1 Data introduction

    The Munsell Color-Glossy set is proposed by American chromatist Munsell and is revised repeatedly by American National Standards Institute (ANSI) and Optical Society:which is one of the standard color sets and includes 1600 colors and each of them is represented with HV/C.The H:V:and C are abbreviations of hue:brightness and saturation respectively.

    The spectral reflectance of all colors in Munsell Color-Glossy set is downloaded from spectral color research group (http://www.uef.fi/fi/spectral/home).Each of them is a vector that contains 401 piece of spectral reflectance in different wavelengths:which is regarded as high-dimensional data.

    4.2 Overview

    First of all:the running times of constructing sequential NPsim matrix:KD-tree:and SR-tree are calculated.After that:KNN search of given point in Munsell color cubic is carried out.The locations of given point and neighbors must be close or continuous.

    Assuming that HV/C and HKVK/CKare given point and corresponding neighbors respectively:the Munsell distance between them is

    Distance=|HK-H|+|VK-V|+|CK-C|(3)

    On one hand:the neighbor colors from three algorithms are compared.On the other hand:with the increase of K:the construction and searching times of different algorithms are calculated and analyzed.

    4.3 Result

    The proposed algorithm:and traditional algorithms based on KD-tree and SR-tree are implemented in the experiment:and the results are compared in aspects of accuracy and speed.There is no parallel strategy used in the following experiment.The hardware includes AMD Athlon(tm) II X2-250 processor and Kingston 4G memory and the software is WinXP operation system and MicroSoft Visual Studio 2008.

    1) Accuracy analysis

    The Munsell color 5BG3/2 is selected for KNN search:which is shown in Fig.3.Searching result is shown in Table 2:3:and 4 under the circumstanceK=6.In Table 2:KNN distance is the NPsim value:and the one in Table 3 and 4 is the Euclidean distance.It can be seen that the color from the proposed method is closer to the given color.But the one from other methods has obvious difference from 5BG3/2:such as 5B3/4 in Table 3 and 7.5BG4/6 in Table 3.In addition:the Munsell distance of the proposed method is less than the one of others.In some cases:the Munsell distances of pioneers:nearest neighbors:and successors are not the ascending order:which is called as reverse phenomenon.The 10BG3/2 in Table 2:5B3/4 in Table 3:and 7.5BG4/6 in Table 4 are typical examples.The number of nearest neighbors with reverse phenomenon in Table 2:3:and 4 are 2:4:and 4:which indicates the stability of the proposed method is better than the one of others.

    Fig.3 Color of 5BG3/2

    Table 3 KNN result of KD-tree algorithm

    Table 4 KNN result of SR-tree algorithm

    2) Speed analysis

    The construction time of indexing structure is shown in Table 5.It can be seen that the one of sequential NPsim matrix is about ten times of the one of KD-tree or SR-tree.

    Table 5 Construction time of different indexing structures

    With the increase ofKvalue:the average searching time for KNN of 1000 selected Munsell colors are shown in Fig.4.The experimental result indicates that the magnitude of the proposed method is about 10-6:but the one of other methods is about 10-2.That is to say:the searching speed of the proposed method is more than thousands times of others.

    Although the construction speed of sequential NPsim matrix is slow:the searching speed is fast.And sequential NPsim matrix can be stored into disk and loaded with high speed at any time.So:the performance of sequential NPsim matrix is better than the one of KD-tree and SR-tree in nearest neighbor search of high-dimensional data.

    Fig.4 Average searching time of KNN with different algorithms

    5 Conclusion

    The nearest neighbor search of high-dimensional data is the foundation of high-dimensional data processing.The problem existing in the similarity measurement and index tree construction has affected the performance of traditional nearest neighbor search algorithm.The NPsim function and sequential NPsim matrix are designed to solve this problem:which are combined to propose a new nearest neighbor search algorithm.To validate the proposed algorithm:the traditional algorithms based on KD-tree and SR-tree are compared in the experiment on Munsell Color-Glossy set.The results show that the accuracy and searching speed of the method are better than the one of two methods.

    However:the construction speed of sequential NPsim matrix is slower than the one of KD-tree and SR-tree.The reason is the time complexity of constructing sequential NPsim matrix is O(M2·n):but the one of constructing KD-tree and SR-tree are both O(M·log2M·n).From section 4.2.1 and 4.2.2.The operations generating different rows of sequential NPsim matrix are independent of each other:which esay be paralleled.But the parallelization for construction of index tree is hard.Therefore:the time complexity would be reduced from O(M2·n) to O(M·n) by virute of parallel.Thus:using parallel in construction of index tree is the future work.

    [1] Jegou H:France L R R:Douze M:et al.Product quantization for nearest neighbor search.IEEETransactionsonPatternAnalysisandMachineIntelligence:2010:33(1):117-128

    [2] Ericson K:Pallickara S.On the performance of high dimensional data clustering and classification algorithms.FutureGenerationComputerSystems:2013:29(4):1024-1034

    [3] Bellman R.Dynamic Programming.Princeton:New Jersey:Dover Publications Inc:2010.152-153

    [4] Andoni A:Indyk P.Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions.CommunicationsoftheACM:2008:51(1):117-122

    [5] Zhang D:Wang J:Cai D:et al.Self-taught hashing for fast similarity search.In:Proceedings of the ACM SIGIR 2010:New York:ACM Press:2010.18-25

    [6] Jagadish H V:Ooi B C:Tan K L:et al.iDistance:an adaptive B+-tree based indexing method for nearest neighbor search.ACMTransactionsonDatabaseSystems:2005:30(2):364-397

    [7] Heisterkamp D R:Peng J.Kernel vector approximation files for relevance feedback retrieval in large image databases.MultimediaToolsandApplications:2005:26(2):175-189

    [8] Yi L H.Research on Clustering Algorithm for High Dimensional Data:[Ph.D dissertation].Qinhuangdao:Institute of Information Science and Engineering:Yanshan University:2011.28-30

    [9] Yang F Z:Zhu Y Y.An efficient method for similarity search on quantitative transaction data.JournalofComputerResearchandDevelopment:2004:41(2):361-368

    [10] Huang S D:Chen Q M.On clustering algorithm of high dimensional data based on similarity measurement.ComputerApplicationsandSoftware:2009:26(9):102-105

    [11] Shao C S:Lou W:Yan L M.Optimization of algorithm of similarity measurement in high-dimensional data.ComputerTechnologyandDevelopment:2011:21(2):1-4

    [12] Wang X Y:Zhang H Y:Shen L Z:et al.Research on high dimensional clustering algorithm based on similarity measurement.ComputerTechnologyandDevelopment:2013:23(5):30-33

    [13] Jia X Y.A high dimensional data clustering algorithm based on twice similarity.JournalofComputerApplications:2005:25(B12):176-177

    [14] Alexander H:Charu A C:Keim D A.What is the nearest neighbor in high dimensional spaces.In:Proceedings of the VLDB 2000:Birmingham:IEEE Computer Society:2000.506-515

    [15] Berkhin P.A survey of clustering data mining techniques.In:Grouping Multidimensional Data:Recent Advances in Clustering.Berlin:Springer-Verlag:2006.25-71

    [16] Nielsen F:Piro P:Barlaud M.Bregman vantage point trees for efficient nearest neighbor queries.In:Proceedings of the IEEE International Conference on Multimedia and Expo 2009.Birmingham:IEEE Computer Society:2009.878-881

    [17] Hetland M L.The basic principles of metric indexing.StudiesinComputationalIntelligence:2009:242:199-232

    [18] Kunze M:Weske M.Metric trees for efficient similarity search in large process model repositories.LectureNotesinBusinessInformationProcessing:2011:66:535-546

    [19] Navarro G.Searching in metric spaces by spatial approximation.TheVldbJournal:2002:11(1):28-46

    [20] Charu C:Aggarwal:Yu P S.The IGrid index:Reversing the dimensionality curse for similarity indexing in high dimensional space.In:Proceedings of ACM SIGKDD 2000.New York:ACM Press:2000.119-129

    Li Wenfa,born in 1974.He received his Ph.D.degree in Graduate University of Chinese Academy of Sciences in 2009.He also received his B.S.and M.S.degrees from PLA Information Engineering University in 1998 and 2003 respectively.His research interests include information security:data analysis and mining:etc.

    10.3772/j.issn.1006-6748.2016.03.002

    ①Supported by the National Natural Science Foundation of China (No.61300078):the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions (No.CIT&TCD201504039):Funding Project for Academic Human Resources Development in Beijing Union University (No.BPHR2014A03:Rk100201510) :and “New Start” Academic Research Projects of Beijing Union University (No.Hzk10201501).

    猜你喜歡
    文法
    從絕響到轉(zhuǎn)型:近現(xiàn)代“文法”概念與“文法學(xué)”
    關(guān)于1940 年尼瑪抄寫的《托忒文文法》手抄本
    中國石油大學(xué)勝利學(xué)院文法與經(jīng)濟管理學(xué)院簡介
    西夏文銅鏡的真言文法與四臂觀音像研究
    西夏學(xué)(2018年2期)2018-05-15 11:24:00
    Similarity measurement method of high-dimensional data based on normalized net lattice subspace①
    LL(1)文法分析器的研究與分析
    科技風(2017年25期)2017-05-30 15:40:44
    25年呵護患病妻子不離不棄
    兵團工運(2016年9期)2016-11-09 05:46:13
    基于領(lǐng)域文法的微博輿情分析方法及其應(yīng)用
    基于單向點格自動機的UPG文法識別并行算法
    文法有道,為作文注入音樂美
    不卡一级毛片| 人成视频在线观看免费观看| 免费在线观看影片大全网站| 国产日韩一区二区三区精品不卡| 亚洲一区二区三区欧美精品| 热99久久久久精品小说推荐| 精品一区二区三区视频在线观看免费 | 免费看a级黄色片| netflix在线观看网站| 日韩一卡2卡3卡4卡2021年| 欧美日本中文国产一区发布| 纯流量卡能插随身wifi吗| 99国产精品99久久久久| 精品久久久久久电影网| 狠狠婷婷综合久久久久久88av| 丁香六月天网| av电影中文网址| 亚洲免费av在线视频| 在线看a的网站| 久久久精品免费免费高清| 在线观看免费视频日本深夜| 黄色a级毛片大全视频| 日本一区二区免费在线视频| 国产男女超爽视频在线观看| videos熟女内射| 亚洲va日本ⅴa欧美va伊人久久| 欧美国产精品一级二级三级| 国产男靠女视频免费网站| 欧美 亚洲 国产 日韩一| 久久久久视频综合| 最近最新免费中文字幕在线| 男女无遮挡免费网站观看| 国产精品亚洲一级av第二区| 亚洲色图综合在线观看| 极品人妻少妇av视频| 天堂中文最新版在线下载| 国产av一区二区精品久久| 一边摸一边做爽爽视频免费| 大片免费播放器 马上看| 人人妻人人澡人人看| 狠狠狠狠99中文字幕| 国产精品香港三级国产av潘金莲| 动漫黄色视频在线观看| 久久 成人 亚洲| 老司机福利观看| 这个男人来自地球电影免费观看| 岛国毛片在线播放| 国产在线观看jvid| 中文亚洲av片在线观看爽 | 在线av久久热| 中文欧美无线码| 亚洲av电影在线进入| 巨乳人妻的诱惑在线观看| 国产精品99久久99久久久不卡| 制服人妻中文乱码| 欧美乱妇无乱码| 精品久久久久久久毛片微露脸| 日韩视频在线欧美| 亚洲三区欧美一区| 国产亚洲精品一区二区www | 最新的欧美精品一区二区| 啦啦啦 在线观看视频| 亚洲欧美色中文字幕在线| 男人舔女人的私密视频| 久热爱精品视频在线9| 一区二区三区乱码不卡18| 久久久精品94久久精品| 国产单亲对白刺激| 久久久精品国产亚洲av高清涩受| 最黄视频免费看| 亚洲av欧美aⅴ国产| 日韩欧美一区二区三区在线观看 | 欧美亚洲 丝袜 人妻 在线| 久久久久久久大尺度免费视频| 日韩中文字幕视频在线看片| 亚洲av美国av| 夜夜骑夜夜射夜夜干| 两性夫妻黄色片| 亚洲av国产av综合av卡| 最近最新中文字幕大全免费视频| 亚洲精品一卡2卡三卡4卡5卡| 国产黄色免费在线视频| 亚洲 国产 在线| 国产单亲对白刺激| 一二三四社区在线视频社区8| 日韩一区二区三区影片| 久久精品亚洲熟妇少妇任你| 看免费av毛片| 精品欧美一区二区三区在线| 亚洲熟女精品中文字幕| 在线天堂中文资源库| 一区二区日韩欧美中文字幕| 自线自在国产av| 久久久精品区二区三区| 欧美国产精品va在线观看不卡| 黄色丝袜av网址大全| 成人国产av品久久久| 国产精品一区二区在线不卡| 日本vs欧美在线观看视频| 男人操女人黄网站| 电影成人av| 久久人妻福利社区极品人妻图片| 高清视频免费观看一区二区| 欧美在线黄色| 狠狠精品人妻久久久久久综合| 精品少妇一区二区三区视频日本电影| 夜夜骑夜夜射夜夜干| 免费一级毛片在线播放高清视频 | 亚洲天堂av无毛| 国产亚洲精品久久久久5区| 大香蕉久久网| 另类亚洲欧美激情| 亚洲国产成人一精品久久久| 亚洲成人免费电影在线观看| 精品久久久精品久久久| 久久99热这里只频精品6学生| 亚洲国产中文字幕在线视频| 人人妻人人澡人人看| 国产成人精品无人区| 久久国产精品大桥未久av| 99精品欧美一区二区三区四区| 国产精品香港三级国产av潘金莲| 午夜免费成人在线视频| 美女视频免费永久观看网站| 成年人午夜在线观看视频| 日本撒尿小便嘘嘘汇集6| 大香蕉久久成人网| 精品福利永久在线观看| 亚洲少妇的诱惑av| 国产视频一区二区在线看| 午夜激情久久久久久久| 最黄视频免费看| 十八禁人妻一区二区| 色精品久久人妻99蜜桃| 男女之事视频高清在线观看| 国产一区二区三区在线臀色熟女 | 老司机午夜十八禁免费视频| 青草久久国产| 一级,二级,三级黄色视频| 黄频高清免费视频| 俄罗斯特黄特色一大片| 久久国产精品大桥未久av| 91精品三级在线观看| 亚洲男人天堂网一区| 欧美黑人精品巨大| 国产精品久久久av美女十八| 99国产极品粉嫩在线观看| 黄色 视频免费看| 一区二区三区激情视频| 一级黄色大片毛片| 欧美精品亚洲一区二区| 久久精品亚洲熟妇少妇任你| 无限看片的www在线观看| 丝袜美足系列| xxxhd国产人妻xxx| 丰满迷人的少妇在线观看| 亚洲avbb在线观看| 欧美日韩一级在线毛片| 欧美日韩福利视频一区二区| 国产色视频综合| 嫁个100分男人电影在线观看| 桃红色精品国产亚洲av| 又大又爽又粗| 久久精品91无色码中文字幕| 国产福利在线免费观看视频| 国产av精品麻豆| 久久人妻福利社区极品人妻图片| 99精国产麻豆久久婷婷| 国产成人精品在线电影| 日韩熟女老妇一区二区性免费视频| 亚洲精品美女久久久久99蜜臀| 国产精品二区激情视频| 人妻一区二区av| 国产伦理片在线播放av一区| 美女高潮喷水抽搐中文字幕| 免费在线观看完整版高清| 精品国产一区二区三区四区第35| 在线观看一区二区三区激情| 国产精品久久久av美女十八| 精品一区二区三卡| 亚洲国产精品一区二区三区在线| 免费在线观看黄色视频的| 亚洲成a人片在线一区二区| 性色av乱码一区二区三区2| 国产伦人伦偷精品视频| 国产成人欧美在线观看 | 69精品国产乱码久久久| 黑人猛操日本美女一级片| 国产精品一区二区精品视频观看| 男女午夜视频在线观看| 丁香六月天网| 亚洲欧美一区二区三区黑人| 菩萨蛮人人尽说江南好唐韦庄| 国产免费视频播放在线视频| 亚洲自偷自拍图片 自拍| 日韩视频在线欧美| 国产精品免费一区二区三区在线 | videos熟女内射| 日韩视频一区二区在线观看| 欧美激情 高清一区二区三区| 国产欧美亚洲国产| 午夜福利在线免费观看网站| 成人国产av品久久久| 巨乳人妻的诱惑在线观看| 欧美黄色片欧美黄色片| 香蕉国产在线看| 久久青草综合色| 日本黄色视频三级网站网址 | 国产亚洲精品久久久久5区| 91国产中文字幕| 国产一区二区三区综合在线观看| 亚洲欧美一区二区三区黑人| 成年人黄色毛片网站| 免费看十八禁软件| 最新的欧美精品一区二区| 麻豆国产av国片精品| 国产欧美日韩综合在线一区二区| 日韩一卡2卡3卡4卡2021年| 桃花免费在线播放| 黑人猛操日本美女一级片| 欧美久久黑人一区二区| 啦啦啦中文免费视频观看日本| 午夜福利视频精品| 夜夜骑夜夜射夜夜干| 国产精品99久久99久久久不卡| 亚洲国产毛片av蜜桃av| 一个人免费在线观看的高清视频| 精品亚洲成国产av| 日韩制服丝袜自拍偷拍| 丝袜喷水一区| 欧美黑人精品巨大| 精品乱码久久久久久99久播| 91精品三级在线观看| 午夜福利在线免费观看网站| 精品免费久久久久久久清纯 | 国产伦理片在线播放av一区| 99在线人妻在线中文字幕 | 久久久水蜜桃国产精品网| 露出奶头的视频| 高清黄色对白视频在线免费看| 日韩大码丰满熟妇| 女警被强在线播放| 狂野欧美激情性xxxx| 久久久久精品国产欧美久久久| xxxhd国产人妻xxx| 欧美日韩国产mv在线观看视频| 岛国在线观看网站| 在线观看66精品国产| 久久久久久人人人人人| 好男人电影高清在线观看| 涩涩av久久男人的天堂| 国产亚洲av高清不卡| 色播在线永久视频| 久久久精品94久久精品| 国产成人一区二区三区免费视频网站| 黄频高清免费视频| 天堂动漫精品| 在线观看一区二区三区激情| 在线观看www视频免费| 一本大道久久a久久精品| 亚洲情色 制服丝袜| 久久久久视频综合| 后天国语完整版免费观看| 国产精品av久久久久免费| 国产91精品成人一区二区三区 | 国产精品自产拍在线观看55亚洲 | a级毛片黄视频| 亚洲精品国产区一区二| 精品人妻1区二区| 18禁国产床啪视频网站| 亚洲性夜色夜夜综合| 汤姆久久久久久久影院中文字幕| 精品久久久久久电影网| 国产高清videossex| 亚洲午夜理论影院| 亚洲熟女毛片儿| 一个人免费看片子| 99国产精品免费福利视频| 国产有黄有色有爽视频| 一区二区三区精品91| 99久久精品国产亚洲精品| 亚洲精品国产色婷婷电影| 咕卡用的链子| 妹子高潮喷水视频| 日韩 欧美 亚洲 中文字幕| 91精品三级在线观看| 亚洲欧美一区二区三区黑人| 国产av又大| 18禁裸乳无遮挡动漫免费视频| 色视频在线一区二区三区| 岛国在线观看网站| 亚洲 国产 在线| 宅男免费午夜| 亚洲七黄色美女视频| 欧美日韩福利视频一区二区| 色尼玛亚洲综合影院| 91九色精品人成在线观看| 妹子高潮喷水视频| avwww免费| 人人妻人人澡人人看| √禁漫天堂资源中文www| 亚洲av成人一区二区三| 一区二区三区激情视频| 啦啦啦免费观看视频1| 国产精品九九99| 日韩大码丰满熟妇| 日韩中文字幕视频在线看片| 国精品久久久久久国模美| 精品卡一卡二卡四卡免费| 日本av手机在线免费观看| 夫妻午夜视频| 老司机福利观看| 十分钟在线观看高清视频www| 日本一区二区免费在线视频| 波多野结衣一区麻豆| 91国产中文字幕| 成人国语在线视频| 欧美日韩亚洲国产一区二区在线观看 | 一级毛片精品| 纯流量卡能插随身wifi吗| 欧美黑人精品巨大| 午夜福利在线免费观看网站| 香蕉丝袜av| 亚洲av日韩在线播放| 午夜老司机福利片| 精品一区二区三区av网在线观看 | www.熟女人妻精品国产| svipshipincom国产片| 欧美国产精品一级二级三级| 色尼玛亚洲综合影院| av一本久久久久| 美女扒开内裤让男人捅视频| 黑人猛操日本美女一级片| 欧美激情久久久久久爽电影 | 人人妻,人人澡人人爽秒播| 黄片小视频在线播放| 在线观看免费日韩欧美大片| 国产单亲对白刺激| 咕卡用的链子| av天堂在线播放| 亚洲熟妇熟女久久| 如日韩欧美国产精品一区二区三区| 在线播放国产精品三级| 我要看黄色一级片免费的| 免费看a级黄色片| 国产三级黄色录像| 性色av乱码一区二区三区2| 免费不卡黄色视频| 男男h啪啪无遮挡| 精品一区二区三区视频在线观看免费 | 久久亚洲精品不卡| 一进一出抽搐动态| 中文亚洲av片在线观看爽 | 久久性视频一级片| 欧美日韩国产mv在线观看视频| 欧美+亚洲+日韩+国产| 亚洲一码二码三码区别大吗| 亚洲精品久久午夜乱码| 午夜福利欧美成人| 久久九九热精品免费| av不卡在线播放| 久久久水蜜桃国产精品网| 欧美中文综合在线视频| 久久人人97超碰香蕉20202| 视频在线观看一区二区三区| 50天的宝宝边吃奶边哭怎么回事| 99久久人妻综合| 免费观看人在逋| 国产激情久久老熟女| 老熟女久久久| 亚洲七黄色美女视频| 久久人人爽av亚洲精品天堂| 91字幕亚洲| 国产成人精品久久二区二区91| cao死你这个sao货| 国产激情久久老熟女| 在线 av 中文字幕| 无遮挡黄片免费观看| 90打野战视频偷拍视频| 国产极品粉嫩免费观看在线| 啦啦啦在线免费观看视频4| 国产一区二区三区视频了| 日韩欧美三级三区| 久久久久国内视频| 国产片内射在线| 一边摸一边做爽爽视频免费| 国产欧美日韩一区二区三| 国产黄色免费在线视频| 国产人伦9x9x在线观看| 女同久久另类99精品国产91| 一进一出好大好爽视频| 18在线观看网站| 久久影院123| 国产视频一区二区在线看| 国产一卡二卡三卡精品| 一边摸一边抽搐一进一小说 | 国产成人免费观看mmmm| 久久精品亚洲精品国产色婷小说| 波多野结衣av一区二区av| 免费观看av网站的网址| 成年女人毛片免费观看观看9 | 国产成人一区二区三区免费视频网站| 国精品久久久久久国模美| 欧美性长视频在线观看| 国产日韩欧美视频二区| 18禁观看日本| av在线播放免费不卡| 日本精品一区二区三区蜜桃| 日韩视频一区二区在线观看| 国产精品久久久久久人妻精品电影 | 免费看十八禁软件| 一进一出抽搐动态| 国产激情久久老熟女| 丁香六月欧美| 岛国在线观看网站| 老司机靠b影院| 午夜福利欧美成人| 最近最新中文字幕大全免费视频| 国产激情久久老熟女| 麻豆乱淫一区二区| 亚洲男人天堂网一区| av福利片在线| 在线观看舔阴道视频| 99re在线观看精品视频| 亚洲av国产av综合av卡| 国产成人av激情在线播放| 久久久久久久国产电影| 天天添夜夜摸| av网站在线播放免费| 麻豆乱淫一区二区| 国产欧美亚洲国产| 免费不卡黄色视频| 国产精品国产高清国产av | 精品一品国产午夜福利视频| 满18在线观看网站| 欧美大码av| 天天躁夜夜躁狠狠躁躁| 肉色欧美久久久久久久蜜桃| 午夜成年电影在线免费观看| 亚洲精品在线美女| 亚洲av电影在线进入| av网站在线播放免费| 成年版毛片免费区| 国产老妇伦熟女老妇高清| 在线永久观看黄色视频| 男人操女人黄网站| 美女福利国产在线| av欧美777| 脱女人内裤的视频| 叶爱在线成人免费视频播放| 18禁黄网站禁片午夜丰满| 一本综合久久免费| 99国产精品一区二区蜜桃av | 日韩制服丝袜自拍偷拍| 免费观看av网站的网址| 国产精品 欧美亚洲| 久久久久久免费高清国产稀缺| videosex国产| 色综合欧美亚洲国产小说| 亚洲精品中文字幕在线视频| 免费人妻精品一区二区三区视频| 欧美国产精品va在线观看不卡| 国产伦理片在线播放av一区| 日本wwww免费看| 亚洲午夜精品一区,二区,三区| 日本精品一区二区三区蜜桃| 性少妇av在线| 亚洲专区国产一区二区| 午夜福利在线免费观看网站| 最新美女视频免费是黄的| 精品亚洲成a人片在线观看| 欧美精品人与动牲交sv欧美| xxxhd国产人妻xxx| 国产精品麻豆人妻色哟哟久久| 久久久久国内视频| 午夜日韩欧美国产| 精品一品国产午夜福利视频| 国产欧美日韩一区二区精品| 女人爽到高潮嗷嗷叫在线视频| 日韩欧美一区二区三区在线观看 | 国产不卡av网站在线观看| 桃花免费在线播放| 妹子高潮喷水视频| cao死你这个sao货| 王馨瑶露胸无遮挡在线观看| 国产伦理片在线播放av一区| 777米奇影视久久| 嫩草影视91久久| 精品一区二区三区四区五区乱码| 成人av一区二区三区在线看| 国产成人影院久久av| 777久久人妻少妇嫩草av网站| 精品免费久久久久久久清纯 | 欧美日韩精品网址| 狠狠狠狠99中文字幕| 精品国产国语对白av| 成人免费观看视频高清| 脱女人内裤的视频| 欧美激情 高清一区二区三区| 成人18禁在线播放| 丝袜喷水一区| 久久这里只有精品19| 每晚都被弄得嗷嗷叫到高潮| 久久精品成人免费网站| www日本在线高清视频| 叶爱在线成人免费视频播放| 中文字幕高清在线视频| 天堂动漫精品| 91大片在线观看| 国产免费av片在线观看野外av| 成年女人毛片免费观看观看9 | 亚洲va日本ⅴa欧美va伊人久久| 亚洲人成77777在线视频| 啦啦啦在线免费观看视频4| 一本一本久久a久久精品综合妖精| 亚洲精品国产一区二区精华液| 女人高潮潮喷娇喘18禁视频| 国产极品粉嫩免费观看在线| 飞空精品影院首页| 日本一区二区免费在线视频| 日韩中文字幕欧美一区二区| 青草久久国产| 午夜精品国产一区二区电影| 久久这里只有精品19| 免费看a级黄色片| 两人在一起打扑克的视频| 老熟妇仑乱视频hdxx| 在线观看免费日韩欧美大片| 一二三四在线观看免费中文在| 国产成人精品无人区| 不卡av一区二区三区| h视频一区二区三区| aaaaa片日本免费| 大型黄色视频在线免费观看| 中文欧美无线码| 国产精品1区2区在线观看. | 欧美变态另类bdsm刘玥| 大片电影免费在线观看免费| 亚洲国产毛片av蜜桃av| 99久久精品国产亚洲精品| 大型黄色视频在线免费观看| 久久久国产成人免费| 热re99久久精品国产66热6| 汤姆久久久久久久影院中文字幕| 国产精品1区2区在线观看. | kizo精华| 搡老乐熟女国产| 国产在线观看jvid| 久久ye,这里只有精品| 成年人免费黄色播放视频| 午夜福利视频精品| 亚洲九九香蕉| 亚洲成人手机| 肉色欧美久久久久久久蜜桃| 777久久人妻少妇嫩草av网站| 欧美日韩成人在线一区二区| 97人妻天天添夜夜摸| 又大又爽又粗| 精品国产超薄肉色丝袜足j| 青草久久国产| 中文字幕制服av| 日本vs欧美在线观看视频| 日韩精品免费视频一区二区三区| www.熟女人妻精品国产| 精品人妻熟女毛片av久久网站| 黑人操中国人逼视频| 欧美精品一区二区大全| 亚洲欧美日韩另类电影网站| 亚洲五月婷婷丁香| 亚洲国产欧美日韩在线播放| 国产精品偷伦视频观看了| 妹子高潮喷水视频| tocl精华| 国产精品一区二区精品视频观看| 亚洲五月婷婷丁香| 久久久久久久大尺度免费视频| 欧美黄色淫秽网站| 国产成人系列免费观看| 夫妻午夜视频| 精品乱码久久久久久99久播| 9色porny在线观看| 日韩熟女老妇一区二区性免费视频| 亚洲欧美一区二区三区久久| 成人免费观看视频高清| 国产亚洲精品一区二区www | 久久久久久久久久久久大奶| 欧美国产精品va在线观看不卡| 免费看十八禁软件| 久久久久久久大尺度免费视频| 午夜老司机福利片| 丁香六月天网| 国产亚洲欧美精品永久| 国产成人系列免费观看| 色综合欧美亚洲国产小说| 久久中文看片网| 午夜日韩欧美国产| 国产有黄有色有爽视频| 国产精品国产高清国产av | 一进一出抽搐动态| 亚洲美女黄片视频| 桃花免费在线播放| 日韩欧美一区二区三区在线观看 | 久久久久国产一级毛片高清牌| 久久精品熟女亚洲av麻豆精品| 久久国产精品人妻蜜桃| 国产伦理片在线播放av一区| 一区二区三区国产精品乱码| 国产一区二区三区综合在线观看| 国产一区二区 视频在线| 岛国毛片在线播放| 黄色毛片三级朝国网站| 视频区欧美日本亚洲| 9热在线视频观看99| 中文字幕人妻熟女乱码| 国产日韩欧美亚洲二区|