• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Fast Chinese syntactic parsing method based on conditional random fields

    2015-04-22 02:33:18HANLei韓磊LUOSenlin羅森林CHENQianrou陳倩柔PANLimin潘麗敏
    關(guān)鍵詞:韓磊姚明森林

    HAN Lei(韓磊), LUO Sen-lin(羅森林), CHEN Qian-rou(陳倩柔), PAN Li-min(潘麗敏)

    (School of Information and Electronics, Beijing Institute of Technology, Beijing 100081, China)

    ?

    Fast Chinese syntactic parsing method based on conditional random fields

    HAN Lei(韓磊), LUO Sen-lin(羅森林), CHEN Qian-rou(陳倩柔), PAN Li-min(潘麗敏)

    (School of Information and Electronics, Beijing Institute of Technology, Beijing 100081, China)

    A fast method for phrase structure grammar analysis is proposed based on conditional random fields (CRF). The method trains several CRF classifiers for recognizing the phrase nodes at different levels, and uses the bottom-up to connect the recognized phrase nodes to construct the syntactic tree. On the basis of Beijing forest studio Chinese tagged corpus, two experiments are designed to select the training parameters and verify the validity of the method. The result shows that the method costs 78.98 ms and 4.63 ms to train and test a Chinese sentence of 17.9 words. The method is a new way to parse the phrase structure grammar for Chinese, and has good generalization ability and fast speed.

    phrase structure grammar; syntactic tree; syntactic parsing; conditional random field

    Chinese syntactic parsing is one of the significant components in natural language processing. With the large scale development of Chinese Treebank[1], the corpus-based techniques that are successful for English have been applied extensively to Chinese. The two main approaches are traditional statistical and deterministic parsing.

    Traditional statistical approaches build models to obtain the unique tree with the highest PCFG probability (Viterbi tree)[2-9]. These algorithms require calculating PCFG probability of every kind of possible trees. Only at the expense of testing time, the Viterbi tree can be obtained. To reduce the testing time, deterministic parsing emerges as an attractive alternative to probabilistic parsing, offering accuracy just below the state-of-the-art in syntactic analysis of English, but running in linear time. Yamada and Matsumoto[10]proposed a method for analyzing word-word dependencies using deterministic bottom-up manner using support vector machines (SVM), and achieved R/P of over 90% accuracy of word-word dependency. Sagae and Lavie[11]used a basic bottom-up shift-reduce algorithm, but employed a classifier to determine parser actions instead of a grammar and achieved R/P of 87.54%/87.61%. Cheng et al.[12]presented a deterministic dependency structure analyzer which implements two algorithms—Yamada and Nivre algorithms—and two sorts of classifiers—SVM and Maximum Entropy models for Chinese. Encouraging results have also been shown in applying deterministic models to Chinese parsing. Wang et al.[13]presented classifier-based deterministic parser for Chinese constituency parsing. Their parser computes parse trees from bottom up in one pass, and uses classifiers to make shift-reduce decisions and achieves R/P of 88.3%/88.1%.

    In this paper, a faster deterministic parsing method is proposed based on conditional random fields (CRF) to reduce the runtime of parsing. The proposed method applies the CRF++[14]as the classifiers training and testing software to label the phrase nodes in syntactic levels for each segment. The nodes labeled are combined to construct the phrase structure syntactic tree.

    1 Method

    Base on the part-of-speech (POS) tagging results, the method labels the phrase and sentence nodes (both of them are defined as nodes) of each segment to construct a syntactic tree. The principle of the method is shown in Fig.1.

    Fig. 1 Principle of the method

    1.1 Data processing

    Data processing contains two parts, i.e. data training and data testing. The aim of data training is to change the data format from the POS and syntax tagging result to one word per line. The data training inputs are the POS and syntax tagging results and outputs are nodes in the format of one word per line. Take the sentence of “姚明說范甘迪是核心(Yao Ming said that Van Gundy is the core)?!?as an example. The data training inputs are the POS and syntax tagging results, which are “0^姚明(Yao Ming)/nr 1^說(said)/v 2^范甘迪(Van Gundy)/nr 3^是(is)/v 4^核心(the core)/n 5^。/w” and “[dj [np 0^姚明/nr] [vp 1^說/v [dj [np 2^范甘迪/nr] [vp 3^是/v 4^核心/n]]] 5^。/w]”. The data training outputs are shown in Tab.1. And the syntax tree is shown in Fig. 2.

    Tab. 1 Format of one word per line for“姚明說范甘迪是核心”

    Fig. 2 Syntactic tree of “姚明說范甘迪是核心”

    To explain the process of data training, the concept of syntactic level (hereinafter referred to as level) is introduced. The closest node to the segment is defined in the 1stlevel (1L) of the segment and less close node to the segment is defined in the 2ndlevel (2L) of the segment. The further the distance between segment and the node is, the higher level the node is in. The root node of the syntactic tree is defined in the last level of the segment. It can conclude that different segments may have different levels. Take the segment of “核心(the core)” as an example in Fig. 2. The phrase node of vp is in the 1stlevel, dj that does not belong to the root node is in the 2ndlevel, vp that is further from the segment is in the 3rdlevel (3L) and dj of the root node is in the 4thlevel (4L). In addition, the maximum level number among the segments in a sentence equals the level number of the sentence.

    In Tab. 1, the first column shows word segmentation results of the example sentence, the second column shows the POS tagging of each segment, the third column contains the segment order (SO) which means the order of the segment in the sentence, the other columns contain all the nodes in the syntactic tree including phrase nodes and sentence nodes.

    The data testing changes the POS tagging results into the format for the next recognition. Its inputs are POS tagging results and its outputs are the POS in the format as the first three columns in Tab. 1.

    1.2 Model training and testing

    Model training is to train the recognition models for each level labeling. Its inputs are the tagged syntactic tree and feature template. Its outputs are the recognition models for each level labeling. The model for labeling the nodes in the 1stlevel is defined as the 1stmodel and the model for labeling the nodes in the 2ndlevel is defined as the 2ndmodel and so on. Considering the data set for testing, the maximum level number of sentences is 14, thus 14 models need to be trained at least. Take the sentence in Tab. 1 as an example. The sentence needs 4 models to label each level. If the 2ndmodel needs to be trained, the inputs are the first 5 columns in Tab. 1 and the outputs are the model for labeling the nodes in the 2ndlevel. Each model has its own feature template and the parameters for training.

    Model testing labels the nodes of each segment from the 1stlevel to the highest level. Its inputs are the POS tagging results and outputs are the syntactic tree in the format of one word per line. Take the sentence in Tab. 1 as an example. The inputs are the first three columns, the 1stmodel, the 2ndmodel, the 3rdmodel and the 4thmodel. Firstly, the method uses the first three columns and the 1stmodel to label the nodes (np, vp, np, vp, vp, dj) in the 1stlevel. Secondly, the method uses the first 3 columns, the labeled nodes in the 1stlevel and the 2ndmodel to label the nodes in the 2ndlevel. Thirdly, the method uses the first 3 columns, the labeled nodes in the 1stand 2ndlevels and the 3rdmodel to label the nodes in the 3rdlevel. At last, based on all the labeled nodes and the 4thmodel, the output of the model testing is obtained as shown in Tab. 1.

    1.3 Constructing syntactic tree

    Constructing syntactic tree is an inverse process of data processing. Take the sentence in Tab. 1 as an example. Firstly, it connects the nodes from a segment to the node in the highest level as a single tree. Secondly, it combines the same node of dj. Thirdly, it combines the same child nodes of dj. Then, it combines the same nodes of every child node until the POS. The process is shown in Fig. 3.

    Fig. 3 Process of constructing syntactic tree

    2 Experiment

    Two experiments are designed to choose the CRF++ training parameters and verify the efficiency, stability and precision of the method. One is parameters selecting and the other is method verifying. For choosing the training parameters of 14 models, the parameters selecting applies the grid method to select the bestcandf. The value ofcranges from 0.2 to 6 with the step length of 0.2. The value offranges from 1 to 5 with the step length of 1. The method verifying uses the parameters with the highest precision for labeling the nodes in different levels to train the models, and then the syntactic tree is verified both in open and closed testing based on the method of 10-fold-cross-validation. At the same time, the training and testing time are recorded.

    2.1 Data set and hardware

    The data set is 10 000 Chinese tagged syntactic sentences in Beijing forest studio Chinese tagged corpus (BFS-CTC)[15-16]. The number of words in a sentence is 17.9 on the average. The maximum number of words in a sentence is 53, the minimum number of words in a sentence is 3, and the maximum syntactic level number is 14 in the BFS-CTC.

    The main hardware which is mostly relevance with the computing speedis as follows.

    ① CPU: Intel?Xeon?X5650 @ 2.67 GHz. (2 CPUs)

    ② Memory: 8 G.

    ③ System type:Windows Server 2008 R2 Enterprise sp1 64 bit.

    2.2 Evaluating

    2.3 Results

    2.3.1 Parameters selecting

    Take the 1stmodel training as an example to show the process of parameters selecting. There are total of 18 features used for 1stmodel training, including POS, POS of former segment etc., as shown in Tab. 2. Based on these features, the partly results are shown in Fig. 4, whose best result is 90.47% for nodes labeling in 1stlevel whencequals 0.4 andfequals 1. The process of parameters choosing of the other 13 models is the same.

    2.3.2 Method verifying

    Method verifying contains closed test and open test.In the closed test, theP,RandFvalues of 10 experimental groups are shown in Fig. 5. The averageP,RandFvalues are 76.32%, 81.25% and 0.787 1 respectively. In the open test, the result is shown in Fig.6. The averageP,RandFvalues are 73.27%, 78.25% and 0.756 8.

    Tab.2 Features used in model training of first level

    Fig. 4 Partly results of choosing parameters

    Fig.5 P, R and F in closed test

    Fig. 6 P, R and F in open test

    In the same condition, the recently results of the PCFGs[9]are shown in Tab. 4. The time of the table contains the rule collecting and testing for one group experiment. In Tab.4, theP,RandFare calculated by using the PARSEVAL[17]. The time of testing a sentence for PCFG, L-PCFG, S-PCFG and SL-PCFG is 1 847.1 ms, 1 425.9 ms, 205.8 ms and 82 ms in closed set, and 1 850.2 ms, 1 412.2 ms, 178.3 ms and 68.1 ms in open set. In comparison with previous work, the time of proposed method is very competitive. Compared with the PCFGs, the proposed method is 22 times faster and gives almost equal results for all three measures. In particular, although the SL-PCFG is the fastest in Tab.4 and almost equals the sum time of the proposed method in closed test, the proposed method gives a significantly better results in open test as shown in Tab.4.

    Tab.3 Time of training and testing in closed and open set ms

    Tab.4 Results of PCFGs

    Some other previous related works are shown in Tab.5. These models used the Penn Chinese Treebank. The average runtime of Levy & Manning’s, Bikel’s and stacked classifier model to analysis a sentence in Penn Chinese Treebank is 117.7 ms, 776.6 ms and 64.1 ms respectively. The runtime is calculated by using the time[13]divides the sum of sentences used in the experiment. Because of the differences of data set in the experiment, the results can’t be compared directly with the proposed method. But these results also have significant reference. The results show that the previous works are not fast enough until the machine learning algorithms are applied to the parser. The fastest result in Tab. 5 costs 64.1 ms to analysis a sentence while the proposed method costs 4.63 ms. But the three measures of the proposed method are lower much.

    Tab. 5 Previous related works

    In a word, the proposed method is fast for testing a sentence and stable in closed and open test. In addition, the syntactic structure may be failed to analyze in PCFGs and the previous works in Tab. 5, while there is no fail condition in the proposed method. But thePof the proposed method is not high, possibly because the nodes and number of syntactic levels tested by the classifiers cannot be all correct.

    3 Conclusion

    A fast method based on conditional random fields is proposed for syntactic parsing. For constructing a syntactic tree, firstly, the method uses the bottom-up to recognize the nodes from the nearest node to the segment to the root node. Secondly, it uses the up-bottom to combine the same node tested from the root node to the segment. Based on the BFS-CTC, two experiments are designed to illustrate the process of choosing training parameters and verify the validity of method. In addition, the results of the method are compared to PCFGs. The results show that the method has a faster time in training and testing. The method is a new way for syntactic analysis. According to the ability of machine learning, the method has good generalization ability and can be used for applications which don’t need high precision but fast speed.

    [1] Xue N, Xia F, Chiou F, et al. The Penn Chinese TreeBank: phrase structure annotation of a large corpus [J]. Natural Language Engineering, 2005, 11(02): 207-38.

    [2] Bikel D M, Chiang D. Two statistical parsing models applied to the Chinese Treebank[C]∥Proceedings of the second workshop on Chinese language processing: held in conjunction with the 38th Annual Meeting of the Association for Computational Linguistics-Volume 12. Stroudsburg, PA, USA: Association for Computational Linguistics, 2000: 1-6.

    [3] Bikel D M. On the parameter space of generative lexicalized statistical parsing models [D]. PhiLadelphia: University of Pennsylvania, 2004.

    [4] Chiang D, Bikel D M. Recovering latent information in treebanks[C]∥Proceedings of the 19th international conference on Computational linguistics-Volume 1, Stroudsbury, PA, USA: Association for Computational Linguistics, 2002: 1-7.

    [5] Levy R, Manning C. Is it harder to parse Chinese, or the Chinese Treebank?[C]∥Proceedings of the 41st Annual Meeting on Association for Computational Linguistics-Volume 1. Stroudsburg, PA, USA: Association for Computational Linguistics, 2003: 439-446.

    [6] Xiong D, Li S, Liu Q, et al. Parsing the penn chinese treebank with semantic knowledge[M]∥Natural Language Processing-IJCNLP 2005. Berlin, Heidelberg: Springer, 2005: 70-81.

    [7] Jiang Zhengping. Statistical Chinese parsing [D]. Singapore: National University of Singapore, 2004.

    [8] Mi H T, Xiong D Y, Liu Q. Research on strategies for integrating Chinese lexical analysis and parsing[J]. Journal of Chinese Information Processing, 2008, 22(2): 10-17. (in Chinese)

    [9] Chen Gong, Luo Senlin, Chen Kaijiang, et al. Method for layered Chinese parsing based on subsidiary context and lexical information [J]. Journal of Chinese Information Processing, 2012, 26(01): 9-15. (in Chinese)

    [10] Yamada H, Matsumoto Y. Statistical dependency analysis with support vector machines [J]. Machine Learning, 1999, 34(1-3): 151-175.

    [11] Sagae K, Lavie A. A classifier-based parser with linear run-time complexity [C]∥Parsing 05 Proceedings of the Ninth International Workshop on Parsing Technology. Stroudsburg, PA, USA: Association for Computational Linguistics, 2005: 125-132.

    [12] Cheng Y, Asahara M, Matsumoto Y. Machine learning-based dependency analyzer for Chinese [J]. Journal of Chinese Language and Computing, 2005, 15(1): 13-24.

    [13] Wang M, Sagae K, Mitamura T. A fast, accurate deterministic parser for Chinese [C]∥Proceeding ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2006: 425-432.

    [14] Lafferty J, McCallum A, Pereira F. Conditional random fields: probabilistic models for segmenting and labeling sequence data [C]∥the Eighteenth International Conference on Machine Learning, San Francisco, CA, USA, 2012.

    [15] Luo Senlin, Liu Yingying, Feng Yang, et al. Method of building BFS-CTC: a Chinese tagged corpus of sentential semantic structure [J]. Transactions of Beijing Institute of Technology, 2012, 32(03): 311-315. (in Chinese)

    [16] Liu Yingying, Luo Senlin, Feng Yang, et al. BFS-CTC: a Chinese corpus of sentential semantic structure [J]. Journal of Chinese Information Processing, 2013, (27): 72-80. (in Chinese)

    [17] Charniak E. Statistical parsing with a context-free grammar and word statistics [C]∥the Fourteenth National Conference on Artificial Intelligence and Ninth Conference on Innovative Applications of Artificial Intelligence, Providence, Rhode Island, 1997.

    (Edited by Cai Jianying)

    10.15918/j.jbit1004-0579.201524.0414

    TP 391.1 Document code: A Article ID: 1004- 0579(2015)04- 0519- 07

    Received 2014- 05- 03

    Supported by the Science and Technology Innovation Plan of Beijing Institute of Technology (2013)

    E-mail: panlimin_bit@126.com

    猜你喜歡
    韓磊姚明森林
    姚明的兩筆賬
    哈Q森林
    哈Q森林
    哈Q森林
    跟著姚明,為愛奔跑!
    海峽姐妹(2016年5期)2016-02-27 15:18:50
    哈Q森林
    像姚明那樣誠實
    An ocean circulation model based on Eulerian forward-backward difference scheme and three-dimensional, primitive equations and its application in regional simulations*
    家長該從韓磊摔嬰案中反思什么?
    海峽姐妹(2014年2期)2014-02-27 15:09:03
    PARTY 控
    八小時以外(2012年1期)2012-06-27 07:28:27
    不卡av一区二区三区| 日本黄色日本黄色录像| 12—13女人毛片做爰片一| 一级毛片女人18水好多| 黄色片一级片一级黄色片| 日韩欧美免费精品| 久久精品熟女亚洲av麻豆精品| 水蜜桃什么品种好| 午夜免费鲁丝| 精品久久久久久久久久免费视频 | 在线观看www视频免费| 亚洲精品av麻豆狂野| 大码成人一级视频| a在线观看视频网站| 女性生殖器流出的白浆| 国产男靠女视频免费网站| 国产精品一区二区精品视频观看| 午夜免费鲁丝| 久久久久久久久久久久大奶| 久久精品国产清高在天天线| 建设人人有责人人尽责人人享有的| 啦啦啦视频在线资源免费观看| 久久久久视频综合| 高清毛片免费观看视频网站 | 精品少妇一区二区三区视频日本电影| xxx96com| 性少妇av在线| 国产黄色免费在线视频| 成人亚洲精品一区在线观看| 一a级毛片在线观看| 国产精品1区2区在线观看. | 久热爱精品视频在线9| 国产在线精品亚洲第一网站| 性少妇av在线| av超薄肉色丝袜交足视频| 成年人免费黄色播放视频| 国产精品久久久av美女十八| 9热在线视频观看99| 亚洲 欧美一区二区三区| 亚洲成a人片在线一区二区| 一个人免费在线观看的高清视频| 久久久久久人人人人人| 成年人午夜在线观看视频| 99精品在免费线老司机午夜| 一级片'在线观看视频| 别揉我奶头~嗯~啊~动态视频| 51午夜福利影视在线观看| 欧美日本中文国产一区发布| 精品国产一区二区三区四区第35| 亚洲国产欧美日韩在线播放| 女性被躁到高潮视频| 日日摸夜夜添夜夜添小说| 国产激情久久老熟女| 三级毛片av免费| 亚洲人成电影观看| 中文字幕另类日韩欧美亚洲嫩草| 在线免费观看的www视频| 在线视频色国产色| 老司机靠b影院| 97人妻天天添夜夜摸| 王馨瑶露胸无遮挡在线观看| 正在播放国产对白刺激| 女人精品久久久久毛片| 欧美精品一区二区免费开放| 亚洲av日韩在线播放| 亚洲精品国产精品久久久不卡| 黄色女人牲交| 国产一区有黄有色的免费视频| 国产高清视频在线播放一区| 国产精品免费视频内射| 又黄又粗又硬又大视频| 亚洲色图综合在线观看| 黑人猛操日本美女一级片| 夜夜夜夜夜久久久久| 国产熟女午夜一区二区三区| 亚洲一码二码三码区别大吗| 国产精品.久久久| 日韩欧美一区二区三区在线观看 | 成年人免费黄色播放视频| 精品国产美女av久久久久小说| 亚洲国产精品sss在线观看 | 黄片播放在线免费| √禁漫天堂资源中文www| 免费在线观看亚洲国产| 成人黄色视频免费在线看| 1024视频免费在线观看| 91av网站免费观看| 亚洲av成人av| 亚洲精品一二三| 国产不卡一卡二| 亚洲少妇的诱惑av| ponron亚洲| 黄色 视频免费看| 久久亚洲精品不卡| 日韩成人在线观看一区二区三区| 美女视频免费永久观看网站| 一级a爱视频在线免费观看| 超碰成人久久| 亚洲午夜理论影院| 怎么达到女性高潮| 国产高清国产精品国产三级| 亚洲综合色网址| 国产亚洲欧美98| 日韩有码中文字幕| 久久国产亚洲av麻豆专区| 久久国产精品人妻蜜桃| 国产激情久久老熟女| 可以免费在线观看a视频的电影网站| 别揉我奶头~嗯~啊~动态视频| 超色免费av| av网站免费在线观看视频| а√天堂www在线а√下载 | av国产精品久久久久影院| 久久热在线av| 国产高清videossex| 欧洲精品卡2卡3卡4卡5卡区| 久久久国产成人精品二区 | 精品熟女少妇八av免费久了| 国产精品成人在线| 午夜福利在线观看吧| 人成视频在线观看免费观看| 老司机影院毛片| 欧美黑人欧美精品刺激| 久久精品国产亚洲av香蕉五月 | tocl精华| 中文欧美无线码| 欧美亚洲 丝袜 人妻 在线| 久久亚洲真实| 交换朋友夫妻互换小说| 亚洲一码二码三码区别大吗| 亚洲精品中文字幕一二三四区| 叶爱在线成人免费视频播放| 色94色欧美一区二区| 成人免费观看视频高清| 成年人免费黄色播放视频| 成年人免费黄色播放视频| 国产一区二区三区综合在线观看| 深夜精品福利| 亚洲精品美女久久久久99蜜臀| 中文字幕人妻丝袜一区二区| 高清视频免费观看一区二区| 宅男免费午夜| 他把我摸到了高潮在线观看| 另类亚洲欧美激情| av视频免费观看在线观看| 亚洲精品久久成人aⅴ小说| 精品视频人人做人人爽| 亚洲精品久久成人aⅴ小说| 欧美黑人精品巨大| 国产蜜桃级精品一区二区三区 | 可以免费在线观看a视频的电影网站| 亚洲一区高清亚洲精品| 亚洲一区高清亚洲精品| 亚洲一区高清亚洲精品| 国产成人欧美在线观看 | 日韩制服丝袜自拍偷拍| 亚洲专区中文字幕在线| 国产高清视频在线播放一区| 国产精品亚洲一级av第二区| 麻豆成人av在线观看| 一级a爱视频在线免费观看| 一级黄色大片毛片| 国产精品亚洲一级av第二区| 亚洲中文字幕日韩| 久久婷婷成人综合色麻豆| 纯流量卡能插随身wifi吗| 好男人电影高清在线观看| 日本五十路高清| 国产男靠女视频免费网站| 亚洲欧美色中文字幕在线| 国产又色又爽无遮挡免费看| avwww免费| 亚洲视频免费观看视频| 欧美黑人欧美精品刺激| 黄色视频不卡| 国产男女超爽视频在线观看| 亚洲一卡2卡3卡4卡5卡精品中文| 久久久国产成人免费| x7x7x7水蜜桃| 久久久久国产一级毛片高清牌| www.精华液| 欧美精品av麻豆av| 免费观看精品视频网站| 国产成人免费无遮挡视频| 王馨瑶露胸无遮挡在线观看| 一本大道久久a久久精品| 手机成人av网站| 亚洲精品成人av观看孕妇| 日韩 欧美 亚洲 中文字幕| 精品福利永久在线观看| 久久久国产成人精品二区 | 免费在线观看黄色视频的| 精品久久久久久电影网| 99久久综合精品五月天人人| www.熟女人妻精品国产| 亚洲情色 制服丝袜| 日日摸夜夜添夜夜添小说| 天天躁日日躁夜夜躁夜夜| 久久久久国产精品人妻aⅴ院 | 十八禁高潮呻吟视频| 巨乳人妻的诱惑在线观看| 国产高清激情床上av| 亚洲av美国av| 久久久久久久午夜电影 | 老司机影院毛片| 大型av网站在线播放| 18禁裸乳无遮挡免费网站照片 | 久久久精品免费免费高清| 亚洲成人国产一区在线观看| 天堂√8在线中文| 一个人免费在线观看的高清视频| 99精国产麻豆久久婷婷| 久久亚洲真实| 午夜精品久久久久久毛片777| 99久久精品国产亚洲精品| 欧美另类亚洲清纯唯美| 欧美大码av| 丁香六月欧美| 91九色精品人成在线观看| 下体分泌物呈黄色| 成人特级黄色片久久久久久久| 欧美 日韩 精品 国产| 精品人妻1区二区| 成年动漫av网址| 天堂动漫精品| 丝袜美腿诱惑在线| 亚洲精华国产精华精| 午夜精品久久久久久毛片777| 亚洲免费av在线视频| 亚洲avbb在线观看| 黄色a级毛片大全视频| 黑人操中国人逼视频| 一边摸一边抽搐一进一出视频| 亚洲欧美日韩另类电影网站| 黄色成人免费大全| 一本大道久久a久久精品| 成年版毛片免费区| 欧美日韩黄片免| 欧美激情久久久久久爽电影 | 亚洲伊人色综图| 一区福利在线观看| 亚洲成av片中文字幕在线观看| 色94色欧美一区二区| 精品免费久久久久久久清纯 | 欧美不卡视频在线免费观看 | 久久精品91无色码中文字幕| 后天国语完整版免费观看| 免费不卡黄色视频| www.熟女人妻精品国产| 久久 成人 亚洲| 热99国产精品久久久久久7| 动漫黄色视频在线观看| 伊人久久大香线蕉亚洲五| 色在线成人网| 久久天堂一区二区三区四区| 大香蕉久久网| 精品福利永久在线观看| 国产欧美日韩一区二区精品| 少妇的丰满在线观看| 国产精品一区二区免费欧美| 在线观看舔阴道视频| 麻豆成人av在线观看| 精品电影一区二区在线| 国产淫语在线视频| 日本黄色日本黄色录像| 99国产精品一区二区蜜桃av | 国产视频一区二区在线看| 乱人伦中国视频| 欧美午夜高清在线| 天天添夜夜摸| 最新的欧美精品一区二区| 亚洲成a人片在线一区二区| 成人影院久久| 女人爽到高潮嗷嗷叫在线视频| 女同久久另类99精品国产91| 十八禁高潮呻吟视频| 亚洲成人国产一区在线观看| 久久中文字幕一级| 欧美成狂野欧美在线观看| www.熟女人妻精品国产| 色综合婷婷激情| 亚洲熟女毛片儿| 色在线成人网| 国产一区二区三区综合在线观看| 狠狠狠狠99中文字幕| 午夜精品久久久久久毛片777| 校园春色视频在线观看| 国产精品98久久久久久宅男小说| 嫩草影视91久久| 成年动漫av网址| 亚洲av电影在线进入| 18在线观看网站| 久久精品国产99精品国产亚洲性色 | 性色av乱码一区二区三区2| 久久午夜亚洲精品久久| 大片电影免费在线观看免费| 一级毛片高清免费大全| 亚洲欧美激情在线| 久久人妻av系列| 免费在线观看视频国产中文字幕亚洲| 亚洲精品国产区一区二| 午夜福利欧美成人| 女人久久www免费人成看片| 国产精品久久视频播放| 国产97色在线日韩免费| 亚洲精品久久午夜乱码| 欧美日韩精品网址| 精品人妻1区二区| 十八禁网站免费在线| 一级片免费观看大全| 色婷婷av一区二区三区视频| 两个人看的免费小视频| 免费日韩欧美在线观看| 亚洲av成人不卡在线观看播放网| 国产精品久久久人人做人人爽| 99精国产麻豆久久婷婷| 久9热在线精品视频| 黑人猛操日本美女一级片| 香蕉久久夜色| ponron亚洲| 欧美老熟妇乱子伦牲交| cao死你这个sao货| 色婷婷av一区二区三区视频| 亚洲精品久久午夜乱码| 国产精品1区2区在线观看. | 日本精品一区二区三区蜜桃| 国产精品 国内视频| 国产97色在线日韩免费| 午夜免费鲁丝| 香蕉久久夜色| 人人妻,人人澡人人爽秒播| 99在线人妻在线中文字幕 | 又黄又爽又免费观看的视频| 国产精品影院久久| 国产精品.久久久| 欧美人与性动交α欧美精品济南到| 制服人妻中文乱码| 一二三四在线观看免费中文在| 美女午夜性视频免费| 欧美日韩乱码在线| 女人精品久久久久毛片| 国产1区2区3区精品| 亚洲精品中文字幕在线视频| 亚洲,欧美精品.| a在线观看视频网站| 国内久久婷婷六月综合欲色啪| 色老头精品视频在线观看| 丁香欧美五月| 国产亚洲精品久久久久5区| e午夜精品久久久久久久| 看黄色毛片网站| 18禁观看日本| 久久香蕉国产精品| 男人操女人黄网站| 人人妻人人添人人爽欧美一区卜| 亚洲中文字幕日韩| 亚洲,欧美精品.| 桃红色精品国产亚洲av| 国产aⅴ精品一区二区三区波| 一级黄色大片毛片| 丰满的人妻完整版| 亚洲精品自拍成人| 多毛熟女@视频| 国产精品98久久久久久宅男小说| 精品一区二区三卡| 国产成人精品久久二区二区91| 狂野欧美激情性xxxx| 欧美乱色亚洲激情| 亚洲美女黄片视频| 国产有黄有色有爽视频| 国产区一区二久久| 一级毛片女人18水好多| 飞空精品影院首页| 国产欧美亚洲国产| 国产极品粉嫩免费观看在线| 国产激情欧美一区二区| 久久久久久久国产电影| 国产亚洲欧美在线一区二区| 久久久国产成人免费| 一级毛片精品| www.自偷自拍.com| 90打野战视频偷拍视频| 精品久久久久久电影网| netflix在线观看网站| 亚洲少妇的诱惑av| 久久人妻福利社区极品人妻图片| 国产亚洲欧美98| 又黄又粗又硬又大视频| 亚洲欧美色中文字幕在线| 51午夜福利影视在线观看| 777米奇影视久久| 一边摸一边抽搐一进一出视频| 午夜福利乱码中文字幕| 中国美女看黄片| 最近最新中文字幕大全电影3 | 国产精品一区二区在线不卡| 在线观看免费视频日本深夜| 久久中文看片网| 久久精品亚洲av国产电影网| 黄片播放在线免费| 成人av一区二区三区在线看| 国产欧美日韩精品亚洲av| 69精品国产乱码久久久| 啦啦啦视频在线资源免费观看| 欧美av亚洲av综合av国产av| 一区二区三区精品91| xxx96com| 黄色 视频免费看| 色94色欧美一区二区| 国产亚洲欧美98| 久久精品亚洲熟妇少妇任你| 亚洲片人在线观看| 99riav亚洲国产免费| 人人妻人人添人人爽欧美一区卜| 一进一出抽搐gif免费好疼 | www.999成人在线观看| 中文字幕人妻熟女乱码| 9191精品国产免费久久| 国产成人系列免费观看| 黑人猛操日本美女一级片| 一区二区三区激情视频| 国产一区二区三区综合在线观看| 18禁黄网站禁片午夜丰满| 91九色精品人成在线观看| 国产xxxxx性猛交| 1024视频免费在线观看| 高潮久久久久久久久久久不卡| 18禁观看日本| 91av网站免费观看| 法律面前人人平等表现在哪些方面| 久久香蕉精品热| 欧美激情极品国产一区二区三区| 亚洲欧美激情综合另类| 亚洲一区中文字幕在线| 国产蜜桃级精品一区二区三区 | 777久久人妻少妇嫩草av网站| 亚洲成人国产一区在线观看| bbb黄色大片| 免费在线观看视频国产中文字幕亚洲| 亚洲精品久久成人aⅴ小说| 国产精华一区二区三区| 免费一级毛片在线播放高清视频 | 性少妇av在线| 欧美日韩av久久| 男女床上黄色一级片免费看| 国产精品秋霞免费鲁丝片| 人人妻人人爽人人添夜夜欢视频| 波多野结衣一区麻豆| 男男h啪啪无遮挡| 久久国产精品男人的天堂亚洲| 色精品久久人妻99蜜桃| 黄片播放在线免费| 韩国精品一区二区三区| 亚洲av第一区精品v没综合| 黑人猛操日本美女一级片| 亚洲精品中文字幕在线视频| 国产男女超爽视频在线观看| 人妻久久中文字幕网| 在线观看www视频免费| 亚洲av日韩精品久久久久久密| 久久久久久亚洲精品国产蜜桃av| 欧美不卡视频在线免费观看 | 女人被狂操c到高潮| 侵犯人妻中文字幕一二三四区| 亚洲熟妇熟女久久| 久久精品国产99精品国产亚洲性色 | 久久精品国产综合久久久| 国产又色又爽无遮挡免费看| tocl精华| 免费日韩欧美在线观看| 深夜精品福利| 欧美黄色淫秽网站| 一区二区三区激情视频| 亚洲性夜色夜夜综合| 欧美大码av| 91成人精品电影| 欧美亚洲 丝袜 人妻 在线| 午夜日韩欧美国产| 午夜福利一区二区在线看| 一进一出抽搐动态| 国产精品国产高清国产av | 国内久久婷婷六月综合欲色啪| 国产欧美日韩一区二区三区在线| 亚洲一码二码三码区别大吗| 免费观看精品视频网站| 亚洲第一av免费看| 国产熟女午夜一区二区三区| 99国产精品一区二区三区| 国产精品二区激情视频| 中出人妻视频一区二区| 丝袜在线中文字幕| 天天操日日干夜夜撸| 少妇猛男粗大的猛烈进出视频| 午夜福利乱码中文字幕| 99久久国产精品久久久| 97人妻天天添夜夜摸| 欧美日韩中文字幕国产精品一区二区三区 | 性少妇av在线| 中文亚洲av片在线观看爽 | 一区二区三区国产精品乱码| 午夜福利一区二区在线看| 极品人妻少妇av视频| 黄色成人免费大全| 国产av精品麻豆| 精品久久久久久久久久免费视频 | 91九色精品人成在线观看| 日韩欧美三级三区| 欧美一级毛片孕妇| 亚洲 国产 在线| 最新的欧美精品一区二区| 成人永久免费在线观看视频| 天堂动漫精品| 亚洲国产中文字幕在线视频| 999久久久国产精品视频| av超薄肉色丝袜交足视频| 91在线观看av| 99re6热这里在线精品视频| 婷婷丁香在线五月| 自线自在国产av| 十八禁人妻一区二区| 亚洲va日本ⅴa欧美va伊人久久| 成年人免费黄色播放视频| 男女高潮啪啪啪动态图| 91成人精品电影| 久久久国产成人免费| 国产精品久久久久久精品古装| 欧美日韩乱码在线| 国精品久久久久久国模美| 人妻 亚洲 视频| 国产区一区二久久| 国产成人欧美| 亚洲黑人精品在线| 国产亚洲精品一区二区www | 欧美精品av麻豆av| 免费黄频网站在线观看国产| 亚洲一卡2卡3卡4卡5卡精品中文| 九色亚洲精品在线播放| 久久国产精品大桥未久av| 淫妇啪啪啪对白视频| 精品一区二区三卡| 免费不卡黄色视频| 中文字幕制服av| 日韩大码丰满熟妇| 欧美国产精品va在线观看不卡| 在线观看免费视频网站a站| 欧美人与性动交α欧美精品济南到| 欧美日韩精品网址| 中文字幕另类日韩欧美亚洲嫩草| 人妻丰满熟妇av一区二区三区 | 亚洲性夜色夜夜综合| 久久青草综合色| 久久精品国产清高在天天线| 国产成人啪精品午夜网站| tube8黄色片| 久久这里只有精品19| 999精品在线视频| 亚洲人成电影观看| 老司机在亚洲福利影院| 久久精品国产亚洲av高清一级| 精品国产国语对白av| av有码第一页| 亚洲欧美日韩高清在线视频| 亚洲成人国产一区在线观看| 国产成人啪精品午夜网站| 日本欧美视频一区| 十分钟在线观看高清视频www| 亚洲专区中文字幕在线| 国产欧美日韩一区二区精品| 建设人人有责人人尽责人人享有的| 久久久久久久久久久久大奶| 热99国产精品久久久久久7| 欧美日韩亚洲综合一区二区三区_| 一级毛片高清免费大全| 久久久水蜜桃国产精品网| 久久久久久久久免费视频了| 国产不卡av网站在线观看| 午夜成年电影在线免费观看| 在线视频色国产色| e午夜精品久久久久久久| 日韩制服丝袜自拍偷拍| 欧美日韩一级在线毛片| 美女视频免费永久观看网站| 久久精品亚洲av国产电影网| 一边摸一边做爽爽视频免费| 亚洲精华国产精华精| 一级a爱视频在线免费观看| 亚洲aⅴ乱码一区二区在线播放 | 人人妻人人澡人人爽人人夜夜| 一级毛片女人18水好多| 老汉色av国产亚洲站长工具| 亚洲欧美日韩高清在线视频| 日本精品一区二区三区蜜桃| 久久人人爽av亚洲精品天堂| 欧美激情 高清一区二区三区| 正在播放国产对白刺激| 久久中文看片网| 两个人看的免费小视频| 日韩欧美三级三区| 欧美久久黑人一区二区| 深夜精品福利| 高潮久久久久久久久久久不卡| 欧美黑人欧美精品刺激| 淫妇啪啪啪对白视频| 十八禁网站免费在线| 看免费av毛片| 欧美激情极品国产一区二区三区| 亚洲 欧美一区二区三区| 国产午夜精品久久久久久| 亚洲精品av麻豆狂野| 久久久久久久午夜电影 | 日日夜夜操网爽| 国产三级黄色录像| 一级毛片精品| 亚洲av日韩精品久久久久久密| 一级毛片精品|