• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Deep Feature Fusion Model for Sentence Semantic Matching

    2019-11-26 06:46:54XuepingPengandRuoyuZhang
    Computers Materials&Continua 2019年11期
    關(guān)鍵詞:手環(huán)香煙成語

    ,Xueping Peng and Ruoyu Zhang

    Abstract:Sentence semantic matching (SSM)is a fundamental research in solving natural language processing tasks such as question answering and machine translation.The latest SSM research benefits from deep learning techniques by incorporating attention mechanism to semantically match given sentences.However,how to fully capture the semantic context without losing significant features for sentence encoding is still a challenge.To address this challenge,we propose a deep feature fusion model and integrate it into the most popular deep learning architecture for sentence matching task.The integrated architecture mainly consists of embedding layer,deep feature fusion layer,matching layer and prediction layer.In addition,we also compare the commonly used loss function,and propose a novel hybrid loss function integrating MSE and cross entropy together,considering confidence interval and threshold setting to preserve the indistinguishable instances in training process.To evaluate our model performance,we experiment on two real world public data sets:LCQMC and Quora.The experiment results demonstrate that our model outperforms the most existing advanced deep learning models for sentence matching,benefited from our enhanced loss function and deep feature fusion model for capturing semantic context.

    Keywords:Natural language processing,semantic matching,deep learning.

    1 Introduction

    Sentence semantic matching (SSM)is a fundamental research in many natural language processing tasks,such as Natural Language Inference [Mueller and Thyagarajan (2016);Liu,Sun,Lin et al.(2016);Wang,Hamza and Florian (2017);Gong,Luo and Zhang(2017)],Question Answering(QA)[Qiu and Huang (2015);Tan,Santos,Xiang et al.(2015);Zhang,Zhang,Wang et al.(2017)] and Machine Translation [Bahdanau,Cho and Bengio (2014)].For example,a frequently answered question (FAQ)based QA system often organises question-answer pairs into tuples (qi,ai)first,then try to figure out which question in question-answer pairs is the most semantically similar question to the given query sentence by SSM algorithms.Say(qi,ai)is the optimal matched questionanswer pair,the answer sentenceawill be the target answer sentence for the FAQ based QA system.

    Last decade has been witnessing an amazingly increasing development of deep learning,many academic and industry giants have put a lot of effort for innovative deep learning models and applications.No doubt many NLP research tasks like SSM benefit a lot from innovative deep learning models as well.In deep learning based NLP direction,most research focus on sentence encoding to have better sentence feature vectors and better feature interaction matching [Kim (2014);Mou,Peng,Li et al.(2015);Mueller and Thyagarajan (2016)].Let us take the most two popular deep learning model sets Convolutional neural networks (CNNs)and Recurrent neural networks (RNNs)as examples to detail a bit further.CNNs have been widely applied in QA [Qiu and Huang(2015);Zhang,Zhang,Wang et al.(2017)]] and text classification tasks [Kim (2014);Conneau,Schwenk,Barrault et al.(2016)].CNN is very advantageous on extracting sequence features and sentence encoding [Blunsom,Grefenstette and Kalchbrenner(2014);Hu,Lu,Li et al.(2014);Yin and Schütze (2015);Zhang,Zhang,Wang et al.(2017);Bai,Kolter and Koltun (2018)].RNNs are time-aware sequential deep learning model set,being able to transmit neural unit values from previous time states to the current neural units to optimise feature weights.RNN’s memory function is advantageous for handing contextual sequential data.Obviously,textual sentence can be considered as words sequence,where RNNs can be easily applied to address textual tasks.In short,RNNs are excellent for sentence encoding through sequential data modelling [Hochreiter and Schmidhuber (1997);Gers and Schmidhuber (2000);Cho,Van Merri?nboer,Gulcehre et al.(2014);Jozefowicz,Zaremba and Sutskever (2015);Tan,Santos,Xiang et al.(2015);Greff,Srivastava,Koutník et al.(2017)].However,both CNNs and RNNs models cannot fully capture all features in the feature extraction or encoding process.This feature loss problem will further cause semantic context loss in understanding language semantics for very long sentences.

    In order to solve the semantic information loss problem,and to remember the key semantic context much better in understanding textual sentences,attention mechanisms have been incorporated to NLP research area.It has already been proved that attention mechanism can greatly contribute to neural network based machine translation[Bahdanau,Cho and Bengio (2014)] and sequence encoding [Yin,Schütze,Xiang et al.(2015);Yang,Yang,Dyer et al.(2016);Lin,Shen,Liu et al.(2016);Wang,Hamza and Florian (2017);Gong,Luo and Zhang (2017);Kim,Hong,Kang et al.(2018)] through better semantic context rememberance machanism.

    Considering the excellent semantic context capturing of attention machanism,our work follows this research stream and proposes a hybrid approach named deep feature fusion model to further memorize semantic features.Our hybrid deep feature fusion model consists of multiple separate sequence encoding approaches and an aggregation component to integrate different encoding outcomes.Another motivation is to design an innovative loss function to prevent over-fitting issue which is a must in modelling.In the most existing deep learning applications,cross entropy is so commonly used as loss function to train models.This approach sourcing from maximum likelihood estimation is very likely to categorize as 0 or 1 even with input noise,which could cause over-fitting issue.However,to our best knowledge,there is very little work on designing new loss functions.This paper also contributes a new alternative loss function incorporating confidence interval and threshold setting.

    The main contributions are summarized as follows:

    ●We proposed a novel sentence encoding method named deep feature fusion to better capture semantic context via the integration of multiple sequence encoding approaches.

    ●We integrated the deep feature fusion approach into the most popular deep learning architecture for sentence matching task.The new architecture mainly consists of embedding layer,deep feature fusion layer,matching layer and prediction layer.

    ●We proposed a new loss function considering confidence interval and threshold setting,which preserves the loss caused by fuzzy instances and focus more on indistinguishable instances.

    ●We evaluated our approach on the common Chinese semantic matching corpus LCQMC,and the public English semantic matching corpus Quora.The results demonstrated that our approach outperforms other public advanced models on LCQMC and won the second place on Quora corpus,which really proved the superiority of our proposed method.

    ●We also open sourced our models in GitHub4https://github.com/XuZhangp/Work2019_DFF_SSMto benefit the whole NLP community,see footnote for your reference.

    The rest of the paper is structured as follows.We introduce the related work about sentence semantic matching in Section 2,and propose our new deep feature fusion model and architecture in Section 3.Section 4 demonstrated the empirical experimental results,followed by the conclusion in Section 5.

    2 Related work

    Sentence pair modeling has received extensive attention in the last decade.Many complex natural language processing tasks can be simplified into sentence semantic matching tasks.For example,information retrieval is to match query terms to documents,QA system is to match the query sentence to the given questions within question-answer pairs.

    Probably a decade ago,SSM research mainly focused on latent semantic analysis and basic syntactic similarity calculation [Das and Smith (2009);Surdeanu,Ciaramita and Zaragoza (2011);B?r,Biemann,Gurevych et al.(2012);Meng,Lu,Zhang et al.(2018);Lu,Wu,Jian et al.(2018)].With the advent of more competitive deep learning techniques,a lot more attention also turned to deep learning based SSM [Zhang,Lu,Ou et al.(2019)].For example,deep structured semantic models [Huang,He,Gao et al.(2013)] and siamese networks [Mueller and Thyagarajan (2016)] simply encoded two sentences via a fully connected CNN or RNN,then calculated sentence matching similarity without considering the local phrase structure existing in the sentence.Further that,Wan et al.proposed BiLSTM to encode the query sentence and the candidate sentence,then calculated if LSTM's hidden layer output matched or not.BiLSTM contributed significantly to SSM research area,because of its capable of handling temporal relationships between sentences,capturing long-term word dependencies,and examining the meaning of each word in different contexts [Wan,Lan,Guo et al.(2016)].Pang et al.constructed three superposition matching matrices to consider the word-word relationship between sentences.Similar to image application,CNN is then applied to extract significant features from these matching matrices [Pang,Lan,Guo et al.(2016)].To further retain long-term context,Bai et al.proposed a Temporal Convolutional Network (TCN)[Bai,Kolter and Koltun (2018)].Experiments showed that TCN based sentence encoding cannot only memorize long-term context more realistically,but also outperforms LSTM [Hochreiter and Schmidhuber (1997)].

    To further improve matching performance,interactive mechanism like attention emerges to mine the connections between different words in sentences using a more elaborate structure.For example,Wang et al.applied four different methods to consider the interaction between sentences [Wang,Hamza and Florian (2017)] with the idea of sequential sentences shouldn't only be considered as one direction.Tomar et al.modified the input representation of the decomposed attention model,using n-gram character embedding instead of word embedding.They then pre-trained all model parameters on Paralex [Fader,Zettlemoyer and Etzioni (2013)],a noisy auto-collected corpus of problem definitions,and fine-tuned the parameters on the Quora dataset [Tomar,Duque,T?ckstr?m et al.(2017)].Gong et al.proposed a complex deep neural network [Gong,Luo and Zhang (2017)] to consider the interactions between different words in the same sentence and to retain the original features through DenseNet [Huang,Liu,Van Der Maaten et al.(2017)] model.Kim et al.proposed a closely connected common attention RNN,which preserves the original information from the lowest level to the highest level.In each block of the stacked RNN,the interaction between two sentences is achieved through common attention.Because stacked RNN rapidly increases the number of arguments,autocoder [Kim,Hong,Kang et al.(2018)] has also been used to compress them.Subramanian et al.tried to combine different sentence representations of learning objectives into a single multi-task framework [Subramanian,Trischler,Bengio et al.(2018)].

    Although the above deep learning models have already achieved good performance on sentence encoding,there are still two challenges to be addressed.First,the common RNN or CNN methods still have the problem of capturing long-term context.Second,attention mechanism and multi-granularity matching strategy might lose features in the sentence encoding process.As for the above challenges,our proposed deep feature fusion model and extended deep learning architecture clearly demonstrated absolute advantages in sentence encoding.

    3 Deep feature fusion model

    This Section is mainly to detail our proposed deep feature fusion model,starting from introducing the model architecture,followed by sub-modules of the model architecture including embedding layer,deep feature fusion layer,matching layer,prediction layer and the improved loss function.

    Figure1:Model architecture of sentence matching

    3.1 Model architecture

    We first introduce the model architecture in Fig.1.As seen,this model architecture includes multiple connected layers such as embedding layer,deep feature fusion layer,matching layer and prediction layer.Given input sentences,we first embed words and phrases through embedding layer,then transit the output of embedding layer to deep feature fusion layer to extract and hybrid semantic features.This feature fusion layer is our key contribution module to improve the semantic encoding performance.After semantic feature extraction,the encoding output of the deep feature fusion module will input to matching layer,followed by a decoding process in the prediction layer via sigmoid function.

    3.2 Embedding layer

    This embedding layer converts tokens such as words or phrases into embedding vectors first,then constructs a sentence matrix representation out of the word and phrase embedding vectors.Basically,multiple embedding approaches,for example pre-trained word embeddings corpus,can be applied to map tokens (words and phrase)to embedded vectors.For the experiments on LCQMC,we use word and character embedding approach to embed tokens by randomly initializing the word (character)vector.For Quora,we apply pre-trained 300 dimensions word embedding vectors from Glove[Pennington,Socher and Manning (2014)] to map sentence tokens to high-dimensional embedding vectors.

    3.3 Deep feature fusion layer

    This architecture includes a key encoding process named deep feature fusion as shown in Fig.2.This deep feature fusion process starts from injecting the embedding matrix out of the embedding layer.Actually,the output of embedding layer for SSM task consists of two token embedding vectors,i.e.,P= [ p1,…,pM]andQ= [q1,…,qM],representing the given two sentences to match.We then apply LSTM and Dense wrapped in Time Distribtued to encode the embedding matrix.To reach the best performance,we separately apply LSTM and Dense twice on a sentence embedding matrix.The encoding will then result in four different outputs in two categories LSTMs and Denses.

    Figure2:Deep feature fusion

    For a given sentence P,two LSTMs outputs can be calculated using Eq.(1)and Eq.(2),similarly to calculate Denses outputs through Eq.(3)and Eq.(4).

    The final step of the deep feature fusion model is to aggregate the encoding outputs together including two simple approaches:ADD and Concatenate.The ADDing approach will aggregate the results with same data structure together,Eq.(5)for LSTMs and Eq.(6)for Denses.After ADDing operation for same structure data,we then perform a concatenate operation,show in Eq.(7)to hybrid LSTM and Dense outcomes together.

    In this way,the embedding vector is further enhanced using ADDing operation within the same dimension data,and the memory semantic context is increasingly captured by concatenate operation for different dimension data.Similar to sentenceP,the same encoding methods can be applied for the matching sentenceQ.The deep feature fusion outputs w ill then be forwarded to upper matching layer.

    3.4 Matching layer

    Taking the output of the deep feature fusion layer as input,matching layer applied three different matching strategies to calculate the sim ilarity/dissim ilarity between two sentence vectorsPandQas tensors.The first one is to calculate the absolute distance between the two sentence vectors as shown in Fig.(8)5https://github.com/keras-team/keras/blob/master/keras/backend/tensorflowbackend.py.Second one is to multiply6https://github.com/keras-team/keras/blob/master/keras/layers/merge.pyvectors together using Eq.(9),the last strategy is to calculate the cosine7https://github.com/keras-team/keras/blob/master/keras/backend/tensorflowbackend.pyvalue for the two vector differences using Eq.(10).Finally,the three calculated tensorscan be forwarded to the next prediction layer.

    3.5 Prediction layer

    The prediction layer itself is a deep neural network (DNN)classifier,including multiple sub-layers to fully extract sentence matching features out of the lower matching layer.In the DNN based prediction layer,encoding and matching modules map the input to the hidden feature space.The fully connected network maps the learned distributed feature representation to the sample tag space.This prediction layer actually consists of three dense sub-layers,w ith 600 dimensions for the first two denses and 1 dimension for the last dense.Each dense layer is followed by a dropout,relu and normalization modules.Then last sub-layer dense to be classified using sigmoid activation function.

    3.6 Improved loss function

    As a popular evaluation metric,mean square error (MSE)as in Eq.(11)measures the average of the squares of the errors,that is,the average squared difference between the estimated valuesytrueand what is estimatedypred.MSE values closer to zero are better.

    As we know that the gradient of MSE loss w ill be larger when the loss value is higher.It drops as the loss approaches to zero,making it more accurate at the end of training.But one disadvantage of using MSE is that its partial derivative value is very small when the output probability value is close to zero or close to 1,which may cause the partial derivative value to almost disappear when the model first starts training.This results in a very slow rate at the beginning of the model.This issue can be partly solved by fitting theones_like8https://github.com/keras-team/keras/blob/master/keras/backend/tensorflowbackend.pydistribution as shown in Eq.(12).

    where:r> 0;[0,1]?k.

    However,cross-entropy in Eq.(13)as an alternative loss function won’t have this problem.Cross-entropy is actually a very popular loss function applied in machine learning research to measure the similarity between prediction values and the original values.In machine learning research,coupled with sigmoid function,cross-entropy is capable to solve the above MSE problem regarding the very slow learning rate when the gradient decreases,because the learning rate can be controlled by the output error.

    To avoid the over-fitting problem for training,we update the cross entropy loss function in Eq.(14)by introducing a unit step functionθ(x).

    where:

    and

    We can see thatLnewcrossentropyupdated the correction termλ(ytrue,ypred)to the common used cross entropy.For a positive sample,that meansytrue=1,obviously we will have:λ(1,ypred)=1 -θ(ytrue-m).In this situation,ifypred>m,thenλ(1,ypred)= 0,then the cross entropy is automatically 0 (to reach the minimum value).Otherwise ifypred<m,there will beλ(1,ypred)=1.So,for our proposed new cross entropy,if the positive sample is bigger than m,it won’t be updated because of reaching the minimum value.Otherwise if the sample value is smaller than m,it will be updated.Similarly,for a negative sample,if the output of the negative sample is smaller than1-m,then won’t be updated,otherwise if bigger than 1-m,will continue to update.To improve classification performance and speed up the training speed,we hybrid the MSE,modified MSE and our proposed cross entropy together as our final loss function shown in Eq.(17).

    where:n=p+q;1 -n> 0;[0,1] ?p,q,n.

    The following experiments on public data sets also demonstrated that our proposed loss function outperforms others.

    4 Experiments and results

    This Section starts with public data sets introduction,and our experiments settings to evaluate our proposed model,followed by results comparisons.

    4.1 Data sets

    We respectively evaluated our model on two public data sets,LCQMC and Quora.LCQMC is a large-scale Chinese question matching corpus released to the public by Liu et al.[Liu,Chen,Deng et al.(2018)],and Quora is one of the largest English question pair corpus released to the public by Chen et al.[Chen,Zhang,Zhang et al.(2018)].

    For LCQMC data set,we pre-process the sentences using Chinese word segmentation,because Chinese doesn’t automatically have spaces like English.For the experiments on LCQMC data set,we apply Word2vec technique [Mikolov,Chen,Corrado et al.(2013)]to train word vectors.For the experiments on data set Quora,we maintain a consistent approach similar to Wang et al.[Wang,Hamza and Florian (2017)],using pre-trained word vectors (300D Glove 840B)[Pennington,Socher and Manning (2014)].

    4.1.1 LCQMC

    LCQMC is a generic corpus mainly for intent matching,which contains a training set of 238,766 question pairs,a development set of 8,802 question pairs,and a test set of 12,500 question pairs.This data set mainly consists of two parts,a sentence pair including two sentences,and a binary matching label indicating if the two sentences matched or not.To better illustrate this data set,we randomly select a few examples as shown in Tab.1.From these example sentence pairs,we can clearly see that the two matched sentences should be similar semantically.

    Table1:LCQMC corpus examples

    S4:一只蜜蜂停在日歷上(猜一成語)EN:A bee sits on a calendar (Guess an idiom)S5:一盒香煙不拆開 能存放多久?EN:How long can be stored that a box of cigarettes without being opened?S6:一條沒拆封的香煙能存放多久。EN:How long can be stored that a cigarette unopened.S7:什么是智能手環(huán)EN:What is a smart bracelet?S8:智能手環(huán)有什么用EN:What is the use of smart bracelet?YES NO

    4.1.2 Quora

    The Quora corpus contains over 400,000 question pairs,and each question pair is annotated with a binary label indicating whether the two questions are paraphrase of each other.To better understand the data set,we also randomly choose some examples shown in Tab.2

    Table2:Quora corpus examples

    4.2 Experiments setting

    To reproduce the experimental outcomes,here is how we set up the experiments.In the embedding layer,the embedding dimension is set to 300.In the deep feature fusion module we set dimension to 300.In the prediction layer,the widths of the three dense layers are 600,600 and 1.Dropout in deep feature fusion layer is 0.1 and in prediction layer is 0.5.Both of the deep feature fusion and prediction layer use relu as the activation function,except the last dense applies sigmoid function for classification.We also tested different parameter combinations for loss function,here is the optimal parameter values for the two data sets.For Quora,p,q are both set to 0.15 and m to 0.6.But for LCQMC,we set p,q both to 0.35 and m to 0.7.

    4.3 Proposed model variations

    ●DFFmis the baseline model.This model uses deep feature fusion to extract sentence eigenvalues,as in Section 3.3,and the interactive matching model,as Section 3.4.This model applies the MSE loss function which is shown in Eq.(11).

    ●DFFimis similar with DFFm,but with the improved MSE loss function as Eq.(12).

    ●DFFcis similar with DFFmas well,but using cross-entropy loss function as Eq.(13).

    ●DFFicis similar with DFFm,but using the improved cross-entropy loss function as Eq.(14).

    ●DFFois also similar too ,but with the hybrid version of loss function including MSE and cross-entropy as Eq.(17).

    Since LCQMC data set is a Chinese corpus,we can encode sentences from the perspective of characters and words.Therefore,each of the above five models can be further customized to two sub-models.For example,DFFmbecomes DFFmchar and DFFmword.But Quora data set is encoded only from word level.

    4.4 Experiments on LCQMC

    A comparison of our work with some of the existing work such as WMD,CBOW,CNN,BiLSTM,BiMPM Liu et al.[Liu,Chen,Deng et al.(2018)] is shown in Tab.3.Our model DFFoobviously outperforms the existing models WMD,CBOW,CNN,BiLSTM,BiMPM both on character and word levels.

    Table3:Experimental results on LCQMC

    DFFcchar DFFcword DFFicchar DFFicword DFFochar DFFoword 76.27 75.89 77.23 77.02 78.58 77.69 95.29 95.32 94.72 94.63 93.88 94.08 84.67 84.46 85.05 84.88 85.51 85.06 82.81 82.51 83.41 83.19 84.15 83.53

    4.4.1 Comparison with exiting models

    From the character level,compared with WMD,CBOW,CNN,BiLSTM and BiMPM,our best model DFFoimproves the precision metric by 11.58%,12.08%,11.48%,11.18%,0.98%,recall by 12.68%,11.08%,8.28%,2.88%,-0.02%,F1-score by 12.11%,11.71%,10.31%,8.01%,0.51%,and accuracy by 13.55%,13.55%,12.35%,10.65%,0.75%.

    We also respectively compare with models WMD,Cwo,Cngram,Dedt,Scos,CBOW,CNN,BiLSTM and BiMPM from word level.The comparison result shows that the precision is improved by 13.29%,16.59%,25.39%,31.19%,17.59%,9.79%,9.29%,7.09%,-0.01%,recall is improved by 17.28%,10.48%,4.78%,7.68%,5.38%,4.18%,9.48%,4.78%,0.58%,F1-score is improved by 14.26%,14.46%,19.06%,24.56%,13.46%,7.66%,9.36%,6.14%,0.16%,and accuracy is improved by 23.53%,12.83%,22.33%,31.23%,13.23%,9.83%,10.73%,7.43%,0.23%.

    From Tab.3,we can clearly see that our model works best at both word and character level on precision,recall,F1-score and accuracy metrics.

    4.4.2 Comparison with model variations

    In this subsection,we compare our proposed model variations with different loss functions.The models also have word and character levels.First from the character level,compared toandthe modelimproves precision by 1.64%,1.26%,2.31%,1.35%,F1-score by 0.67%,0.5%,0.84%,0.46%,accuracy by 1.09%,0.75%,1.34%,0.74%.From the word level,compared toandand the modelimproves precision by 1.86%,1.18%,1.85%,0.67%,F1-score by 0.93%,0.41%,0.6%,0.18%,accuracy by 1.32%,0.67%,1.02%,0.34%.The comparison results clearly show that the improved hybrid loss function has achieved the best performance.In addition,the outcome also shows character encoding is better than word encoding.

    4.5 Experiments on Quora

    We also evaluate our models on the Quora data set with result shown in Tab.4.We compare it with the most advanced models available today,as Tab.2.CNN,LSTM,L.D.C,BiMPM from wang et al.[Wang,Hamza and Florian (2017)],and FFNN,DECATT from Tomar et al.[Tomar,Duque,T?ckstr?m et al.(2017)],and DIIN is the work of Gong et al.[Gong,Luo and Zhang (2017)].Experimental results show that our proposed models are still competitive on the Quora data set.

    Table4:Experimental results on Quora

    5 Conclusion

    We proposed a new deep neural network based sentence matching model with great performance achievement on two public data sets LCQMC and Quora.In this model,we proposed an innovative sentence encoding structure named deep feature fusion to better capture sentence’s eigenvalues.At the mean time,we also proposed a hybrid loss function to better determine confidence interval and threshold setting for classification performance improvement.The experiments demonstrated that our proposed model outperforms the most advanced available models so far,which is contributed from the proposed deep feature fusion module.Additionally,we compared our model variations with different loss functions.The comparison outcome showed that the proposed hybrid loss function integrating MSE and cross entropy performs well on the two public data sets.

    Acknowledgement:The research work is supported by National Nature Science Foundation of China under Grant No.61502259,National Key R&D Program of China under Grant No.2018YFC0831704 and Natural Science Foundation of Shandong Province under Grant No.ZR2017MF056.

    猜你喜歡
    手環(huán)香煙成語
    愛心手環(huán)
    酷愛高檔香煙的“土地爺”
    公民與法治(2022年3期)2022-07-29 00:57:24
    抽“香煙”
    不止想念你的香煙紅唇和劉海
    南風(fēng)(2017年31期)2017-11-10 00:47:04
    拼成語
    意林(2016年21期)2016-11-30 17:32:21
    紅手環(huán)志愿者服務(wù)團歡迎您的加入
    學(xué)習(xí)監(jiān)測手環(huán)
    猜成語
    麥開:放棄智能手環(huán)
    湯姆的香煙
    妹子高潮喷水视频| 97精品久久久久久久久久精品| 日韩精品免费视频一区二区三区 | 少妇被粗大的猛进出69影院 | 亚洲国产欧美在线一区| 91午夜精品亚洲一区二区三区| 自拍欧美九色日韩亚洲蝌蚪91| 两个人的视频大全免费| 欧美+日韩+精品| 欧美xxⅹ黑人| 内地一区二区视频在线| 亚洲综合色网址| 99热全是精品| 国产免费视频播放在线视频| 中文精品一卡2卡3卡4更新| 美女主播在线视频| 边亲边吃奶的免费视频| 大陆偷拍与自拍| 午夜免费男女啪啪视频观看| 国产视频内射| av电影中文网址| 观看美女的网站| 777米奇影视久久| 看非洲黑人一级黄片| 亚洲av日韩在线播放| 看免费成人av毛片| 国产成人精品一,二区| 国产精品蜜桃在线观看| 另类精品久久| 美女国产高潮福利片在线看| 亚洲精品久久午夜乱码| 一本—道久久a久久精品蜜桃钙片| 男女免费视频国产| 午夜福利在线观看免费完整高清在| 免费av不卡在线播放| 国产精品一区二区三区四区免费观看| 国产av一区二区精品久久| 亚洲av不卡在线观看| 精品人妻在线不人妻| 女的被弄到高潮叫床怎么办| 麻豆乱淫一区二区| 亚洲欧美中文字幕日韩二区| 国产在视频线精品| 国产免费一区二区三区四区乱码| 美女福利国产在线| 卡戴珊不雅视频在线播放| 综合色丁香网| 国产精品免费大片| 色吧在线观看| av播播在线观看一区| 纵有疾风起免费观看全集完整版| 亚洲成人av在线免费| 成人免费观看视频高清| 熟女av电影| 亚洲av成人精品一区久久| 亚洲av在线观看美女高潮| 精品国产露脸久久av麻豆| 麻豆成人av视频| 久久久欧美国产精品| 亚洲av成人精品一区久久| 国产高清三级在线| 纯流量卡能插随身wifi吗| 啦啦啦啦在线视频资源| 久久精品夜色国产| 在线 av 中文字幕| 精品一区二区三区视频在线| 日韩免费高清中文字幕av| 美女xxoo啪啪120秒动态图| 在线观看三级黄色| 国产成人免费观看mmmm| 汤姆久久久久久久影院中文字幕| 色94色欧美一区二区| 日日摸夜夜添夜夜添av毛片| 日本欧美视频一区| 亚洲国产精品国产精品| 日韩一区二区三区影片| 男人爽女人下面视频在线观看| 在线 av 中文字幕| 香蕉精品网在线| 啦啦啦啦在线视频资源| 欧美 亚洲 国产 日韩一| 人人妻人人添人人爽欧美一区卜| 国产精品国产三级国产av玫瑰| 草草在线视频免费看| 欧美日韩视频精品一区| 亚洲精品视频女| 插逼视频在线观看| 精品酒店卫生间| 亚洲精品一二三| 日韩精品免费视频一区二区三区 | 亚洲一级一片aⅴ在线观看| 午夜av观看不卡| 国产精品 国内视频| 寂寞人妻少妇视频99o| av网站免费在线观看视频| av网站免费在线观看视频| 久久99一区二区三区| 亚洲中文av在线| 久久亚洲国产成人精品v| 一级爰片在线观看| 在线观看人妻少妇| 简卡轻食公司| 久久人人爽av亚洲精品天堂| 成人漫画全彩无遮挡| 久久久久久久久大av| 日本黄大片高清| kizo精华| 欧美激情 高清一区二区三区| 赤兔流量卡办理| 日韩电影二区| 久久久a久久爽久久v久久| 亚洲少妇的诱惑av| 亚洲图色成人| 蜜桃国产av成人99| 欧美成人精品欧美一级黄| 国产一级毛片在线| 欧美成人精品欧美一级黄| 久久精品久久久久久久性| 下体分泌物呈黄色| 精品一区二区三区视频在线| 热99国产精品久久久久久7| 精品卡一卡二卡四卡免费| 久久韩国三级中文字幕| 美女福利国产在线| 91精品国产国语对白视频| 免费观看性生交大片5| 夜夜骑夜夜射夜夜干| 黄片播放在线免费| 亚洲情色 制服丝袜| 18禁在线播放成人免费| 国产高清国产精品国产三级| 久久久精品94久久精品| 亚洲综合色网址| 精品午夜福利在线看| 少妇的逼好多水| 国国产精品蜜臀av免费| 国产日韩一区二区三区精品不卡 | 亚洲精品av麻豆狂野| 欧美日韩视频高清一区二区三区二| 国产av国产精品国产| av国产久精品久网站免费入址| 婷婷色综合www| 国产黄色视频一区二区在线观看| 秋霞在线观看毛片| 亚洲图色成人| 考比视频在线观看| 黄色配什么色好看| 人人妻人人澡人人爽人人夜夜| 91久久精品电影网| 最新的欧美精品一区二区| 日韩不卡一区二区三区视频在线| 18禁裸乳无遮挡动漫免费视频| 精品久久国产蜜桃| 免费观看性生交大片5| 国模一区二区三区四区视频| av不卡在线播放| 日韩 亚洲 欧美在线| 丁香六月天网| 日韩精品免费视频一区二区三区 | 狂野欧美白嫩少妇大欣赏| 国产精品一二三区在线看| 久久久久久久大尺度免费视频| 日韩精品免费视频一区二区三区 | 晚上一个人看的免费电影| 国产乱人偷精品视频| av视频免费观看在线观看| 精品酒店卫生间| 日本黄大片高清| 国产精品偷伦视频观看了| 少妇人妻 视频| 日韩av免费高清视频| av国产精品久久久久影院| 成年人午夜在线观看视频| 特大巨黑吊av在线直播| 一区二区日韩欧美中文字幕 | 嘟嘟电影网在线观看| 精品亚洲成a人片在线观看| 亚洲久久久国产精品| 国产av码专区亚洲av| 大香蕉久久成人网| 最近最新中文字幕免费大全7| 欧美性感艳星| av卡一久久| 久久精品国产亚洲网站| 涩涩av久久男人的天堂| 日韩在线高清观看一区二区三区| 久久鲁丝午夜福利片| 高清午夜精品一区二区三区| 美女大奶头黄色视频| 少妇丰满av| 亚洲一级一片aⅴ在线观看| 少妇的逼好多水| 精品卡一卡二卡四卡免费| 三级国产精品片| 国产日韩欧美亚洲二区| 热99久久久久精品小说推荐| 成人亚洲欧美一区二区av| 一级,二级,三级黄色视频| 免费高清在线观看视频在线观看| 国产免费福利视频在线观看| 国产精品偷伦视频观看了| 欧美精品一区二区大全| 国产成人91sexporn| 18禁在线播放成人免费| 女性被躁到高潮视频| 国产爽快片一区二区三区| 久久毛片免费看一区二区三区| 国产熟女欧美一区二区| 国产黄频视频在线观看| 18禁裸乳无遮挡动漫免费视频| 久久久久久久精品精品| 亚洲欧美一区二区三区国产| 一个人免费看片子| 免费大片黄手机在线观看| 国产精品国产三级专区第一集| 各种免费的搞黄视频| 黑人欧美特级aaaaaa片| 春色校园在线视频观看| 飞空精品影院首页| 97在线人人人人妻| 精品少妇黑人巨大在线播放| 伊人亚洲综合成人网| 亚洲av男天堂| 最近中文字幕高清免费大全6| 精品一区二区三区视频在线| av线在线观看网站| 日韩成人伦理影院| 国产成人精品无人区| 26uuu在线亚洲综合色| 成人漫画全彩无遮挡| 熟女人妻精品中文字幕| 久久国产精品大桥未久av| 成年美女黄网站色视频大全免费 | 国产精品久久久久久精品电影小说| 亚洲人成网站在线播| 日韩成人伦理影院| 成人毛片60女人毛片免费| 国产精品成人在线| 国产亚洲欧美精品永久| 亚洲成人手机| 韩国高清视频一区二区三区| 久久婷婷青草| 中文字幕人妻丝袜制服| 最新中文字幕久久久久| 色视频在线一区二区三区| 国产黄色视频一区二区在线观看| 91成人精品电影| 夫妻性生交免费视频一级片| 观看av在线不卡| 亚洲人成77777在线视频| 午夜福利影视在线免费观看| 中文字幕制服av| 91精品三级在线观看| 久久久亚洲精品成人影院| av免费在线看不卡| 人人妻人人添人人爽欧美一区卜| 国产精品一区www在线观看| 丝袜喷水一区| 黑丝袜美女国产一区| 菩萨蛮人人尽说江南好唐韦庄| 啦啦啦视频在线资源免费观看| 丰满饥渴人妻一区二区三| 青春草视频在线免费观看| 成人亚洲欧美一区二区av| 搡女人真爽免费视频火全软件| 色视频在线一区二区三区| 另类精品久久| 一个人看视频在线观看www免费| 日韩av在线免费看完整版不卡| 又粗又硬又长又爽又黄的视频| 一区二区三区免费毛片| 蜜桃久久精品国产亚洲av| 国产成人精品福利久久| 亚洲色图 男人天堂 中文字幕 | 男女边摸边吃奶| 色哟哟·www| 国产片特级美女逼逼视频| 一级毛片aaaaaa免费看小| 国产免费福利视频在线观看| 中国三级夫妇交换| 丰满少妇做爰视频| 亚洲图色成人| 国产在线免费精品| 亚洲美女黄色视频免费看| 青青草视频在线视频观看| 国产精品嫩草影院av在线观看| 免费黄频网站在线观看国产| 日韩熟女老妇一区二区性免费视频| 精品国产乱码久久久久久小说| 视频中文字幕在线观看| 青春草国产在线视频| 18+在线观看网站| 国产永久视频网站| 久久久精品94久久精品| 亚洲欧美一区二区三区国产| 免费少妇av软件| 亚洲欧美精品自产自拍| 亚洲一级一片aⅴ在线观看| 插阴视频在线观看视频| 欧美精品高潮呻吟av久久| 国产有黄有色有爽视频| 97超视频在线观看视频| 久久精品国产亚洲av天美| 又黄又爽又刺激的免费视频.| 黄色配什么色好看| 亚洲怡红院男人天堂| 91精品国产国语对白视频| 高清毛片免费看| 久久国产精品大桥未久av| 日本wwww免费看| 亚洲伊人久久精品综合| 蜜臀久久99精品久久宅男| 我的女老师完整版在线观看| 丰满迷人的少妇在线观看| 青春草国产在线视频| av专区在线播放| 国产精品 国内视频| 一级毛片电影观看| 免费黄色在线免费观看| 成人午夜精彩视频在线观看| 久久免费观看电影| 欧美国产精品一级二级三级| 水蜜桃什么品种好| 日韩成人伦理影院| 亚洲熟女精品中文字幕| 黑人欧美特级aaaaaa片| 妹子高潮喷水视频| 人妻一区二区av| 成人黄色视频免费在线看| 中文字幕av电影在线播放| 久久久久久久久久久丰满| 久久婷婷青草| 国产不卡av网站在线观看| 亚洲精品国产av蜜桃| 国产精品三级大全| 国产男女超爽视频在线观看| 国产色爽女视频免费观看| 久热久热在线精品观看| 99re6热这里在线精品视频| 成人国产麻豆网| 国产老妇伦熟女老妇高清| 欧美变态另类bdsm刘玥| 欧美人与性动交α欧美精品济南到 | 亚洲一区二区三区欧美精品| 男女高潮啪啪啪动态图| 嘟嘟电影网在线观看| av有码第一页| 亚洲综合色惰| 亚洲综合精品二区| 免费人成在线观看视频色| 婷婷色麻豆天堂久久| 日本av手机在线免费观看| 国产一级毛片在线| 夫妻性生交免费视频一级片| 爱豆传媒免费全集在线观看| 精品人妻熟女av久视频| 高清视频免费观看一区二区| 欧美日韩精品成人综合77777| 精品国产国语对白av| 国产免费又黄又爽又色| 日韩人妻高清精品专区| 人人妻人人添人人爽欧美一区卜| 国产精品蜜桃在线观看| 岛国毛片在线播放| 亚洲一级一片aⅴ在线观看| 亚洲精品第二区| 国产成人一区二区在线| 99热国产这里只有精品6| 免费观看的影片在线观看| 青春草视频在线免费观看| 2022亚洲国产成人精品| 91精品国产国语对白视频| 国产日韩欧美亚洲二区| 日韩av不卡免费在线播放| 亚洲精品国产色婷婷电影| 亚洲国产成人一精品久久久| 99精国产麻豆久久婷婷| 夫妻午夜视频| 丝袜在线中文字幕| 欧美日韩综合久久久久久| 一边亲一边摸免费视频| 最近2019中文字幕mv第一页| 免费看光身美女| 99热网站在线观看| 欧美日韩av久久| 少妇高潮的动态图| 亚洲国产精品专区欧美| 热99国产精品久久久久久7| 欧美激情 高清一区二区三区| 日韩精品有码人妻一区| 亚洲精品日本国产第一区| 久久av网站| 美女国产视频在线观看| 人妻少妇偷人精品九色| 久久鲁丝午夜福利片| 色吧在线观看| 韩国av在线不卡| 亚洲成色77777| √禁漫天堂资源中文www| 欧美日韩国产mv在线观看视频| 在线观看免费视频网站a站| 最新的欧美精品一区二区| 国产免费视频播放在线视频| 九九爱精品视频在线观看| 天堂中文最新版在线下载| 成人18禁高潮啪啪吃奶动态图 | 人人妻人人爽人人添夜夜欢视频| 七月丁香在线播放| 欧美xxxx性猛交bbbb| 成年人免费黄色播放视频| 国产精品人妻久久久影院| 99热这里只有是精品在线观看| 一个人看视频在线观看www免费| 大码成人一级视频| 丰满饥渴人妻一区二区三| 国产白丝娇喘喷水9色精品| 国产成人91sexporn| 国产精品99久久99久久久不卡 | 久久久久精品久久久久真实原创| 欧美日韩国产mv在线观看视频| 久久这里有精品视频免费| 五月伊人婷婷丁香| 一本一本综合久久| 亚洲精华国产精华液的使用体验| 久久精品国产自在天天线| 国产精品一区www在线观看| 黄色一级大片看看| 五月玫瑰六月丁香| 韩国高清视频一区二区三区| 99精国产麻豆久久婷婷| a级毛片免费高清观看在线播放| 午夜精品国产一区二区电影| 大话2 男鬼变身卡| av天堂久久9| 新久久久久国产一级毛片| 大香蕉久久成人网| 99九九在线精品视频| 在线观看免费视频网站a站| 美女主播在线视频| av不卡在线播放| videos熟女内射| 日本-黄色视频高清免费观看| 成人漫画全彩无遮挡| 国产高清三级在线| 大话2 男鬼变身卡| 国产男人的电影天堂91| 国产精品一二三区在线看| 午夜福利,免费看| 亚洲美女黄色视频免费看| 一本大道久久a久久精品| 一个人看视频在线观看www免费| 爱豆传媒免费全集在线观看| 91精品国产九色| 三级国产精品欧美在线观看| 91在线精品国自产拍蜜月| 一级爰片在线观看| 日本vs欧美在线观看视频| 在线天堂最新版资源| 亚洲欧洲精品一区二区精品久久久 | av黄色大香蕉| 免费人妻精品一区二区三区视频| 卡戴珊不雅视频在线播放| 少妇精品久久久久久久| 国产 一区精品| 成年人午夜在线观看视频| videosex国产| 国产综合精华液| 久久婷婷青草| 成人国产av品久久久| 欧美日韩在线观看h| 老司机亚洲免费影院| videosex国产| 人人妻人人添人人爽欧美一区卜| 中文字幕人妻熟人妻熟丝袜美| 欧美日韩视频精品一区| 最近2019中文字幕mv第一页| 亚洲第一区二区三区不卡| 久久精品久久精品一区二区三区| 一区二区日韩欧美中文字幕 | 成人亚洲欧美一区二区av| 青青草视频在线视频观看| 国产精品国产av在线观看| 亚洲av福利一区| 纯流量卡能插随身wifi吗| 欧美亚洲 丝袜 人妻 在线| 国产精品久久久久成人av| 成人免费观看视频高清| 亚洲情色 制服丝袜| 免费人成在线观看视频色| 成人国语在线视频| 亚洲不卡免费看| 亚洲色图 男人天堂 中文字幕 | 亚洲情色 制服丝袜| 考比视频在线观看| 国产亚洲欧美精品永久| 高清在线视频一区二区三区| av免费观看日本| 性色av一级| 日韩av不卡免费在线播放| 日韩亚洲欧美综合| 亚洲成色77777| 毛片一级片免费看久久久久| 日韩在线高清观看一区二区三区| 啦啦啦在线观看免费高清www| 97在线人人人人妻| 九九久久精品国产亚洲av麻豆| 欧美 日韩 精品 国产| 亚洲熟女精品中文字幕| 国产乱来视频区| 视频在线观看一区二区三区| 在线 av 中文字幕| av.在线天堂| 亚洲四区av| 成人亚洲欧美一区二区av| 国产成人a∨麻豆精品| 99国产精品免费福利视频| 丰满乱子伦码专区| 久久精品熟女亚洲av麻豆精品| 久久久久精品久久久久真实原创| 免费观看无遮挡的男女| 国产毛片在线视频| a级毛片免费高清观看在线播放| 国产淫语在线视频| 日本猛色少妇xxxxx猛交久久| 大又大粗又爽又黄少妇毛片口| 国产欧美另类精品又又久久亚洲欧美| 丁香六月天网| 成人国语在线视频| 少妇精品久久久久久久| 成人国语在线视频| 丝袜喷水一区| 99久久中文字幕三级久久日本| 97在线人人人人妻| 日韩制服骚丝袜av| 高清毛片免费看| 国产极品粉嫩免费观看在线 | 国产在线视频一区二区| 视频中文字幕在线观看| 国产午夜精品一二区理论片| 一边亲一边摸免费视频| 久久久久久久亚洲中文字幕| 亚洲国产精品专区欧美| 久久精品国产a三级三级三级| 精品国产一区二区久久| 亚洲国产成人一精品久久久| 久久精品国产自在天天线| 午夜精品国产一区二区电影| 少妇被粗大猛烈的视频| 亚洲人成网站在线播| 国产欧美日韩综合在线一区二区| 欧美 日韩 精品 国产| 欧美日韩精品成人综合77777| 嘟嘟电影网在线观看| 欧美人与性动交α欧美精品济南到 | 国产精品久久久久久精品古装| 男人操女人黄网站| 性色av一级| 成人黄色视频免费在线看| 性色av一级| 国产欧美日韩综合在线一区二区| 蜜桃国产av成人99| 黄片无遮挡物在线观看| 91精品国产九色| 国产一区二区在线观看日韩| 免费黄网站久久成人精品| 国产一区二区在线观看av| videos熟女内射| 女人精品久久久久毛片| 久久精品国产鲁丝片午夜精品| 亚洲av.av天堂| 少妇精品久久久久久久| xxx大片免费视频| 日本欧美视频一区| 夫妻午夜视频| 国产不卡av网站在线观看| 丰满迷人的少妇在线观看| 亚洲性久久影院| 久久久久久久精品精品| 黄色一级大片看看| 精品少妇内射三级| 国产免费现黄频在线看| 成人漫画全彩无遮挡| 久久精品熟女亚洲av麻豆精品| 欧美另类一区| 99re6热这里在线精品视频| 一级二级三级毛片免费看| 久久鲁丝午夜福利片| 在线观看免费高清a一片| 国产av码专区亚洲av| 国产亚洲精品久久久com| 在线看a的网站| 成年人午夜在线观看视频| 一级,二级,三级黄色视频| 久久99热6这里只有精品| 日韩人妻高清精品专区| 国产日韩欧美视频二区| 午夜av观看不卡| 日日啪夜夜爽| xxx大片免费视频| 欧美变态另类bdsm刘玥| 黄片播放在线免费| 国产精品.久久久| 国产亚洲精品久久久com| 亚洲精品乱码久久久久久按摩| 狂野欧美激情性bbbbbb| 亚洲精品日韩在线中文字幕| 另类精品久久| 国产欧美日韩综合在线一区二区| 国产亚洲av片在线观看秒播厂| 啦啦啦在线观看免费高清www| 日日摸夜夜添夜夜添av毛片| 久久青草综合色| 成人亚洲欧美一区二区av| av女优亚洲男人天堂| 亚洲精品,欧美精品|