• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Visual-Based Gesture Prediction Framework Applied in Social Robots

    2022-01-26 00:36:00BixiaoWuJunpeiZhongandChenguangYang
    IEEE/CAA Journal of Automatica Sinica 2022年3期

    Bixiao Wu,Junpei Zhong,,and Chenguang Yang,

    Abstract—In daily life,people use their hands in various ways for most daily activities.There are many applications based on the position,direction,and joints of the hand,including gesture recognition,gesture prediction,robotics and so on.This paper proposes a gesture prediction system that uses hand joint coordinate features collected by the Leap Motion to predict dynamic hand gestures.The model is applied to the NAO robot to verify the effectiveness of the proposed method.First of all,in order to reduce jitter or jump generated in the process of data acquisition by the Leap Motion,the Kalman filter is applied to the original data.Then some new feature descriptors are introduced.The length feature,angle feature and angular velocity feature are extracted from the filtered data.These features are fed into the long-short time memory recurrent neural network(LSTM-RNN) with different combinations.Experimental results show that the combination of coordinate,length and angle features achieves the highest accuracy of 99.31%,and it can also run in real time.Finally,the trained model is applied to the NAO robot to play the finger-guessing game.Based on the predicted gesture,the NAO robot can respond in advance.

    I.INTRODUCTION

    CURRENTLY,computers are becoming more and more popular,and the demand for human-robot interaction is increasing.People pay more attention to research of new technologies and methods applied to human-robot interactions[1]–[3].Making human-robot interaction as natural as daily human-human interaction is the ultimate goal.Gestures have always been considered an interactive technology that can provide computers with more natural,creative and intuitive methods.Gestures have different meanings in different disciplines.In terms of interaction design,the difference between using gestures and using a mouse and keyboard,etc.,is obvious,i.e.,gestures are more acceptable to people.Gestures are comfortable and less limited by interactive devices,and they can provide more information.Compared with traditional keyboard and mouse control methods,the direct control of the computer by hand movement has the advantages of being natural and intuitive.

    Gesture recognition [4] refers to the process of recognizing the representation of dynamic or static gestures and translating them into some meaningful instructions.It is an extremely significant research direction in the area of human-robot interaction technology.The method of realizing gesture recognition can be divided into two types: visual-based [5],[6] gesture recognition and non-visual-based gesture recognition.The study of non-vision approaches began in the 1970s.Non-vision methods always take advantage of wearable devices [7] to track or estimate the orientation and position of fingers and hands.Gloves are very common devices in this field,and they contain the sensory modules with a wired interface.The advantage of gloves is that their data do not need to be preprocessed.Nevertheless,they are very expensive for virtual reality applications.They also have wires,which makes them uncomfortable to wear.With the development of technology,current research on non-visual gesture recognition is mainly focused on EMG signals[8]–[11].However,EMG signals are greatly affected by noise,which makes it is difficult to process.

    Gesture recognition is based on vision and is less intrusive and contributes to a more natural interaction.It refers to the use of cameras [12]–[16],such as Kinect [17],[18] and Leap Motion [19],[20],to capture images of gestures.Then some algorithms are used to analyze and process the acquired data to get gesture information,so that the gesture can be recognized.It is also more natural and easy to use,becoming the mainstream way of gesture recognition.However,it is also a very challenging problem.

    By using the results of gesture recognition,the subsequent gesture of performers can be predicted.This process could be called gesture prediction,and it has wider applications.In recent years,with the advent of deep learning,many deep neural networks (DNN) are applied to gesture prediction.Zhanget al.[21] used an RNN model to predict gestures from raw sEMG signals.Weiet al.[22] combined a 3D convolutional residual network and bidirectional LSTM network to recognize dynamic gesture.Kumaret al.[23] proposed a multi modal framework based on hand features captured from Kinect and Leap Motion sensors to recognize gestures,using a hidden Markov model (HMM) and bidirectional long shortterm memory model (LSTM).The LSTM [24] has become an effective model for solving some learning problems related to sequence data.Hence,inspired by the previous works,we adopt the LSTM to predict gestures in our proposed framework.

    Fig.1.Pipeline of the proposed approach.

    In the method of gesture prediction,hand key point detection is one of the most important steps.In the early stage of technological development,the former mainly used color filters to segment the hands to achieve detection.However,this type of method relies on skin color,and the detection performance is poor when the hand is in a complex scene.Therefore,the researchers proposed a detection method based on 3D hand key points.The task goal of the 3D hand key point estimation is to locate the 3D coordinates of hand joints in a frame of depth image,mostly used in virtual immersive games,interactive tasks [25],[26],and so on.Leap Motion is a kind of equipment for 3D data extraction based on vision technology.This device could extract the position of the hand joints,orientation and the speed of the fingertips movement.Recently,Leap Motion has always been used by researchers for gesture recognition and prediction.For example,some scholars use it to recognize American sign language (ASL)[27],[28],and it has a high gesture recognition accuracy.Moreover,Zenget al.[29] proposed a gesture recognition method based on deterministic learning and joint calibration of the Leap Motion.And Marinet al.[30] developed a method to combine the Leap Motion and Kinect to calculate different features of hand,and a higher accuracy was obtained.In this work,we use the data of hand key points detected by the Leap Motion to predict gestures and utilize the gesture recognition results to play the finger-guessing game.This game contains three gestures: rock,paper and scissors.The winning rules of this game are: scissors wins paper,paper wins rock,rock wins scissors.Based on these game rules,this paper proposes a method to judge gestures in advance when the player has not completed the action.

    The combination of the Leap Motion and LSTM significantly improves human-robot interaction.The Leap Motion could track each joint of the hand directly and has the ability to recognize or predict gestures.Moreover,compared with other devices,the Leap Motion has higher localization precision.On the other hand,the LSTM can solve the prediction problem well in most cases,and it is one of the important algorithms of deep learning (DL).This work combines the strengths of the LSTM and Leap Motion to predict gestures.Leap Motion captures 21 three-dimensional joint coordinates in each frame,and the LSTM network is used to train and test these features.This work has some novel contributions:

    1) A method for predicting gestures based on the LSTM is proposed.The data of gestures is collected by the Leap Motion.

    2) In order to reduce or eliminate the jitter or jump generated in the process of acquiring data by the Leap Motion,the Kalman filter is applied to solve this problem effectively.

    3) We propose a reliable feature extraction method,which extracts coordinate features,length features,angle features and angular velocity features,and combines these features to predict gestures.

    4) We apply the trained model to the NAO robot and make it play the finger-guessing game with players,which effectively verifies the real-time and accuracy of the proposed approach.

    The rest part of this paper is structured as below: in Section II,the process of processing data is given.In Section III,the experiment of this work is introduced in detail and the effectiveness is verified in this section.Finally,Section IV makes a summary.The framework of this paper is shown in Fig.1.

    II.DATA PROCESSING

    A.Leap Motion Controller

    The structure of the Leap Motion is not complicated,as shown in Fig.2.The main part of the device includes two cameras and three infrared LEDs.They tracked infrared light outside the visible spectrum,which has a wavelength of 850 nanometers.Compared with other depth cameras,such as the Kinect,the information obtained from the Leap Motion is limited (only a few key points rather than complete depth information) and it works in smaller three-dimensional areas.However,it is more accurate to use Leap Motion to acquire data.Moreover,Leap Motion provides software that can recognize some movement patterns,including swipe,tap and so on.Developers can access some functions of Leap Motion through the application programming interface (API) to create new applications.For example,they can obtain information about the position and length of the user’s hand to recognize different gestures.

    Fig.2.The structure of the Leap Motion.

    Even though the manufacturers declare an accuracy of the Leap Motion in position measurement is around 0.01 mm,[31] shows that it is about 0.2 mm for static measurements and 0.4 mm for dynamic measurements in fact.And in the coordinates of the finger joints extracted by Leap Motion,there exists jitter or even jump,which could affect the accuracy of the experimental results.In order to reduce or eliminate these phenomena,this work takes advantage of the Kalman filter to correct the predicted position of hand joints.

    B.Data Acquisition

    Each finger is marked with a name: thumb,index,middle,ring,and pinky,including four bones (except thumb).As shown in Fig.3,the phalanx of the finger includes the metacarpal,proximal phalanx,middle phalanx,and distal phalanx.Particularly,the thumb has only three phalanges,one less than the other.In the algorithm design,we set the length of the thumb metacarpal bone to 0 to guarantee that all five fingers have the same number of phalanges,which is easy to programme.In this work,the main data acquired by the Leap Motion are as follows:

    1) Number of Detected Fingers:Num∈[1,5] is the number of fingers detected by Leap Motion.

    2) Position of the Finger Joints:Pi,i=1,2,3,...,20 contains the three-dimensional position of each finger joint.The Leap Motion provides a one-to-one map between coordinates and finger joints.

    3) Palm Center:Pc(x0,y0,z0) represents three-dimensional coordinates of the center of the palm area in 3D space.

    4) Fingertips Movement Speed:Vrepresents the speed in the three-dimensional direction of each fingertip detected by the Leap Motion.

    Fig.3.Definition of endoskeleton in Leap Motion.

    C.Kalman Filter

    1) Problem Formulation:In the process of gesture changes,the fingertips have the largest range of change and can more easily jitter or jump than other joints,therefore,the Kalman filter is used to process the data from fingertips.Compared with other filters,such as the particle filter,the Luenburger observer filter,etc.,the Kalman filter has sufficient accuracy and can effectively remove Gaussian noise.In addition,its low computational complexity meets the real-time requirements of this work.Therefore,the Kalman filter is used for this work.

    Suppose that the current position of the fingertips obtained by Leap Motion isPt,and the speed isVt.The Kalman filter assumes that these two variables obey a Gaussian distribution,and each variable has a mean value of μ,and variance of σ2.For clarity,Xtdenotes the best estimate at timet,andYtdenotes the covariance matrix.The equations ofXtandYtare as follows:

    2) The Prediction Process:We need to predict the current state (timet) according to the state of the last time (timet-1).This prediction process can be described as follows:

    where Δtis the time interval,which depends on the data acquisition rate of the Leap Motion,and α is the rate of speed change.

    The matrixFtis used to represent the prediction matrix,so(5) can be represented as follows:

    and through the basic operation of covariance,Ytcan be expressed as the following equation:

    3) Refining the Estimate With Measurements:From the measured sensor data,the current state of the system can be guessed roughly.However,due to uncertainty,some states may be closer to the real state than the measurements acquired from the Leap Motion directly.In this work,covarianceRtis used to express the uncertainty (such as the sensor noise),and the mean value of the distribution is defined asZt.

    Now,there are two Gaussian distributions,one near the predicted value and the other near the measured value.Therefore,two Gaussian distributions are supposed to be multiplied to calculate the optimal solution between the predicted value and the measured value of the Leap Motion,as shown in the following equations:

    where μ0,σ0represent the mean and variance of the predicted values,respectively.μ1,σ1represent the mean and variance of the measured values,respectively.μ′,σ′represent the mean and variance of the calculated values,respectively.

    By substituting (8) into (9),we can get the following equations:

    the same parts of (10) and (11) are represented byk

    Therefore,(10) and (11) can be converted as follows:

    4) Integrate All the Equations:In this section,the equations of the Kal man filter used in this paper are integrated.

    There are two Gaussian distributions,the prediction part

    and the measurement part

    We then put them into (13) and (14) to get the following equation:

    And according to (12),the Kalman gain is as follows:

    Next,the above three formulas are simplified.On both sides of (17)and(18),we left multiply the inverse matrix ofFt.On both sides of(18),weright multiply the inverse matrix of.We then can get the following simplified equation:

    is the new optimal estimation of the data collected by the Le ap Motion,which we putalong withinto the next prediction and update the equation,and iterate continuously.Through the above steps,the data collected by the Leap Motion could be more accurate.

    D.Feature Extraction

    Now,after filtering the original data,we analyze four features acquired from the filtered data.These features are introduced in the rest of this section.

    ● Coordinate:x,y,andzcoordinates of the hand joints obtained by the Leap Motion.

    ● Length:The distance from the fingertips to center of the hand.

    ● Angle:The angle between the Proximal phalanx and Intermediate phalanx of each finger (except Thumb).

    ● Angular velocity:The rate of the joints angle change.

    1) Coordinate Feature:As shown in Fig.4(a),this feature set represents the position of the finger joints in three dimensional space.The original data take the Leap Motion as the coordinate origin as shown in Fig.5.With the movement of the hand,the obtained data could change a lot,which has a certain impact on the experimental results.For the purpose of eliminating the influence of different coordinate systems,the coordinate origin is changed to the palm center,as shown in Fig.4(a).Taking the palm of the hand as the plane,the direction from palm center to the root of the middle finger is the positive direction of they-axis.The positive direction of thex-axis is the direction perpendicular to they-axis and to the right.Through the coordinate origin,perpendicular to this plane is thez-axis.

    The positive direction of they-axis in the new coordinate system can be represented by the following vector:

    Similarly,the positive direction of thex-axis in the new coordinate system can be expressed by the following vector:

    And the positive direction of thez-axis in the new coordinate system can be expressed by the following vector:

    The coordinate representation in the new coordinate system is

    Fig.4.Four different types of features extracted from the Leap Motion.

    Fig.5.The coordinate system of the Leap Motion.

    where (,,) represents the new coordinate after coordinate conversion,i=1,2,...,20 represent the points corresponding to the finger joints.Through the above equations,we can get new coordinates with the palm center as the origin of the coordinate system.Because each three-dimensional coordinate is a array of length 3,the actual dimension of the coordinate feature is 3 × 20 = 60.

    2) Length Feature:As shown in Fig.4(b),this feature refers to the length from each fingertip to the center of the palm.The coordinates of the joints collected from the Leap Motion are used to calculate length information.It can be found that the fingertips are the most variable joints,so (31) is used to calculate the distance between the palm center and the fingertips.

    wherei=4,8,12,16,20 represent the points corresponding to the fingertips in Fig.4(b),and the dimension of length feature is 5.

    3) Angle Feature:As shown in Fig.4(c),this feature represents the angle between Proximal phalanx and Intermediate phalanx of each finger (except Thumb),and the angle extracted from the thumb is between the Intermediate phalanx and Distal phalanx.The calculation process is as follows:

    4) Angular Velocity Feature:As shown in Fig.4(d),this feature represents the rate of the joint angle change.As shown in the following equation:

    wheretis the current time,Δtis the time interval,which depends on the Leap Motion’s sampling time.The dimension of the angular velocity feature is 5.

    E.Gesture Prediction

    The method in the previous section produces four different features,and each feature represents some information related to the performed gesture.In this section,the LSTM network[24] used for gesture prediction is described in detail.The internal structure of the LSTM is shown in Fig.6,wherextdenotes the input of the LSTM network andhtdenotes the output of the LSTM network.ftis the forget gate variables,itis the input gate variables,andotis the output gate variables.The subscriptstandt-1 represent the current time and previous time.ctandare the memory cell state and the memory gate,respectively.The notation of σlstmandtanhdenote the sigmoid and hyperbolic activation functions as shown in (36) and (37).

    The relevant parameters of the LSTM can be calculated by the following equations:

    Fig.6.The internal structure of the LSTM.

    Fig.7.Collect gesture paper data by using the Leap Motion.

    where subscripts off,i,o,andcare related to the parameters of the forget gate,input gate,output gate and memory cell.The parametersWf,Wi,Wc,andWodenote the weight matrices of the corresponding subscripts.Similarly,bf,bi,bc,andbopresent the biases corresponding to subscripts of the LSTM network.The notation of * denotes the element-wise product between vectors.

    In the process of data collection,the change of gestures can be divided into three stages,as shown in Fig.7.For a clearer description,we take the process of turning rock into paper as an example to explain these three stages:

    1) The Original Stage:As shown in Figs.7(a) and 7(b),the gestures at this stage are close to the original state,that is,the gesture is similar to a rock.

    2) The Intermediate Stage:As shown in Figs.7(c) and 7(d),the gestures at this stage change significantly compared to the original stage,that is,the five fingers clearly show different degrees of openness.

    3) The Completion Stage:As shown in Figs.7(e) and 7(f),the gestures at this stage are close to the completion,that is,the gestures tend to being paper.

    Since different players perform actions at different speeds,each action contains 2–6 frames.For the purpose of uniformity,theTof LSTM is set to 4,that is,4 frames of data are input into the LSTM network for prediction.This process is shown in Fig.8.The input layer of the LSTM network is features obtained by the Leap Motion.These features are the coordinate,length,angle and angular velocity calculated from Section II-D,and their dimensions are 60,5,5,and 5,respectively.In addition,the hidden layer of the LSTM network contains 100 nodes.The output of the LSTM network is the result of gesture prediction with the dimension of 3,that is,rock,scissors and paper.With the LSTM network,we can predict the gestures accurately,and the classification results will be sent to a social robot for interaction and reaction.

    Fig.8.The process of the LSTM for predicting gestures.

    III.EXPERIMENT

    A.Experimental Setup

    In this section,the performance and efficiency of the proposed framework are tested.The experiments were carried out on a laptop with an Intel Core i5-6200U CPU.The dynamic gestures of rock,paper and scissors are collected from five different players,and each player repeats each gesture 300 times at fast,medium,and slow speeds,for a total of 4500 different data samples.The experimental results of the network trained by the four features and their combination are compared.

    B.Kalman Filter

    In Section II-C,the Kalman filter is introduced in detail.In this section,it is verified by an experiment,and the measured position is directly obtained by the Leap Motion.The Kalman filter is used to process the original coordinate data to make the processed data closer to the real value.As can be seen from Fig.9,the processed data is much smoother.

    Fig.9.Data processed by Kalman filter.

    C.Experimental Result

    According to the description in Section II-D,we extract the three-dimensional coordinates feature,length feature,angle feature,and angular velocity feature from the filtered data,and train them.Figs.10 and 11 show the accuracy of features using the classification algorithm of Section II-E.

    The three-dimensional positions of the finger joints show that the accuracy of gesture prediction is 97.93%.The length feature and the angle feature have an accuracy of 95.17% and 93.79%,respectively.The angular velocity feature has lower performance,it has an accuracy of 79.31%.It is affected by the speed of the player’s movement,so it is not fast enough to make an accurate prediction.

    The combination of multiple features could enrich the input of the neural network.In some cases,it maybe improve the performance of the prediction.As can be seen from Fig.11,the combination of coordinate features,length features and angle features achieve the highest accuracy of 99.31%,better than any of the three features alone.These results suggest that different features can represent different attributes of the hand and include complementary information.

    Fig.10.The experimental results of four features.

    Fig.11.The experimental results of the combination of four features.

    We examine whether the proposed method is able to achieve real-time gesture recognition and prediction.As shown in Figs.12 and 13,it is obvious that the method proposed in this work can predict the gesture of the fingerguessing game very well.For example,when the player’s gesture changes from rock to paper,the proposed method can predict that the player’s gesture is paper before all fingers are fully open.In addition,we also verify the prediction results of the proposed method from different angles of the Leap Motion to the hand,as shown in Fig.14.

    D.Application

    Fig.12.The prediction process of turning rock into paper.

    Fig.13.The prediction process of turning rock into scissors.

    Fig.14.Predicted results from different angles of the Leap Motion to hand.

    In order to further prove the effectiveness of the proposed method,the trained network is applied to the humanoid robot NAO,as shown in Fig.15.The NAO is an autonomous,programmable humanoid robot which is designed by Aldebaran Robotics [32].The height of the NAO is 573.2 mm and the weight of it is 4.5 kg.It has two cameras,voice recognition,voice synthesis and powered by LiPo Battery.What’s more,it consists of four microphones,two sonar emitters and receivers,two IR emitters and receivers,and three tactile sensors on the top of head.

    In this work,we mainly use the NAO robot’s left hand to play the finger-guessing game with the player.As shown in Fig.16,the NAO robot has only three fingers,and they are linked.Therefore,we first define that the full opening of the robot fingers is paper,the half opening of the robot fingers is scissors,and the clenched fingers are rock.

    Then,the trained model is applied to the NAO robot,and the experimental results are shown in Fig.17.The Leap Motion is used to predict gestures,and then the computer sends the results to the NAO robot,so that the NAO robot can win or lose the game through some simple judgments.

    Fig.17.The experimental results with the NAO robot.

    IV.CONCLUSION

    Fig.15.Experimental equipment and platform.

    Fig.16.The NAO robot and rock-paper-scissors gesture.

    In this paper,a gesture prediction framework based on the Leap Motion is proposed.In the process of data acquisition by the Leap Motion,some jumps or jitters maybe occur.Therefore,the Kalman filter is used to solve these problems.Then,based on the original coordinate features collected by the Leap Motion,we extract three new features,namely,the length feature,angle feature and angular velocity feature.The LSTM network is used to train the model for gesture prediction.In addition,the trained model is applied to the NAO robot to verify the real-time and effectiveness of the proposed method.

    99久久人妻综合| 大片免费播放器 马上看| 国产女主播在线喷水免费视频网站| 成人美女网站在线观看视频| 18禁在线无遮挡免费观看视频| 亚洲欧美成人综合另类久久久| 国产av码专区亚洲av| 在线观看人妻少妇| 能在线免费看毛片的网站| 国产欧美另类精品又又久久亚洲欧美| 欧美日韩综合久久久久久| 日日啪夜夜爽| 少妇猛男粗大的猛烈进出视频 | 一区二区三区乱码不卡18| 我的女老师完整版在线观看| 久久女婷五月综合色啪小说 | 人妻一区二区av| 中文字幕制服av| 插阴视频在线观看视频| 我的女老师完整版在线观看| 成人毛片60女人毛片免费| 日日摸夜夜添夜夜添av毛片| 美女高潮的动态| 91在线精品国自产拍蜜月| 99久久精品热视频| 交换朋友夫妻互换小说| 国产午夜精品一二区理论片| 少妇 在线观看| 亚洲欧美日韩卡通动漫| 日韩大片免费观看网站| 日本-黄色视频高清免费观看| 欧美97在线视频| 性色av一级| 能在线免费看毛片的网站| 一级毛片电影观看| 国产精品99久久久久久久久| 插逼视频在线观看| 欧美xxⅹ黑人| 亚洲精品中文字幕在线视频 | 天堂俺去俺来也www色官网| 亚洲欧美日韩另类电影网站 | 午夜福利视频1000在线观看| 国产伦精品一区二区三区视频9| 波野结衣二区三区在线| 日日啪夜夜爽| 熟女人妻精品中文字幕| 国产精品精品国产色婷婷| 九九久久精品国产亚洲av麻豆| 久久女婷五月综合色啪小说 | 又黄又爽又刺激的免费视频.| 人妻少妇偷人精品九色| av国产精品久久久久影院| 777米奇影视久久| 亚洲精品日韩在线中文字幕| 国产欧美亚洲国产| 成人欧美大片| 狠狠精品人妻久久久久久综合| 午夜福利视频1000在线观看| 欧美一区二区亚洲| 国产男人的电影天堂91| 一区二区三区精品91| 白带黄色成豆腐渣| 可以在线观看毛片的网站| 如何舔出高潮| 99久国产av精品国产电影| 少妇裸体淫交视频免费看高清| 99久久精品热视频| 午夜福利高清视频| 白带黄色成豆腐渣| 午夜免费男女啪啪视频观看| 免费看日本二区| 大香蕉97超碰在线| 日本av手机在线免费观看| 国产伦精品一区二区三区视频9| 日韩一本色道免费dvd| 波野结衣二区三区在线| 亚洲人成网站在线观看播放| 久久久久久久国产电影| 久久人人爽人人片av| 日韩免费高清中文字幕av| 激情五月婷婷亚洲| 日韩大片免费观看网站| 日韩欧美精品v在线| 九九在线视频观看精品| 国产伦在线观看视频一区| 久久久久久久亚洲中文字幕| 青春草国产在线视频| 18禁裸乳无遮挡免费网站照片| 观看美女的网站| 五月玫瑰六月丁香| 国产 一区精品| 91久久精品电影网| 亚洲av二区三区四区| 久热这里只有精品99| 尤物成人国产欧美一区二区三区| 久久人人爽人人片av| videossex国产| 日本爱情动作片www.在线观看| 久久久久久久午夜电影| 亚洲伊人久久精品综合| 久久99精品国语久久久| 1000部很黄的大片| 国产精品一区二区三区四区免费观看| 人妻 亚洲 视频| 国产亚洲av片在线观看秒播厂| 天堂俺去俺来也www色官网| 一级av片app| 晚上一个人看的免费电影| 亚洲国产精品成人综合色| 最近最新中文字幕免费大全7| 搞女人的毛片| 成人欧美大片| 黄色欧美视频在线观看| 欧美高清性xxxxhd video| 欧美xxxx性猛交bbbb| 在现免费观看毛片| 18禁裸乳无遮挡免费网站照片| 午夜免费鲁丝| 国产高清国产精品国产三级 | 一区二区三区精品91| 日韩av在线免费看完整版不卡| 国产av码专区亚洲av| 免费大片黄手机在线观看| 久久精品国产亚洲av涩爱| 国产又色又爽无遮挡免| 国产精品麻豆人妻色哟哟久久| 蜜桃久久精品国产亚洲av| 干丝袜人妻中文字幕| 男人爽女人下面视频在线观看| 一级毛片我不卡| 少妇猛男粗大的猛烈进出视频 | 亚洲在线观看片| 欧美成人午夜免费资源| 特大巨黑吊av在线直播| 国产精品国产三级国产专区5o| 国内精品宾馆在线| 久久久久九九精品影院| 亚洲欧美一区二区三区国产| 在线观看免费高清a一片| 欧美日韩视频高清一区二区三区二| 国产男女内射视频| 日日撸夜夜添| 在线观看一区二区三区| 国产精品偷伦视频观看了| 免费高清在线观看视频在线观看| 永久网站在线| av.在线天堂| 欧美区成人在线视频| 国产欧美亚洲国产| 在线精品无人区一区二区三 | 精品人妻一区二区三区麻豆| 久久精品国产亚洲av天美| 亚洲精品国产av蜜桃| 美女视频免费永久观看网站| 国产69精品久久久久777片| 国产成人aa在线观看| 麻豆成人午夜福利视频| 99久国产av精品国产电影| 精品久久久精品久久久| 成人无遮挡网站| 亚洲高清免费不卡视频| 亚洲国产精品成人综合色| 一个人观看的视频www高清免费观看| 亚洲,一卡二卡三卡| 欧美日韩视频高清一区二区三区二| 观看美女的网站| 天美传媒精品一区二区| 国产精品女同一区二区软件| 欧美一区二区亚洲| 嫩草影院精品99| 激情 狠狠 欧美| 少妇裸体淫交视频免费看高清| 亚洲av一区综合| 亚洲av电影在线观看一区二区三区 | 哪个播放器可以免费观看大片| 久久国产乱子免费精品| 亚洲自拍偷在线| 在线观看一区二区三区| 久久精品久久久久久噜噜老黄| 久久国产乱子免费精品| 日韩伦理黄色片| 在线观看一区二区三区| 精品久久久久久电影网| 日韩制服骚丝袜av| 少妇人妻久久综合中文| 五月玫瑰六月丁香| 日本猛色少妇xxxxx猛交久久| 亚洲精品国产成人久久av| 18禁在线播放成人免费| 观看美女的网站| 国产av国产精品国产| av卡一久久| 日本一二三区视频观看| 一本一本综合久久| 黄色视频在线播放观看不卡| 亚洲国产精品成人久久小说| 亚洲欧美一区二区三区国产| 日韩一区二区三区影片| 干丝袜人妻中文字幕| 别揉我奶头 嗯啊视频| 别揉我奶头 嗯啊视频| 免费黄频网站在线观看国产| 日韩视频在线欧美| av专区在线播放| av在线亚洲专区| 国产一区二区三区综合在线观看 | 18禁裸乳无遮挡动漫免费视频 | 新久久久久国产一级毛片| 观看免费一级毛片| 久久精品熟女亚洲av麻豆精品| 欧美高清成人免费视频www| 色5月婷婷丁香| 成年版毛片免费区| 欧美日韩综合久久久久久| 国产高潮美女av| 午夜福利在线在线| 日本午夜av视频| 少妇人妻 视频| 日韩三级伦理在线观看| 久久精品综合一区二区三区| 丰满乱子伦码专区| 69av精品久久久久久| 亚洲综合精品二区| 国产精品一区www在线观看| 国产视频内射| 18禁动态无遮挡网站| 久久久a久久爽久久v久久| 可以在线观看毛片的网站| 亚洲在久久综合| 亚洲成色77777| 麻豆久久精品国产亚洲av| 汤姆久久久久久久影院中文字幕| 欧美bdsm另类| 欧美精品国产亚洲| 中文字幕久久专区| 国产成人精品久久久久久| 欧美另类一区| 建设人人有责人人尽责人人享有的 | 一区二区三区免费毛片| 精品一区二区三卡| 午夜福利在线在线| 久久韩国三级中文字幕| 又黄又爽又刺激的免费视频.| 亚洲av国产av综合av卡| 一二三四中文在线观看免费高清| 国产日韩欧美亚洲二区| 国产精品不卡视频一区二区| 网址你懂的国产日韩在线| 国产永久视频网站| 欧美日韩视频精品一区| 一本—道久久a久久精品蜜桃钙片 精品乱码久久久久久99久播 | 精品一区在线观看国产| 亚洲av电影在线观看一区二区三区 | 在线 av 中文字幕| 女人被狂操c到高潮| tube8黄色片| 国产精品一区二区三区四区免费观看| 午夜福利网站1000一区二区三区| 免费看a级黄色片| 亚洲av.av天堂| 我要看日韩黄色一级片| 大片免费播放器 马上看| 欧美xxxx性猛交bbbb| 国产爽快片一区二区三区| 美女被艹到高潮喷水动态| 国产成人freesex在线| av在线蜜桃| 精华霜和精华液先用哪个| 亚洲国产欧美在线一区| 91在线精品国自产拍蜜月| 国产亚洲av嫩草精品影院| 免费看av在线观看网站| 国产色爽女视频免费观看| 久久久久久久久久成人| 在线天堂最新版资源| 黄片无遮挡物在线观看| 亚洲精品乱久久久久久| 观看美女的网站| 日本欧美国产在线视频| 女的被弄到高潮叫床怎么办| 免费少妇av软件| 亚洲国产日韩一区二区| 永久免费av网站大全| 最新中文字幕久久久久| 午夜激情久久久久久久| 欧美激情在线99| 亚洲精品第二区| 亚洲天堂国产精品一区在线| 国产伦在线观看视频一区| 亚洲第一区二区三区不卡| 欧美日韩在线观看h| 国产精品人妻久久久影院| 在线观看美女被高潮喷水网站| 大香蕉97超碰在线| 国产精品久久久久久久久免| 免费少妇av软件| 岛国毛片在线播放| 91久久精品电影网| 又大又黄又爽视频免费| 91久久精品国产一区二区三区| 97热精品久久久久久| 免费看日本二区| 一级毛片aaaaaa免费看小| 成年版毛片免费区| 亚洲精品日韩在线中文字幕| 99久久中文字幕三级久久日本| 日本av手机在线免费观看| 久久久久精品性色| 伊人久久国产一区二区| 麻豆成人av视频| 男女啪啪激烈高潮av片| 草草在线视频免费看| 欧美日韩视频高清一区二区三区二| 在现免费观看毛片| 99久久人妻综合| 亚洲,欧美,日韩| 国产乱人视频| 亚洲av男天堂| 久久97久久精品| 午夜激情福利司机影院| 国产综合懂色| 网址你懂的国产日韩在线| 亚洲欧美精品专区久久| 亚洲国产欧美人成| 亚洲欧美清纯卡通| 精品99又大又爽又粗少妇毛片| 97超碰精品成人国产| 三级国产精品欧美在线观看| 精品一区二区免费观看| 国产综合精华液| 国产 精品1| 一级爰片在线观看| 看十八女毛片水多多多| 亚洲天堂av无毛| 亚洲最大成人av| 亚洲成人久久爱视频| 天堂网av新在线| 国产精品久久久久久久电影| 视频区图区小说| 69人妻影院| 天堂俺去俺来也www色官网| 人妻 亚洲 视频| 成人毛片a级毛片在线播放| 国产成人免费观看mmmm| av在线播放精品| 成人二区视频| 亚洲精华国产精华液的使用体验| 国产免费福利视频在线观看| 国产av不卡久久| 日韩一区二区三区影片| 久久精品熟女亚洲av麻豆精品| 九九在线视频观看精品| 欧美区成人在线视频| 亚洲一区二区三区欧美精品 | 特大巨黑吊av在线直播| 人人妻人人爽人人添夜夜欢视频 | 国产精品久久久久久精品电影| 亚洲欧美日韩卡通动漫| 国产精品伦人一区二区| 国产老妇女一区| a级毛片免费高清观看在线播放| av黄色大香蕉| 国产一区二区三区av在线| 精品一区二区三区视频在线| 国产精品99久久久久久久久| 国产国拍精品亚洲av在线观看| 亚洲精品国产av蜜桃| 久久久午夜欧美精品| 欧美xxxx黑人xx丫x性爽| 免费人成在线观看视频色| 久久久久精品久久久久真实原创| 18禁裸乳无遮挡免费网站照片| 日日啪夜夜爽| 欧美日韩综合久久久久久| 国产精品久久久久久精品电影| 久久午夜福利片| 久久精品国产亚洲av天美| 亚洲色图av天堂| 精品国产三级普通话版| 91精品伊人久久大香线蕉| 欧美高清性xxxxhd video| 岛国毛片在线播放| 在线观看一区二区三区| 卡戴珊不雅视频在线播放| 免费黄网站久久成人精品| 网址你懂的国产日韩在线| 精品一区二区免费观看| 国产美女午夜福利| 欧美人与善性xxx| 国产伦精品一区二区三区视频9| 简卡轻食公司| 99久久人妻综合| 久久女婷五月综合色啪小说 | 天美传媒精品一区二区| 成人亚洲精品av一区二区| 亚洲无线观看免费| 五月伊人婷婷丁香| 免费观看性生交大片5| 日本黄色片子视频| 亚洲经典国产精华液单| 成人黄色视频免费在线看| 欧美zozozo另类| 久久久久久久国产电影| 精品少妇黑人巨大在线播放| 国产精品av视频在线免费观看| 一二三四中文在线观看免费高清| 18禁动态无遮挡网站| 高清午夜精品一区二区三区| 少妇人妻 视频| 插阴视频在线观看视频| 国产色爽女视频免费观看| 99视频精品全部免费 在线| 自拍欧美九色日韩亚洲蝌蚪91 | 久久鲁丝午夜福利片| av国产免费在线观看| 亚洲色图综合在线观看| 男女那种视频在线观看| 国产av码专区亚洲av| 一级a做视频免费观看| 最近的中文字幕免费完整| 久久久精品免费免费高清| 五月伊人婷婷丁香| 久久精品久久精品一区二区三区| 午夜精品国产一区二区电影 | 久久久亚洲精品成人影院| 噜噜噜噜噜久久久久久91| 久久人人爽人人片av| 午夜免费男女啪啪视频观看| 成人黄色视频免费在线看| 欧美zozozo另类| 别揉我奶头 嗯啊视频| 日韩欧美 国产精品| 九色成人免费人妻av| 建设人人有责人人尽责人人享有的 | 街头女战士在线观看网站| 男男h啪啪无遮挡| 狠狠精品人妻久久久久久综合| 国产av码专区亚洲av| 高清av免费在线| 亚洲人与动物交配视频| 成人免费观看视频高清| 欧美日韩精品成人综合77777| 日韩欧美精品免费久久| 成人综合一区亚洲| 男的添女的下面高潮视频| 欧美zozozo另类| 久久精品国产亚洲av天美| 熟女电影av网| 精品一区在线观看国产| 99热这里只有是精品在线观看| 色网站视频免费| 亚洲精品色激情综合| 亚洲精品亚洲一区二区| 三级经典国产精品| 国产成年人精品一区二区| 99热这里只有是精品在线观看| 内射极品少妇av片p| 各种免费的搞黄视频| 99久国产av精品国产电影| 国产精品不卡视频一区二区| eeuss影院久久| 能在线免费看毛片的网站| 中文资源天堂在线| 国产午夜精品久久久久久一区二区三区| 成人无遮挡网站| 国产探花在线观看一区二区| 婷婷色麻豆天堂久久| 成人无遮挡网站| 亚洲av成人精品一区久久| av国产免费在线观看| 六月丁香七月| 在线观看一区二区三区| 久久韩国三级中文字幕| 男女那种视频在线观看| 亚洲精品日韩av片在线观看| 亚洲aⅴ乱码一区二区在线播放| 男人添女人高潮全过程视频| 一区二区三区精品91| 女人十人毛片免费观看3o分钟| 久久这里有精品视频免费| 免费看a级黄色片| 99久久精品国产国产毛片| 80岁老熟妇乱子伦牲交| 欧美日韩精品成人综合77777| 国产片特级美女逼逼视频| 亚洲精品一二三| 日韩三级伦理在线观看| 久久精品夜色国产| 亚洲电影在线观看av| 国产男女内射视频| 干丝袜人妻中文字幕| av在线蜜桃| 欧美日韩视频高清一区二区三区二| 午夜精品一区二区三区免费看| 午夜日本视频在线| 亚洲欧美清纯卡通| 免费看不卡的av| 亚洲美女视频黄频| 亚洲精品第二区| 少妇高潮的动态图| 搡老乐熟女国产| 亚洲美女搞黄在线观看| 日产精品乱码卡一卡2卡三| 欧美zozozo另类| 日韩电影二区| 男女下面进入的视频免费午夜| 国产亚洲精品久久久com| 看十八女毛片水多多多| 欧美精品国产亚洲| 直男gayav资源| 午夜福利高清视频| 婷婷色综合www| 久久久久国产网址| 综合色av麻豆| 亚洲国产精品国产精品| av国产免费在线观看| 国产精品熟女久久久久浪| av福利片在线观看| 激情五月婷婷亚洲| 国产 一区 欧美 日韩| av播播在线观看一区| 2021少妇久久久久久久久久久| 国产成人精品福利久久| 2018国产大陆天天弄谢| 真实男女啪啪啪动态图| 亚洲精品成人久久久久久| 久久久久网色| 夜夜看夜夜爽夜夜摸| 美女xxoo啪啪120秒动态图| 久热这里只有精品99| 搡女人真爽免费视频火全软件| 久久久久性生活片| 欧美三级亚洲精品| 久久99热这里只频精品6学生| 赤兔流量卡办理| 亚洲综合色惰| 高清毛片免费看| 午夜福利在线在线| 国产美女午夜福利| av天堂中文字幕网| 少妇猛男粗大的猛烈进出视频 | 亚洲国产精品专区欧美| 韩国av在线不卡| 交换朋友夫妻互换小说| 精品久久久久久久末码| 91精品一卡2卡3卡4卡| 国产成人福利小说| av线在线观看网站| 久久精品国产自在天天线| 日韩 亚洲 欧美在线| 欧美精品一区二区大全| 嫩草影院入口| 精品酒店卫生间| 亚洲最大成人av| 看黄色毛片网站| av免费在线看不卡| 新久久久久国产一级毛片| 黄片无遮挡物在线观看| a级毛色黄片| 永久网站在线| 国产精品女同一区二区软件| 香蕉精品网在线| 蜜臀久久99精品久久宅男| 校园人妻丝袜中文字幕| av线在线观看网站| 成人国产麻豆网| 又爽又黄无遮挡网站| 久久99精品国语久久久| 麻豆成人午夜福利视频| av在线蜜桃| videossex国产| 欧美日韩综合久久久久久| 国产综合精华液| 搞女人的毛片| 久久精品综合一区二区三区| 99久久精品一区二区三区| 日韩大片免费观看网站| tube8黄色片| 日韩不卡一区二区三区视频在线| 欧美极品一区二区三区四区| 国产午夜精品久久久久久一区二区三区| 狂野欧美白嫩少妇大欣赏| 国产免费一级a男人的天堂| 亚洲综合精品二区| 最近中文字幕2019免费版| 日本爱情动作片www.在线观看| 免费大片黄手机在线观看| 国产精品精品国产色婷婷| 国产毛片在线视频| 欧美少妇被猛烈插入视频| 看黄色毛片网站| 少妇的逼好多水| 人妻 亚洲 视频| 国产国拍精品亚洲av在线观看| 免费人成在线观看视频色| 国产欧美另类精品又又久久亚洲欧美| 在线观看一区二区三区激情| 七月丁香在线播放| 国产亚洲av片在线观看秒播厂| 久久久久久久久大av| 蜜桃久久精品国产亚洲av| 少妇被粗大猛烈的视频| 久久久久久久大尺度免费视频| 哪个播放器可以免费观看大片| 狠狠精品人妻久久久久久综合| 欧美日韩视频精品一区| 亚洲人成网站在线播| 美女主播在线视频| 高清毛片免费看| 精品国产乱码久久久久久小说| 高清毛片免费看| 搡女人真爽免费视频火全软件| 国产乱人偷精品视频| 亚洲欧美成人综合另类久久久| 国产日韩欧美亚洲二区|