• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Service Robot Localization Based on Global Vision and Stereo Vision

    2012-02-07 07:46:48YUQingxiao于清曉YANWeixin閆維新FUZhuangZHAOYanzheng趙言正

    YU Qing-xiao(于清曉),YAN Wei-xin(閆維新),F(xiàn)U Zhuang(付 莊),ZHAO Yan-zheng(趙言正)*

    1 State Key Laboratory of Mechanical System and Vibration,Shanghai Jiaotong University,Shanghai 200240,China

    2 State Key Laboratory of Robotics and System,Harbin Institute of Technology,Harbin 150001,China

    Introduction

    Mobile robot localization and object pose estimation in a working environment have been central research activities in mobile robotics.Solution of mobile robot localization needs to address two main problems[1]:the robotmusthave a representation of the environment;the robot must have a representation of its belief regarding its pose in this environment.Sensors are the basis of addressing both problems.Many of the sensors available to mobile robots are introduced in Ref.[1],giving their basic principles and performance limitations.Based on this introduction,ultrasonic sensors[2,3],goniometers[4],laser range finders[5,6],and charge coupled device(CCD)cameras[7,8]are sensors which are commonly applied in mobile robot localization projects to gather high precision information on the robot's pose estimation.

    Most mobile robot localization approaches can be widely classified into three major types:metric,topological,and hybrid.Metric approaches[9,10]are useful when it is necessary for the robot to know its position accurately in terms of metric coordinates.And,the state of the robot can also be represented in a more qualitative manner,by using a topological map[11,12].Butthese methods confrontmany difficulties when the environments are changed.A vision-based localization and mapping algorithm which used scale-invariant image feature(STIF)was applied for mobile robot localization and map building in Ref.[13].Miro et al.[14]used the binocular vision system to generate the disparity map by matching feature and realized the self-localization and self-navigation.Grzyb et al.[15]adopted stereo images and biologically inspired algorithms to accurately estimate the pose,size,and shape of the target object.Greggio et al.[16]applied stereo cameras and a real-time tracking algorithm to realize pattern recognition and evaluate the 3D position of a generic spatial object.Chinellato et al.[17]proposed a visual analysis which was based on a computational model of distance and orientation estimation inspired by human visual mechanisms to extract the features and pose of the object.

    Considering the positioning cost and computational efficiency,a hierarchical localization method based on vision is proposed to provide the absolute coordinates in this paper.The proposed algorithm can estimate the real-time coordinates well,and obtain up to±2 cm positioning accuracy to make sure that the robot can successfully grab the object.

    1 Hierarchical Positioning Method

    1.1 Global vision-based coarse localization

    The estimation of the mobile robot's pose is a fundamental problem,which can be roughly divided into two classes[13]:methods for keeping track of the robot's pose and methods for global pose estimation.Most ofthe present research has concentrated on the first class,which assumes that the initial pose of the robot is known[18].In this study,both methods are used to position the mobile robot in the initial stage.The global vision-based localization method is used to calculate the initial position oftherobotand periodically update itscurrent coordinates,and the dead-reckoning method is also used to localize the robot as a supplementary method.In the following part,the global vision-based localization method is explained in detail.

    One monocular camera mounted on the ceiling is used to acquire and transmit the global image to the robot through the wireless network.The image is firstly processed to remove the image distortion.Then,color segmentation method is used to position the robot in the image and distinguish the color mark on the head in the Hue-Saturation-Value(HSV)color space.After distinguishing and extracting the color mark,the coordinates of the color mark can be obtained to calculate the coordinates of the robot,according to the global vision model shown in Fig.1.

    In Fig.1,(0,0)denotes the origin of the camera coordinate system;HCdenotes the height of the camera relative to the floor;HRdenotes the height of the robot;(XRC,YRC)is the center coordinate of the robot;(XHC,YHC)is the center coordinate of the color mark.

    According to the geometric relationship,the following equation can be obtained:

    Then,the following results can be calculated as:

    Fig.1 The global vision camera model

    In Eq.(2),we know that(XRC,YRC)is dependent on the values of HC,HR,and(XHC,YHC).But the values of HCand HRare invariant in this study,and(XRC,YRC)is only dependent on(XHC,YHC).

    Four coordinate systems:global coordinate system(GCS),robot coordinate system(RCS),sensor coordinate system (SCS),and virtualglobalcoordinate system(VGCS)[19]are all under consideration.So,the appropriate transformation between VGCS and SCS should be acquired to calculate the absolute coordinates of the robot.According to the geometric relationship in Fig. 1, the homogeneous transformation matrix T between OXYZ and O′X′Y′Z′can be obtained as:

    Then,the coordinates of the robot in VGCS can be calculated as:

    In the study,VGCS is coincident with GCS in addition to different scale.So,the absolute coordinates of the robot can be achieved as:

    where Kheightand Kwidthare the relative scales to the VGCS;m and n are shift distances of VGCS relative to SCS.

    In this way,the robot can know its initial position wherever it is placed and periodically update its current coordinates.Furthermore,the real-time positioning of the robot can be realized in indoor environment with the help of the deadreckoning method.

    1.2 Stereo vision-based precise localization

    When the service robot moves into the area where the object is placed under the guidance of the global vison,the binocular vision-based localization method is adopted to provide high positioning accuracy to successfully grab the object for the customer.In this study,color segmentation and shape-based matching[20]are separately applied to detect the object,and the extracted results are fused to obtain the accurate object.The color segmentation method has been discussed.Below,we describe the shape-based matching method.

    The advantage ofshape-based matching isitsgreat robustness and flexibility.Instead of using the gray values,features along contours are extracted and used for the model generation and matching.The method is invariant to changes in illumination and variations of object gray values.The method also allows the object to be rotated and scaled.The process of shape-based matching is divided into two distinct phases:the training phase and the recognition phase.In the first phase,the template model should be created and trained.In the second phase,the model is used to find and localize the object in the image.The basic concept of shape-based image matching is shown in Fig.2.

    Fig.2 The basic concept of shape-based image matching

    Using the mentioned mehods,the regions of interest(ROI)are seperately obtained. Then,the following information fusion method is adopted to find the expected region of the object.Suppose that image regions A are obtained using color segmentation method and image regions B are also obtained using shape-based matching method.The center points of image regions A and B are all extracted and compared one by one.If the difference between the center coordinates of one A image region and the center coordinates of one B image region is less than the default constant ε,the image area belongs to the expected regions of interest.Otherwise,the image area is the error detected region.Then,the expected regions of interest can be obtained as:

    where(xi,yi)is the center coordinates of the sub-region of the image regions A;(xj,yj)is the center coordinates of the j subregion of the image regions B.

    According to Eq.(6),the expected regions can be obtained to find and match the object.After obtaining the plate region,the normalized cross-correlation(NCC)matching method[21]is adopted to obtain the depth image.Then,the 3D coordinates of the color mark,which is on the object,can be calculated using the binocular vision system.To localize the mobile robot,the homogeneous transformation matrix between binocular coordinate system and RCS should be firstly obtained.Figure 3 shows the relationship between binocular coordinate system and RCS.

    Fig.3 The diagram of binocular coordinate system and RCS

    In Fig.3,α and θ are the rotation degrees of the head;O′X′Y″Z denotesRCS,and O″X″Y″Z″ denotesbinocular coordinate system.When α and θ are equal to zero,O″X″Y″Z″shifts negative 6 cm in the X-axis direction relative to O′X′Y′Z′.According to these information, the homogeneous transformation matrix T between O′X′Y′Z′and O″X″Y″Z″can be got as:

    Then,the coordinates of the color mark in RCS can be calculated as :

    where[x′ y′ z′]Tis the location of the color mark in RCS;[x″ y″ z″]Tis the location of the color mark in binocular coordinate system.

    In addition to obtain the 3D coordinates of the color mark,the angle of the object relative to the robot should be also calculated by extracting and filtering the edge lines of the object.Detailed process is as follows.Firstly,the region of the object is processed using the expansion algorithm and canny operator.Then,the extracted edge lines are filtered using the threshold filter to obtain the available edge lines which can denote the angle of the object.Then,the selected edge lines are filtered using the mean filter,which can be used to calculate the angle of the object.Finally,the angle of the object relative to the robot,φobject,can be reckoned as follows.

    where φ denotes the detected angle of the object in the image and θ is the rotation angle of the neck joint.

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.Periodically localize the position of the color mark using the method above and adjust the pose of the robot until the robot successfully realizes its precise positioning relative to the color mark.The location-based visual servo control system is shown in Fig.4.

    Fig.4 The chart of the location-based visual servo control system

    2 Experiments and Result Analysis

    2.1 Experiment setup

    The experimental platform is a restaurant robot as shown in Fig.5 which is equipped with one binocular camera.The model of the binular camera is bumblebee2 which is made by Point Grey Research.There is a global camera mounted on the ceiling which is used to obtain the whole environmental image.The global images with 768×576 pixels are sent to the mobile robot through the wirless ethernet.The binocular camera mounted on the robot head can send the stereo images with 512×384 piexls to the robot through 1 394 bus.Before the experiment,the binocular camera has been calibrated in advance.

    The complete movement of the service robot can be divided into two parts:coarse localization and precise localization.In the following parts,we explain the experimental results of the complete movement from the two aspects.

    Fig.5 The photo of the restaurant service robot

    2.2 Global vision-based coarse localization

    In this part,the global vision localization method can be explained through the following example in this study.The mobile robot receives the image of the restaurant from the global camera through the wirless ethernet.To acquire the available value from the image,we should firstly rectify the raw image.Figure 6 is the rectified global image obtained from the camera.To localize the robot in the image,we should find the ROI and detect the color mark on the robot head using the color segmentation method.The result of the image processing is shown in Fig.7.

    After the location of the robot is detected in the image,the location of the robot relative to the pinhole camera coordinate system should be calculated.We have known the size of the picture in advance.And the height of the camera relative to the floor and the height of the robot are invariant and also known in advance.Because the position and the parameters of the global camera are invariant,the scales of the pixel relative to the one centimeter in the X-axis and Y-axis directions can be acquired.These parameters which are used to calculate the location of the robot,are all shown in Table 1.

    Table 1 The used parameters in this experiment

    In Fig.7,the origin of the picture coordinate system is in the top left-hand corner of the picture.And the coordinates of the color mark in the picture coordinate system have been calculated.According to Fig.1,we obtain the transformation coordinates,(X'HC,Y'HC),in the pinhole coordinate system.Then,the result can be calculated as(-68.50,- 311.33).After(X'HC,Y'HC)is obtained,the robot can be positioned in the image as(-39.14,-177.90).

    Next,the real coordinates of the robot in GCS should be reckoned.The origin of GCS is in the top right-hand of the carpet array shown in Fig.7.And,the coordinates of the top right-hand of the carpet array is(216,237)in the pinhole coordinate system.So,the coordinates of the robot in GCS is calculated as

    Many experiment results show that the accuracy of global vision positioning is within ±7 cm.In this study,the coordinates of the robot is updated every 100 ms to make sure that the robot can obtain its real-time coordinates.

    2.3 Stereo vision-based precise localization

    When the robot moves into the area where the object is placed under the guidance of the global vison,the stereo visionbased localization method is adopted toprovide the high positioning accuracy.The acquired images from binocular camera are firstly rectified using image correction algorithm to prepare the available images for the next process.The rectified image from the stereo vision during the robot movement is shown in Fig.8.

    Fig.8 The rectified stereo vision image

    In this study,the image taken by the right camera is used as the reference image where the ROI is extracted to realize the object feature matching and obtain the disparity map of the object.The object region is detected and extracted using color segmentation and shape-based matching methods.The procedure of color segmentation is as follows.Firstly,HSV color spaces could be obtained through the image color space conversion.Then,the appropriate thresholds are set in the Hue and Saturation space components.Finally,the plate region is extracted from the reference image.And the plate region can be also found in the image using the shape-based matching method.Each center coordinates of the extracted regions of interest can be firstly extracted,then,the expected region can be obtained according to Eq.(6).The plate region in Fig.9 is the expected region which is obtained according to the methods above.

    Fig.9 The expected region of interest

    Next,we should find the characteristic sign in the expected region and match image using NCC algorithm.In the process,the consistency test between the left and right images is also used to find the accuracy matching.The obtained depth image is shown in Fig.10.

    Fig.10 The obtained depth image using NCC

    From Fig.10,we can know that the depth value of the characteristic sign is 40.07.According to the binocular vision system,the coordinates of the characteristic mark can be calculated as[1.4 0.2 118.9]T.Otherwise,using the encoders,we can know that α =45°and θ=0°.Then,using Eq.(7),the homogeneous transformation matrix T,between RCS and binocular vision coordinate system is denoted as follows:

    Finally,according to Eq.(8),the result can be reckoned as[x′ y′ z′]=[- 4.6 88.5 83.9].Otherwise,the angle of the object relative to the robot should be calculated to determine the position of the color mark.In experiment,firstly,the edge lines of the object are extracted using canny operator.Then,the extracted edge lines are filtered using the threshold filter to find the available lines.Finally,the selected lines are filtered using the mean filter to calculate the angle of the object.The obtained lines using the method above are shown in Fig.11.From Fig.11,we know that the angle of the object is-3.865°.

    Fig.11 The extracted edge lines of the object

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.And in the process of the movement,the robot should constantly repeat the positioning process above and adjust its pose until the precise positioning is finally realized.

    We performed the whole movement at the velocity of 0.15 m/s for sixty times in our library,and the initial position of the robot was random and unknown each time.And,the service robot could always realize self-localization with the positioning accuracy up to±2 cm and successfully grab the object for the customers within 3 min.The statistical data are shown in Table 2.

    Table 2 The application and positioning accuracy of the hierarchical localization method

    3 Conclusions and Future Work

    In this paper,a method of mobile robot localization which is based on visual feedback is proposed and applied on the restaurant service robot.The whole positioning process can be divided into two stages:coarse positioning and precise positioning.Hierarchical positioningmethod is adopted to provide different positioning accuracies in different stages.In the coarse positioning stage,the robot can periodically obtain its current coordinates using the global vision-based localization method.In the precise positioning stage,the robot can obtain higher localization precision using the binocular vision-based localization method.Finally,the robot can successfully grasp the plate on the table.Many experiments verify that the proposed algorithm has good location effect and obtains up to±2 cm positioning accuracy.

    Currently,research on the improvement in the positioning accuracy and speed of the algorithm is under way.And our future work will also focus on improving the anti-interference ability to the more complex environment.

    [1]Siegwart R,Nourbakhsh I.Introduction to Autonomous Mobile Robots[M].Cambridge:The MIT Press,2004:121-156.

    [2]Choi B S,Lee J J.Mobile Robot Localization Scheme Based on RFID andSonarFusion System [C].IEEE International Symposium on Industrial Electronics,Seoul,Korea,2009:1035-1040.

    [3]Choi B S,Lee J J.Localization of a Mobile Robot Based on an Ultrasonic Sensor Using Dynamic Obstacles[J].Artificial Life and Robotics,2008,12(1/2):280-283.

    [4]Bonnifait P,Garcia G.Design and Experimental Validation of an Odometric and Goniometric Localization System for Outdoor Robot Vehicles[J].IEEE Transactionson Roboticsand Automation,1998,14(4):541-548.

    [5]Balaguer B,Carpin S,Balakirsky S. Towards Quantitative Comparisons of Robot Algorithms:Experiences with SLAM in Simulation and Real World Systems[C].IEEE/RSJ International Conference on Intelligent Robots and Systems,California,USA,2007:1-7.

    [6]Lin H H,Tsai C C.Laser Pose Estimation and Tracking Using Fuzzy Extended Information Filtering for an Autonomous Mobile Robot[J].Journal of Intelligent and Robotic Systems,2008,53(2):119-143.

    [7]Munguia R,Grau A.Monocular SLAM for Visual Odometry[C]. IEEE InternationalSymposium on IntelligentSignal Processing,Madrid,Spain,2007:1-6.

    [8]Lin H Y,Lin J H,Wang M L.A Visual Positioning System for Vehicle or Mobile Robot Navigation[C].Proceedings of the 8th InternationalIEEE Conference on Intelligent Trasportation Systems,Vienna,Austria,2005:73-78.

    [9]Blanco J L,Gonzalez J,F(xiàn)ernandez-Madrigal J A.Consistent Observation Grouping for Generating Metric-Topological Maps that Improves Robot Localization[C].Proceeding of IEEE International Conference on Robotics and Automation,Orlando,USA,2006:818-823.

    [10]Guo Y,Xu X H.Color Landmark Design for Mobile Robot Localization[C].IMACS Multiconference on Computational Engineering in Systems Applications,Beijing,China,2006:1868-1874.

    [11]Kwon T B,Yang J H,Song J B,et al.Efficiency Improvement in Monte Carlo Localization through Topological Information[C].Proceeding of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems,Beijing,China,2006:424-429.

    [12]Tapus A,Siegwart R.A Cognitive Modeling of Space Using Fingerprints ofPlaces for Mobile Robot Navigation [C].Proceedings of the IEEE International Conference on Robotics and Automation,Orlando,USA,2006:1188-1193.

    [13]Borenstein J,Everett B,F(xiàn)eng L.Navigation Mobile Robot:System and Techniques[M].Wellesley,USA:A.K.Peters Ltd.,1996:103-150.

    [14]Miro J V,Zhou W Z,Dissanayake G.Towards Vision Based Navigation in Large Indoor Environments[C]. IEEE International Conference on Intelligent Robots and Systems,Beijing,China,2006:2096-2102.

    [15]Grzyb B,Chinellato E,Morales A,et al.A 3D Grasping System Based on MultimodalVisualand TactileProcessing [J].Industrial Robot:An International Journal,2009,36(4):365-369.

    [16]Greggio N,Bernardino A,Laschi C,et al.Real-Time 3D Stereo Tracking and Localizing of Spherical Objects with the iCub Robotic Platform [J].Journal of Intelligent and Robotic Systems,2011,63(3/4):417-446.

    [17]Chinellato E,Grzyb B J,del Pobil A P.Brain Mechanisms for Robotic ObjectPose Estimation[C]. InternationalJoint Conference on Neural Networks,Hong Kong,China,2008:3268-3275.

    [18]Lang H X,Wang Y,de Silva C W.Mobile Robot Localization and Object Pose Estimation Using Optical Encoder,Vision and Laser Sensors[C].Proceedings of the IEEE International Conference on Automation and Logistics,Qingdao,China,2008:617-622.

    [19]Yu Q X,Yuan C,F(xiàn)u Z,et al.Research of the Localization of Restaurant Service Robot[J].International Journal of Advanced Robotic Systems,2010,7(3):227-238.

    [20]El Munim H A,F(xiàn)arag A A. Shape Representation and Registration Using Vector Distance Functions[C].Proceedings of theIEEE Conference on ComputerVision and Pattern Recognition,Minneapolis,USA,2007:1-8.

    [21]Sun Z X,Wu Q.TS201 Based Fast Algorithm of Normalized Cross-Correlation[J].Modern Electronics Technique,2010,33(10):125-127.(in Chinese)

    麻豆一二三区av精品| 国产一区二区三区在线臀色熟女| 精品一区二区三区视频在线| 乱码一卡2卡4卡精品| 成人国产综合亚洲| 草草在线视频免费看| 日韩欧美免费精品| 狂野欧美白嫩少妇大欣赏| 国产 一区 欧美 日韩| 免费无遮挡裸体视频| 午夜影院日韩av| 国产黄a三级三级三级人| 美女cb高潮喷水在线观看| 欧美极品一区二区三区四区| 欧美日韩瑟瑟在线播放| 国内揄拍国产精品人妻在线| 制服丝袜大香蕉在线| x7x7x7水蜜桃| 国产成人一区二区在线| 搡老岳熟女国产| 亚洲av一区综合| 亚洲人成伊人成综合网2020| av在线亚洲专区| 22中文网久久字幕| 国产一区二区三区在线臀色熟女| 久久精品国产99精品国产亚洲性色| 亚洲成av人片在线播放无| 99久久久亚洲精品蜜臀av| 97超视频在线观看视频| 免费观看的影片在线观看| 久久精品国产鲁丝片午夜精品 | 国产精品三级大全| 在线免费十八禁| 国内精品久久久久精免费| 在线天堂最新版资源| 精品一区二区免费观看| 亚洲欧美清纯卡通| 最近最新中文字幕大全电影3| 国语自产精品视频在线第100页| 五月伊人婷婷丁香| 一区二区三区免费毛片| 淫妇啪啪啪对白视频| 日本黄色视频三级网站网址| 变态另类丝袜制服| 在线免费观看不下载黄p国产 | 国产伦人伦偷精品视频| 免费黄网站久久成人精品| 国产毛片a区久久久久| www日本黄色视频网| 69人妻影院| 久久草成人影院| 搡老熟女国产l中国老女人| av在线观看视频网站免费| a级毛片a级免费在线| 18禁黄网站禁片午夜丰满| 99久久成人亚洲精品观看| 婷婷亚洲欧美| 在线免费观看的www视频| 一本久久中文字幕| 高清在线国产一区| 人妻夜夜爽99麻豆av| 精品午夜福利在线看| 99久久精品国产国产毛片| 国产在线精品亚洲第一网站| 久久久久久九九精品二区国产| 亚洲av成人av| 国内精品久久久久久久电影| 国产av麻豆久久久久久久| 亚洲性久久影院| 亚洲国产精品合色在线| 最好的美女福利视频网| 在线观看美女被高潮喷水网站| 国产视频内射| 国产精品人妻久久久久久| 97碰自拍视频| 狠狠狠狠99中文字幕| 国产精品久久久久久久久免| 国内精品一区二区在线观看| 精品一区二区免费观看| 久久久国产成人免费| 免费观看精品视频网站| 久久久午夜欧美精品| 丰满乱子伦码专区| 老司机午夜福利在线观看视频| 成人永久免费在线观看视频| 在线观看一区二区三区| 春色校园在线视频观看| 免费一级毛片在线播放高清视频| 免费在线观看影片大全网站| 免费看美女性在线毛片视频| 亚洲人成网站在线播| 少妇高潮的动态图| 国产精品一区二区三区四区久久| bbb黄色大片| 中文字幕av在线有码专区| 成人国产一区最新在线观看| 日本a在线网址| 亚洲av不卡在线观看| 国产精品乱码一区二三区的特点| 亚洲av美国av| 两人在一起打扑克的视频| 亚洲va日本ⅴa欧美va伊人久久| 亚洲国产精品sss在线观看| 国产高清有码在线观看视频| 亚洲av成人精品一区久久| 日韩欧美 国产精品| 久久久久久久午夜电影| 精品久久久久久成人av| 欧美日韩黄片免| 变态另类丝袜制服| 在线看三级毛片| 亚洲性久久影院| 午夜a级毛片| 18禁裸乳无遮挡免费网站照片| 精品一区二区三区av网在线观看| 欧美+亚洲+日韩+国产| 亚洲狠狠婷婷综合久久图片| 亚洲久久久久久中文字幕| 精品不卡国产一区二区三区| 精品午夜福利在线看| 狂野欧美白嫩少妇大欣赏| 国产精品99久久久久久久久| 久久久久久久久大av| 亚洲四区av| av专区在线播放| 日日啪夜夜撸| 亚洲国产精品成人综合色| 久久精品国产鲁丝片午夜精品 | 欧美色欧美亚洲另类二区| 天天一区二区日本电影三级| 精品乱码久久久久久99久播| aaaaa片日本免费| 伊人久久精品亚洲午夜| 88av欧美| 日本黄色片子视频| .国产精品久久| 国产91精品成人一区二区三区| 一本一本综合久久| 内射极品少妇av片p| 精品一区二区免费观看| 51国产日韩欧美| 特级一级黄色大片| 久久99热这里只有精品18| 欧美色视频一区免费| 人人妻人人看人人澡| 深夜a级毛片| 中亚洲国语对白在线视频| 乱码一卡2卡4卡精品| 亚洲 国产 在线| 狠狠狠狠99中文字幕| 老女人水多毛片| 国产免费一级a男人的天堂| 国产精品爽爽va在线观看网站| 一级av片app| 国产高清不卡午夜福利| 黄色视频,在线免费观看| 免费黄网站久久成人精品| 永久网站在线| 日本爱情动作片www.在线观看 | 国产老妇女一区| 国产成年人精品一区二区| 久久精品国产亚洲网站| 亚洲专区国产一区二区| 狂野欧美激情性xxxx在线观看| 女人被狂操c到高潮| 国产老妇女一区| 亚洲av日韩精品久久久久久密| 亚洲欧美精品综合久久99| 欧美日韩综合久久久久久 | 老女人水多毛片| 男人舔奶头视频| 国产黄色小视频在线观看| 免费在线观看成人毛片| 在线a可以看的网站| 99在线人妻在线中文字幕| 午夜爱爱视频在线播放| 久久久久久久久久成人| 变态另类丝袜制服| 久久久久精品国产欧美久久久| 日韩欧美 国产精品| 日韩欧美免费精品| 成人一区二区视频在线观看| 免费看日本二区| 在线a可以看的网站| 精品久久久久久久久av| 日本三级黄在线观看| 亚洲中文字幕一区二区三区有码在线看| 国国产精品蜜臀av免费| av黄色大香蕉| 人人妻,人人澡人人爽秒播| 天天躁日日操中文字幕| 在线播放无遮挡| 91麻豆精品激情在线观看国产| 少妇人妻精品综合一区二区 | 精品99又大又爽又粗少妇毛片 | 免费看av在线观看网站| 欧美三级亚洲精品| 一个人免费在线观看电影| 久久久久久久亚洲中文字幕| 日韩中字成人| 一级黄片播放器| 白带黄色成豆腐渣| 赤兔流量卡办理| 少妇熟女aⅴ在线视频| av中文乱码字幕在线| 亚洲精品国产成人久久av| 三级毛片av免费| 成人美女网站在线观看视频| 12—13女人毛片做爰片一| 久久这里只有精品中国| 毛片女人毛片| 亚洲成av人片在线播放无| 亚洲成人中文字幕在线播放| 国产精品自产拍在线观看55亚洲| 成年女人毛片免费观看观看9| 变态另类成人亚洲欧美熟女| 国产精品无大码| 日本爱情动作片www.在线观看 | 亚洲av成人精品一区久久| 神马国产精品三级电影在线观看| 色播亚洲综合网| 国产aⅴ精品一区二区三区波| 成年女人看的毛片在线观看| 亚洲精品国产成人久久av| 美女 人体艺术 gogo| 少妇熟女aⅴ在线视频| 美女黄网站色视频| 别揉我奶头~嗯~啊~动态视频| 天堂av国产一区二区熟女人妻| 国产av在哪里看| 久久6这里有精品| 精品人妻一区二区三区麻豆 | 国产精品人妻久久久久久| 亚洲av中文av极速乱 | 精品无人区乱码1区二区| 久久久久久久久久久丰满 | 国产精品一区二区三区四区免费观看 | 中文在线观看免费www的网站| 国产高清不卡午夜福利| 一进一出好大好爽视频| 日韩亚洲欧美综合| 色综合婷婷激情| 精华霜和精华液先用哪个| 欧美激情在线99| 亚洲天堂国产精品一区在线| 麻豆久久精品国产亚洲av| 嫩草影院精品99| 婷婷丁香在线五月| .国产精品久久| 亚洲av成人av| 亚洲成a人片在线一区二区| 国产精品99久久久久久久久| 丰满的人妻完整版| 久99久视频精品免费| 在线播放无遮挡| 一a级毛片在线观看| 亚洲欧美清纯卡通| 中文字幕久久专区| 午夜福利高清视频| 美女高潮喷水抽搐中文字幕| 少妇人妻精品综合一区二区 | 亚洲,欧美,日韩| 婷婷精品国产亚洲av在线| 99九九线精品视频在线观看视频| 午夜激情福利司机影院| 伦理电影大哥的女人| 亚洲av.av天堂| 在线免费十八禁| a级一级毛片免费在线观看| 亚洲精品成人久久久久久| 久久亚洲真实| 欧美最黄视频在线播放免费| 两个人的视频大全免费| 变态另类丝袜制服| 成人鲁丝片一二三区免费| 日本黄大片高清| 最近最新免费中文字幕在线| 天天一区二区日本电影三级| 嫩草影院精品99| 久久草成人影院| 国产高清不卡午夜福利| 91麻豆精品激情在线观看国产| 日本爱情动作片www.在线观看 | 亚洲内射少妇av| 午夜免费激情av| 中文在线观看免费www的网站| 国语自产精品视频在线第100页| 天堂动漫精品| 美女高潮的动态| 欧美区成人在线视频| 精品乱码久久久久久99久播| 99热这里只有是精品在线观看| 天天一区二区日本电影三级| 天堂动漫精品| 久久久久性生活片| 亚洲va日本ⅴa欧美va伊人久久| 精品久久久久久久久亚洲 | 亚洲性久久影院| 免费看a级黄色片| 免费看美女性在线毛片视频| 欧美极品一区二区三区四区| 麻豆成人av在线观看| 国产精品日韩av在线免费观看| 亚洲人与动物交配视频| 又黄又爽又免费观看的视频| 色综合亚洲欧美另类图片| 美女 人体艺术 gogo| 亚洲色图av天堂| 男女边吃奶边做爰视频| 99热这里只有精品一区| 国产免费一级a男人的天堂| 一个人免费在线观看电影| 观看免费一级毛片| 嫩草影视91久久| 观看美女的网站| 九色国产91popny在线| 国产亚洲精品久久久久久毛片| 国产一区二区在线观看日韩| 少妇猛男粗大的猛烈进出视频 | 黄片wwwwww| 亚洲最大成人手机在线| 亚洲国产日韩欧美精品在线观看| 97人妻精品一区二区三区麻豆| 毛片一级片免费看久久久久 | 亚洲电影在线观看av| 色哟哟·www| 最新在线观看一区二区三区| 伊人久久精品亚洲午夜| 成人av一区二区三区在线看| 亚洲av成人av| 在线观看午夜福利视频| 看黄色毛片网站| 欧美日本亚洲视频在线播放| 国产高清有码在线观看视频| av黄色大香蕉| 国产极品精品免费视频能看的| 国产精品人妻久久久影院| 亚洲人与动物交配视频| 国产成人一区二区在线| 18禁黄网站禁片免费观看直播| 噜噜噜噜噜久久久久久91| 精品人妻熟女av久视频| 国产精品99久久久久久久久| 黄片wwwwww| 亚洲四区av| 小蜜桃在线观看免费完整版高清| 午夜福利在线在线| 搡老妇女老女人老熟妇| 亚洲精品乱码久久久v下载方式| 中文亚洲av片在线观看爽| 亚洲18禁久久av| av在线老鸭窝| 国产高清视频在线观看网站| 国产精品电影一区二区三区| 久久久久久国产a免费观看| 美女被艹到高潮喷水动态| 国产精品一区二区三区四区久久| 窝窝影院91人妻| 床上黄色一级片| 啦啦啦韩国在线观看视频| 国产国拍精品亚洲av在线观看| 人妻丰满熟妇av一区二区三区| 精品午夜福利在线看| 精品久久久噜噜| 亚洲自偷自拍三级| av专区在线播放| 国产精品三级大全| 欧美日本视频| 黄色视频,在线免费观看| 在线观看舔阴道视频| 欧美性感艳星| 赤兔流量卡办理| 国产精品免费一区二区三区在线| 一级av片app| 91久久精品电影网| 免费观看人在逋| 国产aⅴ精品一区二区三区波| 亚洲av中文av极速乱 | 91麻豆av在线| av.在线天堂| 国产黄a三级三级三级人| 精品一区二区三区视频在线| 中文字幕av在线有码专区| 亚洲欧美日韩无卡精品| 嫩草影院入口| 天堂网av新在线| 自拍偷自拍亚洲精品老妇| 久久亚洲真实| 国产精品乱码一区二三区的特点| 亚洲精品影视一区二区三区av| 在线a可以看的网站| 日韩高清综合在线| 亚洲人成网站在线播放欧美日韩| 最近视频中文字幕2019在线8| 国产精品一区二区免费欧美| 久久欧美精品欧美久久欧美| 一个人观看的视频www高清免费观看| 国内精品久久久久精免费| 婷婷色综合大香蕉| 在线免费十八禁| 婷婷亚洲欧美| 极品教师在线免费播放| 欧美+日韩+精品| 日日干狠狠操夜夜爽| 成年版毛片免费区| 99久久精品国产国产毛片| av在线蜜桃| 亚洲人与动物交配视频| 深夜a级毛片| 一卡2卡三卡四卡精品乱码亚洲| 男女下面进入的视频免费午夜| 国产av一区在线观看免费| 免费电影在线观看免费观看| 成人欧美大片| 美女xxoo啪啪120秒动态图| 欧美+日韩+精品| 久久久久久久精品吃奶| 少妇高潮的动态图| 高清毛片免费观看视频网站| 午夜福利高清视频| 干丝袜人妻中文字幕| 国产午夜精品久久久久久一区二区三区 | 亚洲无线观看免费| 色噜噜av男人的天堂激情| 中亚洲国语对白在线视频| 久久久久久久久中文| 色在线成人网| 精品一区二区三区视频在线| 国产精品一区二区三区四区免费观看 | 国产精品一区二区三区四区免费观看 | 免费观看的影片在线观看| 99久久久亚洲精品蜜臀av| 少妇人妻精品综合一区二区 | 日韩国内少妇激情av| 毛片一级片免费看久久久久 | 毛片一级片免费看久久久久 | 欧美黑人巨大hd| 夜夜看夜夜爽夜夜摸| 99精品久久久久人妻精品| 国产av在哪里看| 亚洲av不卡在线观看| 他把我摸到了高潮在线观看| 日韩一区二区视频免费看| 亚洲最大成人av| 久久久久久久久久成人| 亚洲五月天丁香| av.在线天堂| 18禁黄网站禁片午夜丰满| 亚洲成人精品中文字幕电影| 亚洲美女搞黄在线观看 | 久久久国产成人免费| av天堂在线播放| 欧美日韩亚洲国产一区二区在线观看| 国产 一区精品| 天堂影院成人在线观看| 九九在线视频观看精品| 制服丝袜大香蕉在线| 久久久久久久久久久丰满 | 久久久久久久久久黄片| 极品教师在线视频| 一进一出抽搐动态| 国产伦一二天堂av在线观看| 日韩高清综合在线| 欧美国产日韩亚洲一区| 动漫黄色视频在线观看| or卡值多少钱| 观看免费一级毛片| 国产高清视频在线观看网站| 久久久精品大字幕| 久久中文看片网| 91麻豆av在线| 午夜福利18| 在线免费十八禁| 永久网站在线| 亚洲成人久久爱视频| 深夜a级毛片| 一本精品99久久精品77| 国产精品爽爽va在线观看网站| 国产精品国产高清国产av| 亚洲四区av| 国产三级中文精品| 亚洲不卡免费看| 亚洲av成人av| 一个人看视频在线观看www免费| 久久久久国内视频| 亚洲精品亚洲一区二区| 日韩欧美在线二视频| 99在线人妻在线中文字幕| 熟妇人妻久久中文字幕3abv| 日韩高清综合在线| av中文乱码字幕在线| 日本在线视频免费播放| 欧美一区二区国产精品久久精品| 白带黄色成豆腐渣| 熟女人妻精品中文字幕| avwww免费| 人妻丰满熟妇av一区二区三区| 国产一区二区三区在线臀色熟女| 亚洲色图av天堂| 国产淫片久久久久久久久| 亚洲精品日韩av片在线观看| 国产在线精品亚洲第一网站| 黄色视频,在线免费观看| 国产高清不卡午夜福利| 亚洲人成伊人成综合网2020| www.色视频.com| 成人精品一区二区免费| 亚洲黑人精品在线| 日韩人妻高清精品专区| 亚洲国产精品久久男人天堂| 色综合站精品国产| 国产白丝娇喘喷水9色精品| 99在线视频只有这里精品首页| 亚洲欧美日韩无卡精品| 亚洲av成人精品一区久久| 欧美一区二区亚洲| 久久香蕉精品热| 午夜激情福利司机影院| 丰满的人妻完整版| 在线播放无遮挡| 午夜老司机福利剧场| 此物有八面人人有两片| 国产精品永久免费网站| 日韩强制内射视频| 国产亚洲精品久久久久久毛片| 97超级碰碰碰精品色视频在线观看| 一进一出好大好爽视频| 欧美高清成人免费视频www| 欧美成人a在线观看| 又粗又爽又猛毛片免费看| 久久精品国产亚洲av天美| 成人特级av手机在线观看| .国产精品久久| 亚洲人成伊人成综合网2020| 国产精品一及| 欧美人与善性xxx| 桃红色精品国产亚洲av| 2021天堂中文幕一二区在线观| 国产综合懂色| 制服丝袜大香蕉在线| 亚洲中文字幕一区二区三区有码在线看| 成人二区视频| 亚洲av成人av| 成年版毛片免费区| 精品无人区乱码1区二区| 亚洲综合色惰| 国产精品亚洲美女久久久| 国内精品宾馆在线| 欧美日韩精品成人综合77777| 免费人成在线观看视频色| 少妇猛男粗大的猛烈进出视频 | 国产视频一区二区在线看| 99热这里只有是精品在线观看| 国产精品国产三级国产av玫瑰| 欧美绝顶高潮抽搐喷水| 日韩亚洲欧美综合| 最新中文字幕久久久久| 午夜视频国产福利| 一区二区三区免费毛片| 久久人人精品亚洲av| 亚洲在线观看片| 女人十人毛片免费观看3o分钟| 12—13女人毛片做爰片一| 免费黄网站久久成人精品| 国产精品爽爽va在线观看网站| 国产熟女欧美一区二区| 国产v大片淫在线免费观看| 变态另类丝袜制服| 免费av不卡在线播放| 成年免费大片在线观看| 亚洲av成人av| 婷婷精品国产亚洲av| 韩国av在线不卡| 亚洲自偷自拍三级| 免费无遮挡裸体视频| 在线看三级毛片| 精品午夜福利在线看| 欧美不卡视频在线免费观看| x7x7x7水蜜桃| 深夜精品福利| 看十八女毛片水多多多| 亚洲人成伊人成综合网2020| 亚州av有码| 老熟妇乱子伦视频在线观看| 久久久久性生活片| 国产男人的电影天堂91| 国产精品国产高清国产av| x7x7x7水蜜桃| 国产精品一区二区三区四区久久| 精品福利观看| 国产69精品久久久久777片| 99在线人妻在线中文字幕| 变态另类成人亚洲欧美熟女| 干丝袜人妻中文字幕| 亚洲精品粉嫩美女一区| 变态另类成人亚洲欧美熟女| 欧美中文日本在线观看视频| 国产黄a三级三级三级人| 又紧又爽又黄一区二区| 国产v大片淫在线免费观看| a级毛片免费高清观看在线播放| 亚洲国产高清在线一区二区三| 午夜福利成人在线免费观看| 嫩草影院精品99| 亚洲成人久久性| 精品人妻一区二区三区麻豆 | 亚洲四区av| 国产精品av视频在线免费观看| 亚洲,欧美,日韩| 国产美女午夜福利| 深夜精品福利| .国产精品久久| 99久国产av精品| 亚洲狠狠婷婷综合久久图片| 悠悠久久av|