• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Service Robot Localization Based on Global Vision and Stereo Vision

    2012-02-07 07:46:48YUQingxiao于清曉YANWeixin閆維新FUZhuangZHAOYanzheng趙言正

    YU Qing-xiao(于清曉),YAN Wei-xin(閆維新),F(xiàn)U Zhuang(付 莊),ZHAO Yan-zheng(趙言正)*

    1 State Key Laboratory of Mechanical System and Vibration,Shanghai Jiaotong University,Shanghai 200240,China

    2 State Key Laboratory of Robotics and System,Harbin Institute of Technology,Harbin 150001,China

    Introduction

    Mobile robot localization and object pose estimation in a working environment have been central research activities in mobile robotics.Solution of mobile robot localization needs to address two main problems[1]:the robotmusthave a representation of the environment;the robot must have a representation of its belief regarding its pose in this environment.Sensors are the basis of addressing both problems.Many of the sensors available to mobile robots are introduced in Ref.[1],giving their basic principles and performance limitations.Based on this introduction,ultrasonic sensors[2,3],goniometers[4],laser range finders[5,6],and charge coupled device(CCD)cameras[7,8]are sensors which are commonly applied in mobile robot localization projects to gather high precision information on the robot's pose estimation.

    Most mobile robot localization approaches can be widely classified into three major types:metric,topological,and hybrid.Metric approaches[9,10]are useful when it is necessary for the robot to know its position accurately in terms of metric coordinates.And,the state of the robot can also be represented in a more qualitative manner,by using a topological map[11,12].Butthese methods confrontmany difficulties when the environments are changed.A vision-based localization and mapping algorithm which used scale-invariant image feature(STIF)was applied for mobile robot localization and map building in Ref.[13].Miro et al.[14]used the binocular vision system to generate the disparity map by matching feature and realized the self-localization and self-navigation.Grzyb et al.[15]adopted stereo images and biologically inspired algorithms to accurately estimate the pose,size,and shape of the target object.Greggio et al.[16]applied stereo cameras and a real-time tracking algorithm to realize pattern recognition and evaluate the 3D position of a generic spatial object.Chinellato et al.[17]proposed a visual analysis which was based on a computational model of distance and orientation estimation inspired by human visual mechanisms to extract the features and pose of the object.

    Considering the positioning cost and computational efficiency,a hierarchical localization method based on vision is proposed to provide the absolute coordinates in this paper.The proposed algorithm can estimate the real-time coordinates well,and obtain up to±2 cm positioning accuracy to make sure that the robot can successfully grab the object.

    1 Hierarchical Positioning Method

    1.1 Global vision-based coarse localization

    The estimation of the mobile robot's pose is a fundamental problem,which can be roughly divided into two classes[13]:methods for keeping track of the robot's pose and methods for global pose estimation.Most ofthe present research has concentrated on the first class,which assumes that the initial pose of the robot is known[18].In this study,both methods are used to position the mobile robot in the initial stage.The global vision-based localization method is used to calculate the initial position oftherobotand periodically update itscurrent coordinates,and the dead-reckoning method is also used to localize the robot as a supplementary method.In the following part,the global vision-based localization method is explained in detail.

    One monocular camera mounted on the ceiling is used to acquire and transmit the global image to the robot through the wireless network.The image is firstly processed to remove the image distortion.Then,color segmentation method is used to position the robot in the image and distinguish the color mark on the head in the Hue-Saturation-Value(HSV)color space.After distinguishing and extracting the color mark,the coordinates of the color mark can be obtained to calculate the coordinates of the robot,according to the global vision model shown in Fig.1.

    In Fig.1,(0,0)denotes the origin of the camera coordinate system;HCdenotes the height of the camera relative to the floor;HRdenotes the height of the robot;(XRC,YRC)is the center coordinate of the robot;(XHC,YHC)is the center coordinate of the color mark.

    According to the geometric relationship,the following equation can be obtained:

    Then,the following results can be calculated as:

    Fig.1 The global vision camera model

    In Eq.(2),we know that(XRC,YRC)is dependent on the values of HC,HR,and(XHC,YHC).But the values of HCand HRare invariant in this study,and(XRC,YRC)is only dependent on(XHC,YHC).

    Four coordinate systems:global coordinate system(GCS),robot coordinate system(RCS),sensor coordinate system (SCS),and virtualglobalcoordinate system(VGCS)[19]are all under consideration.So,the appropriate transformation between VGCS and SCS should be acquired to calculate the absolute coordinates of the robot.According to the geometric relationship in Fig. 1, the homogeneous transformation matrix T between OXYZ and O′X′Y′Z′can be obtained as:

    Then,the coordinates of the robot in VGCS can be calculated as:

    In the study,VGCS is coincident with GCS in addition to different scale.So,the absolute coordinates of the robot can be achieved as:

    where Kheightand Kwidthare the relative scales to the VGCS;m and n are shift distances of VGCS relative to SCS.

    In this way,the robot can know its initial position wherever it is placed and periodically update its current coordinates.Furthermore,the real-time positioning of the robot can be realized in indoor environment with the help of the deadreckoning method.

    1.2 Stereo vision-based precise localization

    When the service robot moves into the area where the object is placed under the guidance of the global vison,the binocular vision-based localization method is adopted to provide high positioning accuracy to successfully grab the object for the customer.In this study,color segmentation and shape-based matching[20]are separately applied to detect the object,and the extracted results are fused to obtain the accurate object.The color segmentation method has been discussed.Below,we describe the shape-based matching method.

    The advantage ofshape-based matching isitsgreat robustness and flexibility.Instead of using the gray values,features along contours are extracted and used for the model generation and matching.The method is invariant to changes in illumination and variations of object gray values.The method also allows the object to be rotated and scaled.The process of shape-based matching is divided into two distinct phases:the training phase and the recognition phase.In the first phase,the template model should be created and trained.In the second phase,the model is used to find and localize the object in the image.The basic concept of shape-based image matching is shown in Fig.2.

    Fig.2 The basic concept of shape-based image matching

    Using the mentioned mehods,the regions of interest(ROI)are seperately obtained. Then,the following information fusion method is adopted to find the expected region of the object.Suppose that image regions A are obtained using color segmentation method and image regions B are also obtained using shape-based matching method.The center points of image regions A and B are all extracted and compared one by one.If the difference between the center coordinates of one A image region and the center coordinates of one B image region is less than the default constant ε,the image area belongs to the expected regions of interest.Otherwise,the image area is the error detected region.Then,the expected regions of interest can be obtained as:

    where(xi,yi)is the center coordinates of the sub-region of the image regions A;(xj,yj)is the center coordinates of the j subregion of the image regions B.

    According to Eq.(6),the expected regions can be obtained to find and match the object.After obtaining the plate region,the normalized cross-correlation(NCC)matching method[21]is adopted to obtain the depth image.Then,the 3D coordinates of the color mark,which is on the object,can be calculated using the binocular vision system.To localize the mobile robot,the homogeneous transformation matrix between binocular coordinate system and RCS should be firstly obtained.Figure 3 shows the relationship between binocular coordinate system and RCS.

    Fig.3 The diagram of binocular coordinate system and RCS

    In Fig.3,α and θ are the rotation degrees of the head;O′X′Y″Z denotesRCS,and O″X″Y″Z″ denotesbinocular coordinate system.When α and θ are equal to zero,O″X″Y″Z″shifts negative 6 cm in the X-axis direction relative to O′X′Y′Z′.According to these information, the homogeneous transformation matrix T between O′X′Y′Z′and O″X″Y″Z″can be got as:

    Then,the coordinates of the color mark in RCS can be calculated as :

    where[x′ y′ z′]Tis the location of the color mark in RCS;[x″ y″ z″]Tis the location of the color mark in binocular coordinate system.

    In addition to obtain the 3D coordinates of the color mark,the angle of the object relative to the robot should be also calculated by extracting and filtering the edge lines of the object.Detailed process is as follows.Firstly,the region of the object is processed using the expansion algorithm and canny operator.Then,the extracted edge lines are filtered using the threshold filter to obtain the available edge lines which can denote the angle of the object.Then,the selected edge lines are filtered using the mean filter,which can be used to calculate the angle of the object.Finally,the angle of the object relative to the robot,φobject,can be reckoned as follows.

    where φ denotes the detected angle of the object in the image and θ is the rotation angle of the neck joint.

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.Periodically localize the position of the color mark using the method above and adjust the pose of the robot until the robot successfully realizes its precise positioning relative to the color mark.The location-based visual servo control system is shown in Fig.4.

    Fig.4 The chart of the location-based visual servo control system

    2 Experiments and Result Analysis

    2.1 Experiment setup

    The experimental platform is a restaurant robot as shown in Fig.5 which is equipped with one binocular camera.The model of the binular camera is bumblebee2 which is made by Point Grey Research.There is a global camera mounted on the ceiling which is used to obtain the whole environmental image.The global images with 768×576 pixels are sent to the mobile robot through the wirless ethernet.The binocular camera mounted on the robot head can send the stereo images with 512×384 piexls to the robot through 1 394 bus.Before the experiment,the binocular camera has been calibrated in advance.

    The complete movement of the service robot can be divided into two parts:coarse localization and precise localization.In the following parts,we explain the experimental results of the complete movement from the two aspects.

    Fig.5 The photo of the restaurant service robot

    2.2 Global vision-based coarse localization

    In this part,the global vision localization method can be explained through the following example in this study.The mobile robot receives the image of the restaurant from the global camera through the wirless ethernet.To acquire the available value from the image,we should firstly rectify the raw image.Figure 6 is the rectified global image obtained from the camera.To localize the robot in the image,we should find the ROI and detect the color mark on the robot head using the color segmentation method.The result of the image processing is shown in Fig.7.

    After the location of the robot is detected in the image,the location of the robot relative to the pinhole camera coordinate system should be calculated.We have known the size of the picture in advance.And the height of the camera relative to the floor and the height of the robot are invariant and also known in advance.Because the position and the parameters of the global camera are invariant,the scales of the pixel relative to the one centimeter in the X-axis and Y-axis directions can be acquired.These parameters which are used to calculate the location of the robot,are all shown in Table 1.

    Table 1 The used parameters in this experiment

    In Fig.7,the origin of the picture coordinate system is in the top left-hand corner of the picture.And the coordinates of the color mark in the picture coordinate system have been calculated.According to Fig.1,we obtain the transformation coordinates,(X'HC,Y'HC),in the pinhole coordinate system.Then,the result can be calculated as(-68.50,- 311.33).After(X'HC,Y'HC)is obtained,the robot can be positioned in the image as(-39.14,-177.90).

    Next,the real coordinates of the robot in GCS should be reckoned.The origin of GCS is in the top right-hand of the carpet array shown in Fig.7.And,the coordinates of the top right-hand of the carpet array is(216,237)in the pinhole coordinate system.So,the coordinates of the robot in GCS is calculated as

    Many experiment results show that the accuracy of global vision positioning is within ±7 cm.In this study,the coordinates of the robot is updated every 100 ms to make sure that the robot can obtain its real-time coordinates.

    2.3 Stereo vision-based precise localization

    When the robot moves into the area where the object is placed under the guidance of the global vison,the stereo visionbased localization method is adopted toprovide the high positioning accuracy.The acquired images from binocular camera are firstly rectified using image correction algorithm to prepare the available images for the next process.The rectified image from the stereo vision during the robot movement is shown in Fig.8.

    Fig.8 The rectified stereo vision image

    In this study,the image taken by the right camera is used as the reference image where the ROI is extracted to realize the object feature matching and obtain the disparity map of the object.The object region is detected and extracted using color segmentation and shape-based matching methods.The procedure of color segmentation is as follows.Firstly,HSV color spaces could be obtained through the image color space conversion.Then,the appropriate thresholds are set in the Hue and Saturation space components.Finally,the plate region is extracted from the reference image.And the plate region can be also found in the image using the shape-based matching method.Each center coordinates of the extracted regions of interest can be firstly extracted,then,the expected region can be obtained according to Eq.(6).The plate region in Fig.9 is the expected region which is obtained according to the methods above.

    Fig.9 The expected region of interest

    Next,we should find the characteristic sign in the expected region and match image using NCC algorithm.In the process,the consistency test between the left and right images is also used to find the accuracy matching.The obtained depth image is shown in Fig.10.

    Fig.10 The obtained depth image using NCC

    From Fig.10,we can know that the depth value of the characteristic sign is 40.07.According to the binocular vision system,the coordinates of the characteristic mark can be calculated as[1.4 0.2 118.9]T.Otherwise,using the encoders,we can know that α =45°and θ=0°.Then,using Eq.(7),the homogeneous transformation matrix T,between RCS and binocular vision coordinate system is denoted as follows:

    Finally,according to Eq.(8),the result can be reckoned as[x′ y′ z′]=[- 4.6 88.5 83.9].Otherwise,the angle of the object relative to the robot should be calculated to determine the position of the color mark.In experiment,firstly,the edge lines of the object are extracted using canny operator.Then,the extracted edge lines are filtered using the threshold filter to find the available lines.Finally,the selected lines are filtered using the mean filter to calculate the angle of the object.The obtained lines using the method above are shown in Fig.11.From Fig.11,we know that the angle of the object is-3.865°.

    Fig.11 The extracted edge lines of the object

    After determining the position of the color mark,the robot can adjust the velocity and heading angle to move towards the object under the guidance of the location-based visual servo control system.And in the process of the movement,the robot should constantly repeat the positioning process above and adjust its pose until the precise positioning is finally realized.

    We performed the whole movement at the velocity of 0.15 m/s for sixty times in our library,and the initial position of the robot was random and unknown each time.And,the service robot could always realize self-localization with the positioning accuracy up to±2 cm and successfully grab the object for the customers within 3 min.The statistical data are shown in Table 2.

    Table 2 The application and positioning accuracy of the hierarchical localization method

    3 Conclusions and Future Work

    In this paper,a method of mobile robot localization which is based on visual feedback is proposed and applied on the restaurant service robot.The whole positioning process can be divided into two stages:coarse positioning and precise positioning.Hierarchical positioningmethod is adopted to provide different positioning accuracies in different stages.In the coarse positioning stage,the robot can periodically obtain its current coordinates using the global vision-based localization method.In the precise positioning stage,the robot can obtain higher localization precision using the binocular vision-based localization method.Finally,the robot can successfully grasp the plate on the table.Many experiments verify that the proposed algorithm has good location effect and obtains up to±2 cm positioning accuracy.

    Currently,research on the improvement in the positioning accuracy and speed of the algorithm is under way.And our future work will also focus on improving the anti-interference ability to the more complex environment.

    [1]Siegwart R,Nourbakhsh I.Introduction to Autonomous Mobile Robots[M].Cambridge:The MIT Press,2004:121-156.

    [2]Choi B S,Lee J J.Mobile Robot Localization Scheme Based on RFID andSonarFusion System [C].IEEE International Symposium on Industrial Electronics,Seoul,Korea,2009:1035-1040.

    [3]Choi B S,Lee J J.Localization of a Mobile Robot Based on an Ultrasonic Sensor Using Dynamic Obstacles[J].Artificial Life and Robotics,2008,12(1/2):280-283.

    [4]Bonnifait P,Garcia G.Design and Experimental Validation of an Odometric and Goniometric Localization System for Outdoor Robot Vehicles[J].IEEE Transactionson Roboticsand Automation,1998,14(4):541-548.

    [5]Balaguer B,Carpin S,Balakirsky S. Towards Quantitative Comparisons of Robot Algorithms:Experiences with SLAM in Simulation and Real World Systems[C].IEEE/RSJ International Conference on Intelligent Robots and Systems,California,USA,2007:1-7.

    [6]Lin H H,Tsai C C.Laser Pose Estimation and Tracking Using Fuzzy Extended Information Filtering for an Autonomous Mobile Robot[J].Journal of Intelligent and Robotic Systems,2008,53(2):119-143.

    [7]Munguia R,Grau A.Monocular SLAM for Visual Odometry[C]. IEEE InternationalSymposium on IntelligentSignal Processing,Madrid,Spain,2007:1-6.

    [8]Lin H Y,Lin J H,Wang M L.A Visual Positioning System for Vehicle or Mobile Robot Navigation[C].Proceedings of the 8th InternationalIEEE Conference on Intelligent Trasportation Systems,Vienna,Austria,2005:73-78.

    [9]Blanco J L,Gonzalez J,F(xiàn)ernandez-Madrigal J A.Consistent Observation Grouping for Generating Metric-Topological Maps that Improves Robot Localization[C].Proceeding of IEEE International Conference on Robotics and Automation,Orlando,USA,2006:818-823.

    [10]Guo Y,Xu X H.Color Landmark Design for Mobile Robot Localization[C].IMACS Multiconference on Computational Engineering in Systems Applications,Beijing,China,2006:1868-1874.

    [11]Kwon T B,Yang J H,Song J B,et al.Efficiency Improvement in Monte Carlo Localization through Topological Information[C].Proceeding of 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems,Beijing,China,2006:424-429.

    [12]Tapus A,Siegwart R.A Cognitive Modeling of Space Using Fingerprints ofPlaces for Mobile Robot Navigation [C].Proceedings of the IEEE International Conference on Robotics and Automation,Orlando,USA,2006:1188-1193.

    [13]Borenstein J,Everett B,F(xiàn)eng L.Navigation Mobile Robot:System and Techniques[M].Wellesley,USA:A.K.Peters Ltd.,1996:103-150.

    [14]Miro J V,Zhou W Z,Dissanayake G.Towards Vision Based Navigation in Large Indoor Environments[C]. IEEE International Conference on Intelligent Robots and Systems,Beijing,China,2006:2096-2102.

    [15]Grzyb B,Chinellato E,Morales A,et al.A 3D Grasping System Based on MultimodalVisualand TactileProcessing [J].Industrial Robot:An International Journal,2009,36(4):365-369.

    [16]Greggio N,Bernardino A,Laschi C,et al.Real-Time 3D Stereo Tracking and Localizing of Spherical Objects with the iCub Robotic Platform [J].Journal of Intelligent and Robotic Systems,2011,63(3/4):417-446.

    [17]Chinellato E,Grzyb B J,del Pobil A P.Brain Mechanisms for Robotic ObjectPose Estimation[C]. InternationalJoint Conference on Neural Networks,Hong Kong,China,2008:3268-3275.

    [18]Lang H X,Wang Y,de Silva C W.Mobile Robot Localization and Object Pose Estimation Using Optical Encoder,Vision and Laser Sensors[C].Proceedings of the IEEE International Conference on Automation and Logistics,Qingdao,China,2008:617-622.

    [19]Yu Q X,Yuan C,F(xiàn)u Z,et al.Research of the Localization of Restaurant Service Robot[J].International Journal of Advanced Robotic Systems,2010,7(3):227-238.

    [20]El Munim H A,F(xiàn)arag A A. Shape Representation and Registration Using Vector Distance Functions[C].Proceedings of theIEEE Conference on ComputerVision and Pattern Recognition,Minneapolis,USA,2007:1-8.

    [21]Sun Z X,Wu Q.TS201 Based Fast Algorithm of Normalized Cross-Correlation[J].Modern Electronics Technique,2010,33(10):125-127.(in Chinese)

    天天操日日干夜夜撸| 最新在线观看一区二区三区| 久久精品aⅴ一区二区三区四区| 极品少妇高潮喷水抽搐| 一级毛片女人18水好多| 欧美精品一区二区大全| 亚洲精品一二三| 飞空精品影院首页| 丰满少妇做爰视频| 精品欧美一区二区三区在线| 亚洲欧美一区二区三区黑人| 少妇粗大呻吟视频| 国产成人av激情在线播放| 又大又爽又粗| 啦啦啦 在线观看视频| 老熟妇仑乱视频hdxx| 99国产精品99久久久久| 久久久久国产一级毛片高清牌| 亚洲av美国av| 亚洲国产成人一精品久久久| 一区二区三区乱码不卡18| 久久热在线av| 日本撒尿小便嘘嘘汇集6| 最近最新免费中文字幕在线| 亚洲第一青青草原| 国产在线一区二区三区精| 国产精品一区二区精品视频观看| 激情在线观看视频在线高清 | 嫩草影视91久久| 国产成+人综合+亚洲专区| 人妻久久中文字幕网| 国产欧美日韩一区二区精品| 亚洲性夜色夜夜综合| h视频一区二区三区| 日本黄色视频三级网站网址 | 精品久久蜜臀av无| 嫁个100分男人电影在线观看| 另类亚洲欧美激情| 老司机靠b影院| 国产亚洲精品一区二区www | 可以免费在线观看a视频的电影网站| 亚洲av欧美aⅴ国产| 五月开心婷婷网| 午夜福利乱码中文字幕| 久久久久久免费高清国产稀缺| 成人18禁高潮啪啪吃奶动态图| 亚洲一区二区三区欧美精品| 黑丝袜美女国产一区| 久久精品aⅴ一区二区三区四区| 999久久久精品免费观看国产| 精品一区二区三区av网在线观看 | 黑人欧美特级aaaaaa片| 一级片'在线观看视频| 久久香蕉激情| 亚洲黑人精品在线| 国产精品久久久人人做人人爽| 欧美日韩亚洲高清精品| 欧美国产精品va在线观看不卡| 两个人看的免费小视频| 亚洲第一欧美日韩一区二区三区 | 亚洲精品国产色婷婷电影| 日本wwww免费看| 狠狠狠狠99中文字幕| 国产一区二区激情短视频| 亚洲视频免费观看视频| 午夜免费成人在线视频| 黄网站色视频无遮挡免费观看| 91精品国产国语对白视频| 2018国产大陆天天弄谢| 国产成人av教育| 免费人妻精品一区二区三区视频| 俄罗斯特黄特色一大片| 日韩三级视频一区二区三区| 两个人免费观看高清视频| 亚洲全国av大片| 女同久久另类99精品国产91| www日本在线高清视频| 少妇猛男粗大的猛烈进出视频| 2018国产大陆天天弄谢| 性少妇av在线| 国产av又大| 久久免费观看电影| 久久人妻熟女aⅴ| 国产成人精品在线电影| 亚洲av美国av| 国产免费av片在线观看野外av| 国产无遮挡羞羞视频在线观看| 美国免费a级毛片| 丁香六月欧美| 中文字幕最新亚洲高清| 婷婷丁香在线五月| 女警被强在线播放| 免费看十八禁软件| 欧美精品一区二区大全| 777米奇影视久久| 午夜福利在线观看吧| 精品免费久久久久久久清纯 | 99re在线观看精品视频| 极品少妇高潮喷水抽搐| 国产欧美日韩一区二区三区在线| 黄色成人免费大全| 在线 av 中文字幕| 亚洲午夜理论影院| 黄色毛片三级朝国网站| 欧美日韩亚洲国产一区二区在线观看 | av片东京热男人的天堂| 99热国产这里只有精品6| 国产色视频综合| 亚洲七黄色美女视频| 91成年电影在线观看| 久久久久精品人妻al黑| 欧美激情 高清一区二区三区| 亚洲伊人色综图| 久久久久久久大尺度免费视频| 老司机深夜福利视频在线观看| 久久ye,这里只有精品| 成人国语在线视频| 国产精品秋霞免费鲁丝片| 成人手机av| 亚洲av第一区精品v没综合| 免费少妇av软件| av线在线观看网站| 丝袜在线中文字幕| 国产99久久九九免费精品| 中文字幕色久视频| 亚洲精华国产精华精| 性色av乱码一区二区三区2| 黑丝袜美女国产一区| 色94色欧美一区二区| 日本黄色日本黄色录像| 欧美激情 高清一区二区三区| 国产在线观看jvid| 99国产极品粉嫩在线观看| 成人国产一区最新在线观看| 成人av一区二区三区在线看| 日本黄色视频三级网站网址 | 夜夜骑夜夜射夜夜干| netflix在线观看网站| 免费一级毛片在线播放高清视频 | 欧美精品av麻豆av| 在线播放国产精品三级| 多毛熟女@视频| 婷婷丁香在线五月| 亚洲综合色网址| 大码成人一级视频| 高清av免费在线| 国产精品免费大片| 亚洲欧洲日产国产| 久久久精品94久久精品| 亚洲国产中文字幕在线视频| 老司机福利观看| 日韩大片免费观看网站| 桃花免费在线播放| 成人国语在线视频| 国产单亲对白刺激| 大片电影免费在线观看免费| 成人三级做爰电影| 色综合欧美亚洲国产小说| 一级片'在线观看视频| √禁漫天堂资源中文www| 2018国产大陆天天弄谢| 欧美中文综合在线视频| 国产成人啪精品午夜网站| 夜夜骑夜夜射夜夜干| 老熟女久久久| 黄色视频不卡| 国产亚洲精品久久久久5区| 视频在线观看一区二区三区| 桃花免费在线播放| 亚洲国产中文字幕在线视频| 淫妇啪啪啪对白视频| 久久国产精品影院| 亚洲全国av大片| 热99久久久久精品小说推荐| 成人免费观看视频高清| 肉色欧美久久久久久久蜜桃| 欧美成狂野欧美在线观看| 国产精品一区二区在线不卡| 9热在线视频观看99| tube8黄色片| 人人妻人人澡人人爽人人夜夜| 亚洲久久久国产精品| 国产一区二区激情短视频| 国产三级黄色录像| 久久国产精品大桥未久av| 大陆偷拍与自拍| 男女下面插进去视频免费观看| 亚洲国产精品一区二区三区在线| 别揉我奶头~嗯~啊~动态视频| 国产熟女午夜一区二区三区| 女警被强在线播放| 国产区一区二久久| 91麻豆av在线| av一本久久久久| 久久人人爽av亚洲精品天堂| 精品少妇久久久久久888优播| 久久青草综合色| 午夜福利视频精品| 中文字幕精品免费在线观看视频| 一本久久精品| 女性被躁到高潮视频| 久热爱精品视频在线9| 丰满饥渴人妻一区二区三| 成人av一区二区三区在线看| 亚洲精品在线观看二区| 免费黄频网站在线观看国产| 两个人免费观看高清视频| 亚洲精品美女久久久久99蜜臀| 久久av网站| 色94色欧美一区二区| 满18在线观看网站| 亚洲av国产av综合av卡| 免费观看a级毛片全部| 国产国语露脸激情在线看| av又黄又爽大尺度在线免费看| 久久久久国产一级毛片高清牌| 中文字幕色久视频| 成人影院久久| 一区二区三区国产精品乱码| 高清欧美精品videossex| 黄色成人免费大全| 妹子高潮喷水视频| 少妇粗大呻吟视频| 一级a爱视频在线免费观看| 水蜜桃什么品种好| 中文字幕制服av| 亚洲欧美色中文字幕在线| 日韩欧美三级三区| 精品国产一区二区三区久久久樱花| 国产亚洲精品久久久久5区| 国产黄频视频在线观看| 午夜老司机福利片| 人妻久久中文字幕网| 国产精品一区二区免费欧美| netflix在线观看网站| 国产精品一区二区精品视频观看| 变态另类成人亚洲欧美熟女 | 欧美人与性动交α欧美软件| 在线观看66精品国产| 欧美日韩亚洲高清精品| 天天躁狠狠躁夜夜躁狠狠躁| 国产伦理片在线播放av一区| 国产精品av久久久久免费| 又大又爽又粗| 操美女的视频在线观看| 搡老乐熟女国产| 丰满少妇做爰视频| 亚洲少妇的诱惑av| 成年人免费黄色播放视频| 黄色视频,在线免费观看| 亚洲av国产av综合av卡| 精品一区二区三区四区五区乱码| e午夜精品久久久久久久| 成人18禁在线播放| 亚洲av片天天在线观看| 国产精品1区2区在线观看. | 国产精品久久电影中文字幕 | 69av精品久久久久久 | 美女国产高潮福利片在线看| 女人被躁到高潮嗷嗷叫费观| av在线播放免费不卡| 岛国毛片在线播放| 久久久久久久久久久久大奶| 国产一区二区在线观看av| 国产精品一区二区在线不卡| 久久精品亚洲精品国产色婷小说| 亚洲五月色婷婷综合| 无遮挡黄片免费观看| 亚洲精品美女久久久久99蜜臀| 久久中文字幕一级| 日本av免费视频播放| 成人国产av品久久久| 成人特级黄色片久久久久久久 | 国产在线免费精品| 1024视频免费在线观看| 亚洲成国产人片在线观看| 免费在线观看日本一区| 成年版毛片免费区| 人人妻,人人澡人人爽秒播| 精品少妇黑人巨大在线播放| 国产亚洲精品久久久久5区| 成人黄色视频免费在线看| 高清av免费在线| 精品国产国语对白av| 久久久久久人人人人人| 性高湖久久久久久久久免费观看| 超色免费av| 少妇的丰满在线观看| 高潮久久久久久久久久久不卡| 一本综合久久免费| 精品乱码久久久久久99久播| 亚洲av美国av| 69av精品久久久久久 | 国产精品熟女久久久久浪| 免费在线观看影片大全网站| 一进一出好大好爽视频| 亚洲精品中文字幕一二三四区 | 在线观看66精品国产| tube8黄色片| 欧美激情极品国产一区二区三区| 亚洲国产精品一区二区三区在线| av福利片在线| 久久人人爽av亚洲精品天堂| 国产精品久久久久久精品电影小说| 每晚都被弄得嗷嗷叫到高潮| 黄片小视频在线播放| 亚洲av欧美aⅴ国产| 欧美黄色淫秽网站| 久久 成人 亚洲| 午夜两性在线视频| 精品一品国产午夜福利视频| 久久人妻熟女aⅴ| aaaaa片日本免费| 久久 成人 亚洲| 午夜两性在线视频| 两个人免费观看高清视频| 91av网站免费观看| 性少妇av在线| 国产1区2区3区精品| 日韩中文字幕欧美一区二区| 国产成人系列免费观看| 露出奶头的视频| 欧美人与性动交α欧美精品济南到| 美女扒开内裤让男人捅视频| 又黄又粗又硬又大视频| 亚洲av成人不卡在线观看播放网| 少妇被粗大的猛进出69影院| 国产精品久久久久久人妻精品电影 | 少妇的丰满在线观看| 亚洲av成人一区二区三| 久久久久久人人人人人| 精品少妇黑人巨大在线播放| 18在线观看网站| 欧美国产精品一级二级三级| 欧美 亚洲 国产 日韩一| 乱人伦中国视频| 日韩欧美三级三区| 国产成人影院久久av| 午夜成年电影在线免费观看| 成年版毛片免费区| h视频一区二区三区| 国产有黄有色有爽视频| 欧美乱妇无乱码| 人人澡人人妻人| 一区二区三区激情视频| 少妇被粗大的猛进出69影院| 亚洲精华国产精华精| 美女扒开内裤让男人捅视频| 国产精品熟女久久久久浪| 免费黄频网站在线观看国产| 精品国产一区二区三区久久久樱花| cao死你这个sao货| 午夜成年电影在线免费观看| 国产亚洲av高清不卡| 久久天堂一区二区三区四区| 午夜福利在线观看吧| svipshipincom国产片| 老熟妇乱子伦视频在线观看| 亚洲第一av免费看| 黄频高清免费视频| 欧美另类亚洲清纯唯美| 色老头精品视频在线观看| 日韩精品免费视频一区二区三区| 天天添夜夜摸| 国产区一区二久久| 天天躁日日躁夜夜躁夜夜| 国产男靠女视频免费网站| 久久中文字幕人妻熟女| 极品人妻少妇av视频| 亚洲专区字幕在线| 少妇精品久久久久久久| 久久人妻熟女aⅴ| 777米奇影视久久| 亚洲一区中文字幕在线| 国产av一区二区精品久久| 午夜福利在线观看吧| 久久午夜综合久久蜜桃| 麻豆成人av在线观看| 黄片小视频在线播放| 久久性视频一级片| 丰满迷人的少妇在线观看| 在线观看www视频免费| 亚洲av电影在线进入| 成人影院久久| 国产精品麻豆人妻色哟哟久久| 国产精品偷伦视频观看了| 丝袜美足系列| 免费看a级黄色片| 国产伦人伦偷精品视频| 在线av久久热| 少妇粗大呻吟视频| 国产成人精品无人区| 黄色丝袜av网址大全| 男人舔女人的私密视频| 国产成人精品无人区| 欧美成人免费av一区二区三区 | 成人国产av品久久久| 精品视频人人做人人爽| 精品福利永久在线观看| 久久久久国产一级毛片高清牌| 夜夜骑夜夜射夜夜干| 亚洲国产看品久久| 十八禁人妻一区二区| 亚洲欧美日韩高清在线视频 | 久久久精品国产亚洲av高清涩受| 亚洲五月色婷婷综合| 狠狠婷婷综合久久久久久88av| 最新在线观看一区二区三区| 日韩免费高清中文字幕av| avwww免费| 肉色欧美久久久久久久蜜桃| 嫁个100分男人电影在线观看| 欧美精品av麻豆av| 亚洲精品国产区一区二| 亚洲人成77777在线视频| 人人妻人人爽人人添夜夜欢视频| 欧美日韩亚洲综合一区二区三区_| 国产极品粉嫩免费观看在线| 又黄又粗又硬又大视频| 国产主播在线观看一区二区| 电影成人av| xxxhd国产人妻xxx| 亚洲伊人色综图| 精品福利永久在线观看| 久久热在线av| 嫩草影视91久久| 男女之事视频高清在线观看| 国产精品 欧美亚洲| 国产亚洲一区二区精品| 欧美激情极品国产一区二区三区| bbb黄色大片| 超碰成人久久| 亚洲精品一二三| 99香蕉大伊视频| 一区二区三区乱码不卡18| 国产精品香港三级国产av潘金莲| 亚洲精华国产精华精| 看免费av毛片| 丁香六月天网| 视频在线观看一区二区三区| av国产精品久久久久影院| 精品人妻1区二区| 香蕉国产在线看| 黄色成人免费大全| 在线看a的网站| 色播在线永久视频| 亚洲精品国产精品久久久不卡| 在线播放国产精品三级| 亚洲精品中文字幕在线视频| 这个男人来自地球电影免费观看| 99国产精品99久久久久| 日韩视频在线欧美| 丝袜人妻中文字幕| 精品国产国语对白av| 欧美黑人精品巨大| 国产极品粉嫩免费观看在线| 国产欧美日韩一区二区三区在线| 99精品欧美一区二区三区四区| 亚洲人成电影免费在线| 超碰成人久久| 亚洲,欧美精品.| 色婷婷av一区二区三区视频| 久久国产精品影院| videosex国产| 欧美另类亚洲清纯唯美| 亚洲国产欧美日韩在线播放| 免费观看av网站的网址| 亚洲情色 制服丝袜| 午夜福利视频在线观看免费| 久久性视频一级片| 免费在线观看视频国产中文字幕亚洲| 少妇精品久久久久久久| 桃红色精品国产亚洲av| 一本久久精品| a级片在线免费高清观看视频| 欧美日韩视频精品一区| 欧美精品av麻豆av| 美女视频免费永久观看网站| 男女之事视频高清在线观看| 亚洲av片天天在线观看| 欧美黑人欧美精品刺激| 亚洲伊人久久精品综合| 一区二区三区国产精品乱码| 考比视频在线观看| 桃红色精品国产亚洲av| 高清在线国产一区| 日本五十路高清| 这个男人来自地球电影免费观看| 精品国内亚洲2022精品成人 | 最新美女视频免费是黄的| 极品少妇高潮喷水抽搐| 超色免费av| 国产在视频线精品| 天堂动漫精品| 色综合欧美亚洲国产小说| 纯流量卡能插随身wifi吗| 老鸭窝网址在线观看| 99久久99久久久精品蜜桃| 久久中文看片网| 欧美一级毛片孕妇| 狠狠精品人妻久久久久久综合| 一个人免费在线观看的高清视频| 人妻久久中文字幕网| 狠狠精品人妻久久久久久综合| 中文字幕另类日韩欧美亚洲嫩草| 欧美中文综合在线视频| 亚洲av成人不卡在线观看播放网| 久久亚洲真实| 欧美乱码精品一区二区三区| 亚洲成av片中文字幕在线观看| 久久久久视频综合| 国产成人欧美| 男女边摸边吃奶| 无限看片的www在线观看| 99在线人妻在线中文字幕 | bbb黄色大片| 在线观看免费视频日本深夜| 日本a在线网址| 色综合婷婷激情| 欧美黑人欧美精品刺激| 2018国产大陆天天弄谢| 丝瓜视频免费看黄片| 国产成人影院久久av| 欧美国产精品va在线观看不卡| 黄色视频不卡| 国产人伦9x9x在线观看| 搡老岳熟女国产| bbb黄色大片| 婷婷成人精品国产| 这个男人来自地球电影免费观看| 热re99久久国产66热| 美女高潮喷水抽搐中文字幕| 一级片'在线观看视频| 亚洲伊人色综图| 免费观看a级毛片全部| 亚洲欧美日韩另类电影网站| 日本a在线网址| 久久精品国产综合久久久| 捣出白浆h1v1| 91大片在线观看| 久久亚洲精品不卡| 亚洲成国产人片在线观看| 亚洲 国产 在线| 国产av又大| 亚洲中文字幕日韩| 国产片内射在线| 国产真人三级小视频在线观看| 18禁黄网站禁片午夜丰满| 麻豆乱淫一区二区| 电影成人av| 国产精品国产高清国产av | 亚洲男人天堂网一区| 丰满人妻熟妇乱又伦精品不卡| 97在线人人人人妻| 亚洲人成电影免费在线| 丝袜在线中文字幕| 国产精品偷伦视频观看了| 国产精品98久久久久久宅男小说| 捣出白浆h1v1| 久久热在线av| 另类亚洲欧美激情| 91麻豆av在线| 在线观看一区二区三区激情| 久久久精品国产亚洲av高清涩受| a级毛片黄视频| 国产免费福利视频在线观看| 黑人操中国人逼视频| 国产欧美亚洲国产| 99精品在免费线老司机午夜| 亚洲成av片中文字幕在线观看| 免费久久久久久久精品成人欧美视频| 国产在线一区二区三区精| 欧美日韩福利视频一区二区| 亚洲精品国产区一区二| 如日韩欧美国产精品一区二区三区| 激情在线观看视频在线高清 | 亚洲久久久国产精品| 制服人妻中文乱码| 亚洲,欧美精品.| 黑人欧美特级aaaaaa片| 精品乱码久久久久久99久播| 久久热在线av| 亚洲,欧美精品.| 精品第一国产精品| 黄片小视频在线播放| 黑人欧美特级aaaaaa片| 免费在线观看完整版高清| 久久国产精品人妻蜜桃| 日本精品一区二区三区蜜桃| 黄色a级毛片大全视频| www.999成人在线观看| 久久青草综合色| 高清视频免费观看一区二区| 我的亚洲天堂| 99在线人妻在线中文字幕 | 2018国产大陆天天弄谢| 久久国产精品男人的天堂亚洲| aaaaa片日本免费| 嫩草影视91久久| 菩萨蛮人人尽说江南好唐韦庄| 国产一区二区三区视频了| 国产精品久久久久久人妻精品电影 | 欧美黄色淫秽网站| 国产一区二区三区视频了| 国产免费福利视频在线观看| 脱女人内裤的视频| 考比视频在线观看| 欧美黄色淫秽网站| 亚洲国产成人一精品久久久| 久久免费观看电影| 老司机深夜福利视频在线观看| 亚洲精品一二三| 国产精品亚洲av一区麻豆| 国产成人欧美| 亚洲一区二区三区欧美精品|