• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Online Observability-Constrained Motion Suggestion via Efficient Motion Primitive-Based Observability Analysis

    2018-04-16 07:27:11ZhengRongShunanZhongandNathanMichael

    Zheng Rong, Shun’an Zhong and Nathan Michael

    (1.School of Information and Electronics, Beijing Instituteof Technology, Beijing 100081, China; 2.Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, China; 3.Robotics Institute, Carnegie Mellon University, Pittsburgh, PA 15213, USA)

    A reliable state estimation is essential for an autonomous vision-based navigation system such as a mobile robot to enable accurate localization and environment perception with respect to undefined environments[1].Reliable perception systems depend on the availability of sufficient informative sensor observations and environmental information to ensure a continued accurate performance[2]. State estimation methodologies assume the existence and preservation of observability (the ability of the system states to be reconstructed from system output)[3]. However, this preserved observability condition may be violated in environments with limited or scarce information, resulting in increased estimation uncertainty and even divergence, especially for monocular vision-based autonomous systems.

    In recent years, observability analysis techniques have received significant attention. Observability analysis provides a tool to evaluate the observability conditions with respect to the current system state and environment. Observability-constrained active control techniques leverage observability analysis to quantify the implications of sensor observations on a state estimate accuracy toward an informed trajectory selection[3], faster estimator convergence[4], and optimal sensor placement[5]. Observability-constrained vision navigation systems[6-8]detect the unobservable direction and reduce the estimation inconsistency by explicitly prohibiting spurious information gain. Trajectory is optimized to guarantee the sensing observability and controlling stability of the system[9]. Sampling trajectories are selected to maximize observability to localize the robot and construct the map of the environment[10-11]. Path planning has also been incorporated with visibility constraints towards the goal of less exploration time in unknown environments[12]. However, these studies focused on the observability analysis of the current system states, without considering the future observability and impact of future motions on the system observability. Local observability prediction can be of great value in many problematic situations where the reconstruction of the state is difficult and need to be sensed actively, such as robot localization in an unknown environment[9]. By characterizing the impact of future motion on system observability, this technique enables informed selection of future motions to avoid the potential observability degradation, and consequently to improve state estimation performance.

    In this work, an online methodology is proposed, seeking to predict the local observability conditions and suggest observability-constrained motion directions toward enabling robust state estimation and safe operation for an autonomous system in unknown environments.The formulation leverages efficient numerical observability analysis and motion primitive technique to realize the local observability prediction, and accordingly suggest the future motion directions to avoid the potential estimation degradation due to observability deficiency. Following the prior work[13], the empirical observability Gramian (EOG)[14]is employed to enable a real-time local observability evaluation. A motion primitive[15]technique is utilized to enable local path sampling and observability-constrained motion suggesting. We assess the implication of potential motions on the system observability and resulting state estimation performance by evaluating the observability of potential trajectories and seek to preserve the estimation consistency by explicit motion direction selection.

    The proposed approach is specialized to a monocular visual-inertial state estimation framework to assess the viability of the proposed approach to correctly predict the observability of future motions and effectively making motion suggestions to avoid the potential state estimation degeneracy of the perception system. Monocular visual-inertial state estimation methodology is a representative vision-based perception strategy that enables autonomous operation with resource-constrained systems such as a micro aerial vehicle and commodity devices.This choice of sensing modalities also represents a particularly challenging state estimation formulation due to the lack of the direct metric depth observation[16]. It is thereby essential for such a system to assure a full-state observability to achieve the state estimation accuracy.

    1 Methodology

    This section briefly summarizes relevant concepts related to monocular visual-inertial state estimator and EOG-based observability analysis based on prior works[2,13,15,17-19], and introduces a method to predict the observability of future motions and propose motion direction suggestions.

    1.1 EOG-based observability evaluation for mono V-I estimator

    The optimization-based monocular visual-inertial state estimation problem is formulated with respect to a sliding window that containsnvisual keyframes and a local map containingmlandmarks observed by the key frames, which is solved via a recursive optimization strategy. The full state parameters to be estimated are formulated as a vector X∈10n+m+3,

    (1)

    (2)

    where zimuand zcamare inertial measurement unit (IMU) and camera measurements, rimuand rcamare the residuals of IMU and camera measurements, Pimuand Pcamare the corresponding residual covariance. An IMU pre-integration technique[20]is employed to represent the IMU measurement. The resulting integrated system is treated as sensor outputs in the optimization and EOG state-output simulation.

    The system observability is analyzed given the monocular vision-based sliding window state estimation formulation by the efficient EOG computation[13]. The EOG provides insight into the sensitivity of the system outputs with respect to the perturbed system states and captures the ability of the estimator to reconstruct the system states given the sensor outputs by numerically simulating the system state-output behaviors. Crucially, computation of the EOG is independent of state estimation and is therefore a viable observability analysis technique, which makes it possible to perform the observability prediction even in a degraded estimation scenario.Benefiting from the additive property, the EOG can be efficiently computed by partitioning the system output into sub-vectors and exploiting the underlying formulation sparsity of the EOG. The full EOG W is readily computed by perturbing the system initial conditions x∈ndirectly in positive and negative directions and simulating the state-output 2ntimes.

    (3)

    1.2 Generation and observability evaluation of motion primitive

    Motion primitives are incorporated with the EOG evaluation to enable a motion extrapolation and subsequent local observability prediction. The EOG is computed for motion primitive end points based on the current state to enable the observability prediction locally.

    A motion primitive generation strategy[15]is employed to compute primitives based on a discretized set of initial and final conditions with different trajectory durations. While the approach extends readily to three dimensions, in this work,for simplicity of presentation, we compute motion primitives for a system in two dimensions. The computation is decoupled into translation and heading trajectory generation with higher-order end point constraint bounds that correspond to the expected rates of motion exhibited during the simulation and experimental studies. After the computation of the motion primitive library, a look-up-table is generated to enable efficient online queries to find the appropriate end point states for observability evaluation.

    1.3 Trajectory observability evaluation

    To predict the local observability condition and suggest next-step motion direction, a trajectory tree with specified number of levels and branches is generated to realize extrapolation. As shown in Fig.1, a 3-level extrapolation frame with 3-branch motion primitives is generated, and totally 27 potential trajectories in the local region based on the current state are created. The observability conditions of potential trajectories composed of a series of motion primitive end points are evaluated by synthesizing the observability measure of involved end points. For simplicity of representation and without loss of sensitivity due to wide camera FOV, we compute each motion primitive with three spreading branches. The orientation of the system is instantaneously in the direction of the motion.

    Fig.1 Example of a 3-level extrapolation frame with 3-branch motion primitives

    Using the trajectory tree with multiple level extrapolation, the observability evaluation enables local observability prediction with a certain degree to avoid a sudden dead end from which the state estimator may not recover. The observability cost for each trajectory is computed using a weighted strategy, i.e. the closer steps use the larger weighting factor, based on the fact that the closest step has more accurate observability evaluation and is more crucial in near-term motion execution than later steps.Thus for a specified trajectory withNsteps (end points), the observability measureKcan be computed using all the observability measureskiof end points and preset weighting factorswi

    (4)

    As the observability prediction for the first step is more accurate than later steps, a more conservative strategy can be applied by executing one-step motion according to the multi-level evaluation result, without loss of foresight. Under this scenario, it is preferable to evaluate the observability measure for a motion direction instead of a specified trajectory. The observability cost for each motion direction can be computed using all the observability measures of the involved motion primitives.

    (5)

    in which,Nis the number of levels,Mis the number of motion primitives in each level of one motion direction,k(i,j)is the observability measure of thejthmotion primitive inithlevel, and the motion primitives in the same level use identical weighting factorwi. Considering the example in Fig. 1, the future motion for current stateSis spread out with three motion directionsS1,S2,S3, and the observability cost of each direction can be evaluated using the involved 13 end points. For example, the observability cost of the second directionis computed as

    (6)

    Fig.2 Motion suggestions for a predefined simulation scenario and straight trajectory

    Thus the observability constrained motion direction can be suggested according to the resulting observability measures, and then can be incorporated with other motion constraints to yield an informed motion planning for the next step. Note that in the actual application of this strategy, the system doesn’t have to wait for the completion of the one-step execution to start the next motion suggestion. Instead, a denser motion suggestion can be generated and followed after each state estimation of a new coming frame, so that the new sensor observations can be added into the system and the environment information can be updated to yield a more accurate local observability prediction and consequently better motion suggestion.

    1.4 Observability-constrained motion direction suggestion

    After computing the observability cost for all the motion directions, the following strategy in Algorithm 1 can be used to propose the motion suggestion. An example of motion suggestion along a predefined trajectory in a simulated environment is shown in Fig.2. The corresponding online observability prediction results in three directions (left, forward, or right) is shown in Fig.3. Different motion directions are suggested due to the different landmark distribution in three stages. A threshold is used to identify the “forbidden direction”. Motion planner prefers the motion direction with a higher observability measure.

    Fig.3 Observability prediction and resulting motion direction suggestion

    Algorithm1Strategy of motion direction suggestion

    Assume there aremmotion directions to be evaluated.

    Initial condition:

    Kmax=0;

    imax=1;

    forbidden_directions.clear();

    Loop i: from 1 tom

    ifKi

    end.

    ifKi>Kmax, thenKmax=Ki,imax=i;

    end.

    end loop.

    Results:

    ·Forbidden_directionsindicates the directions in which the system is believed to experience an observability-deficient condition and prone to estimation degradation or failure;

    ·imaxindicates the most-suggested direction. Even in the worst case whereKmax

    ·Other directions are moderately suggested, in which the observability condition is believed to be able to provide sufficient information to ensure reasonable state estimation performance.

    A reasonable extrapolation range is essential to make sure the system can make prediction with sufficient foresight and take action in advance to avoid an irreversible degraded situation. An extrapolation with too short distance cannot provide sufficient predictive information, while a large extrapolation distance cannot ensure the prediction accuracy as the system cannot get sufficient environment information and consequently cannot predict the sensor observation accurately. For example, in a confined environment the camera observation may change significantly along the trajectory and the system can hardly predict the feature observation for a far end point. The extrapolation distance is controlled by the velocity of the final state and extrapolation time duration. In this work, we assume the constant linear velocity magnitude and choose different extrapolation duration according to the scenario type. In a confined scenario, smaller duration is used, as the effective sensor observation prediction distance is limited by the environment. In an open scenario, a larger extrapolation duration can be used. The extrapolation width is controlled by the branch number and the heading separation between the branches. As a camera with wide FOV is employed in this study, the configuration of three branches and 45-degree separation is reasonable without any loss of sensitivity and computing efficiency.

    2 Evaluation and Analysis

    The proposed motion direction suggesting approach is evaluated given the optimization-based monocular visual-inertial state estimation formulation through simulation and real-world experiments. The perception system including a monocular camera and IMU (simulated and real) is carried along trajectories in different environments. By the simulation analysis we seek to demonstrate the expected ability and efficacy of the proposed approach both on the local observability prediction and motion direction suggestion. Our current study seeks to verify the effectiveness of the proposed methodology in real-world scenarios.

    2.1 Implementation detail

    Fig.5 Predicted observability, actual observability, and estimation covariance of the two trajectories

    The simulation and experimental analysis leverage the same algorithm implementations, including the optimization-based sliding-window state estimator and local observability prediction-based motion suggestion as presented in sect. 1.4. We employ a time-synchronized monocular camera and IMU in the experiment. The model accurately simulates the associate sensor characteristics and uncertainty. Ceres solver[22]is utilized to solve the optimization problem, and analytic Jacobians are employed to ensure run-time performance. The full sliding window size of the state estimator is set to 30, and the update rate of optimization and motion suggestion is 10 Hz.The motion primitives are generated with three directions and four levels for a reasonable extrapolation distance and width, and consequently 120 observability measure of end points are evaluated. The EOG-based observability evaluation is established with a fixed perturbation size of 0.001 and translational states perturbed. Note that we use different threshold in simulation and experiment to detect the observability deficiency due to different scenario model. From the actual experiment we find that for the similar environment model (for example outdoor open scenario) a well-chosen threshold can be used and doesn’t need further adjustment.

    2.2 Simulation results

    In the simulation two pre-defined trajectories are tested in the same scenario (Fig.4). Straight trajectory-1 experienced observability degradation due to feature deficiency, while trajectory-2 is always involved in a feature-rich area. The observability measures of three motion directions are evaluated to make the prediction, and the actual current observability measure is used as a reference (Fig.5).

    Fig.4 Pre-defined simulation scenario and two trajectories

    Firstly, the observability deficiency in trajectory-1 (left column) is correctly predicted (by 8.5 s) with a threshold of 0.3 applied on the predicted observability measure. The correctness of the prediction can be verified by the actual observability measure and the consequent state estimation performance degradation indicated by the increasing covariance. Secondly, during the duration [60 s, 100 s], left direction is strongly suggested, while the actual motion disobeyed the suggestion and entered into the observability-deficient area, which results in the consequent uncertainty with a degraded state estimation. On the contrary, trajectory-2 (right column) executed a motion following the suggested direction to avoid the feature-deficient area, yielding a reasonable observability measure along the trajectory and having a consistent estimated performance. The estimation performance can be checked by both the estimation covariance (Fig.5) and estimated path (Fig.6). The observability deficiency in trajectory-1 results in a big jump in estimated path and end point, while trajectory-2 yields a result with limited uncertainty.

    Fig.6 Visualized estimation results, including estimated path and features

    Note that in trajectory-1, the prediction of the observability degeneracy occurred 8 s earlier than the actual occurrence of the degeneracy, permitting a motion planner to take actions before entering into a pathological scenario. Although trajectory-2 is much longer than trajectory-1, the state estimation along trajectory-2 preserved a better performance with the well selected motion direction. Ideally, the best trajectory can be selected as the motion direction that keeps the highest observability value, but in actual application the motion planning should be constrained by both the observability and user goal. Note that trajectory-2 is generated based on the direction suggestion in trajectory-1 and actual feature distribution model in the simulated environment.

    2.3 Experimental results

    In the experiment, the system moves in a hallway with different observability conditions representing degraded scenarios such as white wall, narrow space, sharp turning, and dark lighting. As shown in Fig.7,four representative cases and three prediction points are studied.

    Fig.7 Experiment scenario illustrated by estimated path and environment features

    Fig.8 Predicted observability in three directions, actual observability, and estimation covariance

    Fig.9 Camera images overlaid with online motion direction suggestion in four representative cases

    The observability measure in three motion directions are predicted and motion directions are suggested accordingly (Fig.8). The actual observability measure is given as a reference to verify the efficacy of the prediction, while the estimation covariance is used to evaluate the actual estimation performance. Four representative cases and three prediction points are marked. In case-1, forward motion is mostly suggested according to the highest observability prediction, and the actual motion followed this direction, which preserved a good observability condition with little state estimation uncertainty. In case-2, the potential degeneracy in forward motion is successfully predicted.While the system kept moving forward, as expected, it experienced an actual observability degeneracy, which results in an increased estimation covariance. Before the sharp turn in case-3, moving right is strongly recommended due to the highly confined environment. The suggestion is followed by an actual motion. As a result, the actual observability increased significantly and the estimation uncertainty decreased to a low level. In case-4, before the system entering the dark area the degeneracy in future motion is successfully predicted and the violation of the suggestion results in a significantly increased state uncertainty.

    The four representative scenarios include two true-positive predictions (1, 3) and two true-negative predictions (2, 4). In the true positive cases, the system proposed “should go” direction (green arrows overlaid on the camera images), and the actual motion followed the suggestion and consequently yielded a good state estimation performance. On the contrary, in the true negative cases the system moved along the predicted “forbidden” direction (red arrows on the camera images), eventually entering the degenerate conditions and inducing significant state estimation degradation. The true-positive and true-negative cases demonstrate the correctness of the observability prediction in motion directions with informative and deficient sensor observation, respectively. The corresponding images (Fig.9) captured by the camera are given to exhibit the visual environment conditions. Green arrows (G) indicate the most suggested directions, blue (B) for the moderately suggested directions and red (R) for the forbidden directions. Note that the arrows indicating the motion suggestions are generated and overlaid on the camera image online.

    Similar to the simulation test, the degenerate conditions are successfully predicted prior to the actual occurrence of the degeneracy. Three representative prediction points (A′,B′,C′), and the corresponding actual occurrence points (A,B,C) are checked,tA′=12.35 s,tA=12.85 s,tB′=78.75 s,tB=79.45 s,tC′=86.95 s,tC=88.35 s. The prediction is 0.5 s, 0.7 s, 1.4 s earlier than the degeneracy occurrence, respectively.

    3 Conclusion

    An online observability-constrained motion suggesting methodology is proposed in this paper. The proposed approach seeks to make informed motion suggestion for a representative monocular visual-inertial state estimation system to preserve the robust estimation performance. An efficient EOG-based observability evaluation technique and motion primitives are incorporated to enable a local observability prediction and real-time motion suggestion, which makes it possible to evaluate an observability for 120 end points locally within 25 ms using one thread on a 2.9 GHz Intel Core-i7 laptop.The approach is evaluated by both simulation and experiments. The results demonstrate the correctness of the local observability prediction and efficacy of the motion suggestion.

    The observability-constrained motion suggestion is an active strategy to ensure the state estimation performance of a vision-based autonomous system towards safe operation in an undefined environment. To achieve the goal, we will incorporate the proposed approach with the controlling system to enable observability-constrained planning in challenging operation environments.

    [1] Lupton T, Sukkarieh S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions[J]. IEEE Transactions on Robotics,2012, 28(1): 61-76.

    [2] Hinson B T. Observability-based guidance and sensor placement[D]. Washington: University of Washington, 2014.

    [3] Alaeddini A, Morgansen K A. Trajectory design for a nonlinear system to insure observability[C]∥2014 European Control Conference (ECC), Strasbourg, France, 2014.

    [4] Hinson B T, Binder M K, Morgansen K A. Path planning to optimize observability in a planar uniform flow field[C]∥2013 American Control Conference (ACC), Washington, DC, USA, 2013.

    [5] Qi J, Sun K, Kang W. Optimal PMU placement for power system dynamic state estimation by using empirical observability gramian[J]. IEEE Transactions on Power Systems,2015, 30(4): 2041-2054.

    [6] Hesch J A, Kottas D G, Bowman S L, et al. Observability-constrained vision-aided inertial navigation, 2012-001[R]. Minneapolis: Dept of Comp Sci & Eng, MARS Lab, University of Minnesota, 2012.

    [7] Kottas D G, Hesch J A, Bowman S L, et al. On the consistency of vision-aided inertial navigation[C]∥Experimental Robotics, Switzerland, 2013.

    [8] Hesch J A, Kottas D G, Bowman S L, et al. Towards consistent vision-aided inertial navigation[M]. Berlin: Springer, 2013: 559-574.

    [9] Alaeddini A, Morgansen K A. Trajectory optimization of limited sensing dynamical systems[EB/OL]. [2017-02-15].https:∥arxiv.org/abs/1611.08056.

    [10] Lorussi F, Marigo A, Bicchi A. Optimal exploratory paths for a mobile rover[C]∥IEEE International Conference on Robotics and Automation (ICRA), Seoul, Korea, 2001.

    [11] Devries L, Majumdar S J, Paley D A. Observability-based optimization of coordinated sampling trajectories for recursive estimation of a strong, spatially varying flowfield[J]. Journal of Intelligent & Robotic Systems,2013, 70(1-4): 527-544.

    [12] Richter C, Roy N. Learning to plan for visibility in navigation of unknown environments[C]∥International Symposium on Experimental Robotics (ISER), Tokyo, Japan, 2016.

    [13] Rong Z, Michael N. Detection and prediction of near-term state estimation degradation via online nonlinear observability analysis[C]∥2016 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Lausanne, Switzerland, 2016.

    [14] Singh A K, Hahn J. On the use of empirical gramians for controllability and observability analysis[C]∥American Control Conference, Portland, Oregon, USA, 2005.

    [15] Mueller M W, Hehn M, D’andrea R. A computationally efficient motion primitive for quadrocopter trajectory generation[J]. IEEE Transactions on Robotics,2015, 31(6): 1294-1310.

    [16] Hansen P, Alismail H, Rander P, et al. Monocular visual odometry for robot localization in LNG pipes[C]∥2011 IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, 2011.

    [17] Lall S, Marsden J E, Glava?ki S. Empirical model reduction of controlled nonlinear systems[C]∥IFAC World Congress, New York, USA, 1999.

    [18] Qi J, Sun K, Kang W. Adaptive optimal PMU placement based on empirical observability Gramian[C]∥10th IFAC Symposium on Nonlinear Control Systems NOLCOS, Monterey, California, USA, 2016.

    [19] Shen S J, Michael N, Kumar V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs[C]∥IEEE International Conference on Robotics and Automation (ICRA) Washington, USA, 2015.

    [20] Forster C, Carlone L, Dellaert F, et al. On-manifold preintegration theory for fast and accurate visual-inertial navigation[EB/OL]. [2017-02-12].https:∥pdfs.semanticscholar.org/ed4e/9f89d1fbcf 50bea8c65b947b6397a61b4945.pdf.

    [21] Krener A J, Ide K. Measures of unobservability[C]∥48th IEEE Conference on Decision and Control (CDC/CCC), Shanghai, China, 2009.

    [22] Agarwal S, Mierle K. Ceres solver[EB/OL]. [2016-08-14].http:∥ceres-solver.org/.

    亚洲中文av在线| 窝窝影院91人妻| 手机成人av网站| 成人国产一区最新在线观看| 亚洲av日韩精品久久久久久密| 国产精品偷伦视频观看了| 老司机福利观看| 亚洲av成人一区二区三| 999久久久国产精品视频| 亚洲精品一二三| 免费搜索国产男女视频| √禁漫天堂资源中文www| 亚洲自偷自拍图片 自拍| 大型黄色视频在线免费观看| 久久久久久大精品| 久99久视频精品免费| 亚洲欧美日韩高清在线视频| 欧美日韩亚洲高清精品| 久久精品影院6| 国产精品成人在线| svipshipincom国产片| 亚洲情色 制服丝袜| 美女福利国产在线| 韩国av一区二区三区四区| 色播在线永久视频| 又黄又粗又硬又大视频| 国产免费男女视频| 国产精品国产高清国产av| av免费在线观看网站| 天堂俺去俺来也www色官网| 亚洲专区中文字幕在线| 涩涩av久久男人的天堂| 80岁老熟妇乱子伦牲交| 亚洲久久久国产精品| 日韩视频一区二区在线观看| 母亲3免费完整高清在线观看| 国产区一区二久久| 免费在线观看视频国产中文字幕亚洲| 亚洲美女黄片视频| 日韩av在线大香蕉| 欧美大码av| 九色亚洲精品在线播放| 88av欧美| 日韩有码中文字幕| 久久久久久久久中文| 日韩免费av在线播放| 在线天堂中文资源库| 亚洲一卡2卡3卡4卡5卡精品中文| 国产99久久九九免费精品| 999精品在线视频| 国产精品二区激情视频| 免费在线观看黄色视频的| 日日摸夜夜添夜夜添小说| 岛国在线观看网站| 人人妻,人人澡人人爽秒播| 亚洲精品美女久久av网站| 淫秽高清视频在线观看| 中出人妻视频一区二区| 国产精品一区二区精品视频观看| a级片在线免费高清观看视频| 在线观看午夜福利视频| 国产99久久九九免费精品| 老熟妇仑乱视频hdxx| 19禁男女啪啪无遮挡网站| 亚洲人成电影免费在线| 美国免费a级毛片| 久久人人精品亚洲av| 久久中文字幕人妻熟女| 亚洲成a人片在线一区二区| 精品久久久精品久久久| 国产又爽黄色视频| 色精品久久人妻99蜜桃| 99精品在免费线老司机午夜| 久久人人精品亚洲av| 国产伦人伦偷精品视频| 级片在线观看| 欧美一区二区精品小视频在线| 国产精品九九99| 99久久国产精品久久久| 精品久久久精品久久久| 黄色女人牲交| 后天国语完整版免费观看| xxxhd国产人妻xxx| 久久人人精品亚洲av| 亚洲av第一区精品v没综合| av网站在线播放免费| 欧美日韩av久久| 久久精品国产亚洲av香蕉五月| 亚洲一区二区三区欧美精品| 日日夜夜操网爽| 欧美+亚洲+日韩+国产| 亚洲av电影在线进入| 97超级碰碰碰精品色视频在线观看| 搡老熟女国产l中国老女人| 波多野结衣av一区二区av| 最近最新中文字幕大全免费视频| 色尼玛亚洲综合影院| 亚洲欧美激情在线| 91成年电影在线观看| 国产精品久久久人人做人人爽| 男女下面插进去视频免费观看| 人妻久久中文字幕网| 国产乱人伦免费视频| 18禁国产床啪视频网站| 国产欧美日韩综合在线一区二区| 成人永久免费在线观看视频| 黄色女人牲交| bbb黄色大片| 精品卡一卡二卡四卡免费| 亚洲自拍偷在线| 男女下面插进去视频免费观看| 黄色丝袜av网址大全| 热re99久久精品国产66热6| 国产黄a三级三级三级人| 国产麻豆69| 欧美 亚洲 国产 日韩一| 亚洲自拍偷在线| 亚洲aⅴ乱码一区二区在线播放 | 波多野结衣av一区二区av| 满18在线观看网站| 亚洲欧美日韩高清在线视频| 一级a爱片免费观看的视频| 夜夜爽天天搞| 亚洲 欧美一区二区三区| 国产深夜福利视频在线观看| 亚洲七黄色美女视频| 黄色丝袜av网址大全| 丝袜在线中文字幕| 黄频高清免费视频| 91大片在线观看| ponron亚洲| 俄罗斯特黄特色一大片| 伦理电影免费视频| av中文乱码字幕在线| 亚洲一码二码三码区别大吗| 级片在线观看| 女人高潮潮喷娇喘18禁视频| 嫁个100分男人电影在线观看| 亚洲精品av麻豆狂野| 亚洲精品一区av在线观看| 在线看a的网站| 国产高清videossex| 亚洲一区二区三区色噜噜 | av天堂久久9| 老司机靠b影院| 黑丝袜美女国产一区| 午夜久久久在线观看| 在线观看www视频免费| 亚洲中文av在线| www国产在线视频色| 国产精品一区二区精品视频观看| 午夜福利免费观看在线| 在线播放国产精品三级| 婷婷精品国产亚洲av在线| 日韩av在线大香蕉| 窝窝影院91人妻| 亚洲人成网站在线播放欧美日韩| 日韩 欧美 亚洲 中文字幕| 国产欧美日韩一区二区三| 十八禁网站免费在线| 国产成人精品久久二区二区91| 三级毛片av免费| 91大片在线观看| 欧美激情久久久久久爽电影 | 精品人妻1区二区| 亚洲成人国产一区在线观看| 亚洲av电影在线进入| 真人一进一出gif抽搐免费| 亚洲精品国产色婷婷电影| 少妇粗大呻吟视频| 久久精品91无色码中文字幕| 美女福利国产在线| 亚洲视频免费观看视频| 国产单亲对白刺激| 午夜精品国产一区二区电影| 国产精品久久视频播放| 丝袜美足系列| 在线免费观看的www视频| 热99re8久久精品国产| 欧美国产精品va在线观看不卡| 他把我摸到了高潮在线观看| 精品国产一区二区久久| 一二三四在线观看免费中文在| 男女床上黄色一级片免费看| 校园春色视频在线观看| 亚洲精品国产区一区二| 久久亚洲真实| 欧美日韩国产mv在线观看视频| 久久亚洲精品不卡| 极品人妻少妇av视频| 一边摸一边抽搐一进一出视频| 欧美黄色片欧美黄色片| 亚洲avbb在线观看| 99久久人妻综合| 久久久久久久久久久久大奶| 国产99久久九九免费精品| 国产在线观看jvid| 日本一区二区免费在线视频| 黄网站色视频无遮挡免费观看| 精品久久久久久久毛片微露脸| 国产成人啪精品午夜网站| 欧美日韩亚洲高清精品| www.www免费av| 黑人操中国人逼视频| 国产高清激情床上av| 99热国产这里只有精品6| 欧美日韩瑟瑟在线播放| 长腿黑丝高跟| 午夜久久久在线观看| 在线观看av片永久免费下载| 免费搜索国产男女视频| 中文字幕高清在线视频| 99热只有精品国产| 蜜桃久久精品国产亚洲av| 精品人妻偷拍中文字幕| www.www免费av| 韩国av一区二区三区四区| 国产欧美日韩一区二区三| 很黄的视频免费| 国产精品98久久久久久宅男小说| 午夜亚洲福利在线播放| 免费一级毛片在线播放高清视频| 91麻豆精品激情在线观看国产| 亚洲人成伊人成综合网2020| 不卡一级毛片| 中国美女看黄片| 久久久久免费精品人妻一区二区| 天堂√8在线中文| 久久精品国产99精品国产亚洲性色| 午夜两性在线视频| 性色av乱码一区二区三区2| 男女那种视频在线观看| 在线观看一区二区三区| 久久久久性生活片| 1000部很黄的大片| 午夜免费激情av| 好看av亚洲va欧美ⅴa在| 国产成年人精品一区二区| 99久国产av精品| 不卡一级毛片| 男女视频在线观看网站免费| 成人永久免费在线观看视频| 亚洲熟妇中文字幕五十中出| 又爽又黄a免费视频| 村上凉子中文字幕在线| 美女 人体艺术 gogo| 精品不卡国产一区二区三区| 久久久久久久精品吃奶| 内射极品少妇av片p| 麻豆成人午夜福利视频| 美女被艹到高潮喷水动态| 大型黄色视频在线免费观看| 99国产综合亚洲精品| 成人亚洲精品av一区二区| 波野结衣二区三区在线| 精品一区二区免费观看| 很黄的视频免费| 动漫黄色视频在线观看| 一边摸一边抽搐一进一小说| 老熟妇乱子伦视频在线观看| 男女之事视频高清在线观看| 欧美日本视频| 淫秽高清视频在线观看| 99久久精品一区二区三区| 精品一区二区免费观看| 亚洲美女搞黄在线观看 | 三级国产精品欧美在线观看| 亚洲精品久久国产高清桃花| 国语自产精品视频在线第100页| 国产精品自产拍在线观看55亚洲| 亚洲av中文字字幕乱码综合| 长腿黑丝高跟| 老女人水多毛片| www日本黄色视频网| 精品无人区乱码1区二区| 午夜两性在线视频| 在线观看午夜福利视频| 亚洲av二区三区四区| 免费观看人在逋| 免费一级毛片在线播放高清视频| 日韩欧美三级三区| 最近最新中文字幕大全电影3| 欧美在线黄色| 欧美中文日本在线观看视频| 好男人电影高清在线观看| 青草久久国产| 看十八女毛片水多多多| av视频在线观看入口| 免费av观看视频| 亚洲熟妇中文字幕五十中出| 淫妇啪啪啪对白视频| 国产精品久久久久久精品电影| 久久久国产成人精品二区| 中文字幕av成人在线电影| 男插女下体视频免费在线播放| 久久久久国产精品人妻aⅴ院| 久久久精品欧美日韩精品| 国产色婷婷99| 久久久成人免费电影| 色综合欧美亚洲国产小说| 精品人妻1区二区| 国产真实乱freesex| 国产老妇女一区| 亚洲欧美日韩卡通动漫| 黄色一级大片看看| 亚洲欧美日韩卡通动漫| 又黄又爽又免费观看的视频| 女人十人毛片免费观看3o分钟| 欧美高清成人免费视频www| 免费高清视频大片| 久久久精品大字幕| 精品午夜福利在线看| 精品久久久久久久久久久久久| 欧美bdsm另类| 人妻久久中文字幕网| 午夜福利18| 三级毛片av免费| 日本黄大片高清| 天天躁日日操中文字幕| 天堂网av新在线| 亚洲精品在线观看二区| 一级毛片久久久久久久久女| 精品日产1卡2卡| 免费av不卡在线播放| 欧美一区二区亚洲| 日本一本二区三区精品| 日韩精品青青久久久久久| 最后的刺客免费高清国语| 午夜免费男女啪啪视频观看 | 小蜜桃在线观看免费完整版高清| 深夜精品福利| 又黄又爽又刺激的免费视频.| 丰满的人妻完整版| 99久久成人亚洲精品观看| 亚洲片人在线观看| 欧美另类亚洲清纯唯美| 久久亚洲精品不卡| 尤物成人国产欧美一区二区三区| 日韩欧美一区二区三区在线观看| 又爽又黄a免费视频| 久久99热这里只有精品18| 亚洲久久久久久中文字幕| 亚洲成人中文字幕在线播放| 欧美成人免费av一区二区三区| 深夜a级毛片| 成人鲁丝片一二三区免费| 日本五十路高清| 三级毛片av免费| 中文字幕av成人在线电影| 能在线免费观看的黄片| 91午夜精品亚洲一区二区三区 | 中文字幕av在线有码专区| 久久久久久国产a免费观看| 日日夜夜操网爽| 天堂√8在线中文| 国内精品久久久久精免费| 欧美黑人欧美精品刺激| 午夜精品一区二区三区免费看| 色吧在线观看| 亚洲欧美日韩高清在线视频| 有码 亚洲区| 窝窝影院91人妻| 麻豆国产97在线/欧美| 可以在线观看毛片的网站| 免费高清视频大片| 国产在线男女| 欧美性猛交╳xxx乱大交人| 欧美高清成人免费视频www| 噜噜噜噜噜久久久久久91| 看黄色毛片网站| 一个人免费在线观看的高清视频| 一本综合久久免费| 成人午夜高清在线视频| 波野结衣二区三区在线| 午夜福利视频1000在线观看| 每晚都被弄得嗷嗷叫到高潮| 国产精品美女特级片免费视频播放器| 欧美又色又爽又黄视频| 成人鲁丝片一二三区免费| 日韩成人在线观看一区二区三区| 婷婷丁香在线五月| 久久99热这里只有精品18| www.熟女人妻精品国产| 国产高清视频在线观看网站| 麻豆一二三区av精品| 丰满乱子伦码专区| 久久久久免费精品人妻一区二区| 狠狠狠狠99中文字幕| 久久久久国产精品人妻aⅴ院| 欧美在线一区亚洲| 99视频精品全部免费 在线| 午夜免费激情av| 999久久久精品免费观看国产| 1024手机看黄色片| 国产精品久久久久久亚洲av鲁大| 久久久久性生活片| 婷婷色综合大香蕉| 日本三级黄在线观看| 久久精品国产亚洲av香蕉五月| 久久久久精品国产欧美久久久| 免费电影在线观看免费观看| 免费人成在线观看视频色| 欧美日韩中文字幕国产精品一区二区三区| 日日摸夜夜添夜夜添av毛片 | 国产精品久久久久久久久免 | 国产亚洲欧美98| 日韩成人在线观看一区二区三区| 国产成人啪精品午夜网站| www.色视频.com| 国产高清三级在线| 精品久久久久久久久av| 色5月婷婷丁香| 日日干狠狠操夜夜爽| 网址你懂的国产日韩在线| 午夜老司机福利剧场| 999久久久精品免费观看国产| 成人av在线播放网站| 日韩精品青青久久久久久| 最新在线观看一区二区三区| 久久精品国产自在天天线| 亚洲男人的天堂狠狠| 一进一出好大好爽视频| 免费在线观看日本一区| 嫩草影院精品99| 99国产精品一区二区三区| 99久久无色码亚洲精品果冻| 动漫黄色视频在线观看| 少妇人妻精品综合一区二区 | 午夜激情欧美在线| 搡老岳熟女国产| 欧洲精品卡2卡3卡4卡5卡区| 美女被艹到高潮喷水动态| 丰满乱子伦码专区| 亚洲va日本ⅴa欧美va伊人久久| 夜夜爽天天搞| 国内精品美女久久久久久| 日韩 亚洲 欧美在线| 午夜影院日韩av| 99热这里只有精品一区| 欧美黑人欧美精品刺激| 国产精品人妻久久久久久| 搡老妇女老女人老熟妇| 一级黄色大片毛片| 亚洲国产色片| 精品久久久久久,| 免费黄网站久久成人精品 | 国产精品不卡视频一区二区 | av在线蜜桃| 国产中年淑女户外野战色| 无遮挡黄片免费观看| 亚洲精品乱码久久久v下载方式| 亚洲av美国av| 在线播放无遮挡| 女人十人毛片免费观看3o分钟| 在线观看免费视频日本深夜| 能在线免费观看的黄片| 搡老岳熟女国产| 精品人妻一区二区三区麻豆 | 日本免费一区二区三区高清不卡| 国产成人欧美在线观看| 国产精品一及| 国产在线男女| 69av精品久久久久久| 国产成人福利小说| 国产亚洲精品综合一区在线观看| 91在线精品国自产拍蜜月| 国产精品亚洲一级av第二区| 国产美女午夜福利| 3wmmmm亚洲av在线观看| 国产大屁股一区二区在线视频| 九色成人免费人妻av| 国产精品亚洲美女久久久| 性插视频无遮挡在线免费观看| 国产一级毛片七仙女欲春2| 一级黄片播放器| 一a级毛片在线观看| 九色国产91popny在线| 国产三级黄色录像| 日韩国内少妇激情av| 性色avwww在线观看| 在线观看av片永久免费下载| 成熟少妇高潮喷水视频| 国产高清视频在线观看网站| 国产伦一二天堂av在线观看| 成人无遮挡网站| 久久热精品热| 校园春色视频在线观看| 天堂√8在线中文| 国产伦在线观看视频一区| 亚洲精品影视一区二区三区av| 身体一侧抽搐| 一级av片app| 黄片小视频在线播放| 久久久久久久久中文| 亚洲国产精品成人综合色| 国产极品精品免费视频能看的| aaaaa片日本免费| 国产高清激情床上av| 日日夜夜操网爽| 51午夜福利影视在线观看| 熟女电影av网| 两人在一起打扑克的视频| 国产不卡一卡二| 中文字幕久久专区| 国产伦人伦偷精品视频| 精品一区二区三区视频在线观看免费| 99热这里只有是精品在线观看 | 亚洲av中文字字幕乱码综合| 欧美在线一区亚洲| 91久久精品电影网| 欧美另类亚洲清纯唯美| 99热只有精品国产| 国产色婷婷99| 国产国拍精品亚洲av在线观看| 精品国内亚洲2022精品成人| 国产精品免费一区二区三区在线| www.熟女人妻精品国产| 亚洲片人在线观看| 99热这里只有精品一区| 五月伊人婷婷丁香| 国产v大片淫在线免费观看| 午夜免费成人在线视频| 精品无人区乱码1区二区| 又粗又爽又猛毛片免费看| 美女 人体艺术 gogo| 亚洲专区中文字幕在线| 99精品久久久久人妻精品| 欧美一区二区精品小视频在线| 国产成人aa在线观看| 日韩欧美国产一区二区入口| 无人区码免费观看不卡| 成人国产一区最新在线观看| 久久久久性生活片| 中文字幕av成人在线电影| 99热6这里只有精品| 国产野战对白在线观看| 日本黄色片子视频| 婷婷精品国产亚洲av在线| 国产精品久久电影中文字幕| 国产一区二区亚洲精品在线观看| 看十八女毛片水多多多| 少妇人妻精品综合一区二区 | 成年女人毛片免费观看观看9| 99热只有精品国产| 99国产精品一区二区三区| 亚洲欧美日韩高清专用| 婷婷色综合大香蕉| 国产av麻豆久久久久久久| 一个人观看的视频www高清免费观看| 日本免费一区二区三区高清不卡| 日本在线视频免费播放| 亚洲最大成人av| 三级男女做爰猛烈吃奶摸视频| 免费在线观看成人毛片| 丰满人妻一区二区三区视频av| 国产成+人综合+亚洲专区| 欧美成狂野欧美在线观看| 18禁黄网站禁片免费观看直播| 免费人成在线观看视频色| 一个人观看的视频www高清免费观看| 精品人妻偷拍中文字幕| 99久久成人亚洲精品观看| 美女xxoo啪啪120秒动态图 | 99热这里只有是精品在线观看 | 国产伦在线观看视频一区| 给我免费播放毛片高清在线观看| 亚洲成人中文字幕在线播放| av专区在线播放| 老鸭窝网址在线观看| 人人妻,人人澡人人爽秒播| 九色国产91popny在线| 亚洲av第一区精品v没综合| 成人av一区二区三区在线看| 国产久久久一区二区三区| 嫩草影院精品99| 国产午夜福利久久久久久| 91麻豆av在线| 色哟哟·www| 在线a可以看的网站| 首页视频小说图片口味搜索| 国产国拍精品亚洲av在线观看| 国产黄片美女视频| aaaaa片日本免费| 男女床上黄色一级片免费看| 激情在线观看视频在线高清| x7x7x7水蜜桃| 亚洲av美国av| 激情在线观看视频在线高清| 久久久久性生活片| 国产高清视频在线观看网站| 成年版毛片免费区| 国产视频内射| 国产国拍精品亚洲av在线观看| 黄色日韩在线| 麻豆av噜噜一区二区三区| 一二三四社区在线视频社区8| h日本视频在线播放| 成年免费大片在线观看| 亚洲熟妇熟女久久| 小说图片视频综合网站| 国产精品久久久久久久久免 | 淫妇啪啪啪对白视频| 我要看日韩黄色一级片| 国产黄片美女视频| 麻豆成人午夜福利视频| 色综合亚洲欧美另类图片| 老司机深夜福利视频在线观看| 久久久久九九精品影院| 久久欧美精品欧美久久欧美| 国产精品久久电影中文字幕| 国产一区二区三区在线臀色熟女| 国内精品美女久久久久久| 国内久久婷婷六月综合欲色啪| 一个人免费在线观看的高清视频| 国产精品98久久久久久宅男小说|