• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Monocular Visual-Inertial and Robotic-Arm Calibration in a Unifying Framework

    2022-10-26 12:24:02YinlongZhangWeiLiangSeniorMingzeYuanHongshengHeJindongTanandZhiboPangSenior
    IEEE/CAA Journal of Automatica Sinica 2022年1期

    Yinlong Zhang,, Wei Liang, Senior, Mingze Yuan, Hongsheng He,, Jindong Tan,, and Zhibo Pang, Senior

    Abstract—Reliable and accurate calibration for camera, inertial measurement unit (IMU) and robot is a critical prerequisite for visual-inertial based robot pose estimation and surrounding environment perception. However, traditional calibrations suffer inaccuracy and inconsistency. To address these problems, this paper proposes a monocular visual-inertial and robotic-arm calibration in a unifying framework. In our method, the spatial relationship is geometrically correlated between the sensing units and robotic arm. The decoupled estimations on rotation and translation could reduce the coupled errors during the optimization. Additionally, the robotic calibration moving trajectory has been designed in a spiral pattern that enables full excitations on 6 DOF motions repeatably and consistently. The calibration has been evaluated on our developed platform. In the experiments, the calibration achieves the accuracy with rotation and translation RMSEs less than 0.7° and 0.01 m, respectively.The comparisons with state-of-the-art results prove our calibration consistency, accuracy and effectiveness.

    I. INTRODUCTION

    MONOCULAR visual-inertial and robotic-arm calibration aims at correlating the spatial relationship between the visual-inertial unit and robot. Typically, monocular camera and inertial measurement unit (IMU) form a compact and minimum sensor suite. They could be potentially used for robot pose estimation and surrounding environment perception [1]–[5].

    Camera and IMU share heterogeneous features regarding their quick response, tracking accuracy, and absolute pose observability [6]–[9]. Camera belongs to the exteroceptive sensor type. It captures body pose by tracking the visual features (usually, the blob patterns, corners or pixel intensities) and minimizing reprojection errors or photometric errors over image sequence. Comparably, IMU belongs to the proprioceptive type that captures body inertial measurements,i.e., linear accelerations and angular velocities. The absolute metric scale, gravity and short-term yaw-pitch-roll angles could be derived [10], [11]. Through this way, robot tracking works even in presence of camera tracking failures such as illumination changes, motion blurs, and absence of textures[12], [13]. Potential areas for visual-inertial based robot perception include visual-inertial odometry for unmanned aerial vehicle and ground vehicle, robotic arm motion tracking, etc [14]–[17].

    Basically, the performance of monocular visual-inertial based robot pose estimation and surrounding environment perception considerably relies on the accuracy of calibration among the triplet [18]–[22]. Incorrect calibration will result in severe drifts over time, which is unacceptable for high-end industrial robot perception. The existing literatures on visualinertial calibration can be classified into two categories: offline and online [23]. Off-line calibration, as the name suggests, performs the spatial and temporal calibration before use [7], [24], [25]. The calibration process usually requires users to gently and evenly move the sensor rig in front of a stationary target (typically a chessboard or april tags). The whole process repeats a few times until convergence.Generally, the calibration result has relatively higher precision and consistency, since it enables batch optimization over a large visual-inertial dataset. While, for online calibration, the extrinsic parameters as well as the body initial values (e.g.,velocity, pitch, and roll) are estimated online [26]. The angular velocity and acceleration biases could be estimated to compensate for temperature changes. However, the calibration performance, sometimes, is not quite stable compared to offline ones, since the calibration process only lasts a few seconds and lacks repeatability.

    The general goal is to design a visual-inertial & robotic-arm calibration method (illustrated in Fig. 1). In order to solve the inherent calibration issues (for instance, inconsistent calibration movement, unexpected disturbances while holding the visual-inertial rig, absence of adequate excitations on 6-DOF movement), this work designs an accurate and consistent moving trajectory, which is performed by the 6-DOF robotic arm. The trajectory could excite yaw-pitch-roll rotations andx-y-ztranslations evenly and repeatably. Besides, we develop a unifying model that is able to correlate the spatial relationships among the triplet in an elegant mathematical formulation. The contributions of this paper are summarized as follows.

    Fig. 1. Illustration on our proposed method: visual-inertial and robotic-arm calibration. The coordinate frames include camera frame, IMU frame and robotic-arm frame. The goal is to estimate the visual and inertial intrinsics, as well as the spatial relationships among visual-inertial and robotic-arm frames.

    Fig. 2. Pipeline of our calibration on camera, IMU and Robotic arm system. It consists of three parts: IMU-Manipulator calibration, Camera-IMU calibration,and Camera-Manipulator calibration. Afterwards, the calibration parameters are fed into the convergence model for sanity check.

    i) A monocular visual-inertial and robotic-arm calibration model is developed in a unifying framework;

    ii) We design a spiral moving trajectory that could excite yaw-pitch-roll rotations andx-y-ztranslations evenly,uniformly and repeatably. The unexpected moving jitters and disturbances could be alleviated.

    iii) The proposed method has been evaluated on our developed platform. The repeatability tests, systematic analysis and comparisons with state-of-the-art results have been extensively performed to prove our method effectiveness.

    The rest of the paper is organized as follows. In Section II,the related works on visual-inertial-robot calibrations are reviewed. In Section III, preliminaries on variable notation and IMU model are briefly introduced. In Section IV, our proposed method is described in detail. In Section V, the developed platform is briefly introduced; the calibrations are extensively analyzed and compared with state-of-the-art results. Section VI concludes the whole paper.

    II. RELATED WORK

    A. Camera-IMU Calibration

    By and large, camera-IMU calibrations can be classified into two types, i.e., online and offline. Online calibration aims at correlating the spatial relationship between visual-inertial sensors rightly before use. The calibration usually performs in a plug-and-play manner. In [27], Yanget al.analyzed the observability for spatial and temporal calibration parameters of visual-inertial sensors and it has been verified that calibrations are observable for random motions rather than degraded motions (e.g., planar motion) which might result in the occurrence of calibration failures. Similarly, Kelly and Sukhatme showed that the relative observability of visualinertial transformation needs the platform to undergo both accelerations and rotations in more than two IMU axes in[19]. Moreover, [28] introduced a closed-form method to estimate visual-inertial orientation, speeds, scales and biases.In [29], Zhenget al.estimated the calibration parameters within the filtering framework for ground vehicle VIO. In[26], [30], Yanget al.proposed to calculate camera-IMU time offset, relative transformation and inertial biases in the VINS framework. Basically, online calibration fits the plug-and-play scenario such as unmanned aerial vehicles (MAV) odometry.

    Offline calibration calibrates camera and IMU sensor suite in advance. It needs a larger number of calibration datasets and achieves more accurate and reliable calibrations. In [31],Lobo and Dias estimated the rotation and translation between camera and IMU in a separate manner. The rotation is firstly calculated by aligning visual sensor orientation with gravitational components (obtained by inertial accelerations).The translation is computed by using the encoder and turnable. However, it needs to precisely place IMU right at the center and camera at the rim of turntable, which requires expertise and patience to follow the procedures. In [32], the rotation is estimated in form of quaternions. Afterwards,slerping is applied to synchronize the visual-inertial measurements. The rotational matrix is computed by aligning the quaternion rotating axes, which leaves alone the angle part from the derived quaternion. Through this way, the noises in rotational angle components from quaternions could be suppressed. In [33], Furgaleet al.proposed to estimate both the temporal offset and spatial transformation in a joint framework. It minimizes the error terms that relate to tracking feature reprojections, inertial accelerations, velocities and biases. Generally, the offline calibration is more accurate than online ones in that the moving trajectories could be repeated and estimation inconsistency could be alleviated. Additionally, the camera and IMU measurements characterize in highdimensional and sparse space [34], which could be further used to extract nonnegative latent useful information via the NLF model in the offline mode, which are extensively analyzed by Luoet al.in [35], [36].

    B. Camera and Robotic-Arm Calibration

    In the camera and robotic-arm calibration, rotation and translation between camera frame and robotic-arm frame are computed by tracking calibration board features from different viewpoints. Afterwards, the reprojection minimization is implemented to align the adjoint attitudes [37]. The cameraand robotic arm calibration are also categorized into two types, i.e., eye-in-hand mode and eye-to-hand mode.

    TABLE I NOMENCLATURE

    In [38], [39], Liet al.introduced a comprehensive overview of calibration on industrial robots and dual arms. By taking measurement noise and external constraints into the robot kinematic model, the calibration accuracy could be boosted. In terms of eye-in-hand type, camera is rigidly mounted to robotic arm. The calibration board moves within the camera captured views. In [40], the authors proposed a unified mathematical formulation by using procrustes analyzers. In[41], the authors introduced an eye-in-hand calibration for the scanner robotic system. By cyclically capturing images from a 2d calibration object, the calibration kinematic model parameters are obtained. For the eye-to-hand calibration type,camera is fixed in a global-view fashion. Robot holds the checker board and moves within the camera capturing view.In [42], Koide and Menegatti introduced to directly take images with calibration patterns in the pose graph optimization model. In [43], the authors presented the calibration that minimizes two kinematic loop errors. By and large, the eye-in-hand type is preferable because of its flexibility and first person view.

    III. PRELIMINARIES

    A. Variable Notations

    The variables used in the paper are listed in Table I. The coordinate frames used in this paper include world frame {W},camera frame {C}, IMU frame {I} (also known as body frame{B}) , robot frame {R}. The transformation from camera frame{C} to IMU frame {I} is described by a 4 × 4 transformation matrixin its homogeneous form, which is given by

    B. IMU Model

    IMU consists of 3-axis accelerometers and 3-axis gyroscopes [44]. The accelerometer measures both local gravity accelerations and linear accelerations. Usually, they are corrupted by sensor noises, non-linear bias variations with temperature. Unfortunately, these errors will be accumulated and lead to significant drift in the position, attitude, and velocity outputs. Luckily, the inertial errors could be considerably compensated by complementary visual sensing unit.

    In this work, the acceleration model is given by [45]wheredenotes the linear acceleration in the world frame;denotes the sampled body accelerations in the body frame,which is coupled with gravitational parts and noises.gWis the gravitational vector in the world frame.is the rotation matrix from world frame to body frame.denotes the bias (which is caused by low-frequency offsets and changes over time slowly). Usually, it causes the majority of errors while double integrating the accelerations. In this work,is modeled as a random walk process [46].nαis the acceleration noise (usually taken as white Gaussian noise).

    Similarly, the angular rate measurement model is

    IV. METHOD

    In this section, our method is described in detail. As illustrated in Fig. 2, it consists of three modules: camera-IMU calibration, camera-robot calibration, and IMU-robot calibration. For camera-IMU calibration, it includes the computations on IMU biases, coordinate rotations and translations.Then the IMU biases are fed into the calibration on IMU-robot module to derive the corresponding IMU-robot frame transformations. Meanwhile, camera and robot calibration is performed using the hand-eye calibration. The calibration completes when the triplet satisfies the convergence condition.The details are described in the following subsections.

    A. Camera-IMU Calibration

    In this work, camera-IMU calibration is divided into two parts, i.e., IMU preintegration and camera-IMU intrinsic &extrinsic calibration.

    1) Inertial Measurement Preintegration:In order to align the transformation between camera and IMU, it is crucial to implement IMU preintegration [5]. Given time instantsiandjbetween camera successive frames, the body orientation,velocity and position could be preintegrated from the collected inertial measurements. They are given by

    IMU measurements could be preintegrated that are only related to visual frames at timestiandtj. In (5a), the noise term could be isolated and approximated by its first-order Jacobian given as

    Similarly, the increment in velocity could also be expressed by its first-order approximation given as

    2) Camera-IMU Extrinsic and Intrinsic Calibration:In this work, we assume that the gyroscopic and acceleration biases remain constant. The biases can be calculated by minimizing the difference between IMU rotation and camera rotation iteratively. During the iteration, the estimated rotationis used to update the angular velocity bias.

    a)Angular velocity bias estimation

    The rotation of camera frame between timestandt+1 could be given by

    By combining (4a), (6), and (11), the relationship between camera rotation and IMU preintegration of the time periodstandt+1 can be given by

    Equation (13) could be solved using the Levenberg-Marquart nonlinear optimization. The preintegrations on IMU rotations are updated afterwards.

    b)Camera-IMU rotation estimation

    The rotation can be computed by aligning the camera rotation between framestandt+1 with the preintegrated IMU rotations. Assume the changes of attitude for camera frame and IMU frame areandrespectively between time interval [t,t+1].andrepresent the same rotation though in different coordinate reference systems. The quaternionthat converts the camera reference to IMU reference satisfies

    where ( ·)?means quater nion conjugation.

    Then the optimal quaternionwill be obtained by maximizing the following equation:

    c)Accelerometer bias estimation

    By combining (4b), (4c), and (11), the relationship between camera translation and inertial preintegrated translation during time interval [i,j] is

    The optimal accelerometer biascould be obtained by minimizing the following function:

    d)Camera-IMU translation estimation

    B. Camera and Robotic-Arm Calibration

    In this work, camera and IMU are rigidly attached to robotic wrist (the 6th joint of robotic arm). The camera and roboticarm calibration is expressed by the transformation matrix in its homogeneous form.

    C. IMU and Robotic-Arm Calibration

    IMU and robotic-arm calibration could be formulated by homogeneous transformationXRI[48]. The robotic wrist moves in a predefined trajectoryMeanwhile, IMU follows the similar path ({I1→I2→I3···},also seen as IMU pose sequence).

    Similarly to (19), IMU and robotic wrist poses satisfy the following equation:

    In a similar manner, the relative rotationRRIand translationtRIbetween IMU frame and robotic wrist frame could be derived by solving the following equations:

    D. Calibration Convergence Criteria

    In our model, there exists an inherent geometric constraint between robotic wrist frame {R} , IMU frame {I} and camera frame {C}. Theoretically, the transformation matricesTRC,TRI, andTICsatisfy the following geometric relationship:

    E. Robotic Arm Calibration

    Because of the structural parameter errors of robot and the robotic dynamics effects, the actual running trajectory of robot could, somehow, deviate from the programming values.However, the errors caused by the aforementioned issues could be compensated by calibrating the kinematic model of the robot dynamics factors (e.g., centrifugal force, coriolis force, dynamic coupling [38]). The robotic arm calibration model is briefly introduced in this section.

    The robotic arm could be described by the multi-link mechanism connected by joints [49]. Its kinematics could be formulated by the D-H model, which is described as

    wheresymbolizes the transformation matrix between the successive linksi?1 andi.cθ andsθ symbolize the trigonometric functions sinθ and cosθ, respectively. The group parameters {θi,αi,di,ai} are the variables associated with jointiand linki, which mean the joint angle, link length,link offset, and link twist, respectively. The link between the successive joints could be fully described by the D-H model.

    Additionally, the moving trajectory could also be affected by the robot dynamics. Its dynamic model could be described in the form of Lagrangian equation as follows:

    whereτsymbolizes the torque andθsymbolizes the joint angle;M(θ) means the inertial matrix;stands for Coriolis and centrifugal force;g(θ) is the gravity.

    In this work, the least square method is used to calibrate the robot. It computes the inherent geometric parameters (i.e., link parameter error, joint rotation error) by minimizing the errors between the theoretical data and its measurement. In this paper, the 3-D schematic model of position measuring system with drawstring displacement configuration is adopted. Then the robot kinematic errors are calculated by minimizing the loss function of errors between the actual end-effector position and its measurement from the D-H model in (34). Please refer to [50] for more details in calibration implementation.

    V. EXPERIMENTAL RESULTS AND ANALYSIS

    A. Experimental Platform

    In our developed system (shown in Fig. 3), the robotic arm is Kinova Jaco 2 with 6 rotation joints. The total weight of the robot is 5.2 kg. Its maximum reach is up to 98 cm and its maximum linear arm speed is 20 cm/s. Its communication protocol is RS-485 with USB 2.0 port. The commands are given on Linux Ubuntu & ROS system. Its maximum payload is 2.2 kg.

    Fig. 3. The experimental platform consists of Camera, IMU, and Robotic arm. Intel D435i (integrated with monocular camera and IMU) is rigidly attached to the robotic-arm (Kinova Jaco 2) 6th joint. It is programmed to move in a predefined pattern to capture the calibration board and body inertial measurements.

    Fig. 4. Designed robotic arm moving trajectory. It moves in a spiral pattern.The x-y-z translations and yaw-pitch-roll rotations are uniformly excited.

    Intel D435i integrated with camera and IMU (Bosch) are fixed to the 6th joint. The monocular camera samples the image at 20 fps; its image resolution is 640×480. The camera intrinsic parameters are calibrated with standard deviation less than 0.1 pixel. The IMU samples the inertial measurements at 200 Hz. The inertial noise terms are estimated using Allan variance analysis [45]. The initial acceleration noise and its bias noise are 0.00014 m/s2and 0.00004 m/s3respectively;gyroscopic noise and its bias noise are 0.00035 rad/s and 0.000055 rad/s2, respectively. The image and inertial data,together with robot poses, are synchronized by timestamps.The calibration parameters are calculated on the laptop with 2.4 GHz Intel Core i7 and 8 G RAM.

    B. Repeatability Test

    Since the poses read from the robot arm are used to define the trajectory for calibration consistency, we firstly test the robot moving repeatability. The robot is programmed to follow a pre-defined trajectory and expected to move and return to the same set of fixed points. Also, the positions and orientations read from the robot are supposed to be precise.The robotic arm is tested by automatically moving its end to a set of specified spots on a calibration table. A 10×10 chessboard points are designed; each point is denoted by a 1×1 millimeter square. The robot goes over the chessboard points cyclically for more than 100 times. It has been found that the robot arm always directs to the grid points with position error less than 0.1 mm and rotation error less than 0.2°. Hence, the robot moving repeatability satisfies our requirement.

    C. Designed Trajectory

    In the test, we use Apriltag [51] as the calibration board.The robotic arm with the joint-mounted monocular camera and IMU moves in front of the calibration board to capture the Apriltag patterns in the image sequence. It is noticeable that the moving trajectory proceeds in a spiral fashion, as shown in Fig. 4. The trajectory projections onx-yplane,x-zplane, andy-zplane are shown in Fig. 5, Fig. 6, Fig. 7, respectively. This type of trajectory is intentionally designed to fully and uniformly excite yaw-pitch-roll rotations andx-y-ztranslations. Besides, in order to satisfy that the calibration tag patterns are captured in the image sequence, the poses are supposed to enable camera lens (optical axis denoted by blue axis in Fig. 4) point to the calibration board.

    Fig. 5. Manipulator trajectory projected onto x-y plane.

    Fig. 6. Manipulator trajectory projected onto x-z plane.

    Fig. 7. Manipulator trajectory projected onto y-z plane.

    The robot spiral moving pattern is

    wheresx=0.3,sy=?2, andsz=0.3 denote the moving scale along the axes; ω=1 is the angular rate; θ=1, ζ=80, and φ=1 are the initial phases.

    The correspondingyaw-pitch-rollangles are parameterized by

    where ξy=0.001, ξp=0.01, and ξr=0.001 are the angular scales onyaw,pitch, androll, respectively; α=2, β=?1, and γ=1 are the initial angular phases, respectively.

    In robot cartesian mode, there are some configurations in which the robot loses one or more degrees of freedom (i.e., the robot is unable to move in one direction or another). In order to maximally reduce the odds of singularity, we used the

    ActivateAutomaticSingularityAvoidanceAPI function and set it to false. However, it should be noticed that the designed moving trajectory might still be affected due to the existence of singular positions inside the robot workspace, while calculating its inverse kinematics using ROS Moveit [52]. In presence of singularity, the robotic arm will move arbitrarily or stay still (For instance, the arm is at full reach. It is unable to move anymore in the direction it is currently reaching out),rather than proceed in the predefined path. In order to solve this issue, we have tried several initial poses and enabled the robot arm to move the predefined spiral trajectory.Afterwards, the initial pose and its corresponding trajectory without any singularities during the moving process are selected. Additionally, the joint moving angle series are computed in advance, which could save computational time for deriving inverse kinematics during the movement. Totally,there are 80 poses during the manipulator movement. The robot followed the spiral trajectory and the whole movement process lasted about 2 minutes. It has been repeated 20 times for the spiral curve trajectory. The computational time for each trajectory was approximately 10 minutes.

    The translations for the designed trajectory are shown in Fig. 8. The translations onxandzaxes (scale:[?12.85 cm,12.85 cm]) are designed to proceed in a sinusoidal form. The translation onyaxis proceeds in a dog-leg path(scale: [17.20 cm,80.00 cm]). The moving pattern moves back and forth for a few times.

    The moving trajectory angular curves are shown in Fig. 9.For ensuring the cameray-axis points approximately to the calibration board, we designed theroll(angle scale:[0°,17.76°]) andyaw(angle scale: [28.71°,46.47°]) that move in a relatively small scale. Comparably, thepitch(angle scale:[2.23°,233.14°]) alongy-axis changes in a dog-leg style on a large scale.

    Fig. 8. Trajectory curves with repsect to x, y, and z.

    Fig. 9. Trajectory curves with respect to yaw-pitch-roll.

    D. Results and Analysis

    In our experiments, more than twenty groups of visualinertial and robot pose data were collected. Each test lasted about 60 seconds. The acceleration biases are [0.105 m/s?2,?0.131 m/s?2,?0.065 m/s?2]; the gyroscopic biases are[?0.0068 rad/s,?0.0008 rad/s,0.0014 rad/s].

    Our method has been compared with state-of-the-art, i.e.,ICSR [53] and Kalibr [25]. The designed trajectories for the traditional calibrations include the typical motions, e.g.,circular motion, zig-zag motion, circular motion, rosette motion and irregular rotation & translation that excite 6DOF motions. The tests have been repeated more than 20 times.

    In Fig. 10, it can be observed that ours achieves more stable and consistent outcomes (obviously smaller standard deviations onx-y-ztranslations andyaw-pitch-rollrotations) than the other two methods, partly because the geometrical constraint among camera, IMU and robotic arm is added to the calibration model. Also, the robotic arm moves in a predefined spiral fashion, which could potentially improve moving consistency, accuracy, stability.

    Fig. 10. Comparisons on calibration methods using Ours, ICSR [53], and Kalibr [25]. It includes the comparisons on rotation: yaw, pitch, roll; and translations: x, y, z.

    Fig. 11. Comparisons on image reprojection error between (a) Ours, (b) ICSR [53], and (c) Kalibr [25].

    TABLE II COMPARISONS OF CALIBRATION METHODS BETWEEN OURS, ICSR, AND KALIBR

    The corresponding Average and root mean squared error(RMSE) of translations and rotations are shown in Table II. Its Std onyaw,pitch,rollare 0.494°, 0.646°, 0.621°; its translation Std onx,y,zare 0.004 m, 0.009 m , and 0.012 m,respectively. From the results, we can see that ours achieves competitive results while comparing with the other two methods in terms of both Average and RMSE.

    The reprojection errors are also analyzed and compared.As shown in Fig. 11(a), most of the reprojection errors are within the circle less than 0.25 pixel. Comparably, the convergence area using ICSR [53] (Fig. 11(b)) and Kalibr [25](Fig. 11(c)) are relatively larger, which are approximately 0.5 pixel.

    Fig. 12. Comparisons on acceleration errors and angular velocity errors using ours, ICSR, and Kalibr. (a)–(c) Acceleration errors in 2d; (d)–(f) Gyroscopic errors in 2d; (g)–(i) Acceleration errors in 3d; (j)–(l) Gyroscopic errors in 3d.

    We have also compared the acceleration errors and angular velocity errors in both 2d and 3d forms. As can be seen in Figs. 12(a)–12(f), the acceleration and angular velocity errors are all within a small range. By contrast, there exist severe error jitters for both ICSR and Kalibr, which can be attributed to unexpected disturbances and manual jitters during the movement. In Figs. 12(g)–12(i), the acceleration errors are plotted in its 3d form. The acceleration convergence sphere radius using our method is approximately 0.429 m/s2; while the other sphere radii are roughly 1 m/s2. In a similar manner,the gyroscopic errors are plotted in Figs. 12(j)–12(l), the convergence area radius of our method is 0 .08 rad/s, while the radii for ICSR and Kalibr are 0.107 rad/s and 0.114 rad/s,respectively. From the results, it can be observed that IMU errors are comparably smaller than the state-of-the-art, thanks to the spiral moving trajectory design and decoupled estimations on translation & rotation.

    We have also performed the comparisons on the computational time between ours, ICSR [53], and Kalibr [25],to evaluate the method efficiency. As can be observed in Table III, the spiral moving trajectory has been performed two times. The collected dataset related to the first test has approximately 2400 images and the capturing sequence lasts about 120 seconds; the dataset related to the second test has approximately 3600 images and the capturing sequence lasts about 180 seconds. The symbol “*” represents the IMU preintegration. From the results, it can be seen that the computational time using ours with IMU preintegration takes 1232 seconds (the the least amount of time), which can be largely attributed to the strategy in the inertial preintegrated process. IMU preintegration saves the time in iterations of quaternions, which are fed into the velocity and position model. Comparably, ours without preintegration consumes longer time in deriving the calibration parameters. It can be observed that ICSR and Ours show comparable computational time, but ours achieves higher calibration accuracy.Noticeably, Kalibr requires large amount of time, especially for the one with a larger dataset, which is due to the minimization of bundle volume of states being processed.

    TABLE III COMPARISONS ON THE COMPUTATIONAL TIME BETWEEN OURS (WITH IMU PREINTEGRATION), OURS (WITHOUT IMU PREINTEGRATION), ICSR [53], AND KALIBR [25] USING THE SPIRAL MOVING TRAJECTORY. THE 1ST TEST SET HAS APPROXIMATELY 2400 IMAGES AND THE CAPTURING SEQUENCE LASTS ABOUT 120 SECONDS. THE 2ND TEST SET HAS APPROXIMATELY 3600 IMAGES AND THE CAPTURING SEQUENCE LASTS ABOUT 180 SECONDS. “*”MEANS THE IMU PREINTEGRATION

    VI. CONCLUSION AND FUTURE WORK

    In this paper, we have developed a unifying monocular visual-inertial and robotic arm calibration framework. It is able to geometrically correlate the spatial relationship among the sensing unit and robotic arm. Besides, we have designed the calibration moving trajectory in a spiral pattern. Through this design, the excitations onyaw-pitch-rollrotations andx-yztranslations could be performed uniformly and consistently.The performance of the calibration has been evaluated on our developed platform. In the experiments, the standard deviations on rotations and translations are less than 0.7°and 0.012 m, respectively, which proves its advantages in visualinertial-robot calibrations.

    One drawback of our current calibration method is the lack of systematic comparisons on the typical trajectories, such as zig-zag trajectory, circular trajectory, radial trajectory, rosette trajectory. Thus, in the future, we plan to perform more tests and analysis on these trajectories. Besides, we will perform the robot calibration based on [38] to check the repeatability and accuracy of the manipulator qualitatively. Eventually, we plan to design that trajectory that could avoid singularities in the robot cartesian space since the robot experiences the inevitable singularity in the moving process.

    久久6这里有精品| 亚洲在久久综合| 在线观看三级黄色| 久久这里有精品视频免费| 国产黄色视频一区二区在线观看| 免费久久久久久久精品成人欧美视频 | 极品人妻少妇av视频| 国产爽快片一区二区三区| 一级毛片我不卡| 中文字幕人妻丝袜制服| av国产精品久久久久影院| a级毛片在线看网站| 国产成人freesex在线| av女优亚洲男人天堂| 婷婷色综合大香蕉| 人妻制服诱惑在线中文字幕| 乱系列少妇在线播放| 中文精品一卡2卡3卡4更新| 午夜av观看不卡| 男的添女的下面高潮视频| 久久亚洲国产成人精品v| 免费观看a级毛片全部| a级片在线免费高清观看视频| 久久精品国产亚洲av天美| 国产男女内射视频| 免费av不卡在线播放| 欧美97在线视频| 青春草视频在线免费观看| 国产欧美日韩一区二区三区在线 | 全区人妻精品视频| 精品国产国语对白av| 精品国产一区二区久久| 最近中文字幕2019免费版| 久久久国产欧美日韩av| 亚洲欧美日韩另类电影网站| 又大又黄又爽视频免费| 亚洲国产日韩一区二区| 老女人水多毛片| 久久久久国产精品人妻一区二区| 搡老乐熟女国产| 国产伦精品一区二区三区视频9| 色94色欧美一区二区| 国产乱人偷精品视频| 国产日韩一区二区三区精品不卡 | 免费黄频网站在线观看国产| 少妇 在线观看| 久久99热这里只频精品6学生| 久久久久久久久大av| 日韩一本色道免费dvd| 十八禁网站网址无遮挡 | 久久久午夜欧美精品| 久久女婷五月综合色啪小说| 大又大粗又爽又黄少妇毛片口| 韩国高清视频一区二区三区| 人人澡人人妻人| 免费不卡的大黄色大毛片视频在线观看| 这个男人来自地球电影免费观看 | 三级国产精品片| 亚洲成人av在线免费| 国产69精品久久久久777片| 一级毛片我不卡| 国产伦精品一区二区三区视频9| 天天操日日干夜夜撸| 欧美激情极品国产一区二区三区 | 97在线人人人人妻| 国产无遮挡羞羞视频在线观看| 国产精品国产三级国产av玫瑰| 永久网站在线| 国产高清三级在线| 国产69精品久久久久777片| a 毛片基地| 中国国产av一级| 汤姆久久久久久久影院中文字幕| 黑丝袜美女国产一区| 日韩一区二区视频免费看| 丰满饥渴人妻一区二区三| 亚洲欧美清纯卡通| 99九九线精品视频在线观看视频| 美女主播在线视频| 免费观看av网站的网址| 亚洲久久久国产精品| 国产日韩欧美亚洲二区| 建设人人有责人人尽责人人享有的| 午夜老司机福利剧场| 欧美性感艳星| 天堂中文最新版在线下载| av黄色大香蕉| 最近手机中文字幕大全| 午夜福利影视在线免费观看| 一区二区三区乱码不卡18| 色视频在线一区二区三区| 欧美日韩综合久久久久久| 老熟女久久久| 亚洲国产成人一精品久久久| 亚洲va在线va天堂va国产| 97在线人人人人妻| 另类精品久久| 国产黄色视频一区二区在线观看| 国产片特级美女逼逼视频| 成人国产av品久久久| 国产中年淑女户外野战色| 欧美另类一区| 免费不卡的大黄色大毛片视频在线观看| 三级经典国产精品| 男人添女人高潮全过程视频| 久久久久久久国产电影| 日韩亚洲欧美综合| 99久久综合免费| 久久久久久久精品精品| av一本久久久久| 亚洲国产av新网站| 少妇被粗大猛烈的视频| 看十八女毛片水多多多| 国产日韩欧美在线精品| 亚洲人与动物交配视频| 日韩免费高清中文字幕av| 亚洲高清免费不卡视频| 国产成人免费无遮挡视频| 久久韩国三级中文字幕| 亚洲av福利一区| 国内少妇人妻偷人精品xxx网站| 国产高清不卡午夜福利| 三级经典国产精品| 久久久精品免费免费高清| 午夜老司机福利剧场| 热re99久久国产66热| 视频中文字幕在线观看| 少妇的逼好多水| 精品久久久久久久久亚洲| 日本午夜av视频| 亚洲国产精品一区二区三区在线| 久久av网站| av视频免费观看在线观看| 国产在线一区二区三区精| 亚洲精品中文字幕在线视频 | 春色校园在线视频观看| 搡老乐熟女国产| 精品久久国产蜜桃| 成人国产av品久久久| 国产精品久久久久久久电影| 性高湖久久久久久久久免费观看| 国产午夜精品一二区理论片| 精品亚洲成a人片在线观看| 中文字幕亚洲精品专区| 国产精品一二三区在线看| 男人爽女人下面视频在线观看| 美女内射精品一级片tv| 高清不卡的av网站| 欧美国产精品一级二级三级 | 久久久久精品性色| 午夜久久久在线观看| 久久人人爽人人爽人人片va| 日韩中文字幕视频在线看片| 有码 亚洲区| 男女边吃奶边做爰视频| 大香蕉久久网| 中文字幕久久专区| 欧美日韩亚洲高清精品| 另类精品久久| 亚洲av综合色区一区| 午夜激情福利司机影院| 少妇 在线观看| 妹子高潮喷水视频| 午夜精品国产一区二区电影| 中文字幕制服av| 国产69精品久久久久777片| 国产视频内射| 国产精品久久久久久久久免| 亚洲婷婷狠狠爱综合网| 狂野欧美激情性xxxx在线观看| 一区二区三区精品91| 人体艺术视频欧美日本| 男女免费视频国产| 国产白丝娇喘喷水9色精品| 久久精品熟女亚洲av麻豆精品| 国产精品蜜桃在线观看| 搡女人真爽免费视频火全软件| 亚洲国产日韩一区二区| 人人妻人人爽人人添夜夜欢视频 | 人人妻人人爽人人添夜夜欢视频 | 高清午夜精品一区二区三区| 国产精品.久久久| 欧美日韩综合久久久久久| av天堂中文字幕网| 国产av码专区亚洲av| 国产精品.久久久| 国产黄片美女视频| 性高湖久久久久久久久免费观看| 欧美日本中文国产一区发布| a级片在线免费高清观看视频| 少妇的逼水好多| 久久亚洲国产成人精品v| 又爽又黄a免费视频| 亚洲国产精品成人久久小说| 久久精品久久久久久久性| 精品少妇久久久久久888优播| 国产成人精品婷婷| 九九久久精品国产亚洲av麻豆| 黑人高潮一二区| 色哟哟·www| 久久精品久久久久久噜噜老黄| 自拍偷自拍亚洲精品老妇| 欧美日韩精品成人综合77777| 亚洲欧美成人综合另类久久久| 国产一区亚洲一区在线观看| 国产精品无大码| 插逼视频在线观看| 亚洲av中文av极速乱| 亚洲美女黄色视频免费看| 看免费成人av毛片| 国产精品久久久久久久电影| 欧美激情国产日韩精品一区| 人妻一区二区av| 三级经典国产精品| 日韩av在线免费看完整版不卡| 久久亚洲国产成人精品v| 欧美高清成人免费视频www| 欧美高清成人免费视频www| 少妇的逼水好多| 欧美高清成人免费视频www| 久久精品国产亚洲av涩爱| 日韩熟女老妇一区二区性免费视频| 成人亚洲欧美一区二区av| 精品国产露脸久久av麻豆| 寂寞人妻少妇视频99o| 成人毛片a级毛片在线播放| 交换朋友夫妻互换小说| 国产熟女欧美一区二区| 高清黄色对白视频在线免费看 | 777米奇影视久久| 国产国拍精品亚洲av在线观看| 大片电影免费在线观看免费| 欧美少妇被猛烈插入视频| 噜噜噜噜噜久久久久久91| 久久综合国产亚洲精品| 国产av国产精品国产| 亚洲综合精品二区| 人人妻人人添人人爽欧美一区卜| 一本—道久久a久久精品蜜桃钙片| av免费在线看不卡| 一本—道久久a久久精品蜜桃钙片| 一级a做视频免费观看| 色视频在线一区二区三区| 99九九线精品视频在线观看视频| 色哟哟·www| 国产成人91sexporn| 青春草亚洲视频在线观看| 能在线免费看毛片的网站| 免费观看性生交大片5| 丰满乱子伦码专区| 久久97久久精品| 男人和女人高潮做爰伦理| 中文字幕制服av| 秋霞伦理黄片| 国产色婷婷99| 伊人久久精品亚洲午夜| 亚洲精品自拍成人| 纵有疾风起免费观看全集完整版| 男的添女的下面高潮视频| 亚洲欧洲国产日韩| 少妇的逼好多水| 精品一区在线观看国产| 国产精品久久久久成人av| 亚洲成人一二三区av| 精品亚洲成国产av| 国产午夜精品一二区理论片| 国产白丝娇喘喷水9色精品| 亚洲av欧美aⅴ国产| 观看av在线不卡| 又粗又硬又长又爽又黄的视频| av在线观看视频网站免费| 亚洲精品久久久久久婷婷小说| 国产精品秋霞免费鲁丝片| av一本久久久久| 丰满少妇做爰视频| 国产日韩欧美视频二区| 亚洲av在线观看美女高潮| 免费看不卡的av| 精品国产一区二区三区久久久樱花| 国产黄色免费在线视频| 桃花免费在线播放| 我要看日韩黄色一级片| 精品国产国语对白av| 三级国产精品片| 亚洲av成人精品一区久久| 国产探花极品一区二区| 日本-黄色视频高清免费观看| 少妇人妻一区二区三区视频| 亚州av有码| 久久6这里有精品| 国产精品99久久99久久久不卡 | 免费高清在线观看视频在线观看| 美女国产视频在线观看| 97在线视频观看| 91精品国产国语对白视频| 妹子高潮喷水视频| 国产伦理片在线播放av一区| 国产有黄有色有爽视频| 一区在线观看完整版| 欧美国产精品一级二级三级 | 性高湖久久久久久久久免费观看| 欧美精品一区二区大全| 亚洲av中文av极速乱| 成人国产麻豆网| 国产高清国产精品国产三级| 亚洲国产色片| 在线观看人妻少妇| 国产精品不卡视频一区二区| 五月开心婷婷网| 亚洲熟女精品中文字幕| 欧美国产精品一级二级三级 | 精品久久久噜噜| 99久久精品一区二区三区| 插逼视频在线观看| 国内揄拍国产精品人妻在线| 国产国拍精品亚洲av在线观看| 青青草视频在线视频观看| 国产精品熟女久久久久浪| 中文字幕免费在线视频6| 少妇的逼水好多| 国产成人精品久久久久久| 自拍欧美九色日韩亚洲蝌蚪91 | 国产精品人妻久久久久久| 色网站视频免费| 最黄视频免费看| 久久久久人妻精品一区果冻| av国产久精品久网站免费入址| 91精品国产九色| 国产又色又爽无遮挡免| 国产成人精品无人区| 国产精品秋霞免费鲁丝片| 午夜免费鲁丝| a 毛片基地| 欧美精品高潮呻吟av久久| 黄色日韩在线| 午夜av观看不卡| 精品久久久久久久久av| 春色校园在线视频观看| 高清毛片免费看| 久久久久国产网址| 777米奇影视久久| 亚洲伊人久久精品综合| 在线观看免费日韩欧美大片 | 午夜福利在线观看免费完整高清在| 伊人久久精品亚洲午夜| 国产成人精品无人区| 国产亚洲91精品色在线| 下体分泌物呈黄色| 午夜免费男女啪啪视频观看| 欧美精品一区二区大全| 99久久中文字幕三级久久日本| 一区二区三区乱码不卡18| 一级a做视频免费观看| 一本一本综合久久| 久久久a久久爽久久v久久| 久久女婷五月综合色啪小说| 日本色播在线视频| 人人澡人人妻人| 国产视频内射| 91精品一卡2卡3卡4卡| av网站免费在线观看视频| 欧美精品国产亚洲| 国产精品久久久久成人av| 精品亚洲成a人片在线观看| 免费av不卡在线播放| 一区二区av电影网| 亚洲婷婷狠狠爱综合网| 亚洲美女搞黄在线观看| 黄色一级大片看看| 美女主播在线视频| av在线播放精品| 国产精品蜜桃在线观看| 免费看光身美女| 日本色播在线视频| 最近中文字幕高清免费大全6| 妹子高潮喷水视频| 国产精品99久久久久久久久| 亚洲av国产av综合av卡| a级毛片免费高清观看在线播放| 亚洲国产色片| 久久婷婷青草| 天天躁夜夜躁狠狠久久av| 高清不卡的av网站| 大又大粗又爽又黄少妇毛片口| 69精品国产乱码久久久| 免费大片18禁| 免费看日本二区| 亚洲国产日韩一区二区| 少妇被粗大猛烈的视频| 卡戴珊不雅视频在线播放| 国产一级毛片在线| 久久鲁丝午夜福利片| 国产精品秋霞免费鲁丝片| 亚洲av免费高清在线观看| 天堂俺去俺来也www色官网| 免费人成在线观看视频色| 国产黄频视频在线观看| 成人18禁高潮啪啪吃奶动态图 | 日本午夜av视频| 精品久久久精品久久久| 蜜桃在线观看..| 亚洲真实伦在线观看| 亚洲情色 制服丝袜| 久久人人爽人人片av| 国产在线视频一区二区| 亚洲av.av天堂| 亚洲国产色片| 国产探花极品一区二区| 69精品国产乱码久久久| 午夜91福利影院| 99久久人妻综合| 亚洲av日韩在线播放| 欧美日韩视频高清一区二区三区二| 如何舔出高潮| 性高湖久久久久久久久免费观看| 一区二区av电影网| av国产久精品久网站免费入址| 精品视频人人做人人爽| 欧美精品一区二区免费开放| 午夜91福利影院| 亚洲三级黄色毛片| 欧美精品一区二区免费开放| 美女内射精品一级片tv| 国产欧美日韩精品一区二区| 日韩av在线免费看完整版不卡| 国产高清有码在线观看视频| 老女人水多毛片| 汤姆久久久久久久影院中文字幕| 黄色怎么调成土黄色| 久久久a久久爽久久v久久| 伊人亚洲综合成人网| 亚洲情色 制服丝袜| 精品99又大又爽又粗少妇毛片| 色网站视频免费| 九色成人免费人妻av| 麻豆精品久久久久久蜜桃| 性高湖久久久久久久久免费观看| 日日撸夜夜添| 51国产日韩欧美| 五月开心婷婷网| 2018国产大陆天天弄谢| 久久久久久久大尺度免费视频| 久久久久久久久久人人人人人人| 国产精品三级大全| 三上悠亚av全集在线观看 | 亚洲国产精品一区三区| 成人二区视频| 亚洲中文av在线| 十分钟在线观看高清视频www | 中文字幕免费在线视频6| 亚洲中文av在线| 国产真实伦视频高清在线观看| 黄色毛片三级朝国网站 | 视频区图区小说| 下体分泌物呈黄色| 精品少妇黑人巨大在线播放| 久久人人爽人人爽人人片va| 国产av国产精品国产| 新久久久久国产一级毛片| 久久久亚洲精品成人影院| 91在线精品国自产拍蜜月| 新久久久久国产一级毛片| 天天躁夜夜躁狠狠久久av| 国产黄片视频在线免费观看| 丝袜喷水一区| 婷婷色麻豆天堂久久| 毛片一级片免费看久久久久| 国产一区二区在线观看av| 99久国产av精品国产电影| 日日摸夜夜添夜夜添av毛片| 九九久久精品国产亚洲av麻豆| 下体分泌物呈黄色| 人妻制服诱惑在线中文字幕| 免费不卡的大黄色大毛片视频在线观看| 欧美xxxx性猛交bbbb| 女人久久www免费人成看片| 久久人人爽人人爽人人片va| 亚洲av综合色区一区| 中文字幕人妻丝袜制服| 欧美性感艳星| 在线观看人妻少妇| av在线观看视频网站免费| 中文乱码字字幕精品一区二区三区| 久久久久久久久久久免费av| 男女边吃奶边做爰视频| 精品亚洲乱码少妇综合久久| 亚洲精品日韩av片在线观看| 亚洲国产精品成人久久小说| 色94色欧美一区二区| 在线观看国产h片| 在线观看一区二区三区激情| 国产一区二区三区av在线| 国产精品熟女久久久久浪| 五月开心婷婷网| 免费观看性生交大片5| a级一级毛片免费在线观看| 精品久久久久久电影网| 嘟嘟电影网在线观看| 男女啪啪激烈高潮av片| 王馨瑶露胸无遮挡在线观看| 色5月婷婷丁香| 五月玫瑰六月丁香| 亚洲美女搞黄在线观看| 亚洲精品,欧美精品| 亚洲国产精品成人久久小说| 亚洲久久久国产精品| 精品一区在线观看国产| 亚洲人成网站在线观看播放| 久久久国产一区二区| 国产色婷婷99| 精品国产一区二区久久| 最近2019中文字幕mv第一页| 精品熟女少妇av免费看| 精品卡一卡二卡四卡免费| 国产在线视频一区二区| 我要看日韩黄色一级片| 国产淫语在线视频| 爱豆传媒免费全集在线观看| 中文字幕制服av| 亚洲美女黄色视频免费看| 国产美女午夜福利| 寂寞人妻少妇视频99o| 一区二区三区乱码不卡18| 欧美激情极品国产一区二区三区 | 晚上一个人看的免费电影| 交换朋友夫妻互换小说| 夜夜骑夜夜射夜夜干| 免费大片黄手机在线观看| 成人18禁高潮啪啪吃奶动态图 | 春色校园在线视频观看| 免费看光身美女| 成人特级av手机在线观看| 免费黄频网站在线观看国产| 五月伊人婷婷丁香| 狂野欧美白嫩少妇大欣赏| 你懂的网址亚洲精品在线观看| av国产精品久久久久影院| 国产又色又爽无遮挡免| 自拍欧美九色日韩亚洲蝌蚪91 | 两个人的视频大全免费| 毛片一级片免费看久久久久| 亚洲,欧美,日韩| a级毛片免费高清观看在线播放| 精品亚洲成a人片在线观看| 精品久久久久久久久亚洲| 99热全是精品| 纵有疾风起免费观看全集完整版| av国产久精品久网站免费入址| 不卡视频在线观看欧美| av一本久久久久| 美女中出高潮动态图| 色吧在线观看| 老熟女久久久| 3wmmmm亚洲av在线观看| 国产69精品久久久久777片| 纵有疾风起免费观看全集完整版| 精品久久久精品久久久| 国产精品.久久久| 久久精品久久精品一区二区三区| 日日摸夜夜添夜夜添av毛片| 免费av中文字幕在线| 欧美一级a爱片免费观看看| 99热网站在线观看| 黑人巨大精品欧美一区二区蜜桃 | 欧美xxxx性猛交bbbb| 91午夜精品亚洲一区二区三区| a 毛片基地| 日本黄大片高清| 日本欧美国产在线视频| 你懂的网址亚洲精品在线观看| 热re99久久国产66热| 国产一区二区三区综合在线观看 | 亚洲人成网站在线观看播放| 国产精品免费大片| 成人美女网站在线观看视频| 免费不卡的大黄色大毛片视频在线观看| 高清在线视频一区二区三区| 99热国产这里只有精品6| 中文字幕免费在线视频6| 最黄视频免费看| 香蕉精品网在线| av免费观看日本| 如何舔出高潮| 久久精品国产自在天天线| 在线亚洲精品国产二区图片欧美 | 亚洲精品乱码久久久久久按摩| 麻豆成人午夜福利视频| 男的添女的下面高潮视频| 欧美精品亚洲一区二区| 亚洲av欧美aⅴ国产| 熟女人妻精品中文字幕| 国产黄色免费在线视频| 丰满饥渴人妻一区二区三| 日日啪夜夜撸| 国产精品国产三级国产专区5o| 精品人妻偷拍中文字幕| 久久精品夜色国产| 日日啪夜夜撸| 欧美3d第一页| 少妇被粗大猛烈的视频| 黑人巨大精品欧美一区二区蜜桃 | 免费黄色在线免费观看| 国产永久视频网站| 少妇人妻精品综合一区二区| 亚洲真实伦在线观看| 在线观看www视频免费| 91成人精品电影| 3wmmmm亚洲av在线观看| 国产91av在线免费观看| 一区二区av电影网| 又爽又黄a免费视频| 亚洲自偷自拍三级| 精品一品国产午夜福利视频| 亚洲伊人久久精品综合| 日本91视频免费播放|