Skip to content

Commit

Permalink
Merge pull request #79 from rpng/feat_zupt
Browse files Browse the repository at this point in the history
Development v2.2 - Zero Velocity Update
  • Loading branch information
rpng-guest authored Jul 8, 2020
2 parents 1c8ff20 + 0606e7f commit fd6d0a9
Show file tree
Hide file tree
Showing 53 changed files with 1,341,897 additions and 20,211 deletions.
2 changes: 1 addition & 1 deletion Doxyfile
Original file line number Diff line number Diff line change
Expand Up @@ -811,7 +811,7 @@ RECURSIVE = YES
# Note that relative paths are relative to the directory from which doxygen is
# run.

EXCLUDE =
EXCLUDE = ov_core/src/utils/CLI11.hpp

# The EXCLUDE_SYMLINKS tag can be used to select whether or not files or
# directories that are symbolic links (a Unix file system feature) are excluded
Expand Down
4 changes: 1 addition & 3 deletions Doxyfile-mcss
Original file line number Diff line number Diff line change
Expand Up @@ -32,12 +32,10 @@ ALIASES += \
"m_keyword{3}=@xmlonly<mcss:search xmlns:mcss=\"http://mcss.mosra.cz/doxygen/\" mcss:keyword=\"\1\" mcss:title=\"\2\" mcss:suffix-length=\"\3\" />@endxmlonly" \
"m_enum_values_as_keywords=@xmlonly<mcss:search xmlns:mcss=\"http://mcss.mosra.cz/doxygen/\" mcss:enum-values-as-keywords=\"true\" />@endxmlonly"


##! M_THEME_COLOR = #2f73a3
##! M_FAVICON = docs/img/favicon-light.png


##! M_LINKS_NAVBAR1 = pages namespaceov__core namespaceov__eval namespaceov__msckf annotated
##! M_LINKS_NAVBAR1 = pages namespaceov__core namespaceov__type namespaceov__msckf namespaceov__eval annotated
##! M_LINKS_NAVBAR2 = \
##! "<a href=\"https://github.com/rpng/open_vins/">GitHub</a>"

Expand Down
2 changes: 1 addition & 1 deletion ReadMe.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ This is a modification of the code originally developed by the HKUST aerial robo
Here we stress that this is a loosely coupled method, thus no information is returned to the estimator to improve the underlying OpenVINS odometry.
This codebase has been modified in a few key areas including: exposing more loop closure parameters, subscribing to camera intrinsics, simplifying configuration such that only topics need to be supplied, and some tweaks to the loop closure detection to improve frequency.

* **[ov_maplab](https://github.com/rpng/ov_maplab)** -
* **[ov_maplab](https://github.com/rpng/ov_maplab)** -
This codebase contains the interface wrapper for exporting visual-inertial runs from [OpenVINS](https://github.com/rpng/open_vins) into the ViMap structure taken by [maplab](https://github.com/ethz-asl/maplab).
The state estimates and raw images are appended to the ViMap as OpenVINS runs through a dataset.
After completion of the dataset, features are re-extract and triangulate with maplab's feature system.
Expand Down
47 changes: 46 additions & 1 deletion docs/bib/extra.bib
Original file line number Diff line number Diff line change
Expand Up @@ -170,4 +170,49 @@ @article{Qin2018TRO
year={2018},
publisher={IEEE},
url = {https://arxiv.org/pdf/1708.03852.pdf},
}
}

@article{Jeong2019IJRR,
title={Complex urban dataset with multi-level sensors from highly diverse urban environments},
author={Jeong, Jinyong and Cho, Younggun and Shin, Young-Sik and Roh, Hyunchul and Kim, Ayoung},
journal={The International Journal of Robotics Research},
volume={38},
number={6},
pages={642--657},
year={2019},
publisher={SAGE Publications Sage UK: London, England}
}





@inproceedings{Wagstaff2017IPIN,
title={Improving foot-mounted inertial navigation through real-time motion classification},
author={Wagstaff, Brandon and Peretroukhin, Valentin and Kelly, Jonathan},
booktitle={2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN)},
pages={1--8},
year={2017},
organization={IEEE},
url = {https://arxiv.org/pdf/1707.01152.pdf},
}

@article{Ramanandan2011TITS,
title={Inertial navigation aiding by stationary updates},
author={Ramanandan, Arvind and Chen, Anning and Farrell, Jay A},
journal={IEEE Transactions on Intelligent Transportation Systems},
volume={13},
number={1},
pages={235--248},
year={2011},
publisher={IEEE},
}

@inproceedings{Davidson2009ENC,
title={Improved vehicle positioning in urban environment through integration of GPS and low-cost inertial sensors},
author={Davidson, Pavel and Hautam{\"a}ki, Jani and Collin, Jussi and Takala, Jarmo},
booktitle={Proceedings of the European Navigation Conference (ENC), Naples, Italy},
pages={3--6},
year={2009},
url = {http://www.tkt.cs.tut.fi/research/nappo_files/1_C2.pdf},
}
6 changes: 3 additions & 3 deletions docs/css/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,9 @@ main h1 + p, h2 + p, h3 + p {
}

/* math equations by default have some extra space at the top */
div .m-math {
margin-top: -1.5rem;
}
/*div .m-math {*/
/* margin-top: -1.5rem;*/
/*}*/

/* handle youtube images on the readme page */
a .m-image {
Expand Down
22 changes: 21 additions & 1 deletion docs/css/m-udel+documentation.compiled.css
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
/*
This file is part of m.css.
Copyright © 2017, 2018, 2019 Vladimír Vondruš <[email protected]>
Copyright © 2017, 2018, 2019, 2020 Vladimír Vondruš <[email protected]>
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
Expand Down Expand Up @@ -422,6 +422,9 @@ mark {
background-color: #f0c63e;
color: #4c93d3;
}
.m-link-wrap {
word-break: break-all;
}
pre, code {
font-family: 'Source Code Pro', monospace, monospace, monospace;
font-size: 0.8em;
Expand Down Expand Up @@ -2556,6 +2559,23 @@ article:last-child, article section:last-child { margin-bottom: 0; }
.m-code .il { color: #0000cf; font-weight: bold }

.m-console .hll { background-color: #ffffcc }
.m-console .g-AnsiBackgroundBlack { background-color: #232627 }
.m-console .g-AnsiBackgroundBlue { background-color: #1d99f3 }
.m-console .g-AnsiBackgroundBrightBlack { background-color: #7f8c8d }
.m-console .g-AnsiBackgroundBrightBlue { background-color: #3daee9 }
.m-console .g-AnsiBackgroundBrightCyan { background-color: #16a085 }
.m-console .g-AnsiBackgroundBrightGreen { background-color: #1cdc9a }
.m-console .g-AnsiBackgroundBrightMagenta { background-color: #8e44ad }
.m-console .g-AnsiBackgroundBrightRed { background-color: #c0392b }
.m-console .g-AnsiBackgroundBrightWhite { background-color: #ffffff }
.m-console .g-AnsiBackgroundBrightYellow { background-color: #fdbc4b }
.m-console .g-AnsiBackgroundCyan { background-color: #1abc9c }
.m-console .g-AnsiBackgroundDefault { background-color: #fcfcfc }
.m-console .g-AnsiBackgroundGreen { background-color: #11d116 }
.m-console .g-AnsiBackgroundMagenta { background-color: #9b59b6 }
.m-console .g-AnsiBackgroundRed { background-color: #ed1515 }
.m-console .g-AnsiBackgroundWhite { background-color: #fcfcfc }
.m-console .g-AnsiBackgroundYellow { background-color: #f67400 }
.m-console .g-AnsiBlack { color: #232627 }
.m-console .g-AnsiBlue { color: #1d99f3 }
.m-console .g-AnsiBrightBlack { color: #7f8c8d; font-weight: bold }
Expand Down
75 changes: 69 additions & 6 deletions docs/gs-datasets.dox
Original file line number Diff line number Diff line change
Expand Up @@ -75,18 +75,35 @@ Note that we focus on the room datasets as full 6 dof pose collection is availab
@section gs-data-rpng RPNG OpenVINS Dataset

In additional the community maintained datasets, we have also released a few datasets.
Please cite the OpenVINS paper if you use any of these datasets in your works
Please cite the OpenVINS paper if you use any of these datasets in your works.
Here are the specifics of the sensors that each dataset uses:

- ArUco Datasets:
- Core visual-inertial sensor is the [VI-Sensor](https://furgalep.github.io/bib/nikolic_icra14.pdf)
- Stereo global shutter images at 20 Hz
- ADIS16448 IMU at 200 Hz
- Kalibr calibration file can be found [here](https://drive.google.com/file/d/1I0C-z3ZrTKne4bdbgBI6CtH1Rk4EQim0/view?usp=sharing)
- Ironsides Datasets:
- Core visual-inertial sensor is the [ironsides](https://arxiv.org/pdf/1710.00893v1.pdf)
- Has two [Reach RTK](https://docs.emlid.com/reach/) one subscribed to a base station for corrections
- Stereo global shutter fisheye images at 20 Hz
- InvenSense IMU at 200 Hz
- GPS fixes at 5 Hz (/reach01/tcpfix has corrections from [NYSNet](https://cors.dot.ny.gov/sbc))
- Kalibr calibration file can be found [here](https://drive.google.com/file/d/1bhn0GrIYNEeAabQAbWoP8l_514cJ0KrZ/view?usp=sharing)

Most of these datasets do not have perfect calibration parameters, and some are not time synchronised.
Thus, please ensure that you have enabled online calibration of these parameters.

Additionally, there is no groundtruth for these datasets, but some do include GPS messages if you wish to compare relative to something.

@m_div{m-text-center}
| Dataset Name | Length (m) | Dataset Link | Groundtruth Traj. | Example Launch |
|-------------:|--------|--------------|------------------|------------------|
| ArUco Room 01 | 27 | [rosbag](https://drive.google.com/file/d/1ytjo8V6pCroaVd8-QSop7R4DbsvvKyRQ/view?usp=sharing) | none | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Room 02 | 93 | [rosbag](https://drive.google.com/file/d/1l_hnPUW6ufqxPtrLqRRHHI4mfGRZB1ha/view?usp=sharing) | none | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Hallway 01 | 190 | [rosbag](https://drive.google.com/file/d/1FQBo3uHqRd0qm8GUb50Q-sj5gukcwaoU/view?usp=sharing) | none | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Hallway 02 | 105 | [rosbag](https://drive.google.com/file/d/1oAbnV3MPOeaUSjnSc3g8t-pWV1nVjbys/view?usp=sharing) | none | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Room 01 | 27 | [rosbag](https://drive.google.com/file/d/1ytjo8V6pCroaVd8-QSop7R4DbsvvKyRQ/view?usp=sharing) | none | [launch aruco](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Room 02 | 93 | [rosbag](https://drive.google.com/file/d/1l_hnPUW6ufqxPtrLqRRHHI4mfGRZB1ha/view?usp=sharing) | none | [launch aruco](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Hallway 01 | 190 | [rosbag](https://drive.google.com/file/d/1FQBo3uHqRd0qm8GUb50Q-sj5gukcwaoU/view?usp=sharing) | none | [launch aruco](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| ArUco Hallway 02 | 105 | [rosbag](https://drive.google.com/file/d/1oAbnV3MPOeaUSjnSc3g8t-pWV1nVjbys/view?usp=sharing) | none | [launch aruco](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_aruco.launch) |
| Neighborhood 01 | 2300 | [rosbag](https://drive.google.com/file/d/1N07SDbaLEkq9pVEvi6oiHpavaRuFs3j2/view?usp=sharing) | none | [launch ironsides](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_ironsides.launch) |
| Neighborhood 02 | 7400 | [rosbag](https://drive.google.com/file/d/1QEUi40sO8OkVXEGF5JojiiZMHMSiSqtg/view?usp=sharing) | none | [launch ironsides](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_serial_ironsides.launch) |
@m_enddiv


Expand Down Expand Up @@ -130,4 +147,50 @@ Please take a look at the [run_ros_uzhfpv.sh](https://github.com/rpng/open_vins/






@section gs-data-kaist KAIST Urban Dataset

The [KAIST urban dataset](https://irap.kaist.ac.kr/dataset/index.html) @cite Jeong2019IJRR is a dataset focus on autonomous driving and localization in challenging complex urban environments.
The dataset was collected in Korea with a vehicle equipped with stereo camera pair, 2d SICK LiDARs, 3d Velodyne LiDAR, Xsens IMU, fiber optic gyro (FoG), wheel encoders, and RKT GPS.
The camera is 10 Hz, while the Xsens IMU is 100 Hz sensing rate.
A groundtruth "baseline" trajectory is also provided which is the resulting output from fusion of the FoG, RKT GPS, and wheel encoders.

@code{.shell-session}
git clone https://github.com/irapkaist/file_player.git
git clone https://github.com/irapkaist/irp_sen_msgs.git
catkin build
rosrun file_player file_player
@endcode

To use the dataset, the dataset's file player can be used to publish the sensor information on to ROS.
See the above commands on what packages you need to clone into your ROS workspace.
One can record a rosbag after manually and use the serial OpenVINS processing node, or use the live node and manually playback the datasets.
It is important to *disable* the "skip stop section" to ensure that we have continuous sensor feeds.
Typically we process the datasets at 2x rate so we get a 20 Hz image feed and the datasets can be processed in a more efficient manor.

@m_class{m-block m-warning}

@par Dynamic Environments
A challenging open research question is being able to handle dynamic objects seen from the cameras.
By default we rely on our tracking 8 point RANSAC to handle these dynamics objects.
In the most of the KAIST datasets the majority of the scene can be taken up by other moving vehicles, thus the performance can suffer.
Please be aware of this fact.


@m_div{m-text-center}
| Dataset Name | Length (km) | Dataset Link | Groundtruth Traj. | Example Launch |
|-------------:|--------|--------------|------------------|------------------|
| Urban 28 | 11.47 | [download](https://irap.kaist.ac.kr/dataset/download_2.html) | [link](https://github.com/rpng/open_vins/tree/master/ov_data/kaist) | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_ros_kaist.launch) |
| Urban 38 | 11.42 | [download](https://irap.kaist.ac.kr/dataset/download_2.html) | [link](https://github.com/rpng/open_vins/tree/master/ov_data/kaist) | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_ros_kaist.launch) |
| Urban 39 | 11.06 | [download](https://irap.kaist.ac.kr/dataset/download_2.html) | [link](https://github.com/rpng/open_vins/tree/master/ov_data/kaist) | [launch](https://github.com/rpng/open_vins/blob/master/ov_msckf/launch/pgeneva_ros_kaist.launch) |
@m_enddiv







*/
2 changes: 1 addition & 1 deletion docs/update-initfeat.dox → docs/update-featinit.dox
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Thus we have the following mapping to a feature seen from the current frame:

\f{align*}{
{}^{C_i}\mathbf{p}_f &
= z_{f} {}^{C_i}\mathbf{b}_{f}
= {}^{C_i}z_{f} {}^{C_i}\mathbf{b}_{f}
= {}^{C_i}z_{f}
\begin{bmatrix}
u_n \\ v_n \\ 1
Expand Down
79 changes: 79 additions & 0 deletions docs/update-zerovelocity.dox
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
/**


@page update-zerovelocity Zero Velocity Update

The key idea of the zero velocity update (ZUPT) is to allow for the system to reduce its uncertainty leveraging motion knowledge (i.e. leverage the fact that the system is stationary).
This is of particular importance in cases where we have a monocular system without any temporal SLAM features.
In this case, if we are stationary we will be unable to triangulate features and thus will be unable to update the system.
This can be avoided by either using a stereo system or temporal SLAM features.
One problem that both of these don't solve is the issue of dynamic environmental objects.
In a typical autonomous car scenario the sensor system will become stationary at stop lights in which dynamic objects, such as other cars crossing the intersection, can quickly corrupt the system.
A zero velocity update and skipping feature tracking can address these issues if we are able to classify the cases where the sensor system is at rest.


@section update-zerovelocity-meas Constant Velocity Synthetic Measurement

To perform update, we create a synthetic "measurement" which says that the current **true** acceleration and angular velocity is zero.
As compared to saying the velocity is zero, we can model the uncertainty of these measurements based on the readings from our inertial measurement unit.

\f{align*}{
\mathbf{a} &= \mathbf{0} \\
\boldsymbol{\omega} &= \mathbf{0}
\f}

It is important to realize this is not strictly enforcing zero velocity, but really a constant velocity.
This means we can have a false detection at constant velocity times (zero acceleration), but this can be easily addressed by a velocity magnitude check.
We have the following measurement equation relating this above synthetic "measurement" to the currently recorded inertial readings:

\f{align*}{
\mathbf{a} &= \mathbf{a}_m - \mathbf{b}_a - {}^{I_k}_G\mathbf{R}{}^G\mathbf{g} - \mathbf{n}_a \\
\boldsymbol{\omega} &= \boldsymbol{\omega}_m - \mathbf{b}_g - \mathbf{n}_g
\f}

It is important to note that here our actual measurement is the true \f$\mathbf{a}\f$ and \f$\boldsymbol{\omega}\f$ and thus we will have the following residual where we will subtract the synthetic "measurement" and our measurement function:

\f{align*}{
\tilde{\mathbf{z}} &=
\begin{bmatrix}
\mathbf{a} - \Big(\mathbf{a}_m - \mathbf{b}_a - {}^{I_k}_G\mathbf{R}{}^G\mathbf{g} - \mathbf{n}_a \Big) \\
\boldsymbol{\omega} - \Big(\boldsymbol{\omega}_m - \mathbf{b}_g - \mathbf{n}_g \Big)
\end{bmatrix} &=
\begin{bmatrix}
- \Big(\mathbf{a}_m - \mathbf{b}_a - {}^{I_k}_G\mathbf{R}{}^G\mathbf{g} - \mathbf{n}_a \Big) \\
- \Big(\boldsymbol{\omega}_m - \mathbf{b}_g - \mathbf{n}_g \Big)
\end{bmatrix}
\f}

Where we have the following Jacobians in respect to our state:

\f{align*}{
\frac{\partial \tilde{\mathbf{z}}}{\partial {}^{I_k}_{G}\mathbf{R}} &= - \left\lfloor {}^{I_k}_G\mathbf{R}{}^G\mathbf{g} \times\right\rfloor \\
\frac{\partial \tilde{\mathbf{z}}}{\partial \mathbf{b}_a} &= \frac{\partial \tilde{\mathbf{z}}}{\partial \mathbf{b}_g} = - \mathbf{I}_{3\times 3}
\f}





@section update-zerovelocity-detect Zero Velocity Detection

Zero velocity detection in itself is a challenging problem which has seen many different works tried to address this issue @cite Wagstaff2017IPIN, @cite Ramanandan2011TITS, @cite Davidson2009ENC.
Most works boil down to simple thresholding and the approach is to try to determine the optimal threshold which allows for the best classifications of zero velocity update (ZUPT) portion of the trajectories.
There have been other works, @cite Wagstaff2017IPIN and @cite Ramanandan2011TITS, which have looked at more complicated methods and try to address the issue that this threshold can be dependent on the type of different motions (such as running vs walking) and characteristics of the platform which the sensor is mounted on (we want to ignore vehicle engine vibrations and other non-essential observed vibrations).


We approach this detection problem based on tuning of a \f$\chi^2\f$, chi-squared, thresholding based on the measurement model above.
It is important to note that we also have a velocity magnitude check which is aimed at preventing constant velocity cases which have non-zero magnitude.
More specifically, we perform the following threshold check to see if we are current at zero velocity:

\f{align*}{
\tilde{\mathbf{z}}^\top(\mathbf{H}\mathbf{P}\mathbf{H}^\top + \mathbf{R})^{-1}\tilde{\mathbf{z}} < \chi^2
\f}

We found that in the real world experiments, typically the inertial measurement noise \f$\mathbf{R}\f$ needs to be inflated by 50-100 times to allow for proper detection.
This can hint that we are using overconfident inertial noises, or that there are additional frequencies (such as the vibration of motors) which inject additional noises.



*/
1 change: 1 addition & 0 deletions docs/update.dox
Original file line number Diff line number Diff line change
Expand Up @@ -277,6 +277,7 @@ These are essentially the Kalman filter (or linear MMSE) update equations.
- @subpage update-delay --- How to perform delayed initialization
- @subpage update-null --- MSCKF nullspace projection
- @subpage update-compress --- MSCKF measurement compression
- @subpage update-zerovelocity --- Zero velocity stationary update



Expand Down
Loading

0 comments on commit fd6d0a9

Please sign in to comment.