# FAR-AVIO: Fast and Robust Acoustic-Visual-Inertial Odometry

<h3 align="center">Hao Wei, Peiji Wang, Qianhao Wang, Tong Qin, Fei Gao and Yulin Si</h3>

<h4 align="center">Zhejiang University, Shanghai Jiao Tong University</h4>

<figure><img src="/files/vYsMRQZSUIveP3HlhS0D" alt="" width="375"><figcaption></figcaption></figure>

***

<h4 align="center">FAR-AVIO: Fast and Robust Schur-Complement Based Acoustic-Visual-Inertial Fusion Odometry with Sensor Calibration</h4>

> Underwater environments impose severe challenges to visual–inertial odometry systems, as strong light attenuation, marine snow and turbidity, together with weakly exciting motions, degrade inertial observability and cause frequent tracking failures over long-term operation. While tightly coupled acoustic–visual–inertial fusion, typically implemented through an acoustic Doppler Velocity Log (DVL) integrated with visual–inertial measurements, can provide accurate state estimation, the associated graph-based optimization is often computationally prohibitive for real-time deployment on resource-constrained platforms. Here we present FAR-AVIO, a Schur-Complement based, tightly coupled acoustic-visual-inertial odometry framework tailored for underwater robots. FAR-AVIO embeds a Schur complement formulation into an Extended Kalman Filter(EKF), enabling joint pose–landmark optimization for accuracy while maintaining constant-time updates by efficiently marginalizing landmark states. On top of this backbone, we introduce Adaptive Weight Adjustment and Reliability Evaluation(AWARE), an online sensor health module that continuously assesses the reliability of visual, inertial and DVL measurements and adaptively regulates their sigma weights, and we develop an efficient online calibration scheme that jointly estimates DVL–IMU extrinsics, without dedicated calibration manoeuvres. Numerical simulations and real-world underwater experiments consistently show that FAR-AVIO outperforms state-of-the-art underwater SLAM baselines in both localization accuracy and computational efficiency, enabling robust operation on low-power embedded platforms. Our implementation has been released as open source software.

{% hint style="success" %}
**Our code will be open-sourced immediately after the paper is accepted. Please pay attention to this homepage of the FAR-AVIO project**
{% endhint %}

### 1. System Running Example

* Running Example with Log System on real-world underwater environments sequence <mark style="color:orange;">**(Acoustic-Visual-Inertial Mode),**</mark>  there was a continuous duration without any available visual features.
* Meanwhile, the latest version of the code has integrated loop detection (the <mark style="color:red;">**red line**</mark> in the video represents AVIO output, and the <mark style="color:green;">**green line**</mark> represents the loop trajectory) and BA module, which can output high-precision sparse point clouds for visual reconstruction (eg, 3DGS).

{% embed url="<https://www.youtube.com/watch?v=aEZgDCzhw0Y>" %}

* AVIO with SLAM and reconstruction result, utilize the pose result from AVIO and the sparse point cloud provided by the backend SLAM module to perform offline mapping (pose prior BA and 3D GS rendering), we can get the following result.

<div><figure><img src="/files/UIKHdoQRK1IaEPzAElhl" alt="" width="375"><figcaption></figcaption></figure> <figure><img src="/files/d4By4NeEkHXmHQnNrE8i" alt="" width="360"><figcaption></figcaption></figure></div>

{% embed url="<https://www.youtube.com/watch?v=rJlV7V9QoaM>" %}

* Running Example with ROV in real world env&#x20;

{% embed url="<https://youtu.be/TcJ2xT9MGaA?si=jzNXSzNZZhMHuVLl>" %}

* Running Environment and reconstruction result with out odom and slam module output

<div><figure><img src="/files/mXPrad8vBmhjP5VBqz67" alt="" width="375"><figcaption></figcaption></figure> <figure><img src="/files/PbPE2khwmmPOzbw4apY5" alt=""><figcaption></figcaption></figure></div>

* Running Example with Log System on Euroc V2\_03\_difficult sequence <mark style="color:blue;">**(Visual-Inertial Mode)**</mark>

{% embed url="<https://www.youtube.com/watch?v=4ALftH5l3v0>" %}

* Running Example With Dense Reconstruction

{% embed url="<https://youtu.be/GL269ccnRPU>" %}

* Running Example with only FAR-VIDO

{% embed url="<https://www.youtube.com/watch?v=GdZUOCkN2XM>" %}

{% embed url="<https://www.youtube.com/watch?v=MMbPYhaiHis>" %}

### 2. System Running Result

* System Estimate Trajectory Result and Dense Reconstruction Result

<figure><img src="/files/tyxzvnwrHCZLLRvFjDZX" alt=""><figcaption></figcaption></figure>

* System Architecture and DVL Measurement

<div><figure><img src="/files/jkcUud58cOrNFp5wW8Es" alt=""><figcaption></figcaption></figure> <figure><img src="/files/VBtEfLqDYe8wWJLJ5i5v" alt=""><figcaption></figcaption></figure></div>

### 3. Experiment Result

### 3. 1 Accuracy and Computational Load

<figure><img src="/files/Kw6jTlqIn64I6B0oOo08" alt=""><figcaption></figcaption></figure>

<figure><img src="/files/E3nn5RrGDFfvr9Z53Lcj" alt="" width="563"><figcaption></figcaption></figure>

<div><figure><img src="/files/yXYdSRRepfKrCWOxi9qo" alt=""><figcaption></figcaption></figure> <figure><img src="/files/Vg6FxMPvtb3DPetWUWFF" alt=""><figcaption></figcaption></figure></div>

### 3.2 Ablation Study

<figure><img src="/files/XnLmZnwNSwMN1QPvIGnN" alt="" width="563"><figcaption></figcaption></figure>

<figure><img src="/files/c3LwBacMzcXUij46NlaV" alt="" width="563"><figcaption></figcaption></figure>

### 4. Multi-Camera/Omni Fisheye Camera Support

{% hint style="success" %}

#### **Our latest version already supports multi-camera/omnidirectional fisheye camera, and also supports optical flow sensor observation updates. Please stay tuned！**

Running demo with omni fisheye camera.
{% endhint %}

{% embed url="<https://youtu.be/5XsiPX9wu7k>" %}

### 5. Acknowledgments

Thanks to the researchers who have provided valuable suggestions for this work!


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://far-vido.gitbook.io/far-vido-docs/far-avio-fast-and-robust-acoustic-visual-inertial-odometry.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
