Bio-Inspired Underwater ROV

| Jun 21, 2025

Overview

This is an ongoing research project in collaboration with the SeNSE Lab at Northwestern University with the aim to embody the sensory-motor loop of seal to understand the cognitive effects of the percieved data from the wiskers.

So, I am developing an Underwater ROV system with seal-inspired whiskers’ array mounting, to emulate the perception information observed by Seals in real life and develop decision making model based on that information. The final objective is to build an ROV capable of autonomously persuing a moving object, underwater, based solely on wiskers’ perception.

SealBot

Hardware Setup

The Rover
SealBot
The Soft-Robotic wisker array system mimicking real-life seal wiskers
Real

Real

Real

Robotic System

Perception Stack

Ground-Truth

There is two main parts of this subsection. First is the ground truth extracted from the External April Tag system mounted on the ROV using a external Real-Sense RGBD. This is the 6-Dimentional trajectory information of the Rover’s movement in physical world, with respect to a static tag outside the pool.

Real

Observed by Ext RealSense D435i

Real

Plot

The above plot is the 3-D pose trajectory, with the black lines determining the orientation of the tag, at an discrete time interval. And this plot is for each of the on-board 5Tags, so the Rover pose is known at all times. And the orange trajectory is of the static tag out of pool, to calculate relative pose.

Wisker-based Measurements

The second part of the perception stack is the information extracted from the april tags attached on the Wisker Array system, mounted on the Seal Head, as viewed by the Monocular Camera of the ROV. This is submerged in water, and are prone to issues caused by water reflections, waves, varring lighting conditions, etc. To Note the plot is of a single array and the image(left) is the complete observed frame.

As observed by ROV Camera

Observed by BlueOS Cockpit Interface of Onboard Monocular Camera

Real

Plot

In the next phase of the project I would be working on modelling this data into a predictable output.

Motion Stack

SealBot

Mechnical Design

Seal Head

Version 1:
SealBot

V1 -> V2 Improvements:

  1. The tag slots are inclined towards the ROV cam, giving it a much better viewing angle.
  2. Increased length to ensure the camera captures focused images.
  3. Thread inserts for better screwing mechanism.
  4. Screwing mechanism from the top, as the lower section is inaccessible most of the time.
  5. Covered top layer, improves support to the complete head and prevents external lighting and reflection issues.
  6. Removed mirrors, as tilting ensures the work now, and eliminates inversion step for detection.
Version 2:
SealBot SealBot

External April Tag Mount

Version 1:
SealBot SealBot SealBot
Version 2:
SealBot
Version 3:
SealBot SealBot SealBot

Next Quarter Plans:

  1. Develop perceptual models that ideally reflect the cognitive relation between the seal whisker’s sensory perception with its physical environment.
    • Try learning based methods (AutoEncoder could be an option).
    • Try classical methods (like Principal Component Analysis).
  2. Develop a method (maybe already exists, research that) to create a latent/embedding space for the 6D pose data of all the 20 tags from april tag to be used as input to the model.
  3. Understand or develop a quantitative metric to analyse the accuracy of the models.
  4. Stretch goal would be to have real-time feedback from the model to control the SealBot movement to follow a dynamic moving object in front of it. This however does not affect the main goal of developing the Model but can be used as a metric to understand the accuracy of the model. To note: When I tried real-time detection for the static tags, I faced topic synchronisation issues and the frame stream lags at times.[not sure how to fix it yet] [and the camera info publisher may need to be calibrated in water].

Awknowledgements

I would like to thank Prof. Matthew Elwin, Dr. Mitra Hartmann and Kevin Kleczka for their advice and guidance throughout the project. And Ishani Narwankar and James Oubre for their previous work building the Rover.