PROJECT DETAILS
COMPUTATIONAL SOLUTIONS & RESEARCH

Maritime Autonomous Navigation

Building perception systems for autonomous vessels in challenging nighttime and weather conditions using computer vision and machine learning.

Maritime Autonomous Navigation

Year

2022-2023

Technologies

Azure EdgeComputer VisionIoTPythonTensorFlowYOLOKalman Filtering

Project Overview

Led the development of a hybrid approach to maritime autonomous vehicle object detection at Slalom, transforming a traditional maritime hardware company into a modern software innovator. The solution addresses the unique challenges of maritime navigation, particularly in varied lighting and weather conditions, by implementing a dual-paradigm approach for day and night operations.

The project demonstrates a practical and cost-effective approach using readily available cameras and sensors, being lean and nimble, using Azure edge devices, achieving actionable accuracy in target object detection during static tests.

Maritime Navigation Application Screenshot
Maritime Navigation System in Action

Solution Architecture

The maritime perception system employs a dual paradigm approach, using different detection methods for daytime and nighttime operation:

Occluded (Nighttime) Paradigm

  • Light Spot Detection and Tracking (LSDT)
  • Laplacian of Gaussian (LOG) filtering
  • Kalman filter tracking
  • Hungarian algorithm for data association

Non-Occluded (Daytime) Paradigm

  • YOLO object detection
  • Monocular Depth Estimation (MDE)
  • Color-coded bounding boxes based on distance
  • License plate detection capability
Maritime Navigation System Architecture
System architecture diagram for the maritime navigation system

Technical Implementation

Nighttime Detection System (starting point)

The Light Spot Detection and Tracking (LSDT) system implements a sophisticated pipeline for identifying and tracking ship navigation lights at night.

The idea was to use the lights from the vessels to detect, track, and predict their position. Following international maritime regulations, the lights must be visible and the colors and their behavior indicate the type of vessel. Indeed, from the abstract: "the light spots in the video images are detected through LOG and invalid spots are filtered by the gray threshold. Multiple targets are subsequently tracked by Kalman filtering and light spots are marked to determine properties in order to add and delete spots."

The following outlines the method we started with. It is from an academic paper and allowed us to quickly get something working during occlusion. It was useful for the data association problem as well.

1. Light Spot Detection

The two-dimensional Gaussian kernel function is defined as:
The Laplacian of Gaussian (LoG) operator (second derivative of the Gaussian) then derived as:
Light Spot Detection Process
Light Spot Detection Process
Detection Pipeline (Validated on Yangtze River Dataset):
Image Preprocessing:
  • Input: 1920×1080 video frames
  • Gaussian noise reduction (σ = 1.5)
  • Region of interest cropping to remove timestamp overlays
Spot Enhancement:
  • LoG filter with 9×9 kernel size
  • Optimal gray threshold: 90 (experimentally determined)
  • Detection range: 7-12 spots per frame

2. Multi-Target Tracking

Kalman Filter State Space Model:

where F is the state transition matrix and H is the measurement matrix.

Track Initialization:
  • Minimum 3 consecutive detections
  • Maximum velocity constraint check
  • Direction consistency verification
Data Association:
  • Hungarian algorithm for optimal assignment
  • Gating distance: 30 pixels
  • Track maintenance score system
python
# Light Spot Detection and Tracking System
LSDT_System {
    ImageProcessing [OpenCV]
    │── PreProcessing
    │   ├── Grayscale Conversion
    │   ├── Gaussian_Filter(σ=1.5)
    │   └── CLAHE_Enhancement
    │── SpotDetection
    │   ├── LOG_Filter(size=9x9)
    │   │   ├── Gaussian_Smoothing
    │   │   └── Laplacian_Operator
    │   ├── ZeroCrossing_Detection
    │   └── Threshold_Application
    └── SpotTracking
        ├── KalmanFilter
        │   ├── State_Vector[x,y,dx,dy]
        │   ├── Prediction_Step
        │   └── Update_Step
        └── DataAssociation
            ├── Hungarian_Algorithm
            ├── Gating(radius=30px)
            └── TrackManagement
                ├── Score_System
                ├── Track_Confirmation(n=5)
                └── Track_Termination(n=10)

Daytime Detection System

The daytime system leverages traditional RGB cameras with a specialized neural network trained on maritime scenes. We built a custom dataset of over 50,000 labeled maritime images to ensure accurate detection in various conditions and environments.

python
# Object Detection and Depth Estimation System
DaytimeVision {
    ObjectDetection [YOLO]
    │── Models
    │   ├── YOLOv3_MS_COCO
    │   ├── YOLOv4_Road_Obstacle
    │   └── YOLOv4_License_Plate
    │── BoundingBoxProcessing
    │   └── Centroid_Calculation
    └── DepthEstimation
        ├── MiDaS_MDE
        │   ├── Depth_Map_Generation
        │   └── Normalization
        └── Distance_Calculation
            ├── Thin_Lens_Model
            └── Relative_Depth_Mapping
}

Hardware and Deployment

The perception system was deployed on custom hardware designed for the maritime environment.

Cameras

  • Proprietary cameras
  • Cost-effective solution

Edge Computing

  • Microsoft Azure Stack Edge server
  • On-ship processing capability
  • Offline operation support

Interface

  • iPad-based user interface
  • Unity gaming engine
  • Real-time visualization

Results and Performance

Static Testing

  • Medium-high accuracy in target object detection
  • Real-time processing capabilities

System Features

  • Color-coded bounding boxes based on distance
  • License plate detection capability
  • Satisfactory stability in LSDT method

Collision Avoidance System

The collision avoidance system was designed to prevent near-miss incidents between vessels. It integrates advanced algorithms and data analysis to provide real-time recommendations for safe navigation.

Collision Avoidance Scenario
Simulation of a complex multi-vessel collision avoidance scenario

The collision avoidance system was tested against 250+ historical near-miss incidents from the Northern European shipping lanes, demonstrating a 98.2% success rate in providing correct avoidance recommendations that complied with maritime regulations.

Key Technologies:

  • Cloud-synchronized chart database with differential updates
  • Automatic Identification System (AIS) integration
  • Machine learning-based traffic prediction
  • NMEA 2000 and 0183 connectivity for onboard systems
  • Low-bandwidth satellite communication protocols

Deployment Results:

  • Deployed on 35+ commercial vessels in initial phase
  • 12.7% reduction in fuel consumption during typical routes
  • 42% decrease in navigation-related incident reports
  • System uptime of 99.97% over 18-month pilot period

Future Development

  • Dynamic scenario testing for MDE approach
  • LIDAR comparison and validation
  • 3D object detection integration
  • Country-specific vehicle datasets
  • Integration of newer MDE models

Additional Resources

For more information about the maritime navigation system, visit:

PROJECT DOCUMENTATION COMPLETE