HOVERING AUTONOMOUS UNDERWATER VEHICLE SYSTEM DESIGN IMPROVEMENTS AND PERFORMANCE EVALUATION RESULTS

Similar documents
Ship Hull Inspection with the HAUV: US Navy and NATO Demonstrations Results

Advanced PMA Capabilities for MCM

General Dynamics Canada Whitepaper. Abstract

Scanning Sonar and ROV Operations. For Underwater SAR Imaging Applications

Model-based Adaptive Acoustic Sensing and Communication in the Deep Ocean with MOOS-IvP

Dual-Frequency Acoustic Camera: A Candidate for an Obstacle Avoidance, Gap-Filler, and Identification Sensor for Untethered Underwater Vehicles

Sontek RiverSurveyor Test Plan Prepared by David S. Mueller, OSW February 20, 2004

High Definition Laser Scanning (HDS) Underwater Acoustic Imaging and Profiling

Vieques Underwater Demonstration Project

A Distributed Control System using CAN bus for an AUV

IFREMER, Department of Underwater Systems, Toulon, France. L u c i e Somaglino, P a t r i c k J a u s s a u d, R o main P i a s co, E w e n Raugel

Specifications for Synchronized Sensor Pipe Condition Assessment (AS PROVIDED BY REDZONE ROBOTICS)

Type 093B Assessment Christopher P. Carlson 13 April 2018

Sentry de-brief summaries 2011/2012

PropaGator Autonomous Surface Vehicle

The MEDUSA Deep Sea and FUSION AUVs:

Underwater Robots Jenny Gabel

Acoustic Pipeline Inspection Mind The Gap

The Evolution of an Autonomous Unmanned Surface Vessel and Software for Hydrographic Survey

BUYER S GUIDE AQUAlogger 530WTD

A NEW PARADIGM FOR SHIP HULL INSPECTION USING A HOLONOMIC HOVER-CAPABLE AUV

ALFA Task 2 Deliverable M2.2.1: Underwater Vehicle Station Keeping Results

Cooperative Navigation for Autonomous Underwater Vehicles. Navigare 2011, 4 May 2011, Bern

Saab Seaeye Cougar XT Compact

ZIPWAKE DYNAMIC TRIM CONTROL SYSTEM OUTLINE OF OPERATING PRINCIPLES BEHIND THE AUTOMATIC MOTION CONTROL FEATURES

1.0 PURPOSE AND NEED

Alvin Debrief Summary Seven Cruises for 91 dives. Southern California Juan de Fuca Costa Rica Guaymas Basin Galapagos

IEEE RAS Micro/Nano Robotics & Automation (MNRA) Technical Committee Mobile Microrobotics Challenge 2016

RESOLUTION MSC.94(72) (adopted on 22 May 2000) PERFORMANCE STANDARDS FOR NIGHT VISION EQUIPMENT FOR HIGH-SPEED CRAFT (HSC)

A STUDY OF THE LOSSES AND INTERACTIONS BETWEEN ONE OR MORE BOW THRUSTERS AND A CATAMARAN HULL

Mock-up of the Exhaust Shaft Inspection by Dexterous Hexrotor at the DOE WIPP Site

14/10/2013' Bathymetric Survey. egm502 seafloor mapping

Line Following with RobotC Page 1

MISSION PLANNING AND DATA ACQUISITION SOFTWARE

ENVIRONMENTALLY ADAPTIVE SONAR

Simulation and In-water Testing of the Mid-Sized Autonomous Research Vehicle (MARV) Thomas Fulton NUWC Newport

NOAA s Underwater UXO Demonstration Projects Vieques Island, Puerto Rico

Utilizing Vessel Based Mobile LiDAR & Bathymetry Survey Techniques for Survey of Four Southern California Breakwaters

Robin J. Beaman. School of Earth and Environmental Sciences, James Cook University, Cairns, Qld 4870, Australia.

BACKGROUND TO STUDY CASE

The below identified patent application is available for licensing. Requests for information should be addressed to:

In this course you will learn the following

Natsushima Cruise Report NT Sea trial of Autonomous Underwater Vehicle. Yumeiruka around Omuro-dashi. Sagami Bay, Suruga Bay and Omuro-dashi

Mitsui Engineering & Shipbuilding Co., LTD. Kenji NAGAHASHI

Dynamic Positioning Control Augmentation for Jack-up Vessels

ROV Development ROV Function. ROV Crew Navigation IRATECH SUB SYSTEMS 2010

ScanFish Katria. Intelligent wide-sweep ROTV for magnetometer surveys

Tutorial for the. Total Vertical Uncertainty Analysis Tool in NaviModel3

STUDY OF UNDERWATER THRUSTER (UT) FRONT COVER OF MSI300 AUTONOMOUS UNDERWATER VEHICLE (AUV) USING FINITE ELEMENT ANALYSIS (FEA)

Tifft Water Supply Symposium

Hydro-Thermal Vent Mapping with Multiple AUV's AZORES-2001

Release Performance Notes TN WBMS _R _Release_Presentation.pptx 22 September, 2014

DDR Dive Data Recorder Manual - Rel /12

Focus on Operational Efficiency and Crew Safety - Introducing Advanced ROV Technology in Marine Towed Streamer Seismic


NSW Mines Rescue Unmanned Aerial Vehicle (UAV)

Advantages of Using Combined Bathymetry and Side Scan Data in Survey Processing T.M. Hiller, L.N. Brisson

PART 5 - OPTIONS CONTENTS 5.1 SYSTEM EXPANSION 5-3

Overview. 2 Module 13: Advanced Data Processing

Rescue Rover. Robotics Unit Lesson 1. Overview

Rules for. Polyathlon. Version Released

PERCEPTIVE ROBOT MOVING IN 3D WORLD. D.E- Okhotsimsky, A.K. Platonov USSR

NT09-21 Cruise Report SURUGA-BAY Cable Laying Experiment / VBCS Function Test

High Frequency Acoustical Propagation and Scattering in Coastal Waters

Evaluation of the Klein HydroChart 3500 Interferometric Bathymetry Sonar for NOAA Sea Floor Mapping

In-Water Mass Spectrometry for Characterization of Light Hydrocarbon Seeps and Leaks

Open Research Online The Open University s repository of research publications and other research outputs

SEAEYE FALCON & FALCON DR

Marine Mammal Acoustic Tracking from Adapting HARP Technologies

Examples of Carter Corrected DBDB-V Applied to Acoustic Propagation Modeling

Testing and Evaluation of REMUS Vehicle Systems

Intelligent Decision Making Framework for Ship Collision Avoidance based on COLREGs

Note that this arming distance is not affected by the speed setting of your torpedoes.

Fault Diagnosis based on Particle Filter - with applications to marine crafts

Note to Shipbuilders, shipowners, ship Managers and Masters. Summary

Naval Postgraduate School, Operational Oceanography and Meteorology. Since inputs from UDAS are continuously used in projects at the Naval

Meeting the Challenges of the IHO and LINZ Special Order Object Detection Requirements

INNOVATIVE MOORING SYSTEMS

Autonomous Marine Robots Assisting Divers

Ocean Deployment and Testing of a Semi-Autonomous Underwater Vehicle

Challenges in determining water surface in airborne LiDAR topobathymetry. Amar Nayegandhi, Dewberry 15 th Annual JALBTCX Workshop, June 11 th 2014

Panel Discussion on unmanned Hydrography

NATIONAL INSTRUMENTS AUTONOMOUS ROBOTICS COMPETITION Task and Rules Document

REPORT DOCUMENTATION PAGE

SEAHORSES and SUBMARINES Testing transformational capabilities with modern UUVs at NAVOCEANO by Craig A. Peterson and Martha E. M.

AN UNDERWATER AUGMENTED REALITY SYSTEM FOR COMMERCIAL DIVING OPERATIONS

Title: 4-Way-Stop Wait-Time Prediction Group members (1): David Held

Specifications. The Field:

Robot motion by simultaneously wheel and leg propulsion

Proposal for a Design of a Autonomous Bridge-Mapping Hydroplane

Review and Classification of The Modern ROV

DP Ice Model Test of Arctic Drillship

Marine Towed Array Surveys of Ostrich Bay, Lake Erie and Puerto Rico.

An effective approach for wide area detailed seabed mapping

Previous Release Notes

Version 3.1.0: New Features/Improvements: Improved Bluetooth connection on Windows 10

Field testing of the Nereus network

Deploying the TCM-1 Tilt Current Meter in an Inverted (Hanging) Orientation By: Nick Lowell, Founder & President

Wave Glider: Liege Colloquium. Colloquium April 27

CYCLING TRACK SALES CONSULTING

Transcription:

HOVERING AUTONOMOUS UNDERWATER VEHICLE SYSTEM DESIGN IMPROVEMENTS AND PERFORMANCE EVALUATION RESULTS Jerome Vaganay, Leo Gurfinkel, Michael Elkins, Daniel Jankins, Kimberly Shurn Bluefin Robotics Corporation 237 Putnam Avenue Cambridge, MA 02139 (617) 715-7000 vaganay@bluefinrobotics.com Abstract. This paper describes the modifications made to the Hovering Autonomous Underwater Vehicle (HAUV) system during PMS-EOD s EOD HULS program, and presents results obtained with the new system design in hull and harbor search tasks. The HAUV project started under and is still supported by the Office of Naval Research (ONR). The R&D work currently underway within ONR s Confined Area Search group for the development of short term enhanced capabilities is also presented. 1. Introduction The Hovering Autonomous Underwater Vehicle (HAUV) was initially jointly developed by Bluefin Robotics and the Massachusetts Institute of Technology for ship hull inspection under Office of Naval Research (ONR) funding. In 2008, under PMS- EOD s Explosive Ordnance Disposal Hull Unmanned Underwater Localization System program (EOD HULS), Bluefin conducted a major vehicle redesign in order to improve the system s performance and bring it up to the EOD HULS specification. The EOD HULS HAUV system consisting of two vehicles and topside / support equipment was delivered in October 2008. It underwent the Engineering Evaluation phase of the Requirement Compliance Test and Evaluation (RCT&E) of the EOD HULS acquisition program by the Government in December 2008 in San Diego California. It is currently undergoing User Operational Evaluation with the Navy s Mobile Diving and Salvage Unit 2 (MDSU-2) in Little Creek Virginia. In parallel with this effort and under ONR funding, Bluefin continues to look into enhanced short-term capabilities with its partners within ONR s Confined Area Search group. Specific activity areas include high-speed acoustic communications with Florida Atlantic University-SeaTech, Automatic Target Recognition, real-time mosaicing and control with SeeByte Ltd, integrated Feature-Based Navigation (FBN) and control with MIT, and video-based navigation and mosaicing with the University of Michigan. Preliminary capabilities have been demonstrated in these various areas during AUVFest 07 and AUVFest 08, and during engineering trials in the Boston area. Further development and integration work is underway with the objective being to conduct a fully autonomous inspection of an entire hull in 2011. This paper describes the improvements made to the initial vehicle design (HAUV1) that transitioned it from an R&D vehicle to an operational system usable by mobile US Navy units (HAUV2 system). The HAUV principle of operation and the system s concept of operation are described, followed by a detailed presentation of the system s capabilities for the inspection of ship hulls and berthing areas. For ship hull inspection, results are presented both in the noncomplex and complex areas of the hull where different inspection techniques are used. Finally, a section is devoted to the status of the development work currently underway through ONR funding. 2. HAUV Overview The HAUV typically operates by hull-relative navigation and control. The vehicle points an actuated Doppler Velocity Log (DVL) at the ship s hull in order to maintain a fixed standoff distance and determine its hull-relative position. As the vehicle moves relative to the hull, an actuated sonar (DIDSON) images the hull with 100% coverage. Details on hull-relative navigation and control can be found in [1]. On the surface, the operator monitors the DIDSON data

transmitted in real-time over a fiber optic tether. Although the vehicle operates autonomously most of the time, the operator has the ability to stop the vehicle near a contact detected in the real-time DIDSON data stream. Under manual control, some degrees of freedom such as the standoff distance are still automatically controlled by the vehicle. The operator can move the vehicle side to side and up and down, call the contact, and resume the autonomous search. For complex areas such as a ship s running gear, the DVL is pointed at the seafloor. The DIDSON, pointed at the hull, can be used to image or profile the hull. In the latter case, the sonar profiles are combined with the vehicle s navigation data to provide a 3D representation of the hull. During dives under the running gear, the operator can also take manual control of the vehicle to take a closer look at suspicious features. forward, the sonar field of view is limited in length by the hull curvature. Additionally, protuberances such as bilge keels can completely obstruct the field of view. Further details are provided in [5]. To remedy these two issues, it was decided to mount the DIDSON in the vehicle so that it points along the length of the hull by looking to one side of the vehicle. This required a more complex mechanism to control the sonar s orientation relative to the vehicle, but provided substantial improvement in coverage rate. Figure 3 shows one of the two HAUV2-class vehicles (called HULS1 and HULS2 ) delivered to PMS-EOD in October 2008. These hull inspection capabilities have been extended to the inspection of berthing areas (seafloor under the hull, and piers and pilings). The system can provide high resolution images of the seafloor and detailed profiles of pilings or other underwater structures. During post-mission analysis, the DIDSON data can be reviewed and mosaiced with enhanced resolution using software developed by AcousticView. A full Post- Mission Analysis tool is currently under development at Bluefin under ONR support (see section 6). Figure 1: HAUV1A (HAUV1-class vehicle) 3. HAUV Design Improvements The first HAUV, called HAUV1A and shown in Figure 1, was deployed for the first time in November 2004. To support developments by other partners within ONR s Complex Area Search Group, a second vehicle, called HAUV1B, was built and went to sea for the first time in May 2007 (Figure 2). HAUV1B is functionally identical to HAUV1A but has morehydrodynamic foam and a smaller chassis. These vehicles participated in several demonstrations including HULSFest 06, Harbor Protection Trials 06, AUVFest 07 and 08, and the EOD HULS demonstration. For further details on the results obtained during these demonstrations, the reader is referred to [1][2][3][4]. During these demonstrations, limitations with the HAUV1 design were revealed, with the main issue being related to the way the sonar is pointed at the hull. With the DIDSON actuated in pitch and looking Figure 2: HAUV1B (HAUV1-class vehicle) It can be seen how the DIDSON now looks to the starboard side of the vehicle with two rotational degrees of freedom (one tied to the DVL pitch angle, and the other one used to control the sonar s grazing angle). This more complex sonar rotary table now allows the

vehicle to image sections of the hull that were physically impossible to image with the HAUV1 design. For instance, Figure 4 shows an AcousticView mosaic of a World War II destroyer s hull between the water surface and the bilge keel located about 2 meters below the water surface. Further details on the DIDSON rotary table are provided in [5]. EOD HULS specification. The most stringent requirement that drove many of the improvements was that of being able to operate from a small rubber boat without a power source. This requirement had hardware and software implications that are presented in the next section. 4. HAUV2 System Concept of Operation An HAUV2-class vehicle is made up of the following components: vehicle chassis, main electronics housing, junction box, battery, four 100 mm diameter thrusters, two 70 mm diameter thrusters, Doppler Velocity Log, DIDSON, two actuators, vehicle power switch, GPS/strobe/WiFi antennas (optional), relocation transponder, flotation foam, removable data storage module, and lead weights. Although the vehicle is fitted with a fiber optic tether, it operates primarily autonomously. Operator control is used to position the vehicle on the surface at the start of a dive or to take a closer look at a contact detected during autonomous search. Power is provided by an onboard Bluefin pressure-tolerant 1.5 kwh Lithium polymer battery. Figure 3: HULS1 (HAUV2-class vehicle) The vehicle can be launched and recovered by two people over the side of a rubber boat, as shown in Figure 5. Figure 4: Hull between water surface and bilge keel The HAUV2 design includes further modifications made to the HAUV1 design: - Reduction of the number of thrusters from eight to six - Larger horizontal thrusters - High-power thruster board - Rotary actuator redesign for smaller size and weight reduction - Drop and play Bluefin battery - Integration of a removable data storage module - Overall weight reduction In addition to vehicle design changes, system-level improvements have been made to comply with the Figure 5: HAUV launched over the side of a small rubber boat 4.1. Principle of Operation 4.1.1 DVL Locked on a Hull The HAUV was initially designed to inspect ship hulls by pointing the DVL at the hull at all times so as to control the standoff distance and the vehicle s bearing,

and measure the position relative to the hull (hullrelative dead reckoning). This mode of operation is still used when surveying large hulls. The vehicle typically moves up and down the hull ( vertically ) starting near the bow and making its way along the hull towards the stern. While executing this lawnmower pattern, the sonar paints the hull with 100% coverage. Figure 6 shows the shape of a hull reconstructed in 3D from the navigation data collected during the survey. Each color represents the point of contact of one of the four DVL beams on the hull. 4.1.2 DVL Lock on the Seafloor Inspection of complex areas such as the ship s running gear does not allow the DVL to maintain a solid lock on the hull due to the many protuberances in that area. In these areas, the DVL is pointed straight down and the vehicle navigates relative to the seafloor. The DIDSON is pointed up 15 degrees in imaging mode or straight up in profiling mode. Switching from imaging to profiling is done in a few minutes by adding a concentrator lens in front of the DIDSON. This lens reduces the vertical aperture from 14 to 1 degree, thereby creating a beam pattern consisting of 96 beams covering a 30 deg by 1 deg spread. When pointing the DIDSON straight at the surrounding environment, profiles or scans are obtained that can be stacked in space using the vehicle s navigation data in order to reconstruct the environment in 3D. The DVL is also locked on the seafloor when inspecting the bottom of a barge or the seafloor under a hull, or when searching pilings. Figure 6: Hull reconstructed from vehicle navigation data Figure 7 shows the vehicle s trajectory in 2D hull coordinates for the dive shown in Figure 6. The water surface is near the top of the figure and the inboard side is the curved bottom of the figure. The white area shows the estimated sonar coverage derived by analysis of the DIDSON field of view illumination. This display is generated in real-time and in post-processing when replaying the DIDSON file. Note that the small gaps in the white area are due to the fact that the field of view analysis tends to under-estimate the actual DIDSON field of view. 4.2. Operation from a Small Rubber Boat The HAUV2 system was designed for operations by a limited crew from a small boat such as a Rubber Hull Inflatable Boat (RHIB) or a Combat Rubber Raiding Craft (CRRC). As shown in Figure 8, the crew consists of three people: the coxswain, the tether handler, and the operator. Figure 7: Sonar coverage and vehicle trajectory Figure 6 and Figure 7 show how the vehicle goes further under the hull as the hull widens amidships. The vehicle autonomously determines when to stop a slice down the hull towards the inboard side by sensing the inboard side s hull curvature with the DVL. The horizontal extent of Figure 6 is about 145 m and the vertical extent is 30 m at the widest point of the hull. Figure 8: Operations from a small rubber boat

Power to the fiber optic tether reel and the tablet PC, as well as Ethernet communications, are provided by the yellow deck box that can be seen in Figure 8. Coordination is needed between the three crew members to ensure that the vehicle has enough tether slack and to prevent the tether from being severed by the boat s propeller; this has not proved to be a difficulty in operations. 4.3. Operator Interface The operator uses a dual-touch tablet PC (touch and stylus) running Bluefin s Dashboard software to perform every operational step: vehicle dry checkout, in-water check out, mission planning, mission monitoring, operator control, sonar data viewing, contact calling, post-dive checkout. Dashboard is the operator interface used by Bluefin vehicles. Dashboard s appearance is customized by selecting the vehicle class when starting the software. The HAUV Dashboard includes features that are specific to this vehicle, which is rather different from Bluefin s torpedo-shaped AUVs (BF21, BF12, and BF9). As shown in Figure 9 (captured during an actual dive), the Dashboard display is divided into three sections. The left pane is used to monitor the vehicle during a dive. Position and attitude information, as well as vehicle status are displayed there. The lower-right pane shows the real-time DIDSON data. The operator has control of selected sonar settings through controls located at the top of the image (field of view dimensions, gain, ping rate, etc.) The upper right pane is a planar representation of the hull. The vehicle s position in hull coordinates is shown as a solid red line. The sonar coverage is shown by the white area, which builds up as the vehicle covers more and more of the hull. Figure 9: HAUV s operator interface: Dashboard Additional windows can be opened by clicking on buttons or shortcut icons in the menu bar. Depending on the mission type being executed and the phase of the dive, additional buttons also appear in the sonar pane. For instance, at the start of a dive, buttons allowing the operator to steer the vehicle on the surface towards the hull will appear, and then disappear as soon as control is released to the vehicle. 4.4. Mission Planning Mission planning can be performed on-site using a simple planning wizard. There is no need to plan tracklines or to know the shape of the hull. Only basic information is needed, such as the ship s draft, the side of the hull the vehicle is going to survey, the local water depth, etc. After answering a few simple questions, the mission plan is generated and uploaded to the vehicle when starting the dive. Figure 10 shows an example of questions asked to the operator prior to running a survey on a hull. After answering these few questions, the vehicle is capable of running a dive such as the one shown in Figure 6. Figure 10: HAUV Mission Planning Wizard 4.5. Contact Calling When calling contacts, the operator first clicks on the DIDSON image in Dashboard at the location of the contact. This creates a sonar snapshot that is saved after the operator is prompted and enters information relative to the contact using a graphical user interface (e.g. contact type, confidence level). Navigation information relative to the contact is automatically added to a text file when the contact is recorded. Contacts can be called on-the-fly when the vehicle is running autonomously. Alternatively, the vehicle can be put under manual control to allow the operator to take a longer and closer look at a contact. The operator can move the vehicle side to side or up and down, change the sonar grazing angle, and zoom in by changing the sonar field-of-view dimensions. After calling the contact (or not), control is released back to the vehicle, which will go back to where it left off and then resume the search.

4.6. Post-Mission Analysis During post-mission analysis, the operator can review the DIDSON data and call contacts again (if needed) using the same interface. The DIDSON data collected during the dives can be post-processed using a mosaicing software tool developed by AcousticView. This software runs on a powerful Post-Mission Analysis computer also used as a data repository. Examples of mosaics created using the mosaicing software are shown throughout this paper. Another task that can be performed during postmission analysis is the 3D reconstruction of the environment from DIDSON profiling data and vehicle navigation data. Although 3D reconstruction can be done in degraded mode in real-time using a dot mode representation, a volume reconstruction requires a more powerful computer and is performed during postprocessing. The 3D rendering software was developed by ScienceGL, Inc. in collaboration with Sound Metrics Corp.. Examples of 3D reconstruction using ScienceGL s software are shown further down in this paper. Figure 11: Single-frame raw DIDSON image of two tests targets on a large hull A contact map was created from the contacts called in real-time during the hull search shown in Figure 7. This map is shown in Figure 12. Red circles and yellow squares are associated with two different types of targets that had been placed on the hull. Green stars represent regular hull features such as intakes, sea chest, zincs, etc. 5. HAUV2 System Capabilities 5.1. Non-Complex Areas The Non-Complex Area of a hull is defined as follows in the EOD HULS specification: The Non-Complex Hull Area makes up the majority of the ship s underwater surface area. This includes the bottom and side portions of the hull forward of the running gear and aft of the bow dome. For most Navy ships this includes a relatively flat bottom with a non-distinct keel, extending outward to a relatively sharp transition to the nearly vertical sides of the ship. Non-Complex Areas also include the seafloor under a hull that the HAUV is also capable of inspecting. 5.1.1 Hull Search During a hull search, a 5 meter DIDSON field of view is typically used, which allows a 4 meter track-line spacing to be used in order to ensure overlap for 100% coverage. The DIDSON provides high-quality images that allow test targets to be easily detected and classified (Figure 11). The display of sonar data at 5 frames per second makes images even clearer than the single-frame snapshot shown here. Figure 12: Contact map created from contacts called in real-time Coverage rates between 45 and 50 m 2 /min have been estimated by Bluefin on large hulls. The hull search applied to large hulls (Figure 6 and Figure 7) can also be applied to smaller hulls. For instance, Figure 13 and Figure 14 show the same types of plots obtained on a much smaller hull. During this search the vehicle was able to collect DIDSON data over the propeller area (Figure 15). The data was then mosaiced using the mosaicing software (Figure 16). Figure 17 shows the zinc anodes and the propeller on a large hull observed during a hull search with the DVL locked on the hull. Four zinc anodes are visible, as well as the bolts used to attached them to the hull.

Figure 13: Small hull search Figure 17: zinc anodes and propeller on a large hull 5.1.2 Seafloor Search Figure 14: Sonar coverage and vehicle trajectory during small hull search During a seafloor search, the DVL is pointed at the seafloor. The vehicle maintains a fixed altitude over the seafloor and points the sonar down at a 15 degree grazing angle. Figure 18 shows an example of seafloor survey. The colored dots correspond to the intersection of the DVL beams with the seafloor and the blue solid line corresponds to the vehicle s trajectory. When the trajectory is straight, the vehicle was running autonomously. When it is irregular, the vehicle was under operator control for investigation of a seafloor feature detected in the real-time DIDSON stream. The vehicle was running at 1.5 m altitude and about 6 meters depth. The DIDSON provides acoustic images of excellent quality that can be further improved using the mosaicing software. Examples of seafloor mosaics with near-video quality are shown in Figure 19, Figure 20, and Figure 21. Figure 15: Single-frame raw DIDSON image of propeller and rudder on a small hull Figure 18: Seafloor survey example Figure 16: Mosaic of propeller and rudder on a small hull

Figure 19: Seafloor mosaic showing a plank, a lobster trap and other debris Figure 20: Seafloor mosaic showing rocks that fell off a nearby seawall Figure 21: Seafloor mosaic showing various debris and a net on the right side 5.2. Complex Areas The Complex Areas of a hull are defined as follows in the EOD HULS specification: Complex Areas are those areas of the ship s hull which contain complex structures and appendages, voids, complex curvatures, rapid changes in geometry, and difficult-to reach places. These areas include the ship s running gear, additional structures and appendages such as bilge keels, sonar heads, narrow keel extensions, bulbous bows, small-radius curved areas, hull openings and voids, inboard side of the ship, and adjacent piers and pilings. Figure 22: Single-frame raw DIDSON image of a propeller Figure 23: Running gear mosaic 5.2.1 Running Gear During running gear surveys, the DVL is locked on the seafloor and the sonar pointed straight up at the running gear in profiling mode, or 15 degrees above horizontal in imaging mode. The vehicle runs a lawnmower pattern under the hull. Imaging Mode Figure 22 shows a raw DIDSON image collected during a running gear survey under a hull. On a different hull, Figure 23 shows a mosaic of the two propeller shafts and propellers, the keel, and the rudder. Figure 24 shows a mosaic created from a pass under one of the propellers. Figure 24: Mosaic of a propeller and propeller shaft Profiling Mode Profiling under the running gear of a ship can be used instead of or in addition to imaging. DIDSON profiles such as shown in Figure 25 can be stacked in 3D using the vehicle s position and attitude at the time the profiles were capture to reconstruct the shape of the structure being scanned. This approach provides 3D models of the environment as shown in Figure 26 and Figure 27. A degraded mode (dot mode) can be used for real-time visualization of the DIDSON profiles. This capability provides the operator with a situational

awareness that is otherwise difficult to achieve when looking at sonar profiles. For more accurate representation (volume mode), the data is postprocessed with ScienceGL s 3D rendering software. Surveys under a ship s running gear in profiling mode can also be conducted under the bow. Figure 28 shows the bow of a World War II submarine reconstructed in 3D from DIDSON profiles. Figure 28: 3D rendering of a WWII submarine s bow 5.2.2 Pilings Figure 25: Single-frame raw DIDSON profile snapshot showing the hull and a propeller shaft (bio-fouled) During piling surveys, the DVL is locked on the seafloor and the DIDSON (fitted with the concentrator lens) is pointed horizontally at the pilings. The vehicle moves up and down (vertically) and progresses parallel to the piling (horizontally) at the top and bottom of the vertical segments (Figure 29). Figure 29: Seafloor and vehicle trajectory during a piling survey Figure 26: Running gear 3D reconstruction (landing ship) Figure 27: Running gear 3D reconstruction (missile Corvette) The DIDSON pings at a high rate and acquires profiles of the environment such as shown in Figure 30. These profiles are processed with the vehicle s navigation information embedded in the sonar frame header by the 3D rendering software to provide a 3D reconstruction of the environment (Figure 31). It is possible to manipulate the 3D model within the software application (rotate, zoom, hide, change processing parameters, etc.). Figure 31 shows a section of pilings that is 25 meters long and 4 meters high, with three rows of piles captured in a single dive.

Figure 33: Vehicle trajectory around the cylindrical structure Figure 30: Single-frame raw DIDSON profile of four piles Figure 31: 3D rendering of piles under a pier from DIDSON profiling and vehicle navigation data A large cylindrical structure, about 15 meters in diameter and shown in Figure 32, was also profiled using the DIDSON. The HAUV went up and down the structure and turned around it at the top and bottom of the vertical passes, while maintaining a constant standoff distance (under operator control). Figure 33 shows the vehicle s trajectory as it moved around the structure and the seafloor sensed by the DVL. The data collected during the dive was then post-processed using the 3D rendering software leading to the results shown in Figure 34. Figure 34: 3D rendering of a cylindrical structure from DIDSON profiling and vehicle navigation data The vehicle was then run with the DVL locked on the structure, as if it were a hull. The vehicle went up and down the structure, automatically moving around it at a fixed standoff distance of 1 meter at the top and bottom of the vertical slices. Due to the structure s curvature, the DIDSON field of view (looking to the starboard side of the vehicle was limited to about 2 meters in length). Passes down the structure were then mosaiced to produce the images shown in Figure 35. Figure 32: Cylindrical structure profiled by the DIDSON Figure 35: Mosaics of two passes down the cylindrical structure

6. Research and Development Work under ONR The Office of Naval Research, under the direction of Dr. Thomas F. Swean, is funding partners in the Complex Area Search (CAS) group for the development of capabilities aimed at performing a fully autonomous un-tethered search of an entire hull in 2011. The CAS group includes the following partners: - Florida Atlantic University (FAU) SeaTech Campus for high-speed acoustic communications, - Massachusetts Institute of Technology (MIT) for Feature-Based Navigation (FBN) and control, - SeeByte Ltd for Automatic Target Recognition, mosaicing, 3D environment reconstruction, and control, - University of Michigan for video-based featurebased navigation and mosaicing, - Bluefin Robotics Corp. for integration of capabilities developed within the CAS group in the HAUV, hardware / software integration and testing, vehicle software (framework, autonomy, drivers, etc), postmission analysis tool development, operational support for engineering trials and demonstrations. 6.1. Results Engineering trials in the Boston area and demonstrations such as AUVFest 07 and 08 have been used to test capabilities under development. The FAU high-speed acoustic modem has been integrated in HAUV1A and a camera / light system provided by University of Michigan has been integrated in HAUV1B (Figure 36). Figure 36: Hardware integration in HAUV1 in support of CAS research and development Many preliminary results were obtained by the CAS partners on the HAUV during AUVFest 08: - HAUV1A was fitted with the HERMES acoustic modem source designed by FAU and EdgeTech. Two receiver units were used: the HERMES topside unit and an ORE High-speed Acoustic Gateway buoy. Compressed DIDSON images were transmitted at a rate of 4 images per second with a data rate of 45000 bits per second. A total of 3910 messages were received (35.6 million data bits) in one hour with low bit error rate [6]. - MIT implemented a feature extractor designed for a specific type of target placed on the USS Saratoga s hull. The algorithm was able to very reliably detect new targets with a low false alarm rate and to establish correspondence with previous detections. The real-time feature-based navigation algorithm created the target map while simultaneously estimating the vehicle s trajectory [7]. - SeeByte demonstrated their ability to build real-time mosaics of the DIDSON data received over the fiber optic tether (hull data and seafloor data). They also ran their Automatic Target Recognition (ATR) algorithms on the real-time DIDSON data and were able to automatically detect test targets that had been placed on the hull [8]. - Over 1,300 underwater images of the USS Saratoga s hull were collected by the HAUV fitted with the video camera and light system designed by the University of Michigan. The dive consisted of seven 30 m legs of a lawnmower-pattern survey, each spaced 0.5 m apart in depth. Camera constraints were fused with navigation data in an extended information filter framework to provide boundederror precision navigation along the hull. The camera-based Feature-Based Navigation algorithm was able to "close-the-loop" and register itself to earlier imagery from the first leg of the survey - thereby resetting any incurred DVL navigation error during the data dropout period.[9]. In May 2009, engineering trials focused on ATR, acoustic communications, and mosaicing were conducted by FAU, SeeByte, and Bluefin in Boston Harbor. During these trials, the HAUV surveyed a flatbottom hull on which targets had been mounted and transmitted the DIDSON data in real-time to the topside over the fiber optic tether. The sonar data were processed in real-time by SeeByte s ATR algorithm. ATR calls were sent back down to the vehicle over the fiber optic tether for acoustic transmission back to the surface. The full-resolution single-frame DIDSON images sent by the acoustic modem were then displayed on FAU s laptop. This setup was an intermediate configuration used to test the ATR and acoustic modem before running the ATR onboard the vehicle directly. During these same trials, video data acquisition dives were carried out with HAUV1B fitted with University of Michigan s new camera and light system

6.2. On-going Work MIT s work under ONR will use the DIDSON as both an imaging and bathymetry sonar to improve HAUV autonomous navigation. In imaging mode, each newly collected image is matched with previous images to continuously build an estimate of the vehicle s pose history. In bathymetry mode, sections of range data are compared with an a priori sonar scan of the ship hull, to localize the vehicle. On non-complex sections of the hull (which can be approximated locally as flat surfaces), the imaging mode is expected to yield good performance, but in surveying complex structures such as the running gear, bathymetry sonar should allow improved localization and motion planning. Specific goals in motion planning are to achieve 100% sensor coverage of all structures and to concurrently optimize dynamic stability of the vehicle. FAU s role will consist of acoustically transmitting ATR and mosaic images from the HAUV equipped with an on-board SeeByte processing unit. The Hermes acoustic modem will use both high bit-rate uplink and command-and-control downlink, so that the data transmission can be performed independently from the fiber optic tether. In addition, FAU will transmit compressed or uncompressed images acquired in complex sections by the HAUV. SeeByte will be working on control and sensor analysis modules for the new profiling sonars available for the hull inspection problem (DIDSON sonar with profiling lens and Blueview MB2250). Example control tasks that will be developed include a propeller shaft tracking algorithm and a module which will allow the HAUV to inspect harbor pier pilings while maintaining a constant offset. Within the sensor analysis area, SeeByte will be developing real-time 3D ATR and reconstruction models. University of Michigan s work during AUVFest 08 demonstrated visual FBN over relatively flat, benign sections of the hull. Work is currently underway to extend that result to real-time large-area FBN in complex geometry regions (such as screws and rudders). Additionally, advanced FBN capabilities in the area of perception-driven control (PDC) are being developed. PDC will enable a vehicle to respond to the environment by autonomously selecting alternative search patterns based upon perceived feature distributions in the environment. By coupling FBN into the trajectory planning and control, the HAUV can more intelligently adapt its survey and map building strategy so as to only return to feature-rich areas of the hull when it accrues too much pose uncertainty. The goal of this work is to intelligently couple the vehicle trajectory into the FBN pose estimation process so that a specified upper bound on pose error is never exceeded at any given point in time. Bluefin is currently developing a Post Mission Analysis software tool. This tool will incorporate developments from all of our partners. It will provide automatic vehicle data archiving, generate mission reports, display the real-time data and results (e.g. contact calls), allow the user to replay selected dives for detailed investigation of the sonar data, provide an interactive means to navigate through the vehicle and sonar data, generate mosaics and super-resolution sonar video, run Automatic Target Recognition (ATR) algorithms, perform 3D reconstruction on sonar profiling data, and re-navigate sonar files using featurebased navigation. Additional work to be done at Bluefin includes the development of new behaviors for inspection of the hull complex areas, integration and testing of other sonars, core vehicle software upgrades and additions, integration of CAS capabilities in the vehicle, and operational support for testing and demonstrations. 7. Conclusion The HAUV started as an R&D project funded by ONR. The system has been improved over the years and is now being evaluated by the US Navy for potential procurement. The system has shown its ability to provide excellent imagery thanks to the platform s stability and the quality of its sonar (DIDSON). Research and development work is underway under ONR funding to further improve the system and integrate short-term capabilities such as Automatic Target Recognition, real-time mosaicing, high-speed acoustic communications, and feature-based navigation. The objective is to inspect an entire hull fully autonomously in FY 2011. Acknowledgments The authors would like to express their gratitude to Dr. Thomas Swean at ONR and Mr. Robert Simmons at PMS-EOD for their support to the HAUV project over the years. References [1] Ship Hull Inspection by Hull-Relative Navigation and Control J. Vaganay, M.L. Elkins, F.S. Hover, R.S. Damus, S. Desset, J.P. Morash, V.C. Polidoro IEEE/MTS Oceans 05, Washington D.C., USA, Sept. 19-23, 2005.

[2] Performance of the HAUV Ship Hull Inspection System Prototype during HULSFest 2006 - J. Vaganay, F. Hover, M. Elkins, and D. Esposito 7 th International Symposium on Technology and the Mine Problem, Naval Post Graduate School, Monterey CA, May 2-5, 2006 [3] Ship Hull Inspection with the HAUV: US Navy and NATO Demonstrations Results - J. Vaganay, M. Elkins, D. Esposito, W. O Halloran, F. Hover, M. Kokko IEEE/MTS Oceans 06, Boston MA, USA, Sept. 18-21, 2006. [4] Hovering Autonomous Underwater Vehicle for Ship Hull Inspection : Demonstration Results J. Vaganay, S. Willcox, M. Elkins, D. Esposito, W. O Halloran, F. Hover, M. Kokko Undersea Defense Technology Pacific (UDT Pacific 06), San Diego CA, USA, Dec. 6-8, 2006. [5] HAUV System Performance Enhancement for Use by EOD Units - J. Vaganay, L. Gurfinkel, D. Jankins, K. Robinson, T. Stefanov-Wagner, S. Summit 8 th International Symposium on Technology and the Mine Problem, Naval Post Graduate School, Monterey CA, 6-8 May, 2008 [6] HERMES A High Bit-Rate Underwater Acoustic Modem Operating at High-Frequencies for Ports and Shallow Water Applications - P.P.J. Beaujean, E.A. Carlson, Marine Technology Soc. Journal, Vol. 43, no. 2, 2009, pp. 21-32. [7] Stability and Robustness Analysis Tools for Marine Robot Localization and Mapping Applications - B. Englot, Master's Thesis, Massachusetts Institute of Technology, 2009. [8] Automatic Ship Hull Inspection - The Detection of mine-like targets in sonar data using multi-cad fusion and tracking technologies - P.Y. Mignotte, S. Reed, A. Cormack, and Y. Petillot, in Proceedings of Institute of Acoustics International Conference on the Detection and Classification of Underwater Targets, Sept. 2007, Heriot-Watt University, Scotland. [9] Pose-graph Visual (SLAM) with Geometric Model Selection for Autonomous Underwater Ship Hull Inspection - A. Kim and R. Eustice - IROS 2009, Saint. Louis, MO, USA, Oct. 11-15, 2009