Abstract: This journal paper describes the design and operation the Floating-Point III Autonomous Surface Vehicle d e v e l o p e d

Similar documents
Floating- Point ASV System

Steve Seyler, Sid Menon, C. Randy Breingan, John Li, Santiago Iglesias 6/11/2012

Design and Development of an Autonomous Surface Watercraft

Proposal for a Design of a Autonomous Bridge-Mapping Hydroplane

PropaGator Autonomous Surface Vehicle

GOLFER. The Golf Putting Robot

Autonomous Surface Vehicle

NSW Mines Rescue Unmanned Aerial Vehicle (UAV)

BACKGROUND TO STUDY CASE

Intelligent Decision Making Framework for Ship Collision Avoidance based on COLREGs

Rules for. Polyathlon. Version Released

High Definition Laser Scanning (HDS) Underwater Acoustic Imaging and Profiling

Model-based Adaptive Acoustic Sensing and Communication in the Deep Ocean with MOOS-IvP

The MEDUSA Deep Sea and FUSION AUVs:

Specifications for Synchronized Sensor Pipe Condition Assessment (AS PROVIDED BY REDZONE ROBOTICS)

ROV Development ROV Function. ROV Crew Navigation IRATECH SUB SYSTEMS 2010

ZSTT Team Description Paper for Humanoid size League of Robocup 2017

The below identified patent application is available for licensing. Requests for information should be addressed to:

Advanced PMA Capabilities for MCM

Kennesaw State University Autonomous Underwater Vehicle Team 1

Focus on Operational Efficiency and Crew Safety - Introducing Advanced ROV Technology in Marine Towed Streamer Seismic

Project to Refine a Prototype Unmanned, Tethered ADCP Platform for Measuring Streamflow

Autonomous blimp control with reinforcement learning

Laboratory 2(a): Interfacing WiiMote. Authors: Jeff C. Jensen (National Instruments) Trung N. Tran (National Instruments)

A Distributed Control System using CAN bus for an AUV

Shadow Incorporated. Design Review #1 February 1, Mark Backman Ben Collins Jesus Gonzalez Suma Tumuluri Vinita Venkatesh

PERCEPTIVE ROBOT MOVING IN 3D WORLD. D.E- Okhotsimsky, A.K. Platonov USSR

Sontek RiverSurveyor Test Plan Prepared by David S. Mueller, OSW February 20, 2004

Scanning Sonar and ROV Operations. For Underwater SAR Imaging Applications

Pioneer Array Micro-siting Public Input Process Frequently Asked Questions

Sensors and Platforms for Autonomous Undersea Systems

Specifications. The Field:

Open Research Online The Open University s repository of research publications and other research outputs

Game Manual. 1.2 The team must have an adult coach who has completed the Maryland 4-H UME Volunteer training.

Title: 4-Way-Stop Wait-Time Prediction Group members (1): David Held

Trim and Stabilisation systems NEXT GENERATION IN BOAT CONTROL.

Simrad yachting catalog 2007

Lane Management System Team 1 Adam Pruim - Project Manager Curtis Notarantonio - Security/Safety Engineer Jake Heisey - Domain Expert/Customer

Cricket umpire assistance and ball tracking system using a single smartphone camera

Neural Network in Computer Vision for RoboCup Middle Size League

ZIPWAKE DYNAMIC TRIM CONTROL SYSTEM OUTLINE OF OPERATING PRINCIPLES BEHIND THE AUTOMATIC MOTION CONTROL FEATURES

The Water Dogs RoboBoat 2017

Using AUVs in Under-Ice Scientific Missions

GHC 20. Owner s Manual

#19 MONITORING AND PREDICTING PEDESTRIAN BEHAVIOR USING TRAFFIC CAMERAS

IDeA Competition Report. Electronic Swimming Coach (ESC) for. Athletes who are Visually Impaired

Cooperative Navigation for Autonomous Underwater Vehicles. Navigare 2011, 4 May 2011, Bern

General Dynamics Canada Whitepaper. Abstract

Figure 8: Looking Down on a Three Thruster Vehicle

GHC 20 Owner s Manual

Department of Computer Science and Engineering The University of Texas at Arlington. Team: PAINTeK. Project: Sentinel

RESOLUTION MSC.94(72) (adopted on 22 May 2000) PERFORMANCE STANDARDS FOR NIGHT VISION EQUIPMENT FOR HIGH-SPEED CRAFT (HSC)

sorting solutions osx separator series

ACCURATE PRESSURE MEASUREMENT FOR STEAM TURBINE PERFORMANCE TESTING

An experimental validation of a robust controller on the VAIMOS autonomous sailboat. Fabrice LE BARS

PART 5 - OPTIONS CONTENTS 5.1 SYSTEM EXPANSION 5-3

2600T Series Pressure Transmitters Plugged Impulse Line Detection Diagnostic. Pressure Measurement Engineered solutions for all applications

MUNIN s Autonomous Bridge

2016 Maryland 4-H Robotics Engineering Challenge

The Wave Glider: A Mobile Buoy Concept for Ocean Science. 009 Liquid Robotics Inc.

Marine Mammal Acoustic Tracking from Adapting HARP Technologies

Marine Towed Array Surveys of Ostrich Bay, Lake Erie and Puerto Rico.

MOTUS Wave Buoys. Powered By the Aanderaa MOTUS Directional Wave Sensor

CYCLING TRACK SALES CONSULTING

Oceanographic Research With The LiquID Station

Rescue Rover. Robotics Unit Lesson 1. Overview

H Robotics Engineering Challenge

Dynamic Positioning Control Augmentation for Jack-up Vessels

M. Mikkonen.

2018 ASABE Student Robotics Challenge

The Evolution of an Autonomous Unmanned Surface Vessel and Software for Hydrographic Survey

Utilizing Vessel Based Mobile LiDAR & Bathymetry Survey Techniques for Survey of Four Southern California Breakwaters

AN UNDERWATER AUGMENTED REALITY SYSTEM FOR COMMERCIAL DIVING OPERATIONS

DEPARTMENT OF THE NAVY DIVISION NEWPORT OFFICE OF COUNSEL PHONE: FAX: DSN:

FAI Sporting Code. UAV Class U. Section 12 Unmanned Aerial Vehicles Edition. Effective 1st January 2018

Products and Services HR3D, AUV3D

Integrated control system for ship motion in inland navigation

NATIONAL INSTRUMENTS AUTONOMOUS ROBOTICS COMPETITION Task and Rules Document

Barunastra ITS: Nala Heroes

Underwater Robots Jenny Gabel

RAMSTM. 360 Riser and Anchor-Chain Integrity Monitoring for FPSOs

FixedWingLib CGF. Realistic CGF Aircraft Entities ware-in-the-loop Simulations

An effective approach for wide area detailed seabed mapping

Interim Operating Procedures for SonTek RiverSurveyor M9/S5

The Stingray AUV: A Small And Cost-Effective Solution for Ecological Monitoring

Mock-up of the Exhaust Shaft Inspection by Dexterous Hexrotor at the DOE WIPP Site

The University of Texas at Arlington RoboBoat 2014

Failure Detection in an Autonomous Underwater Vehicle

Remote Control Bait Boat

Robocup 2010 Version 1 Storyboard Prototype 13 th October 2009 Bernhard Hengst

Review and Classification of The Modern ROV

A new AI benchmark. Soccer without Reason Computer Vision and Control for Soccer Playing Robots. Dr. Raul Rojas

SEAEYE FALCON & FALCON DR

ANZAI AZ-733VI. Respiratory Gating System. since 1976

AutonoVi-Sim: Modular Autonomous Vehicle Simulation Platform Supporting Diverse Vehicle Models, Sensor Configuration, and Traffic Conditions

Deploying the TCM-1 Tilt Current Meter in an Inverted (Hanging) Orientation By: Nick Lowell, Founder & President

SeaSmart. Jonathan Evans

Ranger Walking Initiation Stephanie Schneider 5/15/2012 Final Report for Cornell Ranger Research

frequently asked questions

CprE 288 Final Project Description

Transcription:

Abstract: This journal paper describes the design and operation of the Floating-Point III Autonomous Surface Vehicle d e v e l o p e d by s t u d ents f rom t h e R o b o t i cs A s s o ciation a t Embry-Riddle Aeronautical University (RAER). This vessel was designed specifically for entry into the 8 th International RoboBoat Competition. The Embry-Riddle team has developed new and innovative features to successfully complete the autonomous challenges in the competition. These features include a hull and deck designed for integration of an autonomous unmanned aerial vehicle, improved hardware, updated algorithms, and improved software modules. I. INTRODUCTION The Robotics Association at Embry-Riddle Aeronautical University (RAER) is proud to present, the Floating-Point III Autonomous Surface Vehicle (ASV) system as a competitive solution to the 2015 RoboBoat challenge. Goals of the Floating-Point team are fulfillment of all mandatory tasks such as, launch and recovery of an Autonomous Unmanned Aerial Vehicle (UAV), navigation of the buoy channel, and completion of all additional tasks posed by the challenge stations. The Floating-Point ASV system was developed with innovative solutions for meeting these goals. The final design of the Figure 1. ERAU Floating-Point III ASV (2015) system includes a stable, maneuverable, trihull boat platform, a waterproof UAV subvehicle, a powerful on-board processor, and an array of mission-critical sensors. The Floating-Point sensor suite includes a Global Positioning System (GPS), HD cameras, ultrasonic hydrophones, and a scanning laser rangefinder. II. HULL DESIGN The Floating-Point III ASV (Figure 1) system is a block upgrade retrofitted version of Floating-Point II ASV (Figure 2). The new upgrades include modifications to carry an autonomous UAV, a more accessible and organized payload space, an internal cooling system, and a new coning tower for the sensor suite. The team has retained the trihull configuration, where the center hull bears most of the load and the two exterior hulls provide stability. The two outside hulls draft one inch less water than the center hull to reduce drag while providing counter forces that resist pitch and roll. The leading edge of the center hull is angled relative to the water for improved hydrodynamic performance. Floating Point 1

Figure 2. ERAU Floating-Point II ASV (2014) The new coning tower has been designed and built on the front deck to hold the laser range finder (labeled 1 in Fig. 3) at an optimal position avoiding any obstruction of its 30 degree vertical view. The tower also holds the GPS (labeled 2 in Fig. 3) above other sensors to avoid inference experienced in previous systems. The aft portion of the deck has an integrated landing pad with four tapered shallow quadrants (labeled 3 in Fig. 3) to support and guide the UAV subvehicle. The four shallow quadrants provide a nesting structure to help hold the subvehicle on the deck and help correct landing errors in order to make landing more stable, allowing for a safe recovery of the UAV. Propulsion of the ASV is provided by a pair of SeaBotix BTD150 thrusters. The SeaBotix thrusters deliver a continual Bollard thrust of 28 N each at 4.25 amps. The thrusters a r e c o n f i g u r e d t o provide differential steering. Floating- Point III h a s a top speed of 1.3 m/s, with turn rates of approximately 1.1 rad/s. III. SYSTEM INTEGRATION The Floating-Point system is comprised of commercial off-the-shelf (COTS) parts integrated through a custom printed circuit board. These parts include an emergency stop system, a collection of sensors, used to locate and complete challenge stations, and communication equipment to communicate with the judge s network and ground station systems. On-board Processing The sensor suite sends data to the onboard Intel Core i7 mobile computer. After sensor data has been processed for the current task, messages are sent to the control board, which generates the output control signals required for maneuvering. These signals are sent to the SeaBotix thrusters for precise movement. The on-board computer system also sends data through a wireless data link for testing and debugging. Communications On-board systems communication, control, and health monitoring are implemented using an Ethernet-based communication backbone. A 5.8 GHz Ubiquiti Rocket radio is integrated into the Ethernet backbone as a bridge to external communications. This system has undergone extensive reliability testing by the ERAU team, and is now used on multiple platforms. It is capable of transmitting high bandwidth data from over a mile away and in radio-frequency-hostile environments. Although not useable for direct control by the team during a competition run, this data link - enables the team to perform real-time diagnostics and monitoring while testing from either a chase boat or a nearby shore location. It also provides the RoboBoat judges with full access and control of the platform. Figure 3: Floating-Point Deck Floating Point 2

An independent direct command and control link is also maintained with the vehicle at all times using a 2.4 GHz commercially available, frequency hopping, spread spectrum, radio control (RC) system. This system implements emergency stop functionality as well as switch from manual control, to either a paused state or autonomous control. Functionality has been tested at ranges up to half a mile, which is sufficient for the most extreme distances encountered at the current RoboBoat venue. In the event of loss of RC signal, hardware circuits disable power to the motors and stop the vehicle. A 3DR 433MHz radio is used as a communication link between Floating-Point and its sub-vehicle. This allows communication with and control of the UAV at all times. Emergency Systems Recognizing the need for safe operation, the team opted for redundancy while developing their safety systems. To ensure safety, all control signals, manual or autonomous, are sent to the control board and not through the onboard autonomy software. A wireless emergency stop solution was also developed to allow remote deactivation by the team s safety operator through the control board. Upon pressing the button, the control board will terminate power to the onboard thrusters. This system is accessible through the RC controller, used for manually controlling the vehicle in water. The vehicle also includes an on-board emergency stop button. Navigation and Location Sensors The sensor suite used on the Floating-Point III system includes: LIDAR for range measurement, cameras for object identification and hydrophones for locating the underwater pinger. Localization is provided using the combination of data from a Hemisphere A325 GPS and a highprecision Figure 4: Floating-Point Deck Sparton GDEC-6E Inertial Measurement Unit (IMU). External sensing and perception of obstacles is achieved by combining this array of sensors. Due to the low reflectivity of water in the infrared (IR) range, a Velodyne VLP-16 Light Detection and Ranging (LIDAR) scanner was chosen as a primary sensing modality. The LIDAR also allows for accurate determination of range for docking and trajectory planning, which is further enhanced by placing the sensor above the platform in such a way that its field-of-view is maximized as can be seen from the simulation in Figure 4. A single camera is used to identify objects by size, shape, type and color. The camera and LIDAR data are then segmented into discrete objects whose positions are tracked and updated in the global coordinate frame using individualized Kalman filters for each object. For detection and ranging of submerged sonar pings, an array of Teledyne Reson hydrophones are placed below the vessel. Ranging to the underwater source is achieved using a combination of time-offlight calculations and beam forming. Power Distribution Energy storage is provided by six-cell, 22.2 Floating Point 3

Volt Lithium Polymer batteries, providing up to 20 Amp hours. Power regulation is provide through the control board at 19V, 12V and 5V to power the on-board computer, sensors, and other on-board electronics. Added robustness and safety have been designed into the platform power distribution system. This includes a comprehensive safety system including fusing of all electrical lines and reverse polarity protection. IV. CHALLENGE STATIONS The RoboBoat competition has been broken into multiple challenges of increasing complexity. The challenges are designed to evaluate specific areas of the autonomous platform s capability. These challenges include channel navigation, acquisition and localization of a submerged acoustic beacon, launching a sub-vehicle, docking, and interaction with the judge s network. 1. The first task is to get Floating-Point to navigate autonomously to a pair of buoy gates that are separated by an unknown distance. The vessel must navigate a linear course between the starting and ending pair of gates. This task requires the team to demonstrate the degree of navigation, control and repeatability inherent in their platform 2. In the detection and avoidance of obstacles task, Floating-Point must autonomously navigate to the pre-designated entry gate (1, 2, or 3), travel autonomously through a field of floating, stationary obstacle buoys varying in size and color. Completion of this task requires successful traversal of the obstacle field and exit through the designated exit gate (X, Y, or Z) without contacting any of the buoys. Figure 5: RoboBoat Field General Arrangement 3. The craft docking and target identification task requires Floating-Point to successfully identify the red cruciform and the blue triangle which marks two of the three docking bays. Once the docking bays have been located, the vessel must maneuver to enter the dock with the red cruciform first, and come to a stop. Then the vessel must repeat this process in the dock marked by the blue triangle before moving on to the next task. 4. The Interoperability challenge requires the launch and recover an autonomous UAV from Floating-Point s deck. Floating point must be within 20ft of the buoy marking the launch zone to launch the UAV. This task requires the use of the UAV sub-vehicle and its on-board sensors. The UAV must go to a specific GPS waypoint marking the recognition zone, recognize a letter or number being displayed on the ground in the zone, send a picture of the display, and determine what letter or number is being displayed. 5. In the underwater search task, the platform must successfully identify and Floating Point 4

locate a specific underwater device that is emitting an acoustic signal. Once located, the platform must circle the buoy that the acoustic beacon is beneath, and relay the color of the buoy to competition judges. V. GUIDANCE, NAVIGATION, & CONTROL Embry-Riddle s approach to autonomy for the Floating-Point III ASV is based on a set of behaviors. The overall concept offers a high degree of robustness and repeatability in a marine environment. Each sensor processes its data and sends it to the Mapper and Objective Tracker. The Path Planner receives objectives from the Objective Tracker and a list of known obstacles from the Mapper. This data is used to select which of the four motion behaviors (waypoint navigation, loiter, search pattern or position hold) are most appropriate for the situation. These behaviors are subsumed by one another, and the resultant emergent behavior allows Floating-Point to avoid obstacles and perform the various tasks required. Trajectory planning is performed using a Dubin s Path algorithm. Dubin s Path assumes a vehicle moving at unit velocity with a fixed minimum turn radius. Using this information as well as a start pose and end pose, the algorithm is able to calculate the minimum length path between the two poses. Each behavior modifies the parameters used by the algorithm (minimum radii, starting and end conditions) in order to generate an ideal path that can achieve the end pose. Then this ideal path is modified by the obstacle avoidance algorithm producing a trajectory as close to the ideal path without obstacles. The final path is achievable by floating point and routes the platform around obstacles. VI. BOUY IDENTIFICATION & NAVIGATION Buoy Detection The primary difficulty associated with vision-based buoy identification is that the system must work reliably regardless of lighting conditions, specular reflections from the water, or background noise. To this end, Floating point only uses vision to detect buoy color, while utilizing LIDAR buoy identification and classification. Buoy Detection and Classification. The primary sensor for above water sensing is the Velodyne VLP-16 LIDAR, producing approximately 350,000 data points every second out to a maximum range of 328 feet (100 meters). The LIDAR returns are processed in the on-board computer, which places the returns into an occupancy grid, as shown in Figure 6. Figure 6: Occupancy Grid (Blue circle marks the identification range, the white dots mark objects, and the boat is in the center.) The occupancy grid is then treated as a binary image which is searched for each independent objects. Each object is then classified based on features of distance from Floating Point 5

the Velodyne, object height and number of laser returns as a Tall Taylor-Made, Polyform A-0, Polyform A-2 buoy or unknown object. This classification system estimates the probability distribution of these features using a multivariate Gaussian model. The end result is that based on the observed features, the system can determine the probability the object is each of the possible buoy types used in competition. The object is said to be unknown if all of the probabilities are below a threshold, and the most likely buoy if any of the probabilities is above this threshold. Empirical results have shown this classification system to be 90% accurate at classifying the competition buoys. The Object classifier node feeds into a mapping node which uses time of detection and GPS, along with the characteristics determined by the object classifiers to build up object location, type, and confidence of existence. These objects are then mapped into a global reference frame and tracked over time. From this global map, a region of interest can be determined in the camera view, where the buoys are expected. Vision Algorithm Vision processing is performed using the LabVIEW software environment. After performing object classification, the center point of each buoy has been placed on a map in the vehicle s frame of reference. This information is also tagged with the type of buoy that has been detected. Based on the known size of each buoy type, a region of interest is extracted from the camera view around the located buoy. From the region of interest, the color of the buoy is determined by removing the background pixels (typically water and the shore), and then comparing the resulting image histogram against a color look-up table to determine the most likely color. The use of the global map and knowledge of buoy color also enables the ASV to navigate between buoys of interest, such as the Talyor-Made Red and Green buoys. This is done by generating a vector pointing towards the center of the buoy pair. This center-oriented vector can now be used by the navigation and control algorithms as a drive point. If only one buoy, either green or red, is found, then the boat will produce a vector pointing towards the direction of the missing buoy, e.g. to the right of the red buoy if the green buoy is not found. Aerial Platform VII. THE INTEROPERABILITY CHALLENGE The Mariner frame (Figure 8) was chosen as the ERAU UAV sub-vehicle, named Exponent, to complete the Interoperability Challenge. Building on previous experience with the Aerotestra quadrotor (Figure 7) from the 2013 Catch the Ball task, the Aerotestra was the initial platform to attempt this year s challenge. Later, the Aerotestra s hardware and software components were transitioned and integrated into the Mariner frame. The biggest advantage of the Mariner frame is that it is fully waterproof and smaller than the Aerotestra. Figure 7: Aerotestra Quadrotor Frame Floating Point 6

Figure 8: Mariner Quadrotor Frame Software Integration The Interoperability Challenge requires the UAV to launch from the boat, fly over a recognition zone to a given waypoint, report what is displayed in the recognition zone, and successfully land back on the boat. The UAV is integrated with a sensor and processing suite that includes an Ardupilot autopilot, a Logitech webcam, and a quadcore embedded Linux ODROID computer (Figure 9) to meet these requirements. To determine what is displayed in the recognition zone, the UAV uses the webcam while running Ubuntu and OpenCV. The UAV is launched when Floating-Point is within 20ft of a buoy marking the launch zone, and a start message is sent to the UAV s on-board computer. Exponent will perform the required reconnaissance task and return to the boat using GPS navigation and visual identification of Floating-Point s landing zone. Figure 9: Exponent s Odroid-U3 computer Figure 10: Picture of Recognition Zone Object Identification The recognition zone contains a gray/silver tarp upon which are laid a number of pipe segments forming a 7 segment display for values 0-9 and A-F. In addition a small pipe is placed in the corner of the tarp to mark the orientation (figure 10). The UAV must take a picture and send both the picture and the correct number or letter to the judges network. OpenCV libraries in Python allow access to a wide range of image processing functions. Utilizing these functions, orange objects are found and noise is removed from the resulting binary image, leaving only the orange pipes (Figure 11). Once the pipes are identified (Figure 12) the UAV can assume the smallest pipe is the indicator. Then, using the orientation, the number of pipes, and where the pipes are placed relative to each other, the UAV can report which number or letter is displayed. After significant development and testing, the algorithm can now account for the angle of the camera relative to the display, different lighting and weather conditions, all possible digits, different orientations, and interfering objects of similar color. Floating Point 7

Using adaptive algorithms, and on board sensors the UAV can adjust for many scenarios. This results in the detection of all the pipes 91% of the time. When the pipe segments are correctly detected, the algorithm successfully reports the correct letter or number 95% of the time regardless of the weather conditions, angle of the UAV, display in the recognition zone, lighting conditions or orientation of the UAV camera relative to the target. Performance during autonomous tests provides the team with a high level of confidence that the system will successfully perform the Interoperability Challenge. Figure 11: Recognition Zone After Image Processing Functions are Applied Figure 12: Picture of all the found pipes Interoperability Results One of the main objectives of the project was to integrate a sub-vehicle UAV that can work collaboratively with Floating-Point s system. The UAV was designed to operate in any weather condition. New software has been developed for Floating-Point to allow it to deploy the sub-vehicle and communicate with it during its mission. VIII. CONCLUSION The Floating-Point III ASV has improved technology and enhanced capabilities compared to the previous year s entry. We have shown through simulation, lab testing and in-water test runs that the Floating-Point III ASV System is capable of attempting and successfully completing all of the tasks in this year s RoboBoat competition. REFERENCES 2015 RoboBoat Competition Final Rules: https://s3.amazonaws.com/com.felixpageau.roboboat/roboboat_2015_final_rules_201 50527.pdf 2014 Floating-Point Journal Paper: IX. ACKNOWLEDGEMENT The Floating-Point III ASV team would like to thank Dr. Charles Reinholtz, Dr. Patrick Currier, Dr. Eric Coyle, Dr. Brian Butka, Christopher Hockley, Tim Zuercher, and Hitesh Patel for their support on this project. Floating Point 8