Monitoring fish using passive acoustics

dc.contributor.authorMouy, Xavier
dc.contributor.supervisorDosso, Stanley Edward
dc.contributor.supervisorJuanes, Francis
dc.date.accessioned2022-01-31T17:59:11Z
dc.date.available2022-01-31T17:59:11Z
dc.date.copyright2022en_US
dc.date.issued2022-01-31
dc.degree.departmentSchool of Earth and Ocean Sciencesen_US
dc.degree.levelDoctor of Philosophy Ph.D.en_US
dc.description.abstractSome fish produce sounds for a variety of reasons, such as to find mates, defend their territory, or maintain cohesion within their group. These sounds could be used to non-intrusively detect the presence of fish and potentially to estimate their number (or density) over large areas and long time periods. However, many fish sounds have not yet been associated to specific species, which limits the usefulness of this approach. While recording fish sounds in tanks is reasonably straightforward, it presents several problems: many fish do not produce sounds in captivity or their behavior and sound production is altered significantly, and the complex acoustic propagation conditions in tanks often leads to distorted measurements. The work presented in this thesis aims to address these issues by providing methodologies to record, detect, and identify species-specific fish sounds in the wild. A set of hardware and software solutions are developed to simultaneously record fish sounds, acoustically localize the fish in three-dimensions, and record video to identify the fish and observe their behavior. Three platforms have been developed and tested in the field. The first platform, referred to as the large array, is composed of six hydrophones connected to an AMAR acoustic recorder and two open-source autonomous video cameras (FishCams) that were developed during this thesis. These instruments are secured to a PVC frame of dimension 2 m x 2 m x 3 m that can be transported and assembled in the field. The hydrophone configuration for this array was defined using a simulated annealing optimization approach that minimized localization uncertainties. This array provides the largest field of view and most accurate acoustic localization, and is well suited to long-term deployments (weeks). The second platform, referred to as the mini array, uses a single FishCam and four hydrophones connected to a SoundTrap acoustic recorder on a one cubic meter PVC frame; this array can be deployed more easily in constrained locations or on rough/uneven seabeds. The third platform, referred to as the mobile array, consists of four hydrophones connected to a SoundTrap recorder and mounted on a tethered Trident underwater drone with built-in video, allowing remote control and real-time positioning in response to observed fish presence, rather than long-term deployments as for the large and mini arrays. For each array, acoustic localization is performed by measuring time-difference of arrivals between hydrophones and estimating the sound-source location using linearized (for the large array) or non-linear (for the mini and mobile arrays) inversion. Fish sounds are automatically detected and localized in three dimensions, and sounds localized within the field of view of the camera(s) are assigned to a fish species by manually reviewing the video recordings. The three platforms were deployed at four locations off the East coast of Vancouver Island, British Columbia, Canada, and allowed the identification of sounds from quillback rockfish (Sebastes maliger), copper rockfish (Sebastes caurinus), and lingcod (Ophiodon elongatus), species that had not been documented previously to produce sounds. While each platform developed during this thesis has its own set of advantages and limitations, using them in coordination helps identify fish sounds over different habitats and with various budget and logistical constraints. In an effort to make passive acoustics a more viable way to monitor fish in the wild, this thesis also investigates the use of automatic detection and classification algorithms to efficiently find fish sounds in large passive acoustic datasets. The proposed approach detects acoustic transients using a measure of spectrogram variance and classifies them as “noise” or “fish sounds” using a binary classifier. Five different classification algorithms were trained and evaluated on a dataset of more than 96,000 manually annotated examples of fish sounds and noise from five locations off Vancouver Island. The classification algorithm that performed best (random forest) has an Fscore of 0.84 (Precision = 0.82,Recall = 0.86) on the test dataset. The analysis of 2.5 months of acoustic data collected in a rockfish conservation area off Vancouver Island shows that the proposed detector can be used to efficiently explore large datasets, formulate hypotheses, and help answer practical conservation questions.en_US
dc.description.scholarlevelGraduateen_US
dc.identifier.bibliographicCitationMouy, X., Rountree, R., Juanes, F., and Dosso, S. E. (2018). Cataloging fish sounds in the wild using combined acoustic and video recordings. The Journal of the Acoustical Society of America, 143(5), EL333–EL339.en_US
dc.identifier.bibliographicCitationMouy, X., Black, M., Cox, K., Qualley, J., Mireault, C., Dosso, S.E., and Juanes, F. (2020). FishCam: A low-cost open source autonomous camera for aquatic research. HardwareX 8, e00110.en_US
dc.identifier.urihttp://hdl.handle.net/1828/13731
dc.languageEnglisheng
dc.language.isoenen_US
dc.rightsAvailable to the World Wide Weben_US
dc.subjectpassive acousticsen_US
dc.subjectfish soundsen_US
dc.subjectacoustic localisationen_US
dc.subjectVideo cameraen_US
dc.subjectautomatic detection and classificationen_US
dc.subjectlingcoden_US
dc.subjectquillback rockfishen_US
dc.subjectcopper rockfishen_US
dc.titleMonitoring fish using passive acousticsen_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Mouy_Xavier_PhD_2022.pdf
Size:
5.87 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2 KB
Format:
Item-specific license agreed upon to submission
Description: