Atmosphere (Mar 2020)

Wave-Tracking in the Surf Zone Using Coastal Video Imagery with Deep Neural Networks

  • Jinah Kim,
  • Jaeil Kim,
  • Taekyung Kim,
  • Dong Huh,
  • Sofia Caires

DOI
https://doi.org/10.3390/atmos11030304
Journal volume & issue
Vol. 11, no. 3
p. 304

Abstract

Read online

In this paper, we propose a series of procedures for coastal wave-tracking using coastal video imagery with deep neural networks. It consists of three stages: video enhancement, hydrodynamic scene separation and wave-tracking. First, a generative adversarial network, trained using paired raindrop and clean videos, is applied to remove image distortions by raindrops and to restore background information of coastal waves. Next, a hydrodynamic scene of propagated wave information is separated from surrounding environmental information in the enhanced coastal video imagery using a deep autoencoder network. Finally, propagating waves are tracked by registering consecutive images in the quality-enhanced and scene-separated coastal video imagery using a spatial transformer network. The instantaneous wave speed of each individual wave crest and breaker in the video domain is successfully estimated through learning the behavior of transformed and propagated waves in the surf zone using deep neural networks. Since it enables the acquisition of spatio-temporal information of the surf zone though the characterization of wave breakers inclusively wave run-up, we expect that the proposed framework with the deep neural networks leads to improve understanding of nearshore wave dynamics.

Keywords