2. EchoMap: Plotting echogram and maps from echo data in the Pacific Hake survey#
2.1. Introduction#
2.1.1. Goals#
Illustrate how to connect work between
echopype
andechoshader
.Illustrate a common workflow for plotting echogram and echomap using
echoshader
.
2.1.2. Description#
This notebook uses EK60 echosounder data collected during the 2017 Joint U.S.-Canada Integrated Ecosystem and Pacific Hake Acoustic Trawl Survey (‘Pacific Hake Survey’) to illustrate a common workflow for data conversion, calibration and visualization using echopype and echoshader.
Ten days of cloud-hosted .raw data files are accessed by echopype directly from an Amazon Web Services (AWS) S3
“bucket” maintained by the NOAA NCEI Water-Column Sonar Data Archive. The total data used are 365 .raw files at approximately 25 MB each (1 Hz pinging rate from first light to dusk), corresponding to approximately 40 GB. With echopype, each file is converted to a standardized representation and saved on Google Drive
.
2.1.3. Outline#
Establish
AWS S3
file system connection and Process S3-hosted raw files withechopype
.Combine MVBS with coordinate information.
Plot echogram, echomap and control widgets with
echoshader
.
2.1.4. Warning#
When Image
dimension ping_time is not evenly sampled to relative tolerance of 0.001. There will be an error.
Use echogram.erase_error()
to get rid of them.
2.1.5. Note#
We encourage importing echopype
asep
for consistency.
from pathlib import Path
import itertools as it
import datetime as dt
from dateutil import parser as dtparser
import fsspec
from urllib import request
import echopype as ep
import xarray as xr
import panel
import holoviews as hv
from echoshader.echomap import EchoMap
hv.extension("bokeh")
2.2. Establish AWS S3
file system connection and Process S3-hosted raw files with echopype
.#
base_dpath = Path('./exports')
base_dpath.mkdir(exist_ok=True)
calibrated_dpath = (base_dpath / 'HakeSurvey_10_Days')
calibrated_dpath.mkdir(exist_ok=True)
lon_dpath = (base_dpath / 'Lon_10_Days')
lon_dpath.mkdir(exist_ok=True)
lat_dpath = (base_dpath / 'Lat_10_Days')
lat_dpath.mkdir(exist_ok=True)
fs = fsspec.filesystem('s3', anon=True)
bucket = "ncei-wcsd-archive"
rawdirpath = "data/raw/Bell_M._Shimada/SH1707/EK60"
s3rawfiles = fs.glob(f"{bucket}/{rawdirpath}/*.raw")
start_datetime = dt.datetime(2017, 7, 24, 0, 0)
end_datetime = dt.datetime(2017, 8, 2, 23, 59)
date_list = []
for x in range(0, (end_datetime-start_datetime).days + 1):
date = start_datetime + dt.timedelta(days = x)
date_list.append(str(date.month).zfill(2)+str(date.day).zfill(2))
s3rawfiles = [
s3path for s3path in s3rawfiles
if any([f"D2017{datestr}" in s3path for datestr in date_list])
]
print(f"There are {len(s3rawfiles)} target raw files available")
for s3rawfpath in s3rawfiles:
raw_fpath = Path(s3rawfpath)
try:
# Access file directly from S3 to create a converted EchoData object in memory
ed = ep.open_raw(
f"s3://{s3rawfpath}",
sonar_model='EK60',
storage_options={'anon': True}
)
# Use the EchoData object "ed" to generate calibrated and
# computed MVBS files that will be saved to netcdf
ds_Sv = ep.calibrate.compute_Sv(ed)
ds_MVBS = ep.preprocess.compute_MVBS(
ds_Sv,
range_meter_bin=5, # in meters
ping_time_bin='1800s' # in seconds
)
ds_MVBS.to_netcdf(calibrated_dpath / f"MVBS_{raw_fpath.stem}.nc")
ds_lon = ed['Platform'].longitude
ds_lon.to_netcdf(lon_dpath / f"MVBS_{raw_fpath.stem}.nc")
ds_lat = ed['Platform'].latitude
ds_lat.to_netcdf(lat_dpath / f"MVBS_{raw_fpath.stem}.nc")
except Exception as e:
print(f"Failed to process raw file {raw_fpath.name}: {e}")
2.3 Combine MVBS with coordinate information.#
MVBS_ds = xr.open_mfdataset(
str(calibrated_dpath / 'MVBS_*.nc'),
data_vars='minimal', coords='minimal',
combine='by_coords'
)
longitude = xr.open_mfdataset(
str(lon_dpath / '*.nc'),
data_vars='minimal', coords='minimal',
combine='by_coords'
)
latitude = xr.open_mfdataset(
str(lat_dpath / '*.nc'),
data_vars='minimal', coords='minimal',
combine='by_coords'
)
lon = longitude["longitude"]
lat = latitude["latitude"]
lon=lon.interp(time1=MVBS_ds["ping_time"])
lat=lat.interp(time1=MVBS_ds["ping_time"])
MVBS_ds["longitude"]=lon
MVBS_ds["latitude"]=lat
import datetime
history = (
f"{datetime.datetime.utcnow()} +00:00. "
"Interpolated from Platform latitude/longitude."
)
MVBS_ds["latitude"] = MVBS_ds["latitude"].assign_attrs({"history": history})
MVBS_ds["longitude"] = MVBS_ds["longitude"].assign_attrs({"history": history})
MVBS_ds
2.4 Plot echogram, echomap and control widgets with echoshader
.#
# Calibratd data is stored in Google Drive
url = 'https://drive.google.com/uc?export=download&id=1E5mdgALsPApD2-vFoetIZXKyo_y3NUb9'
def urllib_download():
request.urlretrieve(url, 'test_EchoMap.nc')
urllib_download()
MVBS_ds = xr.open_mfdataset(
paths = 'test_EchoMap.nc',
data_vars = 'minimal', coords='minimal',
combine = 'by_coords'
)
MVBS_ds
<xarray.Dataset> Dimensions: (channel: 4, ping_time: 875, echo_range: 150) Coordinates: * channel (channel) object 'GPT 18 kHz 009072058c8d 1-1 ES18-11... * ping_time (ping_time) datetime64[ns] 2017-07-24T19:30:00 ... 201... * echo_range (echo_range) float64 0.0 5.0 10.0 ... 735.0 740.0 745.0 time1 (ping_time) datetime64[ns] dask.array<chunksize=(875,), meta=np.ndarray> Data variables: Sv (channel, ping_time, echo_range) float64 dask.array<chunksize=(4, 875, 150), meta=np.ndarray> frequency_nominal (channel) float64 dask.array<chunksize=(4,), meta=np.ndarray> longitude (ping_time) float64 dask.array<chunksize=(875,), meta=np.ndarray> latitude (ping_time) float64 dask.array<chunksize=(875,), meta=np.ndarray> Attributes: processing_software_name: echopype processing_software_version: 0.6.0 processing_time: 2022-08-20T07:11:30Z processing_function: preprocess.compute_MVBS
There using EchoMap
, wchich is a subclass of Echogram
.
echomap = EchoMap(MVBS_ds)
Below shows how to get map in select box.
%%capture --no-display
panel.Row(echomap.widgets , panel.Column(echomap.view_gram, echomap.view_box_map))
WARNING:param.Image01534: Image dimension ping_time is not evenly sampled to relative tolerance of 0.001. Please use the QuadMesh element for irregularly sampled data or set a higher tolerance on hv.config.image_rtol or the rtol parameter in the Image constructor.
WARNING:param.Image01534: Image dimension ping_time is not evenly sampled to relative tolerance of 0.001. Please use the QuadMesh element for irregularly sampled data or set a higher tolerance on hv.config.image_rtol or the rtol parameter in the Image constructor.
Below shows how to get map in select box and map responding to full-screen echogram.
panel.Row(panel.Column(echomap.view_all_map))
Below shows how to get curtain in select box.
panel.Row(echomap.view_curtain)
Numba: Attempted to fork from a non-main thread, the TBB library may be in an invalid state in the child process.