Skip to content

Commit

Permalink
added glorys download functions with new API (#180)
Browse files Browse the repository at this point in the history
* added glorys download functions with new API

* black

* black

* bugfix

* bugfix

* bugfix

* update notebook

* black

---------

Co-authored-by: Ashley Barnes <[email protected]>
Co-authored-by: Ashley Barnes <[email protected]>
  • Loading branch information
3 people authored Aug 23, 2024
1 parent 4a863ad commit 071b54b
Show file tree
Hide file tree
Showing 2 changed files with 136 additions and 17 deletions.
42 changes: 26 additions & 16 deletions demos/reanalysis-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -117,22 +117,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 2: Prepare ocean forcing data\n",
"\n",
"We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is `\"east_unprocessed\"` for segments and `\"ic_unprocessed\"` for the initial condition.\n",
"\n",
"Data can be downloaded directly from the [Copernicus Marine data store](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download) via their GUI (once logged in).\n",
"\n",
"1. Initial condition: Using the GUI, select an area matching your `longitude_extent` and `latitude_extent` that corresponds to the first day in your date range. Download the initial condition and save it with filename `ic_unprocessed.nc` inside the `glorys_path` directory.\n",
"2. Boundary forcing: Using the GUI, select the Eastern boundary of your domain (if you have one that contains ocean). Allow for a buffer of ~0.5 degrees in all directions, and download for the prescribed `date_range`. Download and name `east_unprocessed.nc`.\n",
"3. Repeat step 2 for the remaining sections of the domain."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 3: Make experiment object\n",
"## Step 2: Make experiment object\n",
"The `regional_mom6.experiment` contains the regional domain basics, and also generates the horizontal and vertical grids, `hgrid` and `vgrid` respectively, and sets up the directory structures. "
]
},
Expand Down Expand Up @@ -185,6 +170,31 @@
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 3: Prepare ocean forcing data\n",
"\n",
"We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is `\"east_unprocessed\"` for segments and `\"ic_unprocessed\"` for the initial condition.\n",
"\n",
"In this notebook, we are forcing with the Copernicus Marine \"Glorys\" reanalysis dataset. There's a function in the `mom6-regional` package that generates a bash script to download the correct boundary forcing files for your experiment. First, you will need to create an account with Copernicus, and you'll be prompted for your username and password when you try to run the bash script.\n",
"\n",
"The function is called `get_glorys_rectangular` because the fully automated setup is only supported for domains with boundaries parallel to lines of longitude and latitude. To download more complex domain shapes you can call `rmom6.get_glorys_data` directly."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"expt.get_glorys_rectangular(\n",
" raw_boundaries_path=glorys_path,\n",
" boundaries=[\"north\", \"south\", \"east\", \"west\"],\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
111 changes: 110 additions & 1 deletion regional_mom6/regional_mom6.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
import shutil
import os
import importlib.resources

import datetime
from .utils import quadrilateral_areas


Expand Down Expand Up @@ -145,6 +145,53 @@ def longitude_slicer(data, longitude_extent, longitude_coords):
return data


from pathlib import Path


def get_glorys_data(
longitude_extent,
latitude_extent,
timerange,
segment_name,
download_path,
modify_existing=True,
):
"""
Generates a bash script to download all of the required ocean forcing data.
Args:
longitude_extent (tuple of floats): Westward and Eastward extents of the segment
latitude_extent (tuple of floats): Southward and Northward extents of the segment
timerange (tule of datetime strings): Start and end of the segment in format %Y-%m-%d %H:%M:%S
segment_range (str): name of the segment (minus .nc extension, eg east_unprocessed)
download_path (str): Location of where this script is saved
modify_existing (bool): Whether to add to an existing script or start a new one
buffer (float): number of
"""
buffer = 0.24 # Pads downloads to ensure that interpolation onto desired domain doesn't fail. Default of 0.24 is twice Glorys cell width (12th degree)

path = Path(download_path)

if modify_existing:
file = open(path / "get_glorysdata.sh", "r")
lines = file.readlines()
file.close()

else:
lines = ["#!/bin/bash\ncopernicusmarine login"]

file = open(path / "get_glorysdata.sh", "w")

lines.append(
f"""
copernicusmarine subset --dataset-id cmems_mod_glo_phy_my_0.083deg_P1D-m --variable so --variable thetao --variable uo --variable vo --variable zos --start-datetime {str(timerange[0]).replace(" ","T")} --end-datetime {str(timerange[1]).replace(" ","T")} --minimum-longitude {longitude_extent[0] - buffer} --maximum-longitude {longitude_extent[1] + buffer} --minimum-latitude {latitude_extent[0] - buffer} --maximum-latitude {latitude_extent[1] + buffer} --minimum-depth 0 --maximum-depth 6000 -o {str(path)} -f {segment_name}.nc --force-download\n
"""
)
file.writelines(lines)
file.close()
return


def hyperbolictan_thickness_profile(nlayers, ratio, total_depth):
"""Generate a hyperbolic tangent thickness profile with ``nlayers`` vertical
layers and total depth of ``total_depth`` whose bottom layer is (about) ``ratio``
Expand Down Expand Up @@ -899,6 +946,68 @@ def initial_condition(

return

def get_glorys_rectangular(
self, raw_boundaries_path, boundaries=["south", "north", "west", "east"]
):
"""
This function is a wrapper for `get_glorys_data`, calling this function once for each of the rectangular boundary segments and the initial condition. For more complex boundary shapes, call `get_glorys_data` directly for each of your boundaries that aren't parallel to lines of constant latitude or longitude.
args:
raw_boundaries_path (str): Path to the directory containing the raw boundary forcing files.
boundaries (List[str]): List of cardinal directions for which to create boundary forcing files.
Default is `["south", "north", "west", "east"]`.
"""

# Initial Condition
get_glorys_data(
self.longitude_extent,
self.latitude_extent,
[
self.date_range[0],
self.date_range[0] + datetime.timedelta(days=1),
],
"ic_unprocessed",
raw_boundaries_path,
modify_existing=False,
)
if "east" in boundaries:
get_glorys_data(
[self.longitude_extent[1], self.longitude_extent[1]],
[self.latitude_extent[0], self.latitude_extent[1]],
self.date_range,
"east_unprocessed",
raw_boundaries_path,
)
if "west" in boundaries:
get_glorys_data(
[self.longitude_extent[0], self.longitude_extent[0]],
[self.latitude_extent[0], self.latitude_extent[1]],
self.date_range,
"west_unprocessed",
raw_boundaries_path,
)
if "north" in boundaries:
get_glorys_data(
[self.longitude_extent[0], self.longitude_extent[1]],
[self.latitude_extent[1], self.latitude_extent[1]],
self.date_range,
"north_unprocessed",
raw_boundaries_path,
)
if "south" in boundaries:
get_glorys_data(
[self.longitude_extent[0], self.longitude_extent[1]],
[self.latitude_extent[0], self.latitude_extent[0]],
self.date_range,
"south_unprocessed",
raw_boundaries_path,
)

print(
f"script `get_glorys_data.sh` has been greated at {raw_boundaries_path}.\n Run this script via bash to download the data from a terminal with internet access. \nYou will need to enter your Copernicus Marine username and password.\nIf you don't have an account, make one here:\nhttps://data.marine.copernicus.eu/register"
)
return

def rectangular_boundaries(
self,
raw_boundaries_path,
Expand Down

0 comments on commit 071b54b

Please sign in to comment.