Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mostly ERA5-GLORYS demo notebook tweaks #160

Merged
merged 11 commits into from
Apr 23, 2024
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ The above installs the version of `regional-mom6` (plus any required dependencie

#### "*I want to live on the edge! I want the latest developments*"

To install `regional-mom6` directly via GitHub using `pip`, first install `esmpy` as described above. Then:
To install `regional-mom6` directly from the [GitHub repository](https://github.com/COSIMA/regional-mom6/) using `pip`, first install `esmpy` as described above. Then:

```bash
pip install git+https://github.com/COSIMA/regional-mom6.git
Expand Down
98 changes: 57 additions & 41 deletions demos/reanalysis-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,26 +4,41 @@
"cell_type": "markdown",
Copy link
Contributor

@navidcy navidcy Apr 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Line #14.    ## Directory where fre tools are stored, e.g. on NCI Gadi /home/157/ahg157/repos/mom5/src/tools/

we want to avoid referencing NCI which means nothing to most people


Reply via ReviewNB

"metadata": {},
"source": [
"# Regional Tasmania forced by GLORYS and ERA5 reanalysis datasets\n",
"# Regional Tasmanian domain forced by GLORYS and ERA5 reanalysis datasets\n",
"\n",
"**Note: FRE-NC tools are required to be set up, as outlined in the `regional-mom6` package [documentation](https://regional-mom6.readthedocs.io/en/latest/).**\n",
"**Note**: FRE-NC tools are required to be set up, as outlined in the [documentation](https://regional-mom6.readthedocs.io/en/latest/) of regional-mom6 package.\n",
"\n",
"For this example we need a copy of the [GEBCO bathymetry](https://www.gebco.net/data_and_products/gridded_bathymetry_data/), access to the [GLORYs ocean reanalysis data](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description), and [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5). \n",
"For this example we need:\n",
"\n",
"This example reads in the entire global extent of ERA5 and GEBCO; we don't need to worry about cutting it down to size. "
"- [GEBCO bathymetry](https://www.gebco.net/data_and_products/gridded_bathymetry_data/)\n",
"- [GLORYS ocean reanalysis data](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description), and\n",
"- [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5)\n",
"\n",
"This example reads in the entire global extent of ERA5 and GEBCO; we don't need to worry about cutting it down to size."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What does the `regional_mom6` package do?\n",
"\n",
"Setting up a regional model in MOM6 can be a pain. The goal of this package is that users should spend their debugging time fixing a model that's running and doing weird things, rather than puzzling over a model that won't even start.\n",
"\n",
"In running this notebook, you'll hopefully have a running MOM6 regional model. There will still be a lot of fiddling to do with the `MOM_input` file to make sure that the parameters are set up right for your domain, and you might want to manually edit some of the input files. *But*, this package should help you bypass most of the woes of regridding, encoding and understanding the arcane arts of the MOM6 boundary segment files. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What does this notebook do?\n",
"This notebook is designed to set you up with a working MOM6 regional configuration. First, try and get it running with our default Tasmania case, then you can clone the notebook and modify for your region of interest. \n",
"This notebook is designed to set you up with a working MOM6 regional configuration. First, try to get it running with our default Tasmania case, then you can clone the notebook and modify for your region of interest. \n",
"\n",
"Input Type | Source | Subsets required\n",
"---|---|---\n",
"Surface | [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5) | Data from 2003; whole globe or subset around our domain\n",
"Ocean | [GLORYs reanalysis product](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description) | Boundary segments & initial condition; see section 2 for details. \n",
"Ocean | [GLORYS reanalysis product](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description) | Boundary segments & initial condition; see section 2 for details. \n",
"Bathymetry | [GEBCO](https://www.gebco.net/data_and_products/gridded_bathymetry_data/) | whole globe or subset around domain"
]
},
Expand Down Expand Up @@ -63,12 +78,14 @@
"source": [
"## Step 1: Choose our domain, define workspace paths\n",
"\n",
"You can use the <a href=\"https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download\" >graphical user interface (GUI) by Copernicus</a> to find the latitude-longitude extent of the domain of your interest."
"To make sure that things are working I'd recommend starting with the default example defined below. If this runs ok, then change to a domain of your choice and hopefully it runs ok too! If not, check the [README](https://github.com/COSIMA/regional-mom6/blob/main/README.md) and [documentation](https://regional-mom6.readthedocs.io/) for troubleshooting tips.\n",
"\n",
"You can log in and use [this GUI](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download) to find the lat/lon of your domain and copy paste below."
]
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -85,16 +102,15 @@
"## Directory where you'll run the experiment from\n",
"run_dir = Path(f\"mom6_run_directories/{expt_name}/\")\n",
"\n",
"## Directory where fre tools are stored \n",
"toolpath_dir = Path(\"PATH_TO_FRE_TOOLS\") ## Compiled tools needed for construction of mask tables\n",
"## Directory where compiled FRE tools are located (needed for construction of mask tables)\n",
"toolpath_dir = Path(\"PATH_TO_FRE_TOOLS\")\n",
"\n",
"## Path to where your raw ocean forcing files are stored\n",
"glorys_path = Path(\"PATH_TO_GLORYS_DATA\" )\n",
"\n",
"## if directories don't exist, create them\n",
"for path in (run_dir, glorys_path, input_dir):\n",
" if not os.path.exists(path):\n",
" os.makedirs(path)"
" os.makedirs(str(path), exist_ok=True)"
]
},
{
Expand All @@ -103,23 +119,21 @@
"source": [
"## Step 2: Prepare ocean forcing data\n",
"\n",
"We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is \"east_unprocessed\" for segments and \"ic_unprocessed\" for the initial condition.\n",
"We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is `\"east_unprocessed\"` for segments and `\"ic_unprocessed\"` for the initial condition.\n",
"\n",
"Data can be downloaded directly from the [Copernicus Marine data store](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download) via their GUI (once logged in).\n",
"\n",
"1. Initial condition: Using the GUI, select an area matching your `longitude_extent` and `latitude_extent` that corresponds to the first day in your date range. Download the initial condition and save it with filename `ic_unprocessed.nc` inside the `glorys_path` directory.\n",
"2. Boundary forcing: Using the GUI, select the Eastern boundary of your domain (if you have one that contains ocean). Allow for a buffer of ~0.5 degrees in all directions, and download for the prescribed `date_range`. Download and name `east_unprocessed.nc`.\n",
"3. Repeat step 2 for the rest sections of the domain."
"3. Repeat step 2 for the remaining sections of the domain."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 3: Make experiment object\n",
"The `regional_mom6.experiment` contains the regional domain basics, as well as it generates the horizontal and vertical grids, `hgrid` and `vgrid` respectively, and sets up the directory structures. \n",
"\n",
"\n"
"The `regional_mom6.experiment` contains the regional domain basics, and also generates the horizontal and vertical grids, `hgrid` and `vgrid` respectively, and sets up the directory structures. "
]
},
{
Expand Down Expand Up @@ -148,31 +162,26 @@
"source": [
"We can now access the horizontal and vertical grid of the regional configuration via `expt.hgrid` and `expt.vgrid` respectively.\n",
"\n",
"By plotting `vgrid` with `marker = '.'` option allows us to inspect the vertical spacing. Alternatively,\n",
"\n",
"Plotting the vertical grid with `marker = '.'` lets you see the spacing. You can use `numpy.diff` to compute the vertical spacings, e.g.,\n",
"```python\n",
"import numpy as np\n",
"np.diff(expt.vgrid.zl).plot(marker = '.')\n",
"```\n",
"shows you the vertical spacing profile."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"shows you the vertical spacing profile.\n",
"\n",
"### Modular workflow!\n",
"\n",
"After constructing our `experiment` object, if for some reason we are not happy with the grids, then we can simply modify them and then save them back into the `experiment` object. However, in doing so, we also need to save them to disk again. For example:\n",
"After constructing your `expt` object, if you don't like the default `hgrid` and `vgrid` you can simply modify and then save them back into the `expt` object. However, you'll then also need to save them to disk again. For example:\n",
"\n",
"```python\n",
"new_hgrid = xr.open_dataset(inputdir / \"hgrid.nc\")\n",
"new_hgrid = xr.open_dataset(input_dir + \"/hgrid.nc\")\n",
"```\n",
"Modify `new_hgrid`, ensuring that _all metadata_ is retained to keep MOM6 happy. Then, save any changes\n",
"Modify `new_hgrid`, ensuring that _all metadata_ is retained to keep MOM6 happy. Then, save your changes\n",
"\n",
"```python\n",
"expt.hgrid = new_hgrid\n",
"\n",
"expt.hgrid.to_netcdf(inputdir / \"hgrid.nc\")\n",
"expt.hgrid.to_netcdf(input_dir + \"/hgrid.nc\")\n",
"```"
]
},
Expand All @@ -182,9 +191,9 @@
"source": [
"## Step 4: Set up bathymetry\n",
"\n",
"Similarly to ocean forcing, we point the experiment's `bathymetry` method at the location of the file of choice and also pass a dictionary mapping variable names. We don't don't need to preprocess the bathymatry since it is simply a two-dimensional field and is easier to deal with. Afterwards the bathymetry is stored at `expt.bathymetry`.\n",
"Similarly to ocean forcing, we point the experiment's `setup_bathymetry` method at the location of the file of choice and also provide the variable names. We don't need to preprocess the bathymetry since it is simply a two-dimensional field and is easier to deal with. Afterwards you can inspect `expt.bathymetry` to have a look at the regional domain.\n",
"\n",
"After running this cell, your input directory will contain other bathymetry-related things like the ocean mosaic and mask table too. This defaults to a 10x10 layout which can be updated later."
"After running this cell, your input directory will contain other bathymetry-related things like the ocean mosaic and mask table too. The mask table defaults to a 10x10 layout and can be modified later."
]
},
{
Expand Down Expand Up @@ -252,7 +261,7 @@
"\n",
"This cuts out and interpolates the initial condition as well as all boundaries (unless you don't pass it boundaries).\n",
"\n",
"The dictionary maps the MOM6 variable names to what they're called in your ocean input file. Notice how for GLORYs, the horizontal dimensions are `x` and `y`, vs `xh`, `yh`, `xq`, `yq` for ACCESS OM2-01. This is because for an 'A' grid type tracers share the grid with velocities so there's no difference.\n",
"The dictionary maps the MOM6 variable names to what they're called in your ocean input file. Notice how for GLORYS, the horizontal dimensions are `latitude` and `longitude`, vs `xh`, `yh`, `xq`, `yq` for MOM6. This is because for an 'A' grid type tracers share the grid with velocities so there's no difference.\n",
navidcy marked this conversation as resolved.
Show resolved Hide resolved
"\n",
"If one of your segments is land, you can delete its string from the 'boundaries' list. You'll need to update MOM_input to reflect this though so it knows how many segments to look for, and their orientations."
]
Expand All @@ -265,8 +274,8 @@
"source": [
"# Define a mapping from the GLORYS variables and dimensions to the MOM6 ones\n",
"ocean_varnames = {\"time\": \"time\",\n",
" \"y\": \"latitude\",\n",
" \"x\": \"longitude\",\n",
" \"yh\": \"latitude\",\n",
" \"xh\": \"longitude\",\n",
" \"zl\": \"depth\",\n",
" \"eta\": \"zos\",\n",
" \"u\": \"uo\",\n",
Expand Down Expand Up @@ -316,19 +325,19 @@
"source": [
"## Step 7: Set up ERA5 forcing:\n",
"\n",
"Here we assume the ERA5 dataset is stored somewhere on the system we are working. \n",
"Here we assume the ERA5 dataset is stored somewhere on the system we are working on. \n",
"\n",
"Below is a table showing ERA5 characteristics and what needs to be done to sort it out\n",
"Below is a table showing ERA5 characteristics and what needs to be done to sort it out.\n",
"\n",
"**Required ERA data**:\n",
"**Required ERA5 data**:\n",
"\n",
"Name | ERA filename | era variable name | Units\n",
"Name | ERA5 filename | ERA5 variable name | Units\n",
"---|---|---|---\n",
"Surface Pressure | sp | sp | Pa \n",
"Surface Temperature | 2t | t2m | K \n",
"Meridional Wind | 10v | v10 | m/s \n",
"Zonal Wind | 10u | u10 | m/s \n",
"Specific Humidity | na | na | kg/kg, calculated from dewpoint temperature\n",
"Specific Humidity | - | - | kg/kg, calculated from dewpoint temperature\n",
"Dewpoint Temperature | 2d | d2m | K\n",
"\n",
"\n",
Expand Down Expand Up @@ -386,6 +395,13 @@
"\n",
"Another thing that can go wrong is little bays that create non-advective cells at your boundaries. Keep an eye out for tiny bays where one side is taken up by a boundary segment. You can either fill them in manually, or move your boundary slightly to avoid them"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -404,7 +420,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
"version": "3.10.13"
}
},
"nbformat": 4,
Expand Down
3 changes: 1 addition & 2 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
Regional MOM6 Documentation
===========================

Python package for automatic generation of regional configurations for the `Modular Ocean Model version 6`_ (MOM6).

[regional-mom6](https://github.com/COSIMA/regional-mom6/) is a Python package for automatic generation of regional configurations for the `Modular Ocean Model version 6`_ (MOM6).

In brief...
-----------
Expand Down
2 changes: 1 addition & 1 deletion docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ The above installs the version of `regional-mom6` (plus any required dependencie

## "*I want to live on the edge! I want the latest developments*"

To install `regional-mom6` directly via GitHub using `pip`, first install `esmpy` as described above. Then:
To install `regional-mom6` directly from the [GitHub repository](https://github.com/COSIMA/regional-mom6/) using `pip`, first install `esmpy` as described above. Then:

```bash
pip install git+https://github.com/COSIMA/regional-mom6.git
Expand Down
4 changes: 1 addition & 3 deletions regional_mom6/regional_mom6.py
Original file line number Diff line number Diff line change
Expand Up @@ -1168,7 +1168,7 @@ def tidy_bathymetry(
{"depth": (["ny", "nx"], bathymetry["elevation"].values)}
)
bathymetry.attrs["depth"] = "meters"
bathymetry.attrs["standard_name"] = "bathymetryraphic depth at T-cell centers"
bathymetry.attrs["standard_name"] = "bathymetric depth at T-cell centers"
bathymetry.attrs["coordinates"] = "zi"

bathymetry.expand_dims("tiles", 0)
Expand Down Expand Up @@ -1400,8 +1400,6 @@ def setup_run_directory(
existing files in the 'rundir' directory for the experiment.

Args:
regional_mom6_path (str): Path to the regional MOM6 source code that was cloned
from GitHub. Default is current path, ``'.'``.
surface_forcing (Optional[str]): Specify the choice of surface forcing, one
of: ``'jra'`` or ``'era5'``. If not prescribed then constant fluxes are used.
using_payu (Optional[bool]): Whether or not to use payu (https://github.com/payu-org/payu)
Expand Down
Loading