Skip to content

Commit

Permalink
Fixes for demos (#142)
Browse files Browse the repository at this point in the history
* new title for GLORYS+ERA5 example

* minor rephrase

* some rephrasing

* better phrasing

* better title+reqs

* fix rendering

* fix rendering

* fix rendering
  • Loading branch information
navidcy authored Apr 16, 2024
1 parent bb742c4 commit 52c3d5d
Show file tree
Hide file tree
Showing 2 changed files with 28 additions and 27 deletions.
6 changes: 4 additions & 2 deletions demos/access_om2-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Example for NCI users: Regional Tasmania JRA-55 and ACCESS-OM2\n",
"# Regional Tasmania forced by JRA-55 reanalysis and ACCESS-OM2 model output\n",
"\n",
"**Before you begin, make sure you have access to the relevant projects to access the data listed below**\n",
"### Note: This example requires access to [NCI's Gadi HPC system](https://nci.org.au/our-systems/hpc-systems)\n",
"\n",
"**Ensure you have access to the relevant NCI projects to access the data listed below**\n",
"\n",
"## What does this notebook do?\n",
"This notebook is designed to set you up with a working MOM6 regional configuration. First, try and get it running with our default Tasmania case, then you can clone the notebook and modify for your region of interest. \n",
Expand Down
49 changes: 24 additions & 25 deletions demos/reanalysis-forced.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Example: Regional Tasmania forced by Reanalysis dataset and ERA5\n",
"# Regional Tasmania forced by GLORYS and ERA5 reanalysis datasets\n",
"\n",
"**Before you begin, make sure you've downloaded and installed the package, and have set up your FRE-NC tools as outlined in the package README**\n",
"**Note: FRE-NC tools are required to be set up, as outlined in the `regional-mom6` package [documentation](https://regional-mom6.readthedocs.io/en/latest/).**\n",
"\n",
"In addition, for this example you'll need a copy of the [GEBCO bathymetry](https://www.gebco.net/data_and_products/gridded_bathymetry_data/), access to the [GLORYs ocean reanalysis data](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description), and [ERA5 surface forcing for 2003](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5). \n",
"For this example we need a copy of the [GEBCO bathymetry](https://www.gebco.net/data_and_products/gridded_bathymetry_data/), access to the [GLORYs ocean reanalysis data](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description), and [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5). \n",
"\n",
"This script is designed to read in the entire global extent of ERA5 and GEBCO, so you don't need to worry about cutting it down to size. "
"This example reads in the entire global extent of ERA5 and GEBCO; we don't need to worry about cutting it down to size. "
]
},
{
Expand All @@ -22,8 +22,8 @@
"\n",
"Input Type | Source | Subsets required\n",
"---|---|---\n",
"Surface | [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5) | Data from 2003, whole globe or subset around domain\n",
"Ocean | [GLORYs reanalysis product](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description) | Boundary segments & initial condition. See section 2 for details. \n",
"Surface | [ERA5 surface forcing](https://www.ecmwf.int/en/forecasts/dataset/ecmwf-reanalysis-v5) | Data from 2003; whole globe or subset around our domain\n",
"Ocean | [GLORYs reanalysis product](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/description) | Boundary segments & initial condition; see section 2 for details. \n",
"Bathymetry | [GEBCO](https://www.gebco.net/data_and_products/gridded_bathymetry_data/) | whole globe or subset around domain"
]
},
Expand All @@ -46,9 +46,7 @@
"source": [
"## Step 1: Choose our domain, define workspace paths\n",
"\n",
"To make sure that things are working I'd recommend starting with the default example defined below. If this runs ok, then change to a domain of your choice and hopefully it runs ok too! There's some troubleshooting you can do if not (check readme / readthedocs)\n",
"\n",
"To find the lat/lon of the domain you want to test you can use <a href=\"https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download\" > this GUI </a> and copy paste below"
"You can use the <a href=\"https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download\" >graphical user interface (GUI) by Copernicus</a> to find the latitude-longitude extent of the domain of your interest."
]
},
{
Expand Down Expand Up @@ -89,19 +87,19 @@
"\n",
"We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is \"east_unprocessed\" for segments and \"ic_unprocessed\" for the initial condition.\n",
"\n",
"Data can be downloaded directly from the [Copernicus Marine data store](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download) via their GUI (once you're logged in). Unfortunately their old client `motuclient` is no longer working and they're currently in the process of replacing it. Until this is restored, and this notebook is updated with their new client, users will need to download each segment manually\n",
"Data can be downloaded directly from the [Copernicus Marine data store](https://data.marine.copernicus.eu/product/GLOBAL_MULTIYEAR_PHY_001_030/download) via their GUI (once logged in).\n",
"\n",
"1. Using the GUI, select an area matching your xextent and yextent for the first day in your daterange. Download and label `ic_unprocessed`, then store it in your `glorys_path` folder.\n",
"2. Using the GUI Select the Eastern boundary of your domain (if you have one that contains ocean). Give a buffer of ~0.5 degrees in all directions, and download for your full daterange. Download and label `east_unprocessed`\n",
"3. Repeat for your other sections"
"1. Initial condition: Using the GUI, select an area matching your `longitude_extent` and `latitude_extent` that corresponds to the first day in your date range. Download the initial condition and save it with filename `ic_unprocessed.nc` inside the `glorys_path` directory.\n",
"2. Boundary forcing: Using the GUI, select the Eastern boundary of your domain (if you have one that contains ocean). Allow for a buffer of ~0.5 degrees in all directions, and download for the prescribed `date_range`. Download and name `east_unprocessed.nc`.\n",
"3. Repeat step 2 for the rest sections of the domain."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Step 3: Make experiment object\n",
"This object keeps track of your domain basics, as well as generating the hgrid, vgrid and setting up the folder structures. \n",
"The `regional_mom6.experiment` contains the regional domain basics, as well as it generates the horizontal and vertical grids, `hgrid` and `vgrid` respectively, and sets up the directory structures. \n",
"\n",
"\n"
]
Expand Down Expand Up @@ -130,14 +128,14 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"After running you can have a look at your grids by calling `expt.hgrid` and `expt.vgrid`\n",
"We can now access the horizontal and vertical grid of the regional configuration via `expt.hgrid` and `expt.vgrid` respectively.\n",
"\n",
"Plotting vgrid with `marker = '.'` option let's us see the spacing, or plotting\n",
"By plotting `vgrid` with `marker = '.'` option allows us to inspect the vertical spacing. Alternatively,\n",
"\n",
"```python\n",
"np.diff(expt.vgrid.zl).plot(marker = '.')\n",
"```\n",
" shows you the vertical spacing profile."
"shows you the vertical spacing profile."
]
},
{
Expand All @@ -146,12 +144,12 @@
"source": [
"### Modular workflow!\n",
"\n",
"After constructing your expt object, if you don't like the default hgrid and vgrids you can simply modify and overwrite them. However, you'll then also need to save them to disk again. For example:\n",
"After constructing our `experiment` object, if for some reason we are not happy with the grids, then we can simply modify them and then save them back into the `experiment` object. However, in doing so, we also need to save them to disk again. For example:\n",
"\n",
"```python\n",
"new_hgrid = xr.open_dataset(inputdir / \"hgrid.nc\")\n",
"```\n",
"Modify `new_hgrid`, ensuring that metadata is retained to keep MOM6 happy. Then, save your changes\n",
"Modify `new_hgrid`, ensuring that _all metadata_ is retained to keep MOM6 happy. Then, save any changes\n",
"\n",
"```python\n",
"expt.hgrid = new_hgrid\n",
Expand All @@ -166,7 +164,7 @@
"source": [
"## Step 4: Set up bathymetry\n",
"\n",
"Similarly to ocean forcing, we point our 'bathymetry' method at the location of the file of choice, and pass it a dictionary mapping variable names. This time we don't need to preprocess the topography since it's just a 2D field and easier to deal with. Afterwards you can run `expt.topog` and have a look at your domain. After running this cell, your input directory will contain other topography - adjacent things like the ocean mosaic and mask table too. This defaults to a 10x10 layout which can be updated later."
"Similarly to ocean forcing, we point the experiment's `bathymetry` method at the location of the file of choice and also pass a dictionary mapping variable names. We don't don't need to preprocess the bathymatry since it is simply a two-dimensional field and is easier to deal with. Afterwards you can run `expt.topog` and have a look at your domain. After running this cell, your input directory will contain other topography - adjacent things like the ocean mosaic and mask table too. This defaults to a 10x10 layout which can be updated later."
]
},
{
Expand Down Expand Up @@ -225,14 +223,14 @@
"outputs": [],
"source": [
"# Define a mapping from the GLORYS variables and dimensions to the MOM6 ones\n",
"ocean_varnames = {\"time\":\"time\",\n",
"ocean_varnames = {\"time\": \"time\",\n",
" \"y\": \"latitude\",\n",
" \"x\": \"longitude\",\n",
" \"zl\": \"depth\",\n",
" \"eta\": \"zos\",\n",
" \"u\": \"uo\",\n",
" \"v\": \"vo\",\n",
" \"tracers\":{\"salt\": \"so\", \"temp\": \"thetao\"}\n",
" \"tracers\": {\"salt\": \"so\", \"temp\": \"thetao\"}\n",
" }\n",
"\n",
"# Set up the initial condition\n",
Expand Down Expand Up @@ -276,12 +274,13 @@
"metadata": {},
"source": [
"## Step 7: Set up ERA5 forcing:\n",
"Here we assume you've already got ERA5 data stored somewhere on your system. \n",
"\n",
"For this example, we are forcing for the entire year of 2003 so we just generate a single forcing file with 2003's data.\n",
"Here we assume the ERA5 dataset is stored somewhere on the system we are working. \n",
"\n",
"Below is a table showing ERA5 characteristics and what needs to be done to sort it out\n",
"### Required ERA data:\n",
"\n",
"**Required ERA data**:\n",
"\n",
"Name | ERA filename | era variable name | Units\n",
"---|---|---|---\n",
"Surface Pressure | sp | sp | Pa \n",
Expand Down

0 comments on commit 52c3d5d

Please sign in to comment.