-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
120 m and 250 m meshes running WITH ADCIRCPY #56
Comments
ADCIRC-only configuration working folder:
|
I've built a system to store / read configurations to JSON files |
@JaimeCalzadaNOAA @zacharyburnettNOAA Hi Zach and Jaime, As the next step lets do: run cases:
|
I will be out Thursday and Friday, but https://github.com/noaa-ocs-modeling/CoupledModelDriver/blob/main/README.md#usage has information on how to use the new JSON configuration system if you'd like to try it out |
@zacharyburnettNOAA @JaimeCalzadaNOAA Make sure to include: Thanks, |
updated to
both are currently running coldstart (have completed the mesh decomposition step) |
MESH DECOMPOSITIONNEMS HSOFS 120m mesh runmesh decomposition seems to have completed successfully (there are 598
NEMS HSOFS 250m mesh runmesh decomposition completed successfully (there are 598 ADCIRC-only HSOFS 250m mesh runmesh decomposition failed (no
I will look into rebuilding |
|
Ok, I'll make sure the partition argument is treated as optional.
…On Thu, Apr 8, 2021 at 2:09 PM Saeed Moghimi ***@***.***> wrote:
(coupledmodeldriver) ***@***.***:/scratch2/COASTAL/coastal/save/shared/saeed/adcircpy$
python example_3_250m.py
Traceback (most recent call last):
File
"/scratch2/COASTAL/coastal/save/shared/saeed/adcircpy/example_3_250m.py",
line 85, in
main()
File
"/scratch2/COASTAL/coastal/save/shared/saeed/adcircpy/example_3_250m.py",
line 48, in main
slurm = SlurmConfig(
TypeError: *init*() missing 1 required positional argument: 'partition'
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#56 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQ3T4UPWTEY5H5YUHDDCXG3THXWNRANCNFSM4Z7URQQQ>
.
--
Jaime R. Calzada
UCAR
Marine Modeling and Analysis Branch
National Ocean Service / Office of Coast Survey Development Laboratory
National Oceanic and Atmospheric Administration
1315 East West Highway, N/CS13
Silver Spring, MD 20910-3282
phone: (301) 713-2809 x 119 <%28301%29%20713-2809%20x103>
mobile: (787) 484-6944 <%28201%29%20539-1679>
|
@JaimeCalzadaNOAA Please check: ADCIRCPY - adcprep error on 250m case see here:
|
You need to replace "netcdf/4.7.2-parallel" with the actual module name
loaded for your compiled ADCIRC binary.
…On Thu, Apr 8, 2021 at 4:25 PM Saeed Moghimi ***@***.***> wrote:
Prep error on 250m case see here:
/scratch2/COASTAL/coastal/save/shared/saeed/adcircpy/outputs/example_3_250m/coldstart/
Lmod has detected the following error: The following module(s) are unknown:
"netcdf/4.7.2-parallel"
Please check the spelling or version number. Also try "module spider ..."
It is also possible your cache file is out-of-date; it may help to try:
$ module --ignore-cache load "netcdf/4.7.2-parallel"
Also make sure that all modulefiles written in TCL start with the string
#%Module
------------------------------
Start Epilog v20.08.28 on node h2c48 for job 17762725 :: Thu Apr 8
17:59:48 UTC 2021
Job 17762725 (not serial) finished for user Saeed.Moghimi in partition
hera with exit code 1:0
------------------------------
End Epilogue v20.08.28 Thu Apr 8 17:59:48 UTC 2021
INFO: Processing --np
INFO: Processing --partmesh
File fort.14
WAS FOUND! Opening & Processing file
from alloc_main1:
memory currently allocated = 1347642720 bytes
memory high water mark = -1649704876 bytes
memory currently allocated = 1441352512 bytes
memory high water mark = 1535062304 bytes
Global Grid file read successfully.
INFO: This mesh has 7698 weir node pairs.
INFO: Maximum number of duals for any weir node is 2.
maximum co-nodes for any node = 21
edge count = 5385262
Grid Partition Data
METIS 4.0 will require approximately 604899540 bytes
Total Edges Cut = 1708037
INFO: Writing mesh partition to partmesh.txt.
INFO: METIS has partitioned nodes successfully.
memory currently allocated = 1441352512 bytes
memory high water mark = 1535062304 bytes
INFO: Processing --np
INFO: Processing --prepall
File fort.14
WAS FOUND! Opening & Processing file
File fort.15
WAS FOUND! Opening & Processing file
Elevation Station Locations contained in fort.15
Velocity Station Locations Contained in fort.15
forrtl: severe (64): input conversion error, unit 15, file
/scratch2/COASTAL/coastal/save/shared/saeed/adcircpy/outputs/example_3_250m/coldstart/fort.15
Image PC Routine Line Source
adcprep 00000000005EFCBE Unknown Unknown Unknown
adcprep 0000000000625130 Unknown Unknown Unknown
adcprep 00000000004070E0 presizes_mp_sizeu 1263 presizes.F
adcprep 0000000000428ECF prepinput_ 436 adcprep.F
adcprep 000000000042798B MAIN__ 239 adcprep.F
adcprep 000000000040355E Unknown Unknown Unknown
libc-2.17.so 00002B4EA7A1A555 __libc_start_main Unknown Unknown
adcprep 0000000000403469 Unknown Unknown Unknown
------------------------------
Start Epilog v20.08.28 on node h16c53 for job 17766649 :: Thu Apr 8
20:20:08 UTC 2021
Job 17766649 (not serial) finished for user Saeed.Moghimi in partition
hera with exit code 64:0
------------------------------
End Epilogue v20.08.28 Thu Apr 8 20:20:08 UTC 2021
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#56 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AQ3T4UMVJY6MOMBQC5KBNH3THYGMNANCNFSM4Z7URQQQ>
.
--
Jaime R. Calzada
UCAR
Marine Modeling and Analysis Branch
National Ocean Service / Office of Coast Survey Development Laboratory
National Oceanic and Atmospheric Administration
1315 East West Highway, N/CS13
Silver Spring, MD 20910-3282
phone: (301) 713-2809 x 119 <%28301%29%20713-2809%20x103>
mobile: (787) 484-6944 <%28201%29%20539-1679>
|
I am using consistent adcprep and padcirc now. I am not sure if this is related to nstcdf parallel. @JaimeCalzadaNOAA #!/bin/bash --login
#SBATCH -D .
#SBATCH -J example_3_250m.py
#SBATCH -A coastal
#SBATCH --mail-type=all
#SBATCH [email protected]
#SBATCH --output=example_3_250m.log
#SBATCH -n 1000
#SBATCH --time=08:00:00
# #SBATCH --partition=
ulimit -s unlimited
set -e
source /scratch2/COASTAL/coastal/save/shared/saeed/ADC-WW3-NWM-NEMS/modulefiles/envmodules_intel.hera
PATH=/scratch2/COASTAL/coastal/save/shared/saeed/ADC-WW3-NWM-NEMS/ADCIRC/work/:$PATH
main() {
SECONDS=0
run_coldstart_phase
if grep -Rq "ERROR: Elevation.gt.ErrorElev, ADCIRC stopping." example_3_250m.log; then
duration=$SECONDS
echo "ERROR: Elevation.gt.ErrorElev, ADCIRC stopping."
echo "Wallclock time: $(($duration / 60)) minutes and $(($duration % 60)) seconds."
exit -1
else
run_hotstart_phase
duration=$SECONDS
if grep -Rq "ERROR: Elevation.gt.ErrorElev, ADCIRC stopping." example_3_250m.log; then
echo "ERROR: Elevation.gt.ErrorElev, ADCIRC stopping."
echo "Wallclock time: $(($duration / 60)) minutes and $(($duration % 60)) seconds."
exit -1
fi
fi
echo "Wallclock time: $(($duration / 60)) minutes and $(($duration % 60)) seconds."
}
run_coldstart_phase() {
rm -rf coldstart
mkdir coldstart
cd coldstart
ln -sf ../fort.14
ln -sf ../fort.13
ln -sf ../fort.15.coldstart ./fort.15
adcprep --np $SLURM_NTASKS --partmesh
adcprep --np $SLURM_NTASKS --prepall
srun padcirc
clean_directory
cd ..
}
run_hotstart_phase() {
rm -rf hotstart
mkdir hotstart
cd hotstart
ln -sf ../fort.14
ln -sf ../fort.13
ln -sf ../fort.15.hotstart ./fort.15
ln -sf ../coldstart/fort.67.nc
adcprep --np $SLURM_NTASKS --partmesh
adcprep --np $SLURM_NTASKS --prepall
srun padcirc
clean_directory
cd ..
}
clean_directory() {
rm -rf PE*
rm -rf partmesh.txt
rm -rf metis_graph.txt
rm -rf fort.13
rm -rf fort.14
rm -rf fort.15
rm -rf fort.16
rm -rf fort.80
rm -rf fort.68.nc
}
main
#! /usr/bin/env python
"""
This example recreates the Shinnecock Inlet test case with some added
improvements in order to demonstrate some of the capabilities of AdcircPy.
In contrast to example_1, this example generates input files that are separated
by a coldstart and hotstart phase.
The behaviour of this program is similar to the example_1.
"""
from datetime import datetime, timedelta
import pathlib
import tarfile
import tempfile
import urllib.request
from adcircpy import AdcircMesh, AdcircRun, Tides
from adcircpy.server import SlurmConfig
from adcircpy.forcing.winds import BestTrackForcing
PARENT = pathlib.Path(__file__).parent.absolute()
FORT14 = "/scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0/fort.14"
FORT13 = "/scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0/fort.13"
def main():
# open mesh file
mesh = AdcircMesh.open(FORT14, crs=4326)
# init tidal forcing and setup requests
tidal_forcing = Tides(tidal_source = 'TPXO' , resource = '/scratch2/COASTAL/coastal/save/shared/models/forcings/tides/h_tpxo9.v1.nc')
tidal_forcing.use_all()
mesh.add_forcing(tidal_forcing)
# Add wind forcing to model
#wind_forcing = BestTrackForcing('Sandy2012')
#mesh.add_forcing(wind_forcing)
# import fort.13
mesh.import_nodal_attributes(FORT13)
# activate fort.13
for name in mesh.get_nodal_attribute_names():
mesh.set_nodal_attribute_state(name,True,True)
# instantiate AdcircRun object.
slurm = SlurmConfig(
account='coastal',
ntasks=1000,
run_name='example_3_250m.py',
partition='',
walltime=timedelta(hours=8),
mail_type='all',
mail_user='[email protected]',
log_filename='example_3_250m.log',
modules=['intel/2020', 'impi/2020', 'netcdf/4.7.2-parallel'],
path_prefix='/scratch2/COASTAL/coastal/save/shared/repositories/adcirc-cg/work/'
)
now = datetime.utcnow()
driver = AdcircRun(
mesh,
start_date = now,
end_date = now + timedelta(days=10),
spinup_time=timedelta(days=15),
server_config=slurm
)
# Tweak V55 defualt
# William suggested :TODO check from f15 again
# tweak parmeter to get IM = 511112 and A0,B0,C0 = 0.0,1.0,0.0 for explicit scheme
# driver.gwce_solution_scheme = 'explicit'
# legacy (sergey)
# tweak parmeter to get IM = 511111 and A0,B0,C0 = 0.35,0.3,0.35 for explicit scheme
driver.gwce_solution_scheme = 'semi-implicit-legacy'
# Write driver state to file.
driver.write("outputs/example_3_250m_v2", overwrite=True)
if __name__ == '__main__':
main() |
@saeed-moghimi-noaa cc. @zacharyburnettNOAA There are three problems with your setup:
To verify please resolve these in your client file and submit your job file through the debug queue with |
@JaimeCalzadaNOAA @saeed-moghimi-noaa you can use the following commands with ADCIRC-only configuration with coupledmodeldriver:initialize_adcirc \
--output-directory run_adcirconly_hsofs_250m \
--mesh-directory /scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0 \
--modeled-start-time 20121022T060000 \
--modeled-duration 04:05:00:00 \
--modeled-timestep 00:00:02 \
--tidal-spinup-duration 12:06:00:00 \
--platform HERA \
--adcirc-processors 600 \
--adcirc-executable /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/ADCIRC/work/padcirc \
--adcprep-executable /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/ADCIRC/work/adcprep \
--modulefile /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/modulefiles/envmodules_intel.hera \
--job-duration 06:00:00 \
--forcings tidal \
--tidal-source TPXO \
--tidal-path /scratch2/COASTAL/coastal/save/shared/models/forcings/tides/h_tpxo9.v1.nc generate_adcirc run_adcirconly_hsofs_250m NEMS + ADCIRC configuration with coupledmodeldriver:initialize_adcirc \
--output-directory run_nemsadcirc_hsofs_250m \
--mesh-directory /scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0 \
--modeled-start-time 20121022T060000 \
--modeled-duration 04:05:00:00 \
--modeled-timestep 00:00:02 \
--tidal-spinup-duration 12:06:00:00 \
--platform HERA \
--adcirc-processors 598 \
--adcirc-executable /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/NEMS/exe/NEMS.x \
--adcprep-executable /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/ADCIRC/work/adcprep \
--modulefile /scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/modulefiles/envmodules_intel.hera \
--job-duration 06:00:00 \
--forcings tidal,atmesh,ww3data \
--tidal-source TPXO \
--tidal-path /scratch2/COASTAL/coastal/save/shared/models/forcings/tides/h_tpxo9.v1.nc \
--atmesh-path /scratch2/COASTAL/coastal/save/shared/models/forcings/hsofs/sandy/Wind_HWRF_SANDY_Nov2018_ExtendedSmoothT.nc \
--ww3data-path /scratch2/COASTAL/coastal/save/shared/models/forcings/hsofs/sandy/ww3.HWRF.NOV2018.2012_sxy.nc \
--nems-interval 01:00:00 generate_adcirc run_nemsadcirc_hsofs_250m |
I've updated the modeled start and end times to align with the ATMESH and WW3DATA forcings:
datetime.timedelta(days=4, hours=5)
datetime.timedelta(days=10, hours=17) here are the most recent model runs: /scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_250m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_120m
/scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_250m |
currently running jobs: ➜ squeue -u Zachary.Burnett -o "%.8i %3C %4D %97Z %15j" --sort i
JOBID CPU NODE WORK_DIR NAME
17908675 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_250m/runs/run_1 ADC_HOT_RUN
17908681 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_250m/runs/run_1 ADC_HOT_RUN
17909270 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_120m/coldstart ADC_COLD_RUN
17909271 1 1 /scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_120m/runs/run_1 ADC_MESH_DECOMP
17909272 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/adcirc/run_20210412_hsofs_120m/runs/run_1 ADC_HOT_RUN
17909274 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_120m/coldstart ADC_COLD_RUN
17909275 1 1 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_120m/runs/run_1 ADC_MESH_DECOMP
17909276 600 15 /scratch2/COASTAL/coastal/save/shared/working/zach/nems_adcirc/run_20210412_hsofs_120m/runs/run_1 ADC_HOT_RUN |
@zacharyburnettNOAA |
Fixed, I gave |
ADCIRC onlylooks like the hotstart runs of both ADCIRC-only 120m and ADCIRC-only 250m runs elapsed the 6 hour job run time and were cancelled, as shown in
I will run them again with 12 hour time limits. Also, there are a bunch of empty NEMS + ADCIRChowever, the NEMS + ADCIRC coldstart run failed after a while with the following error in
|
Please compare with cases here: Perhaps identical information with this run so we can compare: Take station list from here: /scratch2/COASTAL/coastal/noscrub/shared/Saeed.Moghimi/shared_with_zach/flo/a10_FLO_OCN_SPINUP_v1.0/rt_20210104_h19_m07_s33r618/scr/fort.15.template.tide_spinup spin up
|
added besttrack support in 6cf7f3b |
@saeed-moghimi-noaa in the directory
here are the sizes of the output NetCDFs:
|
running the command (CoupledModelDriver) C:\Repositories\CoupledModelDriver >
➜ plot_fort61 run_20210416_hsofs_250m_v1.0_besttrack\coldstart\fort.61.nc MSL returns the following error:
going to https://api.tidesandcurrents.noaa.gov/api/prod/datagetter?station=8459479&begin_date=20121010+01%3A00&end_date=20121016+22%3A00&product=water_level&datum=MSL&units=metric&time_zone=gmt&format=json&application=noaa%2Fnos%2Fcsdl%2Fadcircpy returns the error JSON: {"error": {"message":" Wrong Datum: No valid datum value for MSL ***station=8459479"}} which datum should I use for these stations? The choices in adcircpy are
EDIT: none of the choices worked, I'll take a look at what's wrong on Monday |
Some stations might simply not have data. Call the CLI using the |
250m meshin the hotstart run (
|
@zacharyburnettNOAA That error is related to the hot start file. I think this usually occurs if there is no data in the fort.67.nc or fort.68.nc. So it means no data was written out into hot start during cold start run or it was deleted and new one made by adcprep before the hot start. |
good catch, it looks like the print({variable_name: variable.shape for variable_name, variable in fort67_dataset.variables.items()}) {
'time': (0,),
'x': (1813443,),
'y': (1813443,),
'element': (3564104, 3),
'adcirc_mesh': (1,),
'neta': (),
'nvdll': (1,),
'max_nvdll': (),
'ibtypee': (1,),
'nbdv': (186,),
'nvel': (),
'nvell': (186,),
'max_nvell': (),
'ibtype': (186,),
'nbvv': (55274,),
'depth': (1813443,),
'zeta1': (0, 1813443),
'zeta2': (0, 1813443),
'zetad': (0, 1813443),
'u-vel': (0, 1813443),
'v-vel': (0, 1813443),
'nodecode': (0, 1813443),
'noff': (0, 3564104),
'imhs': (),
'iths': (),
'iestp': (),
'nscoue': (),
'ivstp': (),
'nscouv': (),
'ipstp': (),
'iwstp': (),
'nscoum': (),
'igep': (),
'nscouge': (),
'igvp': (),
'nscougv': (),
'igpp': (),
'igwp': (),
'nscougw': (),
} I will look into the start and end times for output again. |
here are the last few lines of
here are the
|
yes the NHSINC = 5929200 needs to be correct. Does that line up with the timestep and when you want to output the hot start? Hotstart output interval in seconds = NHSINC*TIMESTEP |
Hi William @WPringle The 250 m blows up in the cold start part. ERR
fort.15
EDIT: shortened constituent lines to make it easier to scroll through |
oh right. yeah that's just a typical blowup. I highly recommend using smagorinsky model by changing ESL from 10 to -0.2. You need to use IM = 511112, instead of 111112 when you do this. Also ICS=22 is better than ICS=2: technically ICS=2 is simply incorrect, but not that much difference for this domain. |
Thanks William. -Saeed
…______________________________________________________
Saeed Moghimi, PhD
*UCAR/NOAA - NOS Storm Surge Modeling
<https://nauticalcharts.noaa.gov/learn/storm-surge-modeling.html> Team Lead*
Coastal Marine Modeling Branch, Coast Survey Development Laboratory, Office
of Coast Survey at NOAA National Ocean Service.
Address: 1315 East West Hwy, Room 6607, Silver Spring, Maryland 20910
Phone: (240) 847-8230
The contents of this message are mine personally and do not necessarily
reflect any position of NOAA.
On Mon, Apr 19, 2021 at 3:44 PM William Pringle ***@***.***> wrote:
oh right. yeah that's just a typical blowup.
I highly recommend using smagorinsky model by changing ESL from 10 to
-0.2. You need to use IM = 511112, instead of 111112 when you do this. Also
ICS=22 is better than ICS=2: technically ICS=2 is simply incorrect, but not
that much difference for this domain.
I've been running the 250m mesh with 4.6514 sec time step using these
settings and never seen a blow up.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#56 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/APZULDZ73OO2NHXRFUXPI3DTJSBZPANCNFSM4Z7URQQQ>
.
|
@WPringle thanks, those suggestions really helped. The configuration I made from your suggestions put out a For reference, I used the following JSON configuration:
{
"adcirc_executable_path": "/scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/ADCIRC/work/padcirc",
"adcprep_executable_path": "/scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/ADCIRC/work/adcprep",
"modeled_start_time": "2012-10-22 06:00:00",
"modeled_end_time": "2012-10-26 11:00:00",
"modeled_timestep": 2.0,
"fort_13_path": "/scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0/fort.13",
"fort_14_path": "/scratch2/COASTAL/coastal/save/shared/models/meshes/hsofs/250m/v1.0/fort.14",
"tidal_spinup_duration": 1058400.0,
"tidal_spinup_timestep": 2.0,
"gwce_solution_scheme": "explicit",
"use_smagorinsky": true,
"horizontal_eddy_viscosity": null,
"source_filename": "/scratch2/COASTAL/coastal/save/shared/repositories/ADC-WW3-NWM-NEMS/modulefiles/envmodules_intel.hera",
"use_original_mesh": false,
"output_surface": true,
"surface_output_interval": 3600.0,
"output_stations": false,
"stations_file_path": null,
"stations_output_interval": 360.0,
"output_spinup": true,
"output_elevations": true,
"output_velocities": true,
"output_concentrations": false,
"output_meteorological_factors": false,
"processors": 600,
"nems_parameters": {}
} to produce the following
created on 2021-04-20 15:58 ! RUNDES - 32 CHARACTER ALPHANUMERIC RUN DESCRIPTION
NOMAD mesh v1e MSL ! RUNID - 24 CHARACTER ALPANUMERIC RUN IDENTIFICATION
1 ! NFOVER - NONFATAL ERROR OVERRIDE OPTION
1 ! NABOUT - ABREVIATED OUTPUT OPTION PARAMETER
100 ! NSCREEN - UNIT 6 OUTPUT OPTION PARAMETER
0 ! IHOT - HOT START PARAMETER
2 ! ICS - COORDINATE SYSTEM SELECTION PARAMETER
511112 ! IM - MODEL SELECTION PARAMETER
1 ! NOLIBF - BOTTOM FRICTION TERM SELECTION PARAM; before NWP==1, '2' was used
2 ! NOLIFA - FINITE AMPLITUDE TERM SELECTION PARAMETER
1 ! NOLICA - SPATIAL DERIVATIVE CONVECTIVE SELECTION PARAMETER
1 ! NOLICAT - TIME DERIVATIVE CONVECTIVE TERM SELECTION PARAMETER
5 ! NWP - VARIABLE BOTTOM FRICTION AND LATERAL VISCOSITY OPTION PARAMETER; default 0
mannings_n_at_sea_floor
primitive_weighting_in_continuity_equation
surface_canopy_coefficient
surface_directional_effective_roughness_length
surface_submergence_state
1 ! NCOR - VARIABLE CORIOLIS IN SPACE OPTION PARAMETER
1 ! NTIP - TIDAL POTENTIAL OPTION PARAMETER
0 ! NWS - WIND STRESS AND BAROMETRIC PRESSURE OPTION PARAMETER
1 ! NRAMP - RAMP FUNCTION OPTION
9.81 ! G - ACCELERATION DUE TO GRAVITY - DETERMINES UNITS
-3 ! TAU0 - WEIGHTING FACTOR IN GWCE; original, 0.005
2.000000 ! DTDP - TIME STEP (IN SECONDS)
0 ! STATIM - STARTING TIME (IN DAYS)
0 ! REFTIM - REFERENCE TIME (IN DAYS)
12.25 ! RNDAY - TOTAL LENGTH OF SIMULATION (IN DAYS)
12.25 ! DRAMP - DURATION OF RAMP FUNCTION (IN DAYS)
0 1 0 ! A00 B00 C00 - TIME WEIGHTING FACTORS FOR THE GWCE EQUATION
0.01 0 0 0.01 ! H0 NODEDRYMIN NODEWETRMP VELMIN
-80.9048 30.2847 ! SLAM0 SFEA0 - CENTER OF CPP PROJECTION (NOT USED IF ICS=1, NTIP=0, NCOR=0)
0.0025 ! FFACTOR
-0.2 ! smagorinsky coefficient - LATERAL EDDY VISCOSITY COEFFICIENT; IGNORED IF NWP =1
0 ! CORI - CORIOLIS PARAMETER - IGNORED IF NCOR = 1
8 ! NTIF - NUMBER OF TIDAL POTENTIAL CONSTITUENTS BEING FORCED starting 2008082300
...
...
...
110 ! ANGINN - INNER ANGLE THRESHOLD
0 0 0 0 ! NOUTE TOUTSE TOUTFE NSPOOLE - ELEV STATION OUTPUT INFO (UNIT 61)
0 ! NSTAE - TOTAL NUMBER OF ELEVATION RECORDING STATIONS
0 0 0 0 ! NOUTV TOUTSV TOUTFV NSPOOLV - VELOCITY STATION OUTPUT INFO (UNIT 62)
0 ! NSTAV - TOTAL NUMBER OF VELOCITY RECORDING STATIONS
-5 0.000000 12.250000 1800 ! NOUTGE TOUTSGE TOUTFGE NSPOOLGE - GLOBAL ELEVATION OUTPUT INFO (UNIT 63)
-5 0.000000 0.000000 1800 ! NOUTGV TOUTSGV TOUTFGV NSPOOLGV - GLOBAL VELOCITY OUTPUT INFO (UNIT 64)
0 ! NFREQ
0 0 0 0 ! THAS THAF NHAINC FMV - HARMONIC ANALYSIS PARAMETERS
0 0 0 0 ! NHASE NHASV NHAGE NHAGV - CONTROL HARMONIC ANALYSIS AND OUTPUT TO UNITS 51,52,53,54
5 529200 ! NHSTAR NHSINC - HOT START FILE GENERATION PARAMETERS
1 0 1E-08 25 ! ITITER ISLDIA CONVCR ITMAX - ALGEBRAIC SOLUTION PARAMETERS
! NCPROJ - PROJECT TITLE
! NCINST - PROJECT INSTITUTION
! NCSOUR - PROJECT SOURCE
! NCHIST - PROJECT HISTORY
! NCREF - PROJECT REFERENCES
! NCCOM - PROJECT COMMENTS
! NCHOST - PROJECT HOST
! NCONV - CONVENTIONS
! NCCONT - CONTACT INFORMATION
2012-10-10 00:00 ! NCDATE - forcing start date The |
basic configurations ran successfully with 250m and 120m in ADCIRC-only focusing now on using coupled NEMS #86 |
@JaimeCalzadaNOAA @zacharyburnettNOAA
I suggest to get both 120 m and 250 m meshes running with ADCIRCPY and perhaps best track. Then switch back to nems. For this purpose and to make sure we isolate things. I suggest using the original f14 and f13.
Any refactoring for mesh object then happen after the basic run went through.
The text was updated successfully, but these errors were encountered: