Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Precipitation rates for NoahMP are likely being calculated in the wrong place #319

Closed
grantfirl opened this issue Sep 13, 2019 · 17 comments
Closed

Comments

@grantfirl
Copy link
Collaborator

NoahMP expects precipitation rates and calculates its own from Diag% variables immediately prior to being called. This is problematic because Diag% variables get zeroed out before physics is called periodically such that the precipitation rates NoahMP sees will periodically be zero. For the initial CCPP transition, this calculation was done in a sfc_noahmp_pre interstitial routine to achieve bit-for-bit with the non-CCPP version. These rates should be calculated the previous time step in GFS_MP_generic_post (like what is done for RUC LSM) and put into a persistent variable rather than the Diag% DDT.

@climbfuji
Copy link
Collaborator

@junwang-noaa @DusanJovic-NOAA @HelinWei-NOAA - are you aware of this? Can you double check? Making the proposed changes will change the answer. Thanks.

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 18, 2019 via email

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 18, 2019 via email

@climbfuji
Copy link
Collaborator

@grantfirl reported this, I defer to him for further details. But I know that for RUC LSM, we added additional variables to preserve the precipitation rates, since RUC LSM also needs this information.

@grantfirl
Copy link
Collaborator Author

grantfirl commented Nov 18, 2019

Hi Dom, I am not aware of that issue? Can you give me more detail? Thanks. Helin

On Nov 18, 2019, at 1:39 PM, Dom Heinzeller @.***> wrote:  @junwang-noaa @DusanJovic-NOAA @HelinWei-NOAA - are you aware of this? Can you double check? Making the proposed changes will change the answer. Thanks. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.

Hi Helin,

I came across this potential bug when transitioning NoahMP into the CCPP. Here is my understanding of the issue. If you look in GFS_physics_driver.F90, search for the rainn_mp variable for example. When NoahMP is used, this variable (and similar variables for other precip. species) is calculated using variables in the "Diag" GFS derived data type (DDT). I think that the variables in this DDT are used to accumulate diagnostics for averaging over some periodicity and are zeroed out after calculating their time-averaged values periodically prior to physics being called. I have not actually run any tests to show that this actually happens, but I would suspect that this causes NoahMP to be given zero values for precipitation rates immediately after diagnostics have been written out and zeroed.

To get around this, rather than calculating precipitation rates for NoahMP from variables in the Diag DDT before the call to NoahMP, I think that it would work to follow what was done for the RUC LSM, like Dom mentioned. For that scheme, they save surface precipitation values immediately after microphysics is called into one of the DDTs that does not get zeroed or reinitialized every time step (they chose to use the Tbd DDT, but the decision is somewhat arbitrary). In the CCPP, the file GFS_MP_generic.F90 is the interstitial scheme that gets called after any microphysics scheme, and you can see a code stanza around lines 184-192 where this is done. They're saving liquid water equivalent thickness values, and NoahMP needs rates, but one could execute those same lines for when lsm==lsm_noahmp. Then, in the sfc_noahmp_pre_run scheme in the CCPP, one could convert the thicknesses into rates for use in NoahMP.

-Grant

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 18, 2019 via email

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 19, 2019 via email

@climbfuji
Copy link
Collaborator

Hi Helin, I think this is just fine (from my side, don't want to speak for Grant). Should we make those changes in our dtc/develop branches and bring them to the master code with our next merge? Good to see that NoahMP and RUC can share those variables. Thanks for investigating the issue!

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 19, 2019 via email

@grantfirl
Copy link
Collaborator Author

Hi Grant, I prefer to add these four variables to Sfcprop DDT. In this way both Noah MP and RUC can share them. What do you think? There should be no impact on Noah because srflag is calculated after MP in the same subroutine, nulike Noah MP driver is called before MP. Helin On Mon, Nov 18, 2019 at 4:07 PM Helin Wei - NOAA Affiliate < [email protected]> wrote:

Hi Grant, Thank for finding out this potential bug. If this is the case, that will be also a problem for Noah. Because we also use some of those variables in the "Diag" GFS derived data type (DDT) to calculate srflag to pass into Noah LSM. I am going to take a further look and have some tests. Helin On Mon, Nov 18, 2019 at 3:40 PM grantfirl @.> wrote: > Hi Dom, I am not aware of that issue? Can you give me more detail? > Thanks. Helin > … <#m_-7065188098019345833_m_8914092979223549424_m_-2162555601173180407_> > On Nov 18, 2019, at 1:39 PM, Dom Heinzeller @.> wrote:  > @junwang-noaa https://github.com/junwang-noaa @DusanJovic-NOAA > https://github.com/DusanJovic-NOAA @HelinWei-NOAA > https://github.com/HelinWei-NOAA - are you aware of this? Can you > double check? Making the proposed changes will change the answer. Thanks. — > You are receiving this because you were mentioned. Reply to this email > directly, view it on GitHub, or unsubscribe. > > Hi Helin, > > I came across this potential bug when transitioning NoahMP into the CCPP. > Here is my understanding of the issue. If you look in > GFS_physics_driver.F90, search for the rainn_mp variable for example. When > NoahMP is used, this variable (and similar variables for other precip. > species) is calculated using variables in the "Diag" GFS derived data type > (DDT). I think that the variables in this DDT are used to accumulate > diagnostics for averaging over some periodicity and are zeroed out after > calculating their time-averaged values periodically prior to physics being > called. I have not actually run any tests to show that this actually > happens, but I would suspect that this causes NoahMP to be given zero > values for precipitation rates immediately after diagnostics have been > written out and zeroed. > > To get around this, rather than calculating precipitation rates for > NoahMP from variables in the Diag DDT before the call to NoahMP, I think > that it would work to follow what was done for the RUC LSM, like Dom > mentioned. For that scheme, they save surface precipitation values > immediately after microphysics is called into one of the DDTs that does not > get zeroed or reinitialized every time step (they chose to use the Tbd DDT, > but the decision is somewhat arbitrary). In the CCPP, the file > GFS_MP_generic.F90 is the interstitial scheme that gets called after any > microphysics scheme, and you can see a code stanza around lines 184-192 > where this is done. They're saving liquid water equivalent thickness > values, and NoahMP needs rates, but one could execute those same lines for > when lsm==lsm_noahmp. Then, in the sfc_noahmp_pre_run scheme in the CCPP, > one could convert the thicknesses into rates for use in NoahMP. > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > <#319?email_source=notifications&email_token=ALPHKYERT3O6MDVMLO7OUBLQUL4SHA5CNFSM4IWVDJSKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEEL2VQA#issuecomment-555199168>, > or unsubscribe > https://github.com/notifications/unsubscribe-auth/ALPHKYAN4XJ2YCJ3NUCVV6DQUL4SHANCNFSM4IWVDJSA > . >

Hi Helin,

I think that this is fine too. Just to confirm, you are planning on making these code changes and testing them? If not, let Dom and I know, and we'll make the changes as described and submit a pull request for you to approve.

If in fact NoahMP was seeing zero precipitation rates periodically, the modified code will definitely not pass the existing NoahMP regression test and will require a new baseline.

-Grant

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 19, 2019 via email

@climbfuji
Copy link
Collaborator

Helin, it doesn't matter where the variables are stored, both Sfcprop and Tbd can be written to restart files. Currently we have for RUC in GFS_restart.F90:

    if (Model%lsm == Model%lsm_ruc) then
      num = num + 1
      Restart%name2d(num) = 'ruc_2d_raincprv'
      do nb = 1,nblks
        Restart%data(nb,num)%var2p => Tbd(nb)%raincprv(:)
      enddo
      num = num + 1
      Restart%name2d(num) = 'ruc_2d_rainncprv'
      do nb = 1,nblks
        Restart%data(nb,num)%var2p => Tbd(nb)%rainncprv(:)
      enddo
      num = num + 1
      Restart%name2d(num) = 'ruc_2d_iceprv'
      do nb = 1,nblks
        Restart%data(nb,num)%var2p => Tbd(nb)%iceprv(:)
      enddo
      num = num + 1
      Restart%name2d(num) = 'ruc_2d_snowprv'
      do nb = 1,nblks
        Restart%data(nb,num)%var2p => Tbd(nb)%snowprv(:)
      enddo
      num = num + 1
      Restart%name2d(num) = 'ruc_2d_graupelprv'
      do nb = 1,nblks
        Restart%data(nb,num)%var2p => Tbd(nb)%graupelprv(:)
      enddo
    endif

Regarding your question about coldstarting them. For RUC I believe the first timestep of the coldstart run will just contain zeros for all those fields. I would like to bring @tanyasmirnova into the discussion, since Tanya is the expert on RUC and RUC in FV3.

@climbfuji
Copy link
Collaborator

Where are we at with this? Who is supposed to take action and make the changes? Seems to be a rather important fix to me. Thanks.

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Nov 20, 2019 via email

@grantfirl
Copy link
Collaborator Author

@HelinWei-NOAA
The following pull requests address this issue:
NCAR/fv3atm#14
#365

The precipitation rates needed by NoahMP are now calculated after the microphysics call in preparation for the following timestep of NoahMP. They have also been put in the Sfcprop DDT.

I do not have any tests that show the difference, but from code inspection, when the precipitation rates were calculated prior to the NoahMP call using Diag%rain,rainc,ice,snow,graupel, it definitely looks to me like they would end up at zero after the diagnostics have been written out. In GFS_driver.F90, there are periodic calls to Diag%phys_zero that get executed before the call to physics. This subroutine is defined in GFS_typedefs.F90 as diag_phys_zero. All five of these variables are set to zero inside. This same functionality is mimicked in the CCPP via "time_vary" interstitial schemes.

Please have a look at the pull requests when you get a chance to see if you agree with the changes. Even if there is no or minimal difference, I don't think that these changes "do any harm" from a coding point of view, and would at least provide consistency with how the RUC LSM operates.

@HelinWei-NOAA
Copy link
Collaborator

HelinWei-NOAA commented Dec 6, 2019 via email

@climbfuji
Copy link
Collaborator

Fixed in #365 and PRs referenced there. 10-day forecasts with the old and new version will be made to check the differences.

DusanJovic-NOAA pushed a commit to NOAA-EMC/fv3atm that referenced this issue Jan 10, 2020
* add new suite definition files for GFSv15p2, GFSv16beta
* update existing suite definition files because of the interstitial code rearrangement (NCAR/ccpp-physics#372)
* clean up CCPP compiler flags, add -Wall for GNU in DEBUG mode
* add satmedmfvifq, shalcnv, sascnvn, Ferrier-Aligo microphysics and dependencies to CCPP prebuild config
* bug fixes for ugwp and noahmp
* move previous-timestep precipitation variables from Tbd to Sfcprop (as recommended by @HelinWei-NOAA, see NCAR/ccpp-physics#319)
* fix compiler warnings about non-existent include directories
* cleanup of old comments in GFS_physics_driver.F90 (see NCAR#4, NCAR#16, and NCAR/ccpp-physics#319
* CCPP annotations in GFS_driver.F90, GFS_radiation_driver.F90, GFS_physics_driver.F90 (comments that describe where code blocks ended up in the CCPP interstitial code)
* capability to coldstart FV3 with GSD physics from RAP/HRRR initial conditions (required for FV3 SAR)
* new suite definition file for coupled model
hannahcbarnes pushed a commit to hannahcbarnes/ccpp-physics that referenced this issue Aug 3, 2022
…al scaling to RRTMGP flux adjustment" (NCAR#319), bug fix in several suite definition files (NCAR#331)

Adds a set of dedicated 3d arrays for extended diagnostic output from Thompon MP, and a logical flag that controls allocating, outputting and resetting these arrays. This work is based on @ericaligo-NOAA 's code changes.

The flag to reset the extended diagnostics from Thompson MP is - for the UFS - currently tied to avg_max_length, which is used to reset maximum hourly fields (the corresponding reset variable is renamed to make its purpose clear).

Include the changes in NCAR#319, "Add optional scaling to RRTMGP flux adjustment".

Important: Fixed bugs in several suite definition files that were missing the call to GFS_radiation_surface:

* suite_FV3_GFS_v15_thompson_mynn_RRTMGP.xml
* suite_FV3_GFS_v16_coupled_noahmp.xml
* suite_FV3_GFS_v16_coupled_nsstNoahmp.xml
* suite_FV3_RRFS_v1alpha.xml

Co-authored-by: Dustin Swales <[email protected]>
Co-authored-by: Eric Aligo <[email protected]>
Co-authored-by: Dom Heinzeller <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants