Skip to content

Implement support for moving mountain gravity wave scheme in MPAS dycore #1297

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: cam_development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 5 additions & 6 deletions bld/build-namelist
Original file line number Diff line number Diff line change
Expand Up @@ -3852,10 +3852,10 @@ if (!$simple_phys) {

if ($phys =~ /cam7/) {
#
# moving mountains only supported by SE dycore
# moving mountains only supported by MPAS and SE dycore
# (since vorticity needs to be passed to physics in dp_coupling)
#
if ( $dyn =~ /se/ ) {
if ( $dyn =~ /mpas|se/ ) {
add_default($nl, 'use_gw_movmtn_pbl', 'val'=>'.true.');
} else {
add_default($nl, 'use_gw_movmtn_pbl', 'val'=>'.false.');
Expand All @@ -3864,10 +3864,9 @@ if (!$simple_phys) {

my $use_gw_movmtn_pbl = $nl->get_value('use_gw_movmtn_pbl');
if ($use_gw_movmtn_pbl =~ /$TRUE/io) {
if ( ! ($dyn =~ /se/) ) {
die "$ProgName - ERROR: use_gw_movmtn_pbl is only available with the SE dycore \n";

}
if ( ! ($dyn =~ /mpas|se/) ) {
die "$ProgName - ERROR: use_gw_movmtn_pbl is only available with MPAS and SE dycore\n";
}
}

add_default($nl, 'use_gw_rdg_gamma' , 'val'=>'.false.');
Expand Down
106 changes: 106 additions & 0 deletions doc/ChangeLog
Original file line number Diff line number Diff line change
@@ -1,5 +1,111 @@
===============================================================

Tag name: TBD
Originator(s): kuanchihwang
Date: TBD
One-line Summary: Implement support for moving mountain gravity wave scheme in MPAS dycore
Github PR URL: https://github.com/ESCOMP/CAM/pull/1297

Purpose of changes (include the issue number and title text for each relevant GitHub issue):

This PR implements support for moving mountain gravity wave scheme in MPAS dycore.

The `use_gw_movmtn_pbl` namelist option will now default to `.true.` when MPAS dycore and CAM7
physics are both selected.

The moving mountain gravity wave scheme needs relative vorticities at cell points as input.
However, because MPAS uses staggered C-grid for spatial discretization, where wind vectors are
located at edge points, it calculates relative vorticities at vertex points instead. As a result,
this PR introduces a new functionality in MPAS subdriver to regrid vertex values to cell values.
The regridding functionality is also generalized so that it will work with all variables at vertex
points.

Subsequently, relative vorticities are passed to physics buffer during dynamics-physics coupling,
after which the moving mountain gravity wave scheme can query and use them as input.

Closes #1253 (Support for new moving mountains gravity wave trigger with MPAS dycore)
Closes #1277 (MPAS dycore support for moving mountains parameterization)

Describe any changes made to build system:

None

Describe any changes made to the namelist:

The `use_gw_movmtn_pbl` namelist option will now default to `.true.` when MPAS dycore and CAM7
physics are both selected.

List any changes to the defaults for the boundary datasets:

None

Describe any substantial timing or memory changes:

None

Code reviewed by:

TBD

List all files eliminated:

None

List all files added and what they do:

None

List all existing files that have been modified, and describe the changes:

M bld/build-namelist
* Allow the moving mountain gravity wave scheme to be selected with MPAS dycore
M src/dynamics/mpas/dp_coupling.F90
* Pass relative vorticities to physics buffer during dynamics-physics coupling
M src/dynamics/mpas/driver/cam_mpas_subdriver.F90
* Implement the computation of relative vorticities at cell points
M src/dynamics/mpas/dyn_comp.F90
* Pass relative vorticities to physics buffer during dynamics-physics coupling

If there were any failures reported from running test_driver.sh on any test
platform, and checkin with these failures has been OK'd by the gatekeeper,
then copy the lines from the td.*.status files for the failed tests to the
appropriate machine below. All failed tests must be justified.

derecho/intel/aux_cam:

ERP_Ln9.f09_f09_mg17.FCSD_HCO.derecho_intel.cam-outfrq9s (Overall: FAIL) details:
FAIL ERP_Ln9.f09_f09_mg17.FCSD_HCO.derecho_intel.cam-outfrq9s COMPARE_base_rest
FAIL ERP_Ln9.f09_f09_mg17.FCSD_HCO.derecho_intel.cam-outfrq9s BASELINE
SMS_Lh12.f09_f09_mg17.FCSD_HCO.derecho_intel.cam-outfrq3h (Overall: DIFF) details:
FAIL SMS_Lh12.f09_f09_mg17.FCSD_HCO.derecho_intel.cam-outfrq3h BASELINE
* Pre-existing failures due to HEMCO not having reproducible results (issues #1018 and #856)
SMS_D_Ln9_P1280x1.ne0CONUSne30x8_ne0CONUSne30x8_mt12.FCHIST.derecho_intel.cam-outfrq9s (Overall: FAIL) details:
FAIL SMS_D_Ln9_P1280x1.ne0CONUSne30x8_ne0CONUSne30x8_mt12.FCHIST.derecho_intel.cam-outfrq9s SETUP
* Pre-existing failures due to build-namelist error requiring CLM/CTSM external update

derecho/nvhpc/aux_cam:

All pass

izumi/nag/aux_cam:

All pass

izumi/gnu/aux_cam:

All pass

CAM tag used for the baseline comparison tests if different than previous
tag:

cam6_4_088

Summarize any changes to answers:

None

===============================================================

Tag name: cam6_4_088
Originator(s): jimmielin
Date: 22 April 2025
Expand Down
53 changes: 47 additions & 6 deletions src/dynamics/mpas/dp_coupling.F90
Original file line number Diff line number Diff line change
Expand Up @@ -38,15 +38,15 @@ module dp_coupling
!=========================================================================================

subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)
use cam_mpas_subdriver, only: cam_mpas_update_halo
use cam_mpas_subdriver, only: cam_mpas_update_halo, cam_mpas_vertex_to_cell_relative_vorticities

! Convert the dynamics output state into the physics input state.
! Note that all pressures and tracer mixing ratios coming from the dycore are based on
! dry air mass.
use cam_history, only: hist_fld_active
use dyn_comp, only: frontgf_idx, frontga_idx
use dyn_comp, only: frontgf_idx, frontga_idx, vort4gw_idx
use mpas_constants, only: Rv_over_Rd => rvord
use phys_control, only: use_gw_front, use_gw_front_igw
use phys_control, only: use_gw_front, use_gw_front_igw, use_gw_movmtn_pbl
use cam_budget, only : thermo_budget_history

! arguments
Expand Down Expand Up @@ -101,6 +101,12 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)
real(r8), allocatable :: frontgf_phys(:,:,:)
real(r8), allocatable :: frontga_phys(:,:,:)

! Temporary arrays to hold vorticity for the "moving mountain" gravity wave scheme.
real(r8), allocatable :: vort4gw(:, :) ! Data are unchunked.
real(r8), allocatable :: vort4gw_phys(:, :, :) ! Data are chunked.
! Pointer to vorticity in physics buffer for the "moving mountain" gravity wave scheme.
real(r8), pointer :: pbuf_vort4gw(:, :)

type(physics_buffer_desc), pointer :: pbuf_chnk(:)

integer :: lchnk, icol, icol_p, k, kk ! indices over chunks, columns, physics columns and layers
Expand All @@ -116,6 +122,11 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)
character(len=*), parameter :: subname = 'd_p_coupling'
!----------------------------------------------------------------------------

nullify(pbuf_chnk)
nullify(pbuf_frontgf)
nullify(pbuf_frontga)
nullify(pbuf_vort4gw)

compute_energy_diags=thermo_budget_history

nCellsSolve = dyn_out % nCellsSolve
Expand Down Expand Up @@ -155,9 +166,6 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)

if (use_gw_front .or. use_gw_front_igw) then
call cam_mpas_update_halo('scalars', endrun) ! scalars is the name of tracers in the MPAS state pool
nullify(pbuf_chnk)
nullify(pbuf_frontgf)
nullify(pbuf_frontga)
!
! compute frontogenesis function and angle for gravity wave scheme
!
Expand Down Expand Up @@ -195,6 +203,16 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)

end if

if (use_gw_movmtn_pbl) then
call cam_mpas_vertex_to_cell_relative_vorticities(vort4gw)

allocate(vort4gw_phys(pcols, pver, begchunk:endchunk), stat=ierr)

if (ierr /= 0) then
call endrun(subname // ': Failed to allocate vort4gw_phys')
end if
end if

call t_startf('dpcopy')

ncols = columns_on_task
Expand Down Expand Up @@ -225,6 +243,10 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)
frontgf_phys(icol_p, k, lchnk) = frontogenesisFunction(kk, i)
frontga_phys(icol_p, k, lchnk) = frontogenesisAngle(kk, i)
end if

if (use_gw_movmtn_pbl) then
vort4gw_phys(icol_p, k, lchnk) = vort4gw(kk, i)
end if
end do

do k = 1, pverp
Expand Down Expand Up @@ -261,6 +283,25 @@ subroutine d_p_coupling(phys_state, phys_tend, pbuf2d, dyn_out)
deallocate(frontogenesisAngle)
end if

if (use_gw_movmtn_pbl) then
!$omp parallel do private (lchnk, ncols, icol, k, pbuf_chnk, pbuf_vort4gw)
do lchnk = begchunk, endchunk
ncols = get_ncols_p(lchnk)
pbuf_chnk => pbuf_get_chunk(pbuf2d, lchnk)

call pbuf_get_field(pbuf_chnk, vort4gw_idx, pbuf_vort4gw)

do k = 1, pver
do icol = 1, ncols
pbuf_vort4gw(icol, k) = vort4gw_phys(icol, k, lchnk)
end do
end do
end do

deallocate(vort4gw)
deallocate(vort4gw_phys)
end if

call t_stopf('dpcopy')

call t_startf('derived_phys')
Expand Down
111 changes: 111 additions & 0 deletions src/dynamics/mpas/driver/cam_mpas_subdriver.F90
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ module cam_mpas_subdriver
cam_mpas_compute_unit_vectors, &
cam_mpas_update_halo, &
cam_mpas_cell_to_edge_winds, &
cam_mpas_vertex_to_cell_relative_vorticities, &
cam_mpas_run, &
cam_mpas_finalize, &
cam_mpas_debug_stream, &
Expand Down Expand Up @@ -2213,6 +2214,116 @@ subroutine cam_mpas_cell_to_edge_winds(nEdges, uZonal, uMerid, east, north, edge

end subroutine cam_mpas_cell_to_edge_winds

!-------------------------------------------------------------------------------
! subroutine cam_mpas_vertex_to_cell_relative_vorticities
!
!> \brief Compute the relative vorticities at cell points.
!> \author Kuan-Chih Wang
!> \date 2025-04-12
!> \details
!> MPAS uses staggered C-grid for spatial discretization, where relative
!> vorticities are located at vertex points because wind vectors are located at
!> edge points. However, physics schemes that use relative vorticities as input
!> usually want them at cell points instead.
!> This subroutine computes the relative vorticity at each cell point from its
!> surrounding vertex points and returns the results.
!
!-------------------------------------------------------------------------------
subroutine cam_mpas_vertex_to_cell_relative_vorticities(cell_relative_vorticity)
use mpas_derived_types, only: mpas_pool_type
use mpas_kind_types, only: rkind
use mpas_pool_routines, only: mpas_pool_get_subpool, mpas_pool_get_dimension, mpas_pool_get_array

real(rkind), allocatable, intent(out) :: cell_relative_vorticity(:, :)

character(*), parameter :: subname = 'cam_mpas_subdriver::cam_mpas_vertex_to_cell_relative_vorticities'
integer :: i, k
integer :: ierr
integer, pointer :: ncellssolve, nvertlevels
integer, pointer :: kiteforcell(:, :), nedgesoncell(:), verticesoncell(:, :)
real(rkind), pointer :: areacell(:), kiteareasonvertex(:, :), vorticity(:, :)
type(mpas_pool_type), pointer :: mpas_pool_diag, mpas_pool_mesh

nullify(ncellssolve, nvertlevels)
nullify(kiteforcell, nedgesoncell, verticesoncell)
nullify(areacell, kiteareasonvertex, vorticity)
nullify(mpas_pool_diag, mpas_pool_mesh)

call mpas_pool_get_subpool(domain_ptr % blocklist % allstructs, 'diag', mpas_pool_diag)
call mpas_pool_get_subpool(domain_ptr % blocklist % allstructs, 'mesh', mpas_pool_mesh)

! Input.
call mpas_pool_get_dimension(mpas_pool_mesh, 'nCellsSolve', ncellssolve)
call mpas_pool_get_dimension(mpas_pool_mesh, 'nVertLevels', nvertlevels)

call mpas_pool_get_array(mpas_pool_mesh, 'kiteForCell', kiteforcell)
call mpas_pool_get_array(mpas_pool_mesh, 'nEdgesOnCell', nedgesoncell)
call mpas_pool_get_array(mpas_pool_mesh, 'verticesOnCell', verticesoncell)

call mpas_pool_get_array(mpas_pool_mesh, 'areaCell', areacell)
call mpas_pool_get_array(mpas_pool_mesh, 'kiteAreasOnVertex', kiteareasonvertex)
call mpas_pool_get_array(mpas_pool_diag, 'vorticity', vorticity)

! Output.
allocate(cell_relative_vorticity(nvertlevels, ncellssolve), stat=ierr)

if (ierr /= 0) then
call endrun(subname // ': Failed to allocate cell_relative_vorticity')
end if

do i = 1, ncellssolve
do k = 1, nvertlevels
cell_relative_vorticity(k, i) = regrid_from_vertex_to_cell(i, k, &
nedgesoncell, verticesoncell, kiteforcell, kiteareasonvertex, areacell, &
vorticity)
end do
end do

nullify(ncellssolve, nvertlevels)
nullify(kiteforcell, nedgesoncell, verticesoncell)
nullify(areacell, kiteareasonvertex, vorticity)
nullify(mpas_pool_diag, mpas_pool_mesh)
end subroutine cam_mpas_vertex_to_cell_relative_vorticities

!-------------------------------------------------------------------------------
! function regrid_from_vertex_to_cell
!
!> \brief Regrid values from vertex points to the specified cell point.
!> \author Kuan-Chih Wang
!> \date 2025-04-12
!> \details
!> This function computes the area weighted average (i.e., `cell_value`) at the
!> specified cell point (i.e., `cell_index` and `cell_level`) from the values
!> at its surrounding vertex points (i.e., `vertex_value`).
!> The formulation used here is adapted and generalized from the
!> `atm_compute_solve_diagnostics` subroutine in MPAS.
!
!-------------------------------------------------------------------------------
pure function regrid_from_vertex_to_cell(cell_index, cell_level, &
nverticesoncell, verticesoncell, kiteforcell, kiteareasonvertex, areacell, &
vertex_value) result(cell_value)
use mpas_kind_types, only: rkind

integer, intent(in) :: cell_index, cell_level
integer, intent(in) :: nverticesoncell(:), verticesoncell(:, :), kiteforcell(:, :)
real(rkind), intent(in) :: kiteareasonvertex(:, :), areacell(:)
real(rkind), intent(in) :: vertex_value(:, :)
real(rkind) :: cell_value

integer :: i, j, vertex_index

cell_value = 0.0_rkind

do i = 1, nverticesoncell(cell_index)
j = kiteforcell(i, cell_index)
vertex_index = verticesoncell(i, cell_index)

cell_value = cell_value + &
kiteareasonvertex(j, vertex_index) * vertex_value(cell_level, vertex_index)
end do

cell_value = cell_value / areacell(cell_index)
end function regrid_from_vertex_to_cell

!-----------------------------------------------------------------------
! routine cam_mpas_run
Expand Down
Loading