Giter VIP home page Giter VIP logo

ufs_utils's People

Contributors

aerorahul avatar barlage avatar bensonr avatar boivuong-noaa avatar claradraper-noaa avatar davidhuber-noaa avatar deniseworthen avatar dmwright526 avatar edwardhartnett avatar edwardsafford-noaa avatar georgegayno-noaa avatar gsketefian avatar jeffbeck-noaa avatar jilidong-noaa avatar johnschattel avatar jswhit2 avatar junwang-noaa avatar katefriedman-noaa avatar kgerheiser avatar larissareames-noaa avatar lgannoaa avatar mark-a-potts avatar pjpegion avatar russtreadon-noaa avatar samueltrahannoaa avatar smoorthi-emc avatar walterkolczynski-noaa avatar weihuang-jedi avatar xuli-noaa avatar yangfanglin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ufs_utils's Issues

Add lake maker to orography data

Adding lake maker from Shan Sun and Ning Wang: generate lake fraction and depth on the FV3 grid, add them to oro_data, and adjust land_frac and slmsk, such that land_frac and slmsk are consistent with lake_frac.

CHGRES error to process GFS data

The CHGRES packed with NCEPLIBS (github.com:NOAA-EMC/UFS_UTILS.git, @8b8db58, Nov 11 19:08:52 2019) triggers error when it tries to process GFS data.

 - CALL FieldScatter FOR INPUT GRID TEMPERATURE.
 - FATAL ERROR: READING TEMPERATURE RECORD.
 - IOSTAT IS:          -31
MPT ERROR: Rank 0(g:0) received signal SIGSEGV(11).
        Process ID: 43767, Host: r12i0n32, Program: /glade/scratch/turuncu/ufs-mrweather-app-workflow/bld/chgres_cube.exe
        MPT Version: HPE MPT 2.19  02/23/19 05:30:09

and the error trace is

MPT:     rc=<error reading variable: Cannot access memory at address 0x0>,
MPT:     .tmp.STRING.len_V$7=14336928)
MPT:     at /glade/work/turuncu/UFS/NCEP_LIBS_ALL/UFS_UTILS/sorc/chgres_cube.fd/utils.f90:11
MPT: #8  0x00000000004590fa in input_data::read_input_atm_gaussian_file (localpet=0)
MPT:     at /glade/work/turuncu/UFS/NCEP_LIBS_ALL/UFS_UTILS/sorc/chgres_cube.fd/input_data.F90:1254
MPT: #9  0x0000000000435a5f in input_data::read_input_atm_data (localpet=0)
MPT:     at /glade/work/turuncu/UFS/NCEP_LIBS_ALL/UFS_UTILS/sorc/chgres_cube.fd/input_data.F90:147
MPT: #10 0x000000000041092a in atmosphere::atmosphere_driver (localpet=0)
MPT:     at /glade/work/turuncu/UFS/NCEP_LIBS_ALL/UFS_UTILS/sorc/chgres_cube.fd/atmosphere.F90:145
MPT: #11 0x0000000000435839 in chgres ()
MPT:     at /glade/work/turuncu/UFS/NCEP_LIBS_ALL/UFS_UTILS/sorc/chgres_cube.fd/chgres.F90:78

The same data and the namelist can be processed with the external CHGRES installation used in the prototype system. This version has last commit from George Gayno on Fri Sep 6 10:29:16 2019 and hash is @947145c.

I also test to install NCEP LIBS with Intel 18.0.5 and system provided NetCDF (4.7.1) but it does not help.

release/public-v1: Fix legacy Fortran extension

/home/dusan/simple-ufs/src/preproc/sorc/chgres_cube.fd/write_data.F90:1167:31:

 1167 |    WRITE(OUTFILE, '(A, I1, A)'), 'out.atm.tile', tile, '.nc'
      |                               1
Warning: Legacy Extension: Comma before i/o item list at (1)
/home/dusan/simple-ufs/src/preproc/sorc/chgres_cube.fd/write_data.F90:1637:33:

 1637 |      WRITE(OUTFILE, '(A, I1, A)'), 'out.sfc.tile', tile, '.nc'
      |                                 1
Warning: Legacy Extension: Comma before i/o item list at (1)
[100%] Linking Fortran executable chgres_cube.exe

Create regression tests for CHGRES_CUBE

Create regression tests for chgres_cube. At a minimum, the tests should...

  • Run on Jet, Theia, Cray and Dell.
  • Test several grid configurations (i.e., global uniform, regional, etc).
  • Test all possible input data sources (gfs sigio, gfs nemsio, fv3 nemsio, fv3 tiled history and fv3 tiled restart).
  • Test different task/thread counts.

This project was started under Vlab issues 65761 and 65495.

CHGRES_CUBE - sea ice range check

Dusan used chgres_cube to initialize a C96 global run using FV3GFS gaussian nemsio data as input (2019111600 cycle). The model broke during on the first time step during a computation of surface stress. In his words:

I recompiled the model with debug flags and found that model crashes in 
the very first physics step, in moninedmf.f (subroutine hedmf_run) at this line:

 411       do i=1,im
 412          ustar(i) = sqrt(stress(i))
 413       enddo
 
At one point stress(i) is negative number. It is computed in GFS_surface_composites.F90 here:

472    stress(i) = cice(i)*stress_ice(i) + (one-cice(i))*stress_ocn(i)    !<--- cice(i) must not be > 1.0

At that point, the sea ice fraction was slightly more than one (1.0000004). When he reset that point to exactly one, the model ran with no issues.

Determine why this is happening in chgres and add checks to prevent it.

Update chgres_cube to process Thompson ice- and water-friendly aerosols

Update to read in the monthly aerosol climatology, time interpolate to the run time, then horizontally and vertically interpolate it to the model grid.

The climo file is QNWFA_QNIFA_SIGMA_MONTHLY.dat.nc and located on Hera:
/scratch1/BMC/gsd-fv3-dev/Judy.K.Henderson/test/gw_ccpp_v16b/sorc/aeroconv.fd/INPUT

release/public-v1: syntax errors when compiled using debug flags (Intel)

When I compile chgres_cube using debug compile flags (on Hera using Intel compiler), I get the following syntax errors:

/scratch2/NCEPDEV/fv3-cam/Dusan.Jovic/sufs/simple-ufs-aero/src/preproc/sorc/chgres_cube.fd/program_setup.f90(280): error #6631: A non-optional actual argument must be present when invoking a procedure with an explicit interface.   [RC]
     call error_handler("FOR GRIB2 DATA, PLEASE PROVIDE GRIB2_FILE_INPUT_GRID")  
----------^                                                                      
compilation aborted for /scratch2/NCEPDEV/fv3-cam/Dusan.Jovic/sufs/simple-ufs-aero/src/preproc/sorc/chgres_cube.fd/program_setup.f90 (code 1)

and

/scratch2/NCEPDEV/fv3-cam/Dusan.Jovic/sufs/simple-ufs-aero/src/preproc/sorc/chgres_cube.fd/input_data.F90(5532): error #6631: A non-optional actual argument must be present when invoking a procedure with an explic
it interface.   [RC]
    call error_handler("ERROR USING MISSING_VAR_METHOD. PLEASE SET VALUES IN" // &
---------^
/scratch2/NCEPDEV/fv3-cam/Dusan.Jovic/sufs/simple-ufs-aero/src/preproc/sorc/chgres_cube.fd/input_data.F90(5564): error #6631: A non-optional actual argument must be present when invoking a procedure with an explic
it interface.   [RC]
    call error_handler("reading soil levels. File must have 4 soil levels.")
---------^
compilation aborted for /scratch2/NCEPDEV/fv3-cam/Dusan.Jovic/sufs/simple-ufs-aero/src/preproc/sorc/chgres_cube.fd/input_data.F90 (code 1)

Bad snow values from CHGRES_CUBE

Julie Schramm reported a problem with the snow field for a regional grid. The odd snow values were occurring at open water points. I was able to reproduce the problem using the head of 'develop' (4643e2c). I also noticed bad ice depth values at land points.

I ran a quick test that zeroes out these surface fields after the call to FieldCreate. That appears to have fixed it. To be safe, I will initialize all surface fields after FieldCreate to a proper value.

Create regression tests

Regression tests currently exist for chgres_cube. Tests are needed for other repository components. In particular: global_cycle, the grid generation programs, and the emcsfc programs.

Import Jim Purser's regional grid capability

This issue/task is to import the regional_grid capability into the UFS_UTILS repository.

The regional_grid capability is Jim Purser's method to create a much more uniform (in terms of cell size) regional grid for the FV3 standalone regional (SAR) configuration.

This work was started under Vlab issues 66330 and 65495.

release/public-v1: chgres_cube.fd: remove duplicate target_include_directories

 ${NETCDF_INCLUDES}                                                             
 ${MPI_Fortran_INCLUDE_PATH}                                                    
 ${NETCDF_INCLUDES_F90}

are specified twice as include directories:

Here: https://github.com/NOAA-EMC/UFS_UTILS/blob/8934efa6474d95ff0dbf5764f26d3389d85e7bf5/sorc/chgres_cube.fd/CMakeLists.txt#L20 and here
https://github.com/NOAA-EMC/UFS_UTILS/blob/8934efa6474d95ff0dbf5764f26d3389d85e7bf5/sorc/chgres_cube.fd/CMakeLists.txt#L33

I think these two target_include_directories can be merged.

release/public-v1: setting CPU specific flags is not portable

In sorc/chgres_cube.fd/CMakeLists.txt AVX2 specific flag is set.

https://github.com/NOAA-EMC/UFS_UTILS/blob/512fdd483742e75b6da1430484278e9e0b8ba2d0/sorc/chgres_cube.fd/CMakeLists.txt#L8

This is not portable. For example on Hera I see a lot of warnings like:

../preproc/sorc/chgres_cube.fd/model_grid.F90(141): remark #15009: model_grid_mp_define_input_grid_gaussian_ has been targeted for automatic cpu dispatch
../preproc/sorc/chgres_cube.fd/model_grid.F90(582): remark #15009: model_grid_mp_define_input_grid_gfs_grib2_ has been targeted for automatic cpu dispatch
../preproc/sorc/chgres_cube.fd/model_grid.F90(370): remark #15009: model_grid_mp_define_input_grid_mosaic_ has been targeted for automatic cpu dispatch

We should not use cpu specific flags unless we are cross-compiling, for example on WCOSS Cray or Jet, and in such specific cases those flags should be added conditionally based on user specified option.

GEFS V12 Support

Support GEFS V12.

Per request from GEFS group, create 'release' branch from 1b76994 of 'develop'

CHGRES_CUBE and vertical velocity

Russ had problems running the model when using tiled, warm restart data as input to chgres. He noticed large vertical velocities in the log file. He wrote:

 The C384L127 forecast from the chgres'd C768L127 files failed.   
 I noticed the following in the forecast job log file:

 W max =    3842.734      min =   -2243.542
 Before adi: W max =    3842.734      min =   -2243.542

Because chgres works with both the old spectral model (which output omega) and fv3 (which outputs 'w'), logic was added to the model to convert from omega to 'w' depending on the source of the chgres'd data. The source is identified via a global attribute. The conversion is done in routine routine external_ic.F90. The problem is that the logic does thinks data from warm restart files is 'omega', not 'w'. After discussing with Fanglin and Jun, it was decided to have chgres zero out 'w' for tiled input data instead of updating routine external_ic.F90. According to Fanglin:

Changing resolution from high-res tiles to low_res tiles may produce 
strong w that the low-res model can’t sustain. I think it is better to zero it out. 

release/public-v1: chgres_cube.fd: make install shuld not install .mod files

When I run make install I get the following files in the install directory:

my_install_dir/
├── bin
│   └── chgres_cube.exe
└── include_4
    └── include_4
        ├── atmosphere.mod
        ├── grib2_util.mod
        ├── input_data.mod
        ├── model_grid.mod
        ├── program_setup.mod
        ├── search_util.mod
        ├── static_data.mod
        └── surface.mod

3 directories, 9 files

There is no need for include_4 directory under include_4.

In fact, all those .mod files are not necessary in the install tree. They are internal to chgres_cube, and chgres_cube is not a library.

I suggest the following lines to be removed from sorc/chgres_cube.fd/CMakeLists.txt:

Inconsistency in branch names

UFS_UTILS is the only submodule of the NCEPLIBS umbrella build that uses a branch name release/ufs_release_1.0 for the UFS public release branch. All others just have ufs_release_1.0. Should this be changed, i.e, should a branch ufs_release_1.0 be created from release/ufs_release_1.0 and the latter then deleted? Note that in this case the .gitmodules file in the NCEPLIBS umbrella repository should be updated (because it contains the branch name).

See also NOAA-EMC/NCEPLIBS#12.

Update UFS-UTILS documentation to include all utilities needed for SRW app release

Each of these utilities is envisioned to be part of UFS_UTILS prior to the SRW App Release (list below still needs to be firmed up). Given that, each utility should have a chapter in the UFS-UTILS documentation to be hosted through ReadtheDocs.

make_hgrid
orog
regional_grid
global_equiv_resol
shave
filter_topo
chgres_cube: already exists - simply need to update as necessary for SAR

chgres metadata in output

chgres should include metadata in it's output to identify the source of it's input, this data should be enough to reproduce the contents of the file.

A few sea ice related issues in the NSST analysis

In the current NSST foundation temperature (Tf) analysis, a few sea ice related issues need to be resolved:

  1. Tf analysis for the water-ice mixed grids (ice concentration > 15%)

  2. The Tf evolution in prediction mode due to two SST climatology updates: (1) Seasonal tendency; (2) Relaxation to climatology

In this ticket,

  1. The Tf analysis for the grids with sea ice will be done in the stand-a-lone global_cycle step through a salinity dependent formula.
  2. The assigned value (currently, the frozen point, 271.2 K) will be re-visited, including the use of the SST climatology to help.
  3. Apply the two SST climatology updates to Tf for the grids with sea ice fraction

Updating FNMSKH to high-res land-sea mask for GFS.v16

set FNMSKH=${FNMSKH:-${FIXam}/global_slmask.t1534.3072.1536.grb}
instead of
FNMSKH=${FNMSKH:-${FIXam}/seaice_newland.grb}
to use high-res land-sea mask in global_chgres.sh and global_cycle.sh for interpolation of climatological SST etc.

This is needed to reduce lake temperature biases for GFS.v16 development.

Possible memory leak in filter_topo program

Seeing odd behavior when running in regional mode. My comment from a previous issue (#91).

Discovered when running the grid_gen regression test from develop (270f9dc). It happens with the 'regional' regression test. If I fix the rank of phis in routine FV3_zs_filter (should be 3, not 4) and print out the value of phis(1,1), I get this difference in file C96_oro_data.tile7.halo4.nc when compared the regression test baseline:

Variable  Group Count     Sum  AbsSum      Min     Max   Range     Mean  StdDev
orog_filt /       396 103.033 318.697 -25.4385 26.2191 51.6576 0.260184 2.97099

If I add another print statement for phis(1,2), I get this difference:

Variable  Group Count       Sum AbsSum      Min     Max   Range        Mean   StdDev
orog_filt /       370 -0.375389 68.313 -2.69452 3.75537 6.44989 -0.00101457 0.484862

So there is likely some kind of memory leak going on with the 'regional' option.

Originally posted by @GeorgeGayno-NOAA in #91 (comment)

Update "nst_tf_chg" program for GFS v15 and v16 data

The nst_tf_chg program does not work with GFS v15 nemsio files. When I try running it with the 'fv3gfs_chgres.sh' script, I get errors:

ERROR: the size of recname is not equal to the total number of the fields in the file!
readrecv for gfilei for zm  Status =          -42
writerecv for gfileo for zm  Status =          -73
readrecv for gfilei for ifd  Status =          -42
writerecv for gfileo for ifd  Status =          -73

The program will also need to work with GFS v16 gaussian netcdf files.

nemsio utilities belong in nemsio library

The NEMSIO utilities such as nemsio_read, nemsio_get, nemsio_chgdate, mkgfsnemsioctl belong with the NCEPLIBS-nemsio as a change to the library API most certainly will need an update to these utilities.

Associated ctests should be added to ensure these utilities and the library are in agreement.

HREFv3 support

Matt Pyle requested a release branch to support HREFv3.

HREF uses chgres_cube.

Remove nemsio_cvt utility from repository

The nemsio_cvt utility may be used to query the file 'endianness' of a nemsio file, and to change the 'endianness' of a nemsio file. The latter function is not working. After talking with Jun, we decided it is not worth fixing if no one is using it.

The global-workflow, regional-workflow and UPP groups have verified that they do not use this utility. I can do a search in our operational log files. If it not used, then it should be removed from the repository.

CHGRES_CUBE problem at poles

In August, Jeff B. found that chgres_cube can fail at model points near the poles when using RAP data as input. He fixed it by changing the pole method from ESMF_POLEMETHOD_NONE to ESMF_POLEMETHOD_ALLAVG in the call to ESMF_FieldRegridStore (in routine surface.F90). Jim A. is having the same problem when using chgres_cube to initialize a C768 nested run using gaussian nemsio input data.

This problem likely also exists in the sfc_climo_gen program.

Add CMake build to UFS_UTILS

v1 of the public release used CMake to build chgres_cube. Use CMake to build all programs in the repository. Remove the old build system.

Global parallel initialization using FV3 data

Users are asking how to coldstart a full global parallel (80 enkf members and the high res member) using FV3GFS GDAS data. The only GDAS data available in HPSS is tiled restart files. Therefore, the fv3gfs_chgres.sh script, which runs the old serial chgres, will not work. Need to come up with a replacement to that script that uses chgres_cube. The script could also be designed for users who only want to run a free forecast.

NDATE not defined in gdas init utility scripts

During a recent update the 'develop' (3ad7d83), the load of 'prod_util' was inadvertently removed. As a result, NDATE is undefined in the GDAS initialization utility scripts. NDATE is used when processing GFS v15 data.

Port UFS_UTILS to Orion

Port the repo to the new Orion machine.

Also, remove all references to WCOSS-Phase 1/2, which is being turned off by the end of April.

Missing values in the coordinate lon lat variables in the regional orog file

There are some strange missing values (see the 0.9969E+37 values below) in the two netcdf coordinate variables (lon and lat) in the orography files for the regional FV3SAR and HAFS configurations, e.g., C768_oro_data.tile7.halo0.nc. The geolon and geolat variables, however, look normal.

Not sure whether these missing coordinate lon/lat values matter or not. But it might be better to fill them with reasonable values or just use npx, npy as coordinate variables.

BTW, when checking the lon, lat values in the sfc_data.tile7.nc and gfs_data.tile7.nc files (generated by chgres), they all look normal though.

Variable: f
Type: file
filename: oro_data
path: oro_data.nc
file global attributes:
dimensions:
lon = 2560
lat = 2160
variables:
float lon ( lon )
cartesian_axis : X

  float lat ( lat )
     cartesian_axis :   Y

  float geolon ( lat, lon )

  float geolat ( lat, lon )

  float slmsk ( lat, lon )

  float land_frac ( lat, lon )

  float orog_raw ( lat, lon )

  float orog_filt ( lat, lon )

  float stddev ( lat, lon )

  float convexity ( lat, lon )

  float oa1 ( lat, lon )

  float oa2 ( lat, lon )

  float oa3 ( lat, lon )

  float oa4 ( lat, lon )

  float ol1 ( lat, lon )

  float ol2 ( lat, lon )

  float ol3 ( lat, lon )

  float ol4 ( lat, lon )

  float theta ( lat, lon )

  float gamma ( lat, lon )

  float sigma ( lat, lon )

  float elvmax ( lat, lon )

Coordinates:

 lon:

337.1 337.1 337.0 337.0 337.0 336.9
336.9 336.9 336.9 336.8 336.8 336.8
336.8 336.7 336.7 336.7 336.7 336.6
336.6 336.6 336.6 336.5 336.5 336.5
336.4 336.4 336.4 336.4 336.3 336.3
336.3 336.3 336.2 336.2 336.2 336.2
336.1 336.1 336.1 336.0 336.0 336.0
...
246.3 246.3 246.3 246.2 246.2 246.2
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
...
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37

 lat:

44.88 44.86 44.84 44.82 44.79 44.77
44.75 44.73 44.70 44.68 44.66 44.64
44.61 44.59 44.57 44.54 44.52 44.50
44.48 44.45 44.43 44.41 44.39 44.36
...
-6.162 -6.186 -6.209 -6.233 -6.256 -6.280
-6.303 -6.327 -6.350 -6.374 -6.398 -6.421
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37
0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37 0.9969E+37

Documentation of chgres_cube output files

We have received a request for documentation of chgres_cube output files (gfs_data.nc, sfc_data.nc, and gfs_bndy*.nc files), including the fields contained within. The fields themselves (at least in the SAR) are a function of the external model and the suite definition file used by CCPP.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.