Giter VIP home page Giter VIP logo

exps-running's Introduction

EXPS-RUNNING

Gather all scripts and jobs used to perform a yearly simulation on Datarmor based on NEMO release 4.2
It relies on the DCM (DRAKKAR CONFIGURATION MANAGER) to both create, compile a configuration as well as submit a yearly simulation.

1 - DCM installation:
2 - Build a new configuration environment using DCM
3 - Launch a numerical experiment.


##1 DCM installation
The following note rely on the NEMO 4.2.0 official release. To do only once.

DCM stands for Drakkar Configuration manager, it allows to 1) build a configuration, 2) compile it and 3) submit an experiment
A complete documentation is available here: https://github.com/meom-group/DCM

Somewhere in your Datarmor HOME, create a new directory, let's call the full path of this directory ZZMYDIR in the following.

Go into this new directory and copy the following tarball file DCM_4.0_20221110.tar then expand it :

cd ZZMYDIR
git clone https://github.com/meom-group/DCM.git DCM_4.0 

The DCM_4.0 sub-folders structure look like this:

DCM_4.0/
├── DCMTOOLS
│   ├── bin
│   ├── DRAKKAR
│   │   └── NEMO4
│   ├── NEMOREF
│   │   └── nemo_4.2.0
│   └── templates
├── DOC
├── License
└── RUNTOOLS

The NEMO 4.2.0 official release nemo_4.2.0 code sources are located under the NEMOREF sub-folder as shown in the structure above.

Still in your home directory, at the root of your login, create a directory called modules if you do not have one and create the DCM folder
Then copy in it the file called 4.2.0 from there: /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75

mkdir -p  modules/DCM
cp   /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/4.2.0  .

Open it, and replace the name ZZMYDIR by the full path you have just created above

In the head of your .bashrc file add the 2 lines below and change the yourlogin word:

source /usr/share/Modules/3.2.10/init/bash
export MODULEPATH="$MODULEPATH:/home1/datahome/yourlogin/modules:.:"

And elsewhere in your .bashrc file, add the following lines:

# NEMO v4.2.0
module load DCM/4.2.0
alias mkconfdir=$HOMEDCM/bin/dcm_mkconfdir_local
module load NETCDF-test/4.3.3.1-mpt217-intel2018

export UDIR=$HOME/CONFIGS
export PDIR=$HOME/RUNS
export CDIR=$DATAWORK
export SDIR=$SCRATCH
export WORKDIR=$SCRATCH
  • UDIR is the folder where the configuration will be built and from where the compilation process is launched
  • PDIR corresponds to the area from which the simulation is handled

Then, last step of this installation, source your .bashrc to take into account the new changes above

source .bashrc

Now type the command mkconfdir to test if it works, you should get a similar result:

(base) ctalandi@datarmor2 /home1/scratch/ctalandi $ mkconfdir
USAGE : mkconfdir [-h] [-v]  CONFIG CASE
       or
       mkconfdir [-h] [-v] CONFIG-CASE
        [-h ] : print this help message
        [-v ] : print a much more extensive explanation message

PURPOSE : This script is used to create the skeleton of a new NEMO config
          It will create many directories and sub-directories in many places
          according to the environment variables UDIR, CDIR, PDIR and SDIR
          that should be set previously (in your .profile of whatever file)
          The -v option  gives you much more details

#2 Build a new configuration environment using DCM
This requires to install 2 dedicated sub-directories under master CONFIG & RUNS directories as follows:
For instance, to build a new experiment called NEMO420 (in this exemple) that relies on the CREG025.L75 configuration,
use the following command:

mkconfdir CREG025.L75 NEMO420

CAUTION: make sure that the module DCM/4.2.0 is loaded before lanching this command, type module list

This command creates 2 dedicated folders:

  • one dedicated to the NEMO code source and from which the compilation is handled under the master CONFIG folder
  • one where the user can launch a numerical experiment under the master RUNS folder
    These 2 folders look like this:
  • Under the PDIR directory:
RUNS/RUN_CREG025.L75/CREG025.L75-NEMO420
├── CTL
└── EXE
  • Under the UDIR directory (previously set in your .bashrc file):
CONFIGS/CONFIG_CREG025.L75/CREG025.L75-NEMO420
── arch
├── cfgs
├── ext
└── src
    ├── ICE
    ├── MY_SRC
    ├── NST
    ├── OCE
    ├── OFF
    ├── SAO
    ├── SAS
    └── TOP

Now, the user can fill these directories as it is explained below.

Under the UDIR directory (previously set in your .bashrc file):

CONFIGS/CONFIG_CREG025.L75/CREG025.L75-NEMO420
── arch
├── cfgs
├── ext
└── src
    ├── ICE
    ├── MY_SRC
    ├── NST
    ├── OCE
    ├── OFF
    ├── SAO
    ├── SAS
    └── TOP

The most important sub-directories are arch & src/MY_SRC:
- arch: put there the arch-X64_DATARMORMPI.fcm file . It corresponds to the required compilation options specific to Datarmor.

cp /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/arch-X64_DATARMORMPI.fcm $UDIR/CONFIG_CREG025.L75/CREG025.L75-NEMO420/arch/.
- src/MY_SRC: put there the specific modules that are modified against the NEMO reference code<br>
cd  $UDIR/CONFIG_CREG025.L75/CREG025.L75-NEMO420/src/MY_SRC
cp /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/NEMO420_MY_SRC_20201110.tar .
tar xvf NEMO420_MY_SRC_20201110.tar
rm -f NEMO420_MY_SRC_20201110.tar

Control that the following FORTRAN modules are present:

ls *90
dtatsd.F90     istate.F90                lbc_lnk_pt2pt_generic.h90  sbcblk.F90   sbcssr.F90   zdftke.F90
iceistate.F90  lbc_lnk_call_generic.h90  mpp_nfd_generic.h90        sbcmod.F90   shapiro.F90
iceupdate.F90  lbclnk.F90                nemogcm.F90                sbc_oce.F90  tradmp.F90

Then last step before the compilation, copy the makefile & the CPP.keys files :

cp /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/makefile  $UDIR/CONFIG_CREG025.L75/CREG025.L75-NEMO420/.
cp /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/CPP.keys  $UDIR/CONFIG_CREG025.L75/CREG025.L75-NEMO420/.

The makefile set all usefull information for the compilation while the CPP.keys file, set the cpp keys specific to this experiment.

Now install the configuration, i.e. build the WORK folder which includes all good links to the NEMO code including the ones in the MY_SRC
and launch the compilation itself:

make install

The result should be similar to this:

CONFIGS/CONFIG_CREG025.L75/CREG025.L75-NEMO420
├── *.*0 -> ext/src/IOIPSL/*.*0
├── arch
├── arch-X64_DATARMORMPI.fcm -> arch/arch-X64_DATARMORMPI.fcm
├── B4_compilation.bash
├── cfgs
├── CPP.keys
├── DCM_4.0
├── dtatsd.F90 -> src/MY_SRC/dtatsd.F90
├── ext
├── iceistate.F90 -> src/MY_SRC/iceistate.F90
├── iceupdate.F90 -> src/MY_SRC/iceupdate.F90
├── install_history
├── istate.F90 -> src/MY_SRC/istate.F90
├── lbc_lnk_call_generic.h90 -> src/MY_SRC/lbc_lnk_call_generic.h90
├── lbclnk.F90 -> src/MY_SRC/lbclnk.F90
├── lbc_lnk_pt2pt_generic.h90 -> src/MY_SRC/lbc_lnk_pt2pt_generic.h90
├── makefile
├── mpp_nfd_generic.h90 -> src/MY_SRC/mpp_nfd_generic.h90
├── nemogcm.F90 -> src/MY_SRC/nemogcm.F90
├── sbcblk.F90 -> src/MY_SRC/sbcblk.F90
├── sbcmod.F90 -> src/MY_SRC/sbcmod.F90
├── sbc_oce.F90 -> src/MY_SRC/sbc_oce.F90
├── sbcssr.F90 -> src/MY_SRC/sbcssr.F90
├── shapiro.F90 -> src/MY_SRC/shapiro.F90
├── src
├── tradmp.F90 -> src/MY_SRC/tradmp.F90
├── WORK -> /home1/datawork/ctalandi/WCREG025.L75-NEMO420/cfgs/CREG025.L75-NEMO420/WORK
└── zdftke.F90 -> src/MY_SRC/zdftke.F90

Then launch the compilation itself:

make

CAUTION: make sure that the following 3 modules are loaded before launching the compilation:

  • NETCDF-test/4.3.3.1-mpt217-intel2018
  • intel-fc-18/18.0.1.163
  • mpt/2.17
    if not, type > module load NETCDF-test/4.3.3.1-mpt217-intel2018 or better add it once for all in your .bashrc file

At the end of the compilation , the NEMO executable should be stored in the EXE sub-directory as detailed below.

Under the PDIR directory:

RUNS/RUN_CREG025.L75/CREG025.L75-NEMO420
├── CTL
└── EXE
  • EXE: location of the nemo4.exe binary resulting from the compilation process
  • CTL: location where the user is going to launch numerical experiments

To handle numerical experiments, few files, scripts have to be installed as it is detailed now
The result should be something similar to this:

cd $PDIR/RUN_CREG025.L75/CREG025.L75-NEMO420/CTL
cp -R /home1/datahome/ctalandi/2SHARE/4EMMA/RUNNING-CREG025.L75/RUNS/* .

The following files/folder should be copied :

CREG025.L75-NEMO420_datarmor.sh  includefile.sh                namelist_ice.CREG025.L75-NEMO420  run_nemo.sh
CREG025.L75-NEMO420.db           namelist.CREG025.L75-NEMO420  namelist_ref                      XML
  • XML: contains the XML files to manage outputs
  • namelist.CREG025.L75-NEMO420, namelist_ref & namelist_ice.CREG025.L75-NEMO420 are respectively the ocean & sea-ice model namelists

Other files are detailed in the next section


#3 Launch a numerical experiment.
No details are given about what/how to set physics/numerics in the NEMO namelists herefater, only how to perform a simulation.

Move into the simulation manager folder:

cd $PDIR/RUN_CREG025.L75/CREG025.L75-NEMO420/CTL

Only 2 files should be modified to perform a simulation:

  • CREG025.L75-NEMO420.db
    The starting date of the simulation is set through the nn_date0 parameter in the ocean namelist, here 1979 January the 1st.
    The duration of 1 simulation is based on the total number of model iterations that is set in the CREG025.L75-NEMO420.db file
    For instance:
cat CREG025.L75-NEMO420.db

gives the following result:

1 1 3720

3 columns with following values from the left to the right :

  • 1 : the current stream number, 1 means the beginning of the simulation
  • 1: the first model iteration of stream #1
  • 3720: the total iteration number to perform relative to the model time step rn_Dt parameter set in the ocean namelist which is 720s, this corresponds to 31 days (3720x720s/86400)
    At the end of the simulation, it should look like this:
1 1    3720  19790131
2 3721 7440

The final date of the simulation is added at the end of the first line, here 1979 January 31st as expected.
A second line is also added, it corresponds to the next stream (2nd stream) supposing the same length, i.e. 31 days.
This length can be changed for each new stream.

  • includefile.sh
    - file in which are set :
    - paths where to find input files such as initial state, surface forcing, runoffs ..etc
    - most "classical" input files set into the namelists will be copied into the running directory
    - for new files, not expected from this NEMO release 4.2.0, new input files should be copied by hand.
    - the effective running directory is set with the variable TMPDIR (to be eventually changed)
    - Check also that the CTL_DIR variable points to your CTL directory where is located your include.sh
    - adapt the following variable MAILTO
    - the location of the XIOS binary is set with the P_XIOS_DIR variable. It is currently set to the version I have compiled.
    - the MAXSUB variable at the end of this file set the maximal number automatic job re-submission, this is usefull when the length of one stream is always the same, for instance a one year-long simulation of 365 days. This variable can be leaved to 1, in this case, no automatic job re-submission will occur, a new simulation has to be launched by hand.

The job used to perform one simulation is CREG025.L75-NEMO420_datarmor.sh, there is a Datarmor header with also the total number of MPI cores to be used, nothing has to to changed in it.

Launch a simulation

./run_nemo.sh

To control the job status:

qstat -u yourlogin

1 year-long simulation requires ~ 4h45 elapsed time with a 5d mean output frequency.

exps-running's People

Contributors

ctalandi avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.