Changeset f20342a in flex_extract.git for For_developers/Sphinx/source/Documentation


Ignore:
Timestamp:
May 27, 2020, 8:01:54 PM (4 years ago)
Author:
Petra Seibert <petra.seibert [at) univie.ac.at>
Branches:
master, ctbto, dev
Children:
550435b
Parents:
a14839a
Message:

Language corrections for the Sections Developers, Support, Changelog, and the home directory (index.html)

further improvment of documentation, close to final

Location:
For_developers/Sphinx/source/Documentation
Files:
21 edited

Legend:

Unmodified
Added
Removed
  • For_developers/Sphinx/source/Documentation/Api/api_fortran.rst

    rba99230 rf20342a  
    11**************************************
    2 Fortran's Auto Generated Documentation
     2Auto-generated documentation for the Fortran programme
    33**************************************
    44
     
    66    :local:
    77   
    8    
    9    
    10 Link to other documentation!
     8   
     9`Fortran API <Fortran/index.html>`_
    1110
    12 
    13 
    14 
    15 .... f:autoprogram:: preconvert   
    16    
    17    
    18    
    1911   
    2012.. toctree::
    2113    :hidden:
    2214    :maxdepth: 2
    23    
    24    
    25 
    26    
  • For_developers/Sphinx/source/Documentation/Api/api_python.rst

    rba99230 rf20342a  
    11*************************************
    2 Python's Auto Generated Documentation
     2Auto-generated documentation for the Python scripts
    33*************************************
    44
  • For_developers/Sphinx/source/Documentation/Input/changes.rst

    rba99230 rf20342a  
    88    - comments available with ``#``
    99    - only parameters which are needed to override the default values are necessary
    10     - number of type/step/time elements do not have to be 24 any more. Just select the interval you need.
    11     - the ``dtime`` parameter needs to be consistent with ``type/step/time``. For example ``dtime`` can be coarser as ``time`` intervals are available, but not finer.
     10    - number of type/step/time elements does not have to be 24 anymore. Just provide what you need.
     11    - the ``dtime`` parameter needs to be consistent with ``type/step/time``, for example, ``dtime`` can be coarser than the ``time`` intervals available, but not finer.
    1212
    1313 
  • For_developers/Sphinx/source/Documentation/Input/compilejob.rst

    rb1674ed rf20342a  
    11********************************************
    2 The Compilation Jobscript ``compilejob.ksh``
     2The compilation job script ``compilejob.ksh``
    33********************************************
    44
    5 The compilejob is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.
     5The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.
    66
    7 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate
     7``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate
    88the Korn-shell script from the template files by substituting the individual parameters.
    99These individual parameters are marked by a doubled ``$`` sign in ``job.temp``.
    1010
    11 The jobscript has a number of settings for the batch system which are fixed and differentiates between the *ecgate* and the *cca/ccb*
     11The job script has a number of settings for the batch system which are fixed, and it differentiates between the *ecgate* and the *cca/ccb*
    1212server system to load the necessary modules for the environment when submitted to the batch queue.
    1313
     
    1919------------------------------------
    2020
    21  #. It sets necessary batch system parameters
     21 #. It sets the necessary batch-system parameters
    2222 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules
    23  #. It sets some environment variabels for the single session
     23 #. It sets some environment variables for the single session
    2424 #. It creates the ``flex_extract`` root directory in the ``$HOME`` path of the user
    25  #. It untars the tar-ball into the root directory.
    26  #. It compiles the Fortran programs's ``Makefile``.
    27  #. At the end it checks if the script returned an error or not and send the log file via email to the user.
     25 #. It untars the tarball into the root directory.
     26 #. It compiles the Fortran program using ``Makefile``.
     27 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user.
    2828
    2929
  • For_developers/Sphinx/source/Documentation/Input/control.rst

    rb1674ed rf20342a  
    1010 
    1111This file is an input file for :literal:`flex_extract's` main script :literal:`submit.py`.
    12 It contains the controlling parameters :literal:`flex_extract` needs to decide on dataset specifications,
    13 handling of the retrieved data and general bahaviour. The naming convention is usually (but not necessary):
     12It contains the controlling parameters which :literal:`flex_extract` needs to decide on data set specifications,
     13handling of the  data retrieved, and general behaviour. The naming convention is usually (but not necessarily):
    1414
    1515   :literal:`CONTROL_<Dataset>[.optionalIndications]`
    1616
    17 The tested datasets are the operational dataset and the re-analysis datasets CERA-20C, ERA5 and ERA-Interim.
    18 The optional extra indications for the re-analysis datasets mark the files for *public users*
    19 and *global* domain. For the operational datasets (*OD*) the file names contain also information of
    20 the stream, the field type for forecasts, the method for extracting the vertical coordinate and other things like time or horizontal resolution.
     17There are a number of data sets for which the procedures have been tested, the operational data and the re-analysis datasets CERA-20C, ERA5, and ERA-Interim.
     18The optional indications for the re-analysis data sets mark the files for *public users*
     19and *global* domain. For the operational data sets (*OD*), the file names contain also information of
     20the stream, the field type for forecasts, the method for extracting the vertical wind, and other information such as temporal or horizontal resolution.
    2121
    2222
     
    2424----------------------------------
    2525The first string of each line is the parameter name, the following string(s) (separated by spaces) is (are) the parameter values.
    26 The parameters can be sorted in any order with one parameter per line.
     26The parameters can be listed in any order with one parameter per line.
    2727Comments are started with a '#' - sign. Some of these parameters can be overruled by the command line
    2828parameters given to the :literal:`submit.py` script.
    29 All parameters have default values. Only those parameters which have to be changed
    30 must be listed in the :literal:`CONTROL` files.
     29All parameters have default values; only those parameters which deviate from default
     30have be listed in the :literal:`CONTROL` files.
    3131
    3232
     
    3535
    3636A number of example files can be found in the directory :literal:`flex_extract_vX.X/Run/Control/`.
    37 They can be used as a template for adaptations and understand what's possible to
    38 retrieve from ECMWF's archive.
    39 For each main dataset there is an example and additionally some variances in resolution, type of field or type of retrieving the vertical coordinate.
     37They can be used as a template for adaptation, and to understand what can be
     38retrievee from ECMWF's archives.
     39There is an example for each main data set, and in addition, some more varied with respect to resolution, type of field, or way of retrieving the vertical wind.
    4040
    4141
     
    4545------------
    4646The file :literal:`CONTROL.documentation` documents the available parameters
    47 in grouped sections with their default values. In :doc:`control_params` you can find a more
    48 detailed description with additional hints, possible values and some useful information about
     47in grouped sections together with their default values.
     48In :doc:`control_params`, you can find a more
     49detailed description with additional hints, possible values, and further information about
    4950the setting of these parameters.
    5051
  • For_developers/Sphinx/source/Documentation/Input/control_params.rst

    rb1674ed rf20342a  
    1111************
    1212   
    13 .. exceltable:: User parameter in CONTROL file
     13.. exceltable:: User parameters in CONTROL file
    1414   :file: ../../_files/CONTROLparameter.xls
    1515   :sheet: UserSection
     
    2121***************
    2222
    23 .. exceltable:: General parameter in CONTROL file
     23.. exceltable:: General parameters in CONTROL file
    2424   :file: ../../_files/CONTROLparameter.xls
    2525   :sheet: GeneralSection
     
    3131************
    3232   
    33 .. exceltable:: Time parameter in CONTROL file
     33.. exceltable:: Time parameters in CONTROL file
    3434   :file: ../../_files/CONTROLparameter.xls
    3535   :sheet: TimeSection 
     
    4242************
    4343   
    44 .. exceltable:: Data parameter in CONTROL file
     44.. exceltable:: Data parameters in CONTROL file
    4545   :file: ../../_files/CONTROLparameter.xls
    4646   :sheet: DataSection
     
    5353******************
    5454   
    55 .. exceltable:: Data field parameter in CONTROL file
     55.. exceltable:: Data field parameters in CONTROL file
    5656   :file: ../../_files/CONTROLparameter.xls
    5757   :sheet: DatafieldsSection
     
    6464*****************
    6565
    66 .. exceltable:: Flux data parameter in CONTROL file
     66.. exceltable:: Flux data parameters in CONTROL file
    6767   :file: ../../_files/CONTROLparameter.xls
    6868   :sheet: FluxDataSection
     
    7575**************
    7676   
    77 .. exceltable:: Domain parameter in CONTROL file
     77.. exceltable:: Domain parameters in CONTROL file
    7878   :file: ../../_files/CONTROLparameter.xls
    7979   :sheet: DomainSection
     
    9999***********************
    100100   
    101 .. exceltable:: Additional data parameter in CONTROL file
     101.. exceltable:: Additional data parameters in CONTROL file
    102102   :file: ../../_files/CONTROLparameter.xls
    103103   :sheet: AddDataSection
  • For_developers/Sphinx/source/Documentation/Input/ecmwf_env.rst

    rb1674ed rf20342a  
    11****************************************
    2 ECMWF User Credential file ``ECMWF_ENV``
     2ECMWF user credential file ``ECMWF_ENV``
    33****************************************
    44
     
    1616------------------------
    1717
    18 The following shows an example of the content of an ``ECMWF_ENV`` file:
     18An example of the content of an ``ECMWF_ENV`` file is shown below:
    1919 
    2020.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/examples.rst

    rb1674ed rf20342a  
    33**********************
    44
    5 ``Flex_extract`` has a couple of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.
     5``Flex_extract`` comes with a number of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.
    66
    7 Here is a list of the example files and a description of the data set:
     7Here is a list of the example files:
    88
    99CONTROL.documentation
    10    This file is not intended to be used with ``flex_extract``. It has a list of all possible parameters and their default values for a quick overview.
     10   This file is not intended to be used with ``flex_extract``. It just contains a list of all possible parameters and their default values for a quick overview.
    1111   
    1212.. code-block:: bash
     
    3333        CONTROL_OD.OPER.FC.gauss.highres
    3434        CONTROL_OD.OPER.FC.operational
    35         CONTROL_OD.OPER.FC.twiceaday.1hourly
    36         CONTROL_OD.OPER.FC.twiceaday.3hourly
     35        CONTROL_OD.OPER.FC.twicedaily.1hourly
     36        CONTROL_OD.OPER.FC.twicedaily.3hourly
    3737
    38    
     38   #PS some information to be added.
     39 
    3940.. toctree::
    4041    :hidden:
  • For_developers/Sphinx/source/Documentation/Input/fortran_makefile.rst

    rb1674ed rf20342a  
    11**************************************
    2 The Fortran Makefile - ``calc_etadot``
     2The Fortran makefile for ``calc_etadot``
    33**************************************
    44
    55.. _ref-convert:
    66
    7 ``Flex_extract``'s Fortran program will be compiled during
    8 the installation process to get the executable named ``calc_etadot``.
     7The Fortran program ``calc_etadot`` will be compiled during
     8the installation process to produce the executable called ``calc_etadot``.
    99
    10 ``Flex_extract`` has a couple of ``makefiles`` prepared which can be found in the directory
    11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number.
    12 A list of these ``makefiles`` are shown below:
     10``Flex_extract`` includes several ``makefiles`` which can be found in the directory
     11``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted by the current flex_extract version number.
     12A list of these ``makefiles`` is shown below:
    1313
    1414
     
    1616| Files to be used as they are!
    1717   
    18     | **makefile_ecgate**
    19     | For the use on ECMWF's server **ecgate**.
    20 
    21     | **makefile_cray**
    22     | For the use on ECMWF's server **cca/ccb**.
     18    | **makefile_ecgate**: For  use on ECMWF's server **ecgate**.
     19    | **makefile_cray**:   For  use on ECMWF's server **cca/ccb**.
    2320   
    2421| **Local mode**
    25 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB**
     22| It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** if they don't correspond to the standard paths pre-set in the makefiles.
    2623 
    27     | **makefile_fast**
    28     | For the use with gfortran compiler and optimization mode.
     24    | **makefile_fast**:  For use with the gfortran compiler and optimisation mode.
     25    | **makefile_debug**: For use with the gfortran compiler and debugging mode. Primarily for developers.
    2926
    30     | **makefile_debug**
    31     | For the use with gfortran compiler in debugging mode.
     27If you want to use another compiler than gfortran locally, you can still take ``makefile_fast``,
     28and adapt everything that is compiler-specific in this file.
    3229
    33 
    34 For instructions on how to adapt the ``makefiles`` for the local application mode
     30For instructions on how to adapt the ``makefile`` (local application mode only),
    3531please see :ref:`ref-install-local`.
    36 
    3732
    3833   
  • For_developers/Sphinx/source/Documentation/Input/jobscript.rst

    rb1674ed rf20342a  
    11*************************
    2 The Jobscript ``job.ksh``
     2The job script ``job.ksh``
    33*************************
    44
    5 The jobscript is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.
     5The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.
    66
    7 It is based on the ``job.temp`` template file which is stored in the ``Templates`` directory.
    8 This template is by itself generated in the installation process from a ``job.template`` template file.
     7It is based on the ``job.temp`` template file stored in the ``Templates`` directory.
     8This template is generated in the installation process from a ``job.template`` template file.
    99
    10 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate
     10``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate
    1111the Korn-shell script from the template files by substituting the individual parameters.
    12 These individual parameters are marked by a doubled ``$`` sign in ``job.temp``.
     12These individual parameters are marked by ``$$`` in ``job.temp``.
    1313
    14 The jobscript has a number of settings for the batch system which are fixed and differentiates between the *ecgate* and the *cca/ccb*
     14The job script has a number of settings for the batch system which are fixed, and differentiates between the *ecgate* and the *cca/ccb*
    1515server system to load the necessary modules for the environment when submitted to the batch queue.
    1616
     
    1919
    2020
    21 What does the jobscript do?
     21What does the job script do?
    2222---------------------------
    2323
    24  #. It sets necessary batch system parameters
    25  #. It prepares the job environment at the ECMWF servers by loading the necessary library modules
    26  #. It sets some environment variabels for the single session
    27  #. It creates the directory structure in the users ``$SCRATCH`` file system
    28  #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are given to the jobscript with its default or the user defined values. It also sets the ``CONTROL`` as an environment variable.
    29  #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new pathes for input and output directory and the recently generated ``CONTROL`` file.
    30  #. At the end it checks if the script returned an error or not and send the log file via email to the user.
     24 #. It sets necessary batch system parameters.
     25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules.
     26 #. It sets some environment variables for the single session.
     27 #. It creates the directory structure in the user's ``$SCRATCH`` file system.
     28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are passed to the job script with their default or the user-defined values. It also sets ``CONTROL`` as an environment variable.
     29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new paths for input and output directories and the recently generated ``CONTROL`` file.
     30 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user.
    3131
    3232
  • For_developers/Sphinx/source/Documentation/Input/run.rst

    rb1674ed rf20342a  
    11**********************************
    2 The executable Script - ``run.sh``
     2The executable script - ``run.sh``
    33**********************************
    44
    5 The execution of ``flex_extract`` is done by the ``run.sh`` Shell script, which is a wrapping script for the top-level Python script ``submit.py``.
     5The execution of ``flex_extract`` is done by the ``run.sh`` shell script, which is a wrapper script for the top-level Python script ``submit.py``.
    66The Python script constitutes the entry point to ECMWF data retrievals with ``flex_extract`` and controls the program flow.
    77
    8 ``submit.py`` has two (three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 
    9 the command line parameters and the so-called ``ECMWF_ENV`` file. Whereby, the command line parameters will override the ``CONTROL`` file parameters.
     8``submit.py`` has two (or three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 
     9the command line parameters, and the so-called ``ECMWF_ENV`` file. Command line parameters will override parameters specified in the ``CONTROL`` file.
    1010
    11 Based on these input information ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a Web API on a local maschine or submit a jobscript to ECMWF servers and retrieve the data there with sending the files to the local system eventually.
     11Based on this input information, ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a web API on a local maschine, or submit a job script to an ECMWF server and retrieve the data there, and at the end sends the files to the local system.
    1212
    1313
    1414
    1515
    16 Submission Parameter
     16Submission parameters
    1717--------------------
    1818
    1919
    20 .. exceltable:: Parameter for Submission
     20.. exceltable:: Parameters for submission
    2121    :file:  ../../_files/SubmitParameters.xls
    2222    :header: 1 
     
    3939---------------------------------
    4040
    41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapping Shell script ``run.sh``.  This top-level script is located in
    42 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``help`` parameter we see again all possible
    43 command line parameter.
     41It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapper shell script ``run.sh``.  This top-level script is located in
     42``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter
     43we see again all possible command line parameters.
    4444
    4545.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/setup.rst

    rb1674ed rf20342a  
    11**************************************
    2 The Installation Script - ``setup.sh``
     2The installation script - ``setup.sh``
    33**************************************
    44
    5 
    6 The installation of ``flex_extract`` is done by the Shell script ``setup.sh`` which is located in the root directory of ``flex_extract``.
    7 It calls the top-level Python script ``install.py`` which does all necessary operations to prepare the selected application environment. This includes:
    8 
    9 - preparing the file ``ECMWF_ENV`` with the user credentials for member state access to ECMWF servers (in **remote** and **gateway** mode)
     5The installation of ``flex_extract`` is done by the shell script ``setup.sh`` located in the root directory of ``flex_extract``.
     6It calls the top-level Python script ``install.py`` which does all the necessary operations to prepare the  application environment selected. This includes:
     7
     8- preparing the file ``ECMWF_ENV`` with the user credentials for member-state access to ECMWF servers (in **remote** and **gateway** mode)
    109- preparation of a compilation Korn-shell script (in **remote** and **gateway** mode)
    1110- preparation of a job template with user credentials (in **remote** and **gateway** mode)
    12 - create a tar-ball of all necessary files
    13 - copying tar-ball to target location (depending on application mode and installation path)
    14 - submit compilation script to batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar tar-ball at target location (**local mode**)
    15 - compilation of the FORTRAN90 program ``calc_etadot``
    16 
    17 
    18 The Python installation script ``install.py`` has a couple of command line arguments which are defined in ``setup.sh`` in the section labelled with "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters for his personal use. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee necessary parameters were set.
     11- create a tarball of all necessary files
     12- copying the tarball to the target location (depending on application mode and installation path)
     13- submit the compilation script to the batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar the tarball at target location (**local mode**)
     14- compilation of the Fortran program ``calc_etadot``
     15
     16
     17The Python installation script ``install.py`` has several command line arguments defined in ``setup.sh``, in the section labelled "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters according to his/her personal needs. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee that the necessary parameters were set.
    1918   
    2019After the installation process, some tests can be conducted. They are described in section :ref:`ref-testinstallfe`.
    2120
    22 The following diagram sketches the involved files and scripts in the installation process:
     21The following diagram sketches the files and scripts involved in the installation process:
    2322
    2423.. _ref-install-blockdiag:
     
    115114
    116115.. blockdiag::
    117    :caption: Diagram of data flow during the installation process. The trapezoids are input files with the light blue area being the template files. The edge-rounded, orange boxes are the executable files which start the installation process and reads the input files. The rectangular, green boxes are the output files. The light green files are files which are only needed in the remota and gateway mode.
     116   :caption: Diagram of data flow during the installation process. Trapezoids are input files with the light blue area being the template files. Round-edge orange boxes are executable files which start the installation process and read the input files. Rectangular green boxes are  output files. Light green files are  needed only in the remota and gateway mode.
    118117
    119118   blockdiag {
     
    133132.. _ref-instparams:
    134133
    135 Installation Parameter
    136 ----------------------
     134Installation parameters
     135-----------------------
    137136   
    138137.. exceltable:: Parameter for Installation
     
    155154----------------------------------
    156155
    157 It is also possible to start the installation process of ``flex_extract`` directly from command line by using the ``install.py`` script instead of the wrapping Shell script ``setup.sh``.  This top-level script is located in
    158 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``help`` parameter we see again all possible
    159 command line parameter.
     156It is also possible to start the installation process of ``flex_extract`` directly from the command line by using the ``install.py`` script instead of the wrapper shell script ``setup.sh``.  This top-level script is located in
     157``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter,
     158we see again all possible command line parameters.
    160159
    161160.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/templates.rst

    rb1674ed rf20342a  
    33*********
    44
    5 In ``flex_extract`` we use the Python package `genshi <https://genshi.edgewall.org/>`_ to create specific files from templates. It is the most efficient way to be able to quickly adapt e.g. the job scripts send to the ECMWF batch queue system or the namelist file für the Fortran program without the need to change the program code.
     5In ``flex_extract``, the Python package `genshi <https://genshi.edgewall.org/>`_ is used to create specific files from templates. It is the most efficient way to be able to quickly adapt, e. g., the job scripts sent to the ECMWF batch queue system, or the namelist file für the Fortran program, without the need to change the program code.
    66
    77.. note::
    8    Usually it is not recommended to change anything in these files without being able to understand the effects.
     8   Do not change anything in these files unless you understand the effects!
    99   
    10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values needs to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used the ``$`` sign needs to be doubled to `escape` and keep a single ``$`` sign as it is.
     10Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used, the ``$`` sign needs to be doubled for `escaping`.
    1111   
    12 The following templates are used and can be found in directory ``flex_extract_vX.X/Templates``:
     12The following templates are used; they can be found in the directory ``flex_extract_vX.X/Templates``:
    1313
    1414convert.nl
    1515----------
    1616
    17     This is the template for a Fortran namelist file called ``fort.4`` which will be read by ``calc_etadot``.
     17    This is the template for a Fortran namelist file called ``fort.4`` read by ``calc_etadot``.
    1818    It contains all the parameters ``calc_etadot`` needs.
    1919   
     
    5757    This template is used to create the job script file called ``compilejob.ksh`` during the installation process for the application modes **remote** and **gateway**.
    5858
    59     At the beginning some directives for the batch system are set.
    60     On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
    61     For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
    62 
    63     The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.
    64    
    65     Afterwards the installation steps as such are done. Including the generation of the root directory, putting files in place, compiling the Fortran program and sending a log file via email.
     59    At the beginning, some directives for the batch system are set.
     60    On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
     61    For the high-performance computers **cca** and **ccb**, the ``PBS`` comments are necessary;  for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
     62
     63    The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing.
     64   
     65    Afterwards, the installation steps as such are done. They included the generation of the root directory, putting files in place, compiling the Fortran program, and sending a log file by email.
    6666
    6767    .. code-block:: ksh
     
    145145    This template is used to create the actual job script file called ``job.ksh`` for the execution of ``flex_extract`` in the application modes **remote** and **gateway**.
    146146
    147     At the beginning some directives for the batch system are set.
    148     On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
    149     For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
    150 
    151     The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.
    152    
    153     Afterwards the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end a log file is send via email.
     147    At the beginning, some directives for the batch system are set.
     148    On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
     149    For the high performance computers **cca** and **ccb**, the ``PBS`` comments are necessary;
     150    for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
     151
     152    The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing.
     153   
     154    Afterwards, the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end, a log file is send by email.
    154155   
    155156    .. code-block:: ksh
     
    239240------------
    240241
    241     This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. A couple of parameters are set in this process, such as the user credentials and the ``flex_extract`` version number.
     242    This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. Several parameters are set in this process, such as the user credentials and the ``flex_extract`` version number.
    242243       
    243244    .. code-block:: ksh
     
    325326
    326327
    327 
    328 
    329 
    330 
    331  
    332 
    333 
    334 
    335 
    336 
    337    
    338    
    339 
    340  
    341    
    342328   
    343329
  • For_developers/Sphinx/source/Documentation/Overview/app_modes.rst

    rb1674ed rf20342a  
    11*****************
    2 Application Modes
     2Application modes
    33*****************
    44
     
    1313.. _ref-app-modes:
    1414
    15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has 4 different :underline:`user application modes`:
     15Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has four different :underline:`user application modes`:
    1616
    1717.. _ref-remote-desc:
    1818
    1919  1. Remote (member)
    20       In the **Remote mode** the user works directly on ECMWF Linux member state server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the additional third-party libraries mentioned in :ref:`ref-requirements` as ECMWF provides everything with environment modules. The module selection will be done automatically in ``flex_extract``.
     20      In the **Remote mode** the user works directly on a ECMWF member-state Linux server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the third-party libraries mentioned in :ref:`ref-requirements`, as ECMWF provides everything with environment modules. The module selection will be done automatically by ``flex_extract``.
    2121     
    2222.. _ref-gateway-desc:
    2323     
    2424  2. Gateway (member)
    25       The **Gateway mode** can be used if a local member state gateway server is in place. Then the job scripts can be submitted to the ECMWF Linux member state server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed at the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and some extra setup is done in the local ``flex_extract`` directory at the local gateway server. For more information about establishing a gateway server please see `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.
     25      The **Gateway mode** can be used if a local member-state gateway server is in place. Then, the job scripts can be submitted to the ECMWF member-state Linux server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed on the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and that some extra setup is done in the ``flex_extract`` directory on the local gateway server. For more information about establishing a gateway server, please refer to `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.
    2626
    2727.. _ref-local-desc:
    2828     
    2929  3. Local member
    30       Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.
     30      Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.
    3131     
    3232  4. Local public
    33       Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. In this case a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.
     33      Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF servers. The complete installation process is described in :ref:`ref-local-mode`. In this case, a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.
    3434     
    3535     
  • For_developers/Sphinx/source/Documentation/Overview/prog_flow.rst

    rb1674ed rf20342a  
    11************
    2 Program Flow
     2Program flow
    33************
    44
     
    1616.. figure:: ../../_files/submit.png   
    1717   
    18     Overview of the call of python's ``submit.py`` script and raw sequence of working steps done in ``flex_extract``.
     18    Overview of the call of the``submit.py`` Python script and raw sequence of work steps in ``flex_extract``.
    1919
    2020   
    21 The ``submit.py`` Python program is called by the Shell script ``run.sh`` or ``run_local.sh`` and accomplish the following steps:
     21The ``submit.py`` Python script is called by the shell script ``run.sh`` or ``run_local.sh`` and accomplishes the following steps:
    2222
    23     1. Setup the control data:
    24         It gets all command-line and ``CONTROL`` file parameters as well as optionally the ECMWF user credentials. Depending the :doc:`app_modes`, it might also prepare a job script which is then send to the ECMWF queue.
    25     2. Retrieves data from MARS:
    26         It creates and sends MARS-requests either on the local machine or on ECMWF server, that receives the data and stores them in a specific format in GRIB files. If the parameter ``REQUEST`` was set ``1`` the data are not received but a file ``mars_requests.csv`` is created with a list of MARS requests and their settings. If it is set to ``2`` the file is created in addition to retrieving the data. The requests are created in an optimised way by splitting in time, jobs  and parameters.   
    27     3. Post-process data to create final ``FLEXPART`` input files:
    28         After all data is retrieved, the disaggregation of flux fields (`see here <../disagg.html>`_ ) is done as well as the calculation of vertical velocity (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Eventually, the GRIB fields are merged together such that a single grib file per time step is available with all fields for ``FLEXPART``. Since model level fields are typically in *GRIB2* format whereas surface level fields are still in *GRIB1* format, they can be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties reading pure *GRIB2* files since some parameter IDs change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files can be communicated to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally the progress of the script can be followed with the usual Linux tools.
     23    1. Setup of control data:
     24        Command-line and ``CONTROL``-file parameters are read, as well as (optionally) the ECMWF user credentials. Depending the :doc:`app_modes`, a job script might be prepared which is then sent to the ECMWF queue.
     25    2. Retrieval of data from MARS:
     26        MARS requests are created either on the local machine or on the ECMWF server and then submitted which retrieve the data and store them in GRIB files. If the parameter ``REQUEST`` was set ``1``, the data are not retrieved and instead a file ``mars_requests.csv`` is created, which contains a list of the MARS requests and their settings. If ``REQEST`` is set to ``2``, the csv file is created in addition to retrieving the data. The requests are created in an optimised way by splitting with respect to time, jobs  and parameters.   
     27    3. Post-processing of data to create final ``FLEXPART`` input files:
     28        After all data have been retrieved, flux fields are disaggregated (`see here <../disagg.html>`_ ) and the vertical velocity is calculated (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Finally, the GRIB fields are merged into a single grib file per time step containing all the fields for ``FLEXPART``. Since model-level fields are typically in *GRIB2* format, whereas surface-level fields are still in *GRIB1* format, they will be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties to read these *GRIB2* files since some parameter IDs have been change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files will be sent to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally, the progress of the script can be followed with the usual Linux tools.
    2929
    3030
     
    3333========================================
    3434
    35 More details on how different the program flow is for the different :doc:`app_modes` is sketched in the following diagrams
     35The following diagrams show how different the program flow is for the different :doc:`app_modes`
    3636
    3737+-------------------------------------------------+------------------------------------------------+
  • For_developers/Sphinx/source/Documentation/api.rst

    rba99230 rf20342a  
    11****************************
    2 Auto Generated Documentation
     2Auto-generated documentation
    33****************************
    44   
  • For_developers/Sphinx/source/Documentation/disagg.rst

    rd9abaac rf20342a  
    11***************************
    2 Disaggregation of Flux Data
     2Disaggregation of flux data
    33***************************
    44   
    5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational particles in time and space. This method requires point values in the discrete input fields. However, flux data (as listed in table :ref:`ref-table-fluxpar`) from the ECMWF represent cell averages or integrals and are accumulated over a specific time interval, depending on the dataset. Hence, to conserve the integral quantity with ``FLEXPART``'s linear interpolation a pre-processing scheme has to be applied.
     5``FLEXPART`` interpolates meteorological input data linearly to the position of computational
     6particles in time and space. This method requires point values in the discrete input fields.
     7However, flux data (as listed in table :ref:`ref-table-fluxpar` below) from the ECMWF represent cell
     8averages or integrals and are accumulated over a specific time interval, depending on the data
     9set. Hence, to conserve the integral quantity with the linear interpolation used in ``FLEXPART``,
     10pre-processing has to be applied.
    611
    712.. _ref-table-fluxpar:
    813
    9 .. csv-table:: flux fields
     14.. csv-table:: Flux fields
    1015    :header: "Short Name", "Name", "Units", "Interpolation Type"
    1116    :align: center
     
    2025   
    2126
    22 The first step is to *de-accumulate* the fields in time so that each value represents an integral in x, y, t space.
    23 Afterwards, a *disaggregation* scheme is applied which means to break down the integral value into point values.
    24 In order to be able to carry out the disaggregation procedure proposed by Paul James, additional flux data is retrieved automatically for one day at the beginning and one day at the end of the period specified. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not communicated to the final ``FLEXPART`` input files.
     27The first step is to *de-accumulate* the fields in time so that each value represents non-overlapping integrals in x-, y-, and t-space.
     28Afterwards, a *disaggregation* scheme is applied which means to convert the integral value to corresponding point values to be used late for the interpolation.
     29The disaggregation procedure as proposed by Paul James (currently, the standard) requires additional flux data for one day at the beginning and one day at the end of the period specified.
     30They are retrieved automatically. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not contained in the final ``FLEXPART`` input files.
    2531
    26 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note, that the first two and last two flux files do not contain any data.
     32The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note that the first two and last two flux files do not contain any data.
    2733
    2834.. note::
    2935
    30     Note also that for operational retrievals (``BASETIME`` set to 00 or 12) forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on demand scripts.       
     36    Note also that for operational retrievals (``BASETIME`` set to 00 or 12), forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on-demand scripts.       
    3137       
    3238
     
    3440--------------------------------------------------
    3541
    36 In ``flex_extract`` up to version 5 the disaggregation was done with a Fortran program (FLXACC2). In version 6 this part was converted to Python.
     42In ``flex_extract`` up to version 5, the disaggregation was done with a Fortran program (FLXACC2). In version 6, this part was recoded in Python.
    3743
    38 
    39 In the old versions (below 7.1) a relatively simple method processes the precipitation fields in a way that is consistent with the scheme applied in ``FLEXPART`` for all variables: linear interpolation between times where input fields are available.
    40 At first the accumulated values are divided by the number of hours (i.e., 3 or 6).
     44In the old versions (below 7.1), a relatively simple method processes the precipitation fields in a way that is consistent with the linear interpolation between times where input fields are available that is applied in ``FLEXPART`` for all variables.
     45This scheme (from Paul James) at first divides the accumulated values by the number of hours (i.e., 3 or 6). ???
    4146The best option for disaggregation, which was realised, is conservation within the interval under consideration plus the two adjacent ones.
    4247Unfortunately, this leads to undesired temporal smoothing of the precipitation time series – maxima are damped and minima are raised.
     
    5358    :figclass: align-center
    5459
    55     Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by a half-time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).
     60    Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by half a time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).
    5661
    5762
    5863
    59 Disaggregation is done for 4 adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the 4 adjacent timespans.
     64Disaggregation is done for four adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the four adjacent timespans.
    6065
    6166.. math::
     
    6974
    7075
    71 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the 4 original timespans has a value below 0 it is set to 0 prior to the calculation.
     76This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the four original timespans has a value below 0, it is set to 0 prior to the calculation.
    7277   
    7378.. math::
     
    7883
    7984
    80 
    81 
    8285Disaggregation for precipitation in version 7.1
    8386-----------------------------------------------
    8487
    85 Due to the problems with generating precipitation in originally dry (or lower) intervals and the temporal smoothing a new algorithm was developed. The approach is based on a one dimensional piecewise linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional monotonicity filter helps to gain monotonicity.
    86 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the linear formulation.
    87 These requirements on the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i.e. the reconstructed values vanish throughout this whole time interval, too.
     88Due to the problems mentioned above, a new algorithm was developed. The approach is based on a one-dimensional, piecewise-linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements of preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional filter improves monotonicity.
     89The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the use of a linear formulation.
     90These requirements for the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i. e., the reconstructed values vanish throughout this whole time interval, too.
    8891In the simplest scenario of an isolated precipitation event, where in the time interval before and after the data values are zero, the reconstruction algorithm therefore has to vanish at the boundaries of the interval, too.
    8992The additional conditions of continuity and conservation of the precipitation amount then require us to introduce sub-grid points if we want to keep a linear interpolation (Fig. 2).
     
    142145
    143146
    144 In the case of the new disaggregation method for precipitation, the two new sub grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval and 1 or 2 for the two new sub grid points respectively. The filenames do not change.   
     147In the case of the new disaggregation method for precipitation, the two new sub-grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval, and 1 or 2, respectively, for the two new sub-grid points. The filenames do not change.   
    145148
    146149   
    147150.. note::
    148151
    149     The new method for disaggregation was published in the Geoscientific Model Development Journal in 2018:
     152    The new method for disaggregation was published in the journal Geoscientific Model Development in 2018:
    150153   
    151154    Hittmeir, S., Philipp, A., and Seibert, P.: A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART, Geosci. Model Dev., 11, 2503-2523, https://doi.org/10.5194/gmd-11-2503-2018, 2018.
    152155
    153      
    154    
    155 
    156  
    157156
    158157
    159 Disaggregation for the rest of the flux fields
     158
     159Disaggregation for the other flux fields
    160160----------------------------------------------
    161161     
    162162The accumulated values for the other variables are first divided by the number of hours and
    163 then interpolated to the exact times X using a bicubic interpolation which conserves the integrals of the fluxes within each timespan.
    164 Disaggregation is done for 4 adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which generates a new, disaggregated value which is output at the central point of the 4 adjacent timespans.
     163then interpolated to the exact times using a bicubic interpolation which conserves the integrals of the fluxes within each timespan.
     164Disaggregation is done for four adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which produces a new, disaggregated value that is the output at the central point of the four adjacent timespans.
    165165
    166166.. math::
  • For_developers/Sphinx/source/Documentation/input.rst

    rb1674ed rf20342a  
    11********************
    2 Control & Input Data
     2Control & input data
    33********************
    44
    5 Input Data
     5Input data
    66    - :doc:`Input/control`
    7           ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual dataset to be retrieved. They are initialized by ``flex_extract`` with their default values and can be overwritten with definitions set in the so called :doc:`Input/control`.
     7          ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialised by ``flex_extract`` with certain default values which can be overwritten with definitions set in the so-called :doc:`Input/control`.
    88
    9           To be able to successfully retrieve data from the ECMWF Mars archive it is necessary to understand these parameters and set them to proper and consistent values. They are described in :doc:`Input/control_params` section.
     9          For a successfull retrieval of data from the ECMWF MARS archive it is necessary to understand these parameters and to set them to proper and consistent values. They are described in :doc:`Input/control_params` section.
    1010
    11           We also have some :doc:`Input/examples` and description of :doc:`Input/changes` changes to previous versions and downward compatibilities.
     11          Furthermore, some :doc:`Input/examples` are provided, and in :doc:`Input/changes` changes to previous versions and downward compatibilities are described.
    1212       
    1313    - :doc:`Input/ecmwf_env`
    14          For ``flex_extract`` it is necessary to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.
     14         ``flex_extract`` needs to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.
    1515
    1616    - :doc:`Input/templates`
    17          A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt for example the jobscripts regarding its settings for the batch jobs.         
    18 
    19 
     17         A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt, for example, the job scripts with regard to the settings for the batch jobs.         
    2018
    2119
     
    2927
    3028Controlling
    31     The main tasks and behaviour of ``flex_extract`` are controlled by its Python scripts. There are two top-level scripts, one for installation called install_ and one for execution called submit_.
    32     They can interpret a number of command line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:
     29    The main tasks and the behaviour of ``flex_extract`` are controlled by the Python scripts. There are two top-level scripts, one for installation called install_, and one for execution called submit_.
     30    They interpret a number of command-line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:
    3331
    3432    .. code-block:: bash
     
    3836       python3 Source/Python/submit.py --help
    3937   
    40     In this new version we provide also the wrapping Shell scripts setup_ and run_, which sets the command line parameters, do some checks and execute the corresponing Python scripts ``install.py`` and ``submit.py`` respectivley.
    41      
    42     It might be faster and easier for beginners. See :doc:`../quick_start` for information on how to use them.
     38    With version 7.1, we provide also wrapper shell scripts setup_ and run_ which set the command-line parameters, do some checks, and execute the corresponing Python scripts ``install.py`` and ``submit.py``, respectively.
     39     It might be faster and easier for beginners if they are used. See :doc:`../quick_start` for information on how to use them.
    4340
    44     Additionally, ``flex_extract`` creates the Korn Shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be send to the ECMWF serves in the **remote mode** and the **gateway mode** for starting batch jobs.
     41    ``flex_extract`` also creates the Korn shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be sent to the ECMWF servers in the **remote mode** and the **gateway mode** for starting batch jobs.
    4542
    46     The Fortran program will be compiled during the installation process by the :doc:`Input/fortran_makefile`.
     43    The Fortran program is compiled during the installation process using the :doc:`Input/fortran_makefile`.
    4744   
    48     To sum up, the following scripts controls ``flex_extract``:
     45    To sum up, the following scripts control ``flex_extract``:
    4946
    5047    Installation
  • For_developers/Sphinx/source/Documentation/output.rst

    rb1674ed rf20342a  
    11***********
    2 Output Data
     2Output data
    33***********
    44
    5 The output data of ``flex_extract`` are separated mainly into temporary files and the final ``FLEXPART`` input files:
     5The output data of ``flex_extract`` can be divided into the final ``FLEXPART`` input files and  temporary files:
    66
    77+-----------------------------------------------+----------------------------------------------+   
    88|   ``FLEXPART`` input files                    |  Temporary files (saved in debug mode)       |
    99+-----------------------------------------------+----------------------------------------------+
    10 | - Standard output filenames                   | - MARS request file (opt)                    |
     10| - Standard output file names                  | - MARS request file (optional)               |
    1111| - Output for pure forecast                    | - flux files                                 |
    1212| - Output for ensemble members                 | - VERTICAL.EC                                |
     
    2121========================
    2222
    23 The final output files of ``flex_extract`` are also the meteorological ``FLEXPART`` input files.
    24 The naming of these files depend on the kind of data extracted by ``flex_extract``.
     23The final output files of ``flex_extract`` are the meteorological input files for ``FLEXPART``.
     24The naming convention for these files depends on the kind of data extracted by ``flex_extract``.
    2525
    2626Standard output files
    2727---------------------
    2828 
    29 In general, there is a file for each time step with the filename format:
     29In general, there is one file for each time named:
    3030
    3131.. code-block:: bash
     
    3333    <prefix>YYMMDDHH
    3434   
    35 The ``prefix`` is by default defined as ``EN`` and can be re-defined in the ``CONTROL`` file.
    36 Each file contains all meteorological fields needed by ``FLEXPART`` for all selected model levels for a specific time step.
    37 
    38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` where we extracted only the lowest model level for demonstration reasons:
     35where YY are the last two digits of the year, MM is the month, DD the day, and HH the hour (UTC). <prefix> is by default defined as EN, and can be re-defined in the ``CONTROL`` file.
     36Each file contains all meteorological fields at all levels as needed by ``FLEXPART``, valid for the time indicated in the file name.
     37
     38Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` (where we extracted only the lowest model level for demonstration purposes):
    3939
    4040.. code-block:: bash
     
    8484------------------------------
    8585
    86 ``Flex_extract`` can retrieve forecasts which can be longer than 23 hours. To avoid collisions of time steps for forecasts of more than one day a new scheme for filenames in pure forecast mode is introduced:
     86``Flex_extract`` is able to retrieve forecasts with a lead time of more than 23 hours. In order to avoid collisions of time steps names, a new scheme for filenames in pure forecast mode is introduced:
    8787
    8888.. code-block:: bash
     
    9090    <prefix>YYMMDD.HH.<FORECAST_STEP>
    9191
    92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3 digit number which represents the forecast step in hours.
     92The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3-digit number which represents the forecast step in hours.
    9393   
    9494
     
    9696-------------------------------------
    9797
    98 Ensembles can be retrieved and are addressed by the grib message parameter ``number``. The ensembles are saved per file and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3 digit format.
     98``Flex_extract`` is able to retrieve ensembles data; they are labelled by the grib message parameter ``number``. Each ensemble member is saved in a separate file, and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3-digit format.
    9999
    100100.. code-block:: bash
     
    106106-------------------------------------------------------
    107107
    108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type. They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`.
    109 The two additional fields are marked with the ``step`` parameter in the Grib messages and are set to "1" and "2" for sub-grid point 1 and 2 respectively.
    110 The output filenames do not change in this case. 
    111 Below is an example list of precipitation fields in an output file generated with the new disaggregation method:
     108The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type (large-scale and convective). They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`.
     109The two additional fields are addressed using the ``step`` parameter in the GRIB messages, which
     110is set to "1" or "2", for sub-grid points 1 and 2, respectively.
     111The output file names are not altered. 
     112An example of the list of precipitation fields in an output file generated with the new disaggregation method is found below:
    112113
    113114.. code-block:: bash
     
    129130===============
    130131
    131 ``Flex_extract`` works with a number of temporary data files which are usually deleted after a successful data extraction. They are only stored if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).
     132``Flex_extract`` creates a number of temporary data files which are usually deleted at the end of a successful run. They are preserved only if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).
    132133
    133134MARS grib files
     
    135136
    136137``Flex_extract`` retrieves all meteorological fields from MARS and stores them in files ending with ``.grb``.
    137 Since the request times and data transfer of MARS access are limited and ECMWF asks for efficiency in requesting data from MARS, ``flex_extract`` splits the overall data request in several smaller requests. Each request is stored in an extra ``.grb`` file and the file names are put together by several pieces of information:
     138Since there are limits implemented by ECMWF for the time per request and data transfer from MARS,
     139and as ECMWF asks for efficient MARS retrievals, ``flex_extract`` splits the overall data request
     140into several smaller requests. Each request is stored in its own ``.grb`` file, and the file
     141names are composed of several pieces of information:
    138142
    139143    .. code-block:: bash
     
    144148       
    145149Field type:
    146     ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4d variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast
     150    ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4D variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast
    147151Grid type:
    148    ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat/lon), ``_OROLSM`` - Orography parameter
     152   ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat / lon), ``_OROLSM`` - Orography parameter
    149153Temporal property:
    150154    ``__`` - instantaneous fields, ``_acc`` - accumulated fields
    151155Level type:
    152     ``ML`` - Model Level, ``SL`` - Surface Level
     156    ``ML`` - model level, ``SL`` - surface level
    153157ppid:
    154     The process number of the parent process of submitted script.
     158    The process number of the parent process of the script submitted.
    155159pid:
    156     The process number of the submitted script.
    157 
    158 The process ids should avoid mixing of fields if several ``flex_extract`` jobs are performed in parallel (which is, however, not recommended). The date format is YYYYMMDDHH.
    159 
    160 Example ``.grb`` files for a day of CERA-20C data:
     160    The process number of the script submitted.
     161
     162
     163Example ``.grb`` files for one day of CERA-20C data:
    161164
    162165    .. code-block:: bash
     
    172175-----------------
    173176
    174 This file is a ``csv`` file called ``mars_requests.csv`` with a list of the actual settings of MARS request parameters (one request per line) in a flex_extract job. It is used for documenting the data which were retrieved and for testing reasons.
    175 
    176 Each request consist of the following parameters, whose meaning mainly can be taken from :doc:`Input/control_params` or :doc:`Input/run`:
     177This file is a ``csv`` file called ``mars_requests.csv`` listing the actual settings of the MARS
     178request (one request per line) in a flex_extract job.
     179It is used for documenting which data were retrieved, and for testing.
     180
     181Each request consists of the following parameters, whose meaning mostly can be taken from :doc:`Input/control_params` or :doc:`Input/run`:
    177182request_number, accuracy, area, dataset, date, expver, gaussian, grid, levelist, levtype, marsclass, number, param, repres, resol, step, stream, target, time, type
    178183 
    179 Example output of a one day retrieval of CERA-20c data:
     184Example output of a one-day retrieval of CERA-20C data:
    180185
    181186.. code-block:: bash
     
    192197-----------
    193198
    194 The vertical discretization of model levels. This file contains the ``A`` and ``B`` parameters to calculate the model level height in meters.
     199This file contains information describing the vertical discretisation (model levels)
     200in form of the ``A`` and ``B`` parameters which allow to calculate the actual pressure of a model level from the surface pressure.
    195201
    196202
     
    198204----------
    199205
    200 This file is usually called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords.
    201 
    202 
    203 flux files
     206This file is called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords.
     207#PS NEEDS MORE DESCRIPTION
     208
     209
     210Flux files
    204211----------
    205212
    206 The flux files contain the de-accumulated and dis-aggregated flux fields of large scale and convective precipitation, eastward turbulent surface stress, northward turbulent surface stress, surface sensible heat flux and the surface net solar radiation.
     213The flux files contain the de-accumulated and dis-aggregated flux fields of large-scale and convective precipitation, east- and northward turbulent surface stresses, the surface sensible heat flux, and the surface net solar radiation.
    207214
    208215.. code-block:: bash
     
    210217    flux<date>[.N<xxx>][.<xxx>]
    211218
    212 The date format is YYYYMMDDHH. The optional block ``[.N<xxx>]`` marks the ensemble forecast number, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.
     219The date format is YYYYMMDDHH as explained before. The optional block ``[.N<xxx>]`` is used for the ensemble forecast date, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.
    213220
    214221.. note::
    215222
    216     In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval and ``1`` or ``2`` for the two new intervals respectively.
     223    In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval, and ``1`` or ``2``,  respectively, for the two new intervals.
    217224
    218225   
     
    226233    fort.xx
    227234   
    228 where ``xx`` is the number which defines the meteorological fields stored in these files.
    229 They are generated by the Python part of ``flex_extract`` by just splitting the meteorological fields for a unique time stamp from the ``*.grb`` files into the ``fort`` files.
    230 The following table defines the numbers with their corresponding content.   
     235where ``xx`` is a number which defines the meteorological fields stored in these files.
     236They are generated by the Python code in ``flex_extract`` by splitting the meteorological fields for a unique time stamp from the ``*.grb`` files, storing them under the names ``fort.<XX>`` where <XX> represents some number.
     237The following table defines the numbers and the corresponding content:   
    231238
    232239.. csv-table:: Content of fort - files
     
    240247    "16", "surface fields"
    241248    "17", "specific humidity"
    242     "18", "surface specific humidity (reduced gaussian)"
    243     "19", "vertical velocity (pressure) (optional)"
     249    "18", "surface specific humidity (reduced Gaussian grid)"
     250    "19", "omega (vertical velocity in pressure coordinates) (optional)"
    244251    "21", "eta-coordinate vertical velocity (optional)"
    245     "22", "total cloud water content (optional)"
    246 
    247 Some of the fields are solely retrieved with specific settings, e.g. the eta-coordinate vertical velocity is not available in ERA-Interim datasets and the total cloud water content is an optional field for ``FLEXPART v10`` and newer.
     252    "22", "total cloud-water content (optional)"
     253
     254Some of the fields are solely retrieved with specific settings, e. g., the eta-coordinate vertical velocity is not available in ERA-Interim datasets, and the total cloud-water content is an optional field which is useful for ``FLEXPART v10`` and newer.
    248255
    249256The ``calc_etadot`` program saves its results in file ``fort.15`` which typically contains:
     
    259266.. note::
    260267 
    261     The ``fort.4`` file is the namelist file to drive the Fortran program ``calc_etadot``. It is therefore also an input file.
     268    The ``fort.4`` file is the namelist file to control the Fortran program ``calc_etadot``. It is therefore also an input file.
    262269   
    263270    Example of a namelist:
  • For_developers/Sphinx/source/Documentation/overview.rst

    rb1674ed rf20342a  
    33========
    44
    5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) Mars archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` Atmospheric Transport Modelling system.
    6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who wants to use meteorological data from ECMWF to drive the ``FLEXPART`` model.
    7 The software retrieves the minimal number of parameters ``FLEXPART`` needs to work and provides the data in the explicity format ``FLEXPART`` understands.
     5``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) MARS archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` atmospheric transport modelling system.
     6``Flex_extract`` was created explicitly for ``FLEXPART`` users who want to use meteorological data from ECMWF to drive the ``FLEXPART`` model.
     7The software retrieves the minimum set of parameters needed by ``FLEXPART`` to work, and provides the data in the specific format required by ``FLEXPART``.
    88
    9 ``Flex_extract`` consists of 2 main parts:
    10     1. a Python part, where the reading of parameter settings, retrieving data from MARS and preparing the data for ``FLEXPART`` is done and
    11     2. a Fortran part, where the calculation of the vertical velocity is done and if necessary the conversion from spectral to regular latitude/longitude grids.
     9``Flex_extract`` consists of two main parts:
     10    1. a Python part which reads the parameter settings, retrieves the data from MARS, and prepares them for ``FLEXPART``, and
     11    2. a Fortran part which calculates the vertical velocity and, if necessary, converts variables from the spectral representation to regular latitude/longitude grids.
    1212
    13 Additionally, it has some Korn shell scripts which are used to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.   
     13In addition, there are some Korn shell scripts to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.   
    1414
    1515A number of Shell scripts are wrapped around the software package for easy installation and fast job submission.
    1616
    17 The software depends on a number of third-party libraries which can be found in :ref:`ref-requirements`.
     17The software depends on some third-party libraries as listed in :ref:`ref-requirements`.
    1818
    19 Some details on the tasks and program worksteps are described in :doc:`Overview/prog_flow`.
     19Details of the tasks and program work steps are described in :doc:`Overview/prog_flow`.
    2020
    2121
  • For_developers/Sphinx/source/Documentation/vertco.rst

    rb1674ed rf20342a  
    11*******************
    2 Vertical Coordinate
     2Vertical wind
    33*******************
    44       
    5 Calculation of vertical velocity and preparation of Output-files
     5Calculation of vertical velocity and preparation of output files
    66================================================================
    77
    8 ``flex_extract`` has two ways to calculate the vertical velocity for ``FLEXTRA``/``FLEXPART``:
     8Two methods are provided in ``flex_extract`` for the calculation of the vertical velocity for ``FLEXTRA``/``FLEXPART``:
    99    (i) from the horizontal wind field,
    10     (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**.
     10    (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**, which contains the vertical velocity directly in the eta coordinate system of the ECMWF model.
    1111
    1212Especially for high resolution data, use of the ``MARS`` parameter 77 is recommended,
     
    2020   
    2121   
    22 Calculation of vertical velocity from horizontal wind using the continuity equation
     22Calculation of the vertical velocity from the horizontal wind using the continuity equation
    2323===================================================================================
    2424
    25 The vertical velocity is computed by the FORTRAN90 program ``calc_etadot`` in the ECMWF
    26 vertical coordinate system by applying the equation of continuity and thereby ensuring mass consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the
     25The vertical velocity in the ECMWF's eta vertical coordinate system is computed by the Fortran program ``calc_etadot``, using the continuity equation and thereby ensuring mass-consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the
    2726documents v20_update_protocol.pdf, V30_update_protocol.pdf and
    2827V40_update_protocol.pdf. The computational demand and accuracy of ``calc_etadot`` is highly
     
    3029following guidance can be given for choosing the right parameters:
    3130
    32     * For very fine output grids (0.25 degree or finer) the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279  needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.
     31    * For very fine output grids (0.25 degree or finer), the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279  needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.
    3332    * For lower resolution (often global) output grids, calculation of vertical velocities with lower than operational spectral resolution is recommended. For global grids the following settings appear optimal:
    3433        - For 1.0 degree grids: ``GAUSS=1``, ``RESOL=255``, ``SMOOTH=179``
    3534        - For 0.5 degree grids: ``GAUSS=1``, ``RESOL=399``, ``SMOOTH=359``
    3635        - Calculation on the lat/lon grid is not recommended for less than the operational (T1279) resolution.   
    37         - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279, 2047 in future models). This choice is restricted because a reduced Gaussian grid is defined in then ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.
    38         - For ``SMOOTH`` any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if vertical velocity is calculated on lat\/lon grid (``GAUSS=0``).
    39     * The on demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however.
     36        - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279; 2047 in future model versions). This choice is restricted because a reduced Gaussian grid is defined in the ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.
     37        - For ``SMOOTH``, any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if the vertical velocity is calculated on a lat\/lon grid (``GAUSS=0``).
     38    * The on-demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however.
    4039    * Regional grids are not cyclic in zonal directions, but global grids are. The software assumes a cyclic grid if ``RIGHT``-``LEFT`` is equal to ``GRID`` or is equal to ``GRID``-360.
    41     * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` for application in ATM modelling. If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and the surface pressure tendency. ``OMEGADIFF`` should be zero except for debugging, since it triggers expensive calculations on the Gaussian grid.
     40    * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` (the standard ``flex_extract`` output files) If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and from the surface pressure tendency. ``OMEGADIFF`` should be set to zero except for debugging, since it triggers expensive calculations on the Gaussian grid.
    4241   
    4342   
    44 Calculation of vertical velocity from pre-calculated MARS parameter 77
     43Calculation of the vertical velocity from the pre-calculated MARS parameter 77
    4544======================================================================
    4645
    47 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that this parameter is available if the ``CONTROL`` parameter ``ETA`` is set to 1.
     46Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that parameter 77 is available if the ``CONTROL`` parameter ``ETA`` is set to 1.
    4847
    4948It is recommended to use the pre-calculated parameter 77 by setting ``ETA`` to 1 whenever possible.
    5049
    51 Setting parameter ``ETA`` to 1 normally disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.
     50Setting the parameter ``ETA`` to 1 disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.
    5251
    5352.. note::
    54    However, the calculation on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.
     53   However, the calculations on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case, ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.
    5554
    5655The parameters ``RESOL``, ``GRID``, ``UPPER``, ``LOWER``, ``LEFT``, ``RIGHT`` still apply. As for calculations on the Gaussian grid, the spectral resolution parameter ``RESOL`` should be compatible with the grid resolution (see previous subsection).
Note: See TracChangeset for help on using the changeset viewer.
hosted by ZAMG