Changeset f20342a in flex_extract.git


Ignore:
Timestamp:
May 27, 2020, 8:01:54 PM (4 years ago)
Author:
Petra Seibert <petra.seibert [at) univie.ac.at>
Branches:
master, ctbto, dev
Children:
550435b
Parents:
a14839a
Message:

Language corrections for the Sections Developers, Support, Changelog, and the home directory (index.html)

further improvment of documentation, close to final

Files:
36 edited

Legend:

Unmodified
Added
Removed
  • Documentation/html/Documentation/Api/api_fortran.html

    rb1674ed rf20342a  
    184184  <div class="section" id="fortran-s-auto-generated-documentation">
    185185<h1>Fortran’s Auto Generated Documentation<a class="headerlink" href="#fortran-s-auto-generated-documentation" title="Permalink to this headline">¶</a></h1>
    186 <p>Link to other documentation!</p>
    187 <p>…. f:autoprogram:: preconvert</p>
     186<p><a class="reference external" href="Fortran/index.html">Fortran API</a></p>
    188187<div class="toctree-wrapper compound">
    189188</div>
  • For_developers/Sphinx/source/Documentation/Api/api_fortran.rst

    rba99230 rf20342a  
    11**************************************
    2 Fortran's Auto Generated Documentation
     2Auto-generated documentation for the Fortran programme
    33**************************************
    44
     
    66    :local:
    77   
    8    
    9    
    10 Link to other documentation!
     8   
     9`Fortran API <Fortran/index.html>`_
    1110
    12 
    13 
    14 
    15 .... f:autoprogram:: preconvert   
    16    
    17    
    18    
    1911   
    2012.. toctree::
    2113    :hidden:
    2214    :maxdepth: 2
    23    
    24    
    25 
    26    
  • For_developers/Sphinx/source/Documentation/Api/api_python.rst

    rba99230 rf20342a  
    11*************************************
    2 Python's Auto Generated Documentation
     2Auto-generated documentation for the Python scripts
    33*************************************
    44
  • For_developers/Sphinx/source/Documentation/Input/changes.rst

    rba99230 rf20342a  
    88    - comments available with ``#``
    99    - only parameters which are needed to override the default values are necessary
    10     - number of type/step/time elements do not have to be 24 any more. Just select the interval you need.
    11     - the ``dtime`` parameter needs to be consistent with ``type/step/time``. For example ``dtime`` can be coarser as ``time`` intervals are available, but not finer.
     10    - number of type/step/time elements does not have to be 24 anymore. Just provide what you need.
     11    - the ``dtime`` parameter needs to be consistent with ``type/step/time``, for example, ``dtime`` can be coarser than the ``time`` intervals available, but not finer.
    1212
    1313 
  • For_developers/Sphinx/source/Documentation/Input/compilejob.rst

    rb1674ed rf20342a  
    11********************************************
    2 The Compilation Jobscript ``compilejob.ksh``
     2The compilation job script ``compilejob.ksh``
    33********************************************
    44
    5 The compilejob is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.
     5The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.
    66
    7 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate
     7``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate
    88the Korn-shell script from the template files by substituting the individual parameters.
    99These individual parameters are marked by a doubled ``$`` sign in ``job.temp``.
    1010
    11 The jobscript has a number of settings for the batch system which are fixed and differentiates between the *ecgate* and the *cca/ccb*
     11The job script has a number of settings for the batch system which are fixed, and it differentiates between the *ecgate* and the *cca/ccb*
    1212server system to load the necessary modules for the environment when submitted to the batch queue.
    1313
     
    1919------------------------------------
    2020
    21  #. It sets necessary batch system parameters
     21 #. It sets the necessary batch-system parameters
    2222 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules
    23  #. It sets some environment variabels for the single session
     23 #. It sets some environment variables for the single session
    2424 #. It creates the ``flex_extract`` root directory in the ``$HOME`` path of the user
    25  #. It untars the tar-ball into the root directory.
    26  #. It compiles the Fortran programs's ``Makefile``.
    27  #. At the end it checks if the script returned an error or not and send the log file via email to the user.
     25 #. It untars the tarball into the root directory.
     26 #. It compiles the Fortran program using ``Makefile``.
     27 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user.
    2828
    2929
  • For_developers/Sphinx/source/Documentation/Input/control.rst

    rb1674ed rf20342a  
    1010 
    1111This file is an input file for :literal:`flex_extract's` main script :literal:`submit.py`.
    12 It contains the controlling parameters :literal:`flex_extract` needs to decide on dataset specifications,
    13 handling of the retrieved data and general bahaviour. The naming convention is usually (but not necessary):
     12It contains the controlling parameters which :literal:`flex_extract` needs to decide on data set specifications,
     13handling of the  data retrieved, and general behaviour. The naming convention is usually (but not necessarily):
    1414
    1515   :literal:`CONTROL_<Dataset>[.optionalIndications]`
    1616
    17 The tested datasets are the operational dataset and the re-analysis datasets CERA-20C, ERA5 and ERA-Interim.
    18 The optional extra indications for the re-analysis datasets mark the files for *public users*
    19 and *global* domain. For the operational datasets (*OD*) the file names contain also information of
    20 the stream, the field type for forecasts, the method for extracting the vertical coordinate and other things like time or horizontal resolution.
     17There are a number of data sets for which the procedures have been tested, the operational data and the re-analysis datasets CERA-20C, ERA5, and ERA-Interim.
     18The optional indications for the re-analysis data sets mark the files for *public users*
     19and *global* domain. For the operational data sets (*OD*), the file names contain also information of
     20the stream, the field type for forecasts, the method for extracting the vertical wind, and other information such as temporal or horizontal resolution.
    2121
    2222
     
    2424----------------------------------
    2525The first string of each line is the parameter name, the following string(s) (separated by spaces) is (are) the parameter values.
    26 The parameters can be sorted in any order with one parameter per line.
     26The parameters can be listed in any order with one parameter per line.
    2727Comments are started with a '#' - sign. Some of these parameters can be overruled by the command line
    2828parameters given to the :literal:`submit.py` script.
    29 All parameters have default values. Only those parameters which have to be changed
    30 must be listed in the :literal:`CONTROL` files.
     29All parameters have default values; only those parameters which deviate from default
     30have be listed in the :literal:`CONTROL` files.
    3131
    3232
     
    3535
    3636A number of example files can be found in the directory :literal:`flex_extract_vX.X/Run/Control/`.
    37 They can be used as a template for adaptations and understand what's possible to
    38 retrieve from ECMWF's archive.
    39 For each main dataset there is an example and additionally some variances in resolution, type of field or type of retrieving the vertical coordinate.
     37They can be used as a template for adaptation, and to understand what can be
     38retrievee from ECMWF's archives.
     39There is an example for each main data set, and in addition, some more varied with respect to resolution, type of field, or way of retrieving the vertical wind.
    4040
    4141
     
    4545------------
    4646The file :literal:`CONTROL.documentation` documents the available parameters
    47 in grouped sections with their default values. In :doc:`control_params` you can find a more
    48 detailed description with additional hints, possible values and some useful information about
     47in grouped sections together with their default values.
     48In :doc:`control_params`, you can find a more
     49detailed description with additional hints, possible values, and further information about
    4950the setting of these parameters.
    5051
  • For_developers/Sphinx/source/Documentation/Input/control_params.rst

    rb1674ed rf20342a  
    1111************
    1212   
    13 .. exceltable:: User parameter in CONTROL file
     13.. exceltable:: User parameters in CONTROL file
    1414   :file: ../../_files/CONTROLparameter.xls
    1515   :sheet: UserSection
     
    2121***************
    2222
    23 .. exceltable:: General parameter in CONTROL file
     23.. exceltable:: General parameters in CONTROL file
    2424   :file: ../../_files/CONTROLparameter.xls
    2525   :sheet: GeneralSection
     
    3131************
    3232   
    33 .. exceltable:: Time parameter in CONTROL file
     33.. exceltable:: Time parameters in CONTROL file
    3434   :file: ../../_files/CONTROLparameter.xls
    3535   :sheet: TimeSection 
     
    4242************
    4343   
    44 .. exceltable:: Data parameter in CONTROL file
     44.. exceltable:: Data parameters in CONTROL file
    4545   :file: ../../_files/CONTROLparameter.xls
    4646   :sheet: DataSection
     
    5353******************
    5454   
    55 .. exceltable:: Data field parameter in CONTROL file
     55.. exceltable:: Data field parameters in CONTROL file
    5656   :file: ../../_files/CONTROLparameter.xls
    5757   :sheet: DatafieldsSection
     
    6464*****************
    6565
    66 .. exceltable:: Flux data parameter in CONTROL file
     66.. exceltable:: Flux data parameters in CONTROL file
    6767   :file: ../../_files/CONTROLparameter.xls
    6868   :sheet: FluxDataSection
     
    7575**************
    7676   
    77 .. exceltable:: Domain parameter in CONTROL file
     77.. exceltable:: Domain parameters in CONTROL file
    7878   :file: ../../_files/CONTROLparameter.xls
    7979   :sheet: DomainSection
     
    9999***********************
    100100   
    101 .. exceltable:: Additional data parameter in CONTROL file
     101.. exceltable:: Additional data parameters in CONTROL file
    102102   :file: ../../_files/CONTROLparameter.xls
    103103   :sheet: AddDataSection
  • For_developers/Sphinx/source/Documentation/Input/ecmwf_env.rst

    rb1674ed rf20342a  
    11****************************************
    2 ECMWF User Credential file ``ECMWF_ENV``
     2ECMWF user credential file ``ECMWF_ENV``
    33****************************************
    44
     
    1616------------------------
    1717
    18 The following shows an example of the content of an ``ECMWF_ENV`` file:
     18An example of the content of an ``ECMWF_ENV`` file is shown below:
    1919 
    2020.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/examples.rst

    rb1674ed rf20342a  
    33**********************
    44
    5 ``Flex_extract`` has a couple of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.
     5``Flex_extract`` comes with a number of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.
    66
    7 Here is a list of the example files and a description of the data set:
     7Here is a list of the example files:
    88
    99CONTROL.documentation
    10    This file is not intended to be used with ``flex_extract``. It has a list of all possible parameters and their default values for a quick overview.
     10   This file is not intended to be used with ``flex_extract``. It just contains a list of all possible parameters and their default values for a quick overview.
    1111   
    1212.. code-block:: bash
     
    3333        CONTROL_OD.OPER.FC.gauss.highres
    3434        CONTROL_OD.OPER.FC.operational
    35         CONTROL_OD.OPER.FC.twiceaday.1hourly
    36         CONTROL_OD.OPER.FC.twiceaday.3hourly
     35        CONTROL_OD.OPER.FC.twicedaily.1hourly
     36        CONTROL_OD.OPER.FC.twicedaily.3hourly
    3737
    38    
     38   #PS some information to be added.
     39 
    3940.. toctree::
    4041    :hidden:
  • For_developers/Sphinx/source/Documentation/Input/fortran_makefile.rst

    rb1674ed rf20342a  
    11**************************************
    2 The Fortran Makefile - ``calc_etadot``
     2The Fortran makefile for ``calc_etadot``
    33**************************************
    44
    55.. _ref-convert:
    66
    7 ``Flex_extract``'s Fortran program will be compiled during
    8 the installation process to get the executable named ``calc_etadot``.
     7The Fortran program ``calc_etadot`` will be compiled during
     8the installation process to produce the executable called ``calc_etadot``.
    99
    10 ``Flex_extract`` has a couple of ``makefiles`` prepared which can be found in the directory
    11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number.
    12 A list of these ``makefiles`` are shown below:
     10``Flex_extract`` includes several ``makefiles`` which can be found in the directory
     11``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted by the current flex_extract version number.
     12A list of these ``makefiles`` is shown below:
    1313
    1414
     
    1616| Files to be used as they are!
    1717   
    18     | **makefile_ecgate**
    19     | For the use on ECMWF's server **ecgate**.
    20 
    21     | **makefile_cray**
    22     | For the use on ECMWF's server **cca/ccb**.
     18    | **makefile_ecgate**: For  use on ECMWF's server **ecgate**.
     19    | **makefile_cray**:   For  use on ECMWF's server **cca/ccb**.
    2320   
    2421| **Local mode**
    25 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB**
     22| It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** if they don't correspond to the standard paths pre-set in the makefiles.
    2623 
    27     | **makefile_fast**
    28     | For the use with gfortran compiler and optimization mode.
     24    | **makefile_fast**:  For use with the gfortran compiler and optimisation mode.
     25    | **makefile_debug**: For use with the gfortran compiler and debugging mode. Primarily for developers.
    2926
    30     | **makefile_debug**
    31     | For the use with gfortran compiler in debugging mode.
     27If you want to use another compiler than gfortran locally, you can still take ``makefile_fast``,
     28and adapt everything that is compiler-specific in this file.
    3229
    33 
    34 For instructions on how to adapt the ``makefiles`` for the local application mode
     30For instructions on how to adapt the ``makefile`` (local application mode only),
    3531please see :ref:`ref-install-local`.
    36 
    3732
    3833   
  • For_developers/Sphinx/source/Documentation/Input/jobscript.rst

    rb1674ed rf20342a  
    11*************************
    2 The Jobscript ``job.ksh``
     2The job script ``job.ksh``
    33*************************
    44
    5 The jobscript is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.
     5The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.
    66
    7 It is based on the ``job.temp`` template file which is stored in the ``Templates`` directory.
    8 This template is by itself generated in the installation process from a ``job.template`` template file.
     7It is based on the ``job.temp`` template file stored in the ``Templates`` directory.
     8This template is generated in the installation process from a ``job.template`` template file.
    99
    10 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate
     10``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate
    1111the Korn-shell script from the template files by substituting the individual parameters.
    12 These individual parameters are marked by a doubled ``$`` sign in ``job.temp``.
     12These individual parameters are marked by ``$$`` in ``job.temp``.
    1313
    14 The jobscript has a number of settings for the batch system which are fixed and differentiates between the *ecgate* and the *cca/ccb*
     14The job script has a number of settings for the batch system which are fixed, and differentiates between the *ecgate* and the *cca/ccb*
    1515server system to load the necessary modules for the environment when submitted to the batch queue.
    1616
     
    1919
    2020
    21 What does the jobscript do?
     21What does the job script do?
    2222---------------------------
    2323
    24  #. It sets necessary batch system parameters
    25  #. It prepares the job environment at the ECMWF servers by loading the necessary library modules
    26  #. It sets some environment variabels for the single session
    27  #. It creates the directory structure in the users ``$SCRATCH`` file system
    28  #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are given to the jobscript with its default or the user defined values. It also sets the ``CONTROL`` as an environment variable.
    29  #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new pathes for input and output directory and the recently generated ``CONTROL`` file.
    30  #. At the end it checks if the script returned an error or not and send the log file via email to the user.
     24 #. It sets necessary batch system parameters.
     25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules.
     26 #. It sets some environment variables for the single session.
     27 #. It creates the directory structure in the user's ``$SCRATCH`` file system.
     28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are passed to the job script with their default or the user-defined values. It also sets ``CONTROL`` as an environment variable.
     29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new paths for input and output directories and the recently generated ``CONTROL`` file.
     30 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user.
    3131
    3232
  • For_developers/Sphinx/source/Documentation/Input/run.rst

    rb1674ed rf20342a  
    11**********************************
    2 The executable Script - ``run.sh``
     2The executable script - ``run.sh``
    33**********************************
    44
    5 The execution of ``flex_extract`` is done by the ``run.sh`` Shell script, which is a wrapping script for the top-level Python script ``submit.py``.
     5The execution of ``flex_extract`` is done by the ``run.sh`` shell script, which is a wrapper script for the top-level Python script ``submit.py``.
    66The Python script constitutes the entry point to ECMWF data retrievals with ``flex_extract`` and controls the program flow.
    77
    8 ``submit.py`` has two (three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 
    9 the command line parameters and the so-called ``ECMWF_ENV`` file. Whereby, the command line parameters will override the ``CONTROL`` file parameters.
     8``submit.py`` has two (or three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 
     9the command line parameters, and the so-called ``ECMWF_ENV`` file. Command line parameters will override parameters specified in the ``CONTROL`` file.
    1010
    11 Based on these input information ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a Web API on a local maschine or submit a jobscript to ECMWF servers and retrieve the data there with sending the files to the local system eventually.
     11Based on this input information, ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a web API on a local maschine, or submit a job script to an ECMWF server and retrieve the data there, and at the end sends the files to the local system.
    1212
    1313
    1414
    1515
    16 Submission Parameter
     16Submission parameters
    1717--------------------
    1818
    1919
    20 .. exceltable:: Parameter for Submission
     20.. exceltable:: Parameters for submission
    2121    :file:  ../../_files/SubmitParameters.xls
    2222    :header: 1 
     
    3939---------------------------------
    4040
    41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapping Shell script ``run.sh``.  This top-level script is located in
    42 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``help`` parameter we see again all possible
    43 command line parameter.
     41It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapper shell script ``run.sh``.  This top-level script is located in
     42``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter
     43we see again all possible command line parameters.
    4444
    4545.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/setup.rst

    rb1674ed rf20342a  
    11**************************************
    2 The Installation Script - ``setup.sh``
     2The installation script - ``setup.sh``
    33**************************************
    44
    5 
    6 The installation of ``flex_extract`` is done by the Shell script ``setup.sh`` which is located in the root directory of ``flex_extract``.
    7 It calls the top-level Python script ``install.py`` which does all necessary operations to prepare the selected application environment. This includes:
    8 
    9 - preparing the file ``ECMWF_ENV`` with the user credentials for member state access to ECMWF servers (in **remote** and **gateway** mode)
     5The installation of ``flex_extract`` is done by the shell script ``setup.sh`` located in the root directory of ``flex_extract``.
     6It calls the top-level Python script ``install.py`` which does all the necessary operations to prepare the  application environment selected. This includes:
     7
     8- preparing the file ``ECMWF_ENV`` with the user credentials for member-state access to ECMWF servers (in **remote** and **gateway** mode)
    109- preparation of a compilation Korn-shell script (in **remote** and **gateway** mode)
    1110- preparation of a job template with user credentials (in **remote** and **gateway** mode)
    12 - create a tar-ball of all necessary files
    13 - copying tar-ball to target location (depending on application mode and installation path)
    14 - submit compilation script to batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar tar-ball at target location (**local mode**)
    15 - compilation of the FORTRAN90 program ``calc_etadot``
    16 
    17 
    18 The Python installation script ``install.py`` has a couple of command line arguments which are defined in ``setup.sh`` in the section labelled with "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters for his personal use. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee necessary parameters were set.
     11- create a tarball of all necessary files
     12- copying the tarball to the target location (depending on application mode and installation path)
     13- submit the compilation script to the batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar the tarball at target location (**local mode**)
     14- compilation of the Fortran program ``calc_etadot``
     15
     16
     17The Python installation script ``install.py`` has several command line arguments defined in ``setup.sh``, in the section labelled "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters according to his/her personal needs. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee that the necessary parameters were set.
    1918   
    2019After the installation process, some tests can be conducted. They are described in section :ref:`ref-testinstallfe`.
    2120
    22 The following diagram sketches the involved files and scripts in the installation process:
     21The following diagram sketches the files and scripts involved in the installation process:
    2322
    2423.. _ref-install-blockdiag:
     
    115114
    116115.. blockdiag::
    117    :caption: Diagram of data flow during the installation process. The trapezoids are input files with the light blue area being the template files. The edge-rounded, orange boxes are the executable files which start the installation process and reads the input files. The rectangular, green boxes are the output files. The light green files are files which are only needed in the remota and gateway mode.
     116   :caption: Diagram of data flow during the installation process. Trapezoids are input files with the light blue area being the template files. Round-edge orange boxes are executable files which start the installation process and read the input files. Rectangular green boxes are  output files. Light green files are  needed only in the remota and gateway mode.
    118117
    119118   blockdiag {
     
    133132.. _ref-instparams:
    134133
    135 Installation Parameter
    136 ----------------------
     134Installation parameters
     135-----------------------
    137136   
    138137.. exceltable:: Parameter for Installation
     
    155154----------------------------------
    156155
    157 It is also possible to start the installation process of ``flex_extract`` directly from command line by using the ``install.py`` script instead of the wrapping Shell script ``setup.sh``.  This top-level script is located in
    158 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``help`` parameter we see again all possible
    159 command line parameter.
     156It is also possible to start the installation process of ``flex_extract`` directly from the command line by using the ``install.py`` script instead of the wrapper shell script ``setup.sh``.  This top-level script is located in
     157``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter,
     158we see again all possible command line parameters.
    160159
    161160.. code-block:: bash
  • For_developers/Sphinx/source/Documentation/Input/templates.rst

    rb1674ed rf20342a  
    33*********
    44
    5 In ``flex_extract`` we use the Python package `genshi <https://genshi.edgewall.org/>`_ to create specific files from templates. It is the most efficient way to be able to quickly adapt e.g. the job scripts send to the ECMWF batch queue system or the namelist file für the Fortran program without the need to change the program code.
     5In ``flex_extract``, the Python package `genshi <https://genshi.edgewall.org/>`_ is used to create specific files from templates. It is the most efficient way to be able to quickly adapt, e. g., the job scripts sent to the ECMWF batch queue system, or the namelist file für the Fortran program, without the need to change the program code.
    66
    77.. note::
    8    Usually it is not recommended to change anything in these files without being able to understand the effects.
     8   Do not change anything in these files unless you understand the effects!
    99   
    10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values needs to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used the ``$`` sign needs to be doubled to `escape` and keep a single ``$`` sign as it is.
     10Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used, the ``$`` sign needs to be doubled for `escaping`.
    1111   
    12 The following templates are used and can be found in directory ``flex_extract_vX.X/Templates``:
     12The following templates are used; they can be found in the directory ``flex_extract_vX.X/Templates``:
    1313
    1414convert.nl
    1515----------
    1616
    17     This is the template for a Fortran namelist file called ``fort.4`` which will be read by ``calc_etadot``.
     17    This is the template for a Fortran namelist file called ``fort.4`` read by ``calc_etadot``.
    1818    It contains all the parameters ``calc_etadot`` needs.
    1919   
     
    5757    This template is used to create the job script file called ``compilejob.ksh`` during the installation process for the application modes **remote** and **gateway**.
    5858
    59     At the beginning some directives for the batch system are set.
    60     On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
    61     For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
    62 
    63     The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.
    64    
    65     Afterwards the installation steps as such are done. Including the generation of the root directory, putting files in place, compiling the Fortran program and sending a log file via email.
     59    At the beginning, some directives for the batch system are set.
     60    On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
     61    For the high-performance computers **cca** and **ccb**, the ``PBS`` comments are necessary;  for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
     62
     63    The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing.
     64   
     65    Afterwards, the installation steps as such are done. They included the generation of the root directory, putting files in place, compiling the Fortran program, and sending a log file by email.
    6666
    6767    .. code-block:: ksh
     
    145145    This template is used to create the actual job script file called ``job.ksh`` for the execution of ``flex_extract`` in the application modes **remote** and **gateway**.
    146146
    147     At the beginning some directives for the batch system are set.
    148     On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
    149     For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
    150 
    151     The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.
    152    
    153     Afterwards the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end a log file is send via email.
     147    At the beginning, some directives for the batch system are set.
     148    On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.
     149    For the high performance computers **cca** and **ccb**, the ``PBS`` comments are necessary;
     150    for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.
     151
     152    The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing.
     153   
     154    Afterwards, the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end, a log file is send by email.
    154155   
    155156    .. code-block:: ksh
     
    239240------------
    240241
    241     This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. A couple of parameters are set in this process, such as the user credentials and the ``flex_extract`` version number.
     242    This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. Several parameters are set in this process, such as the user credentials and the ``flex_extract`` version number.
    242243       
    243244    .. code-block:: ksh
     
    325326
    326327
    327 
    328 
    329 
    330 
    331  
    332 
    333 
    334 
    335 
    336 
    337    
    338    
    339 
    340  
    341    
    342328   
    343329
  • For_developers/Sphinx/source/Documentation/Overview/app_modes.rst

    rb1674ed rf20342a  
    11*****************
    2 Application Modes
     2Application modes
    33*****************
    44
     
    1313.. _ref-app-modes:
    1414
    15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has 4 different :underline:`user application modes`:
     15Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has four different :underline:`user application modes`:
    1616
    1717.. _ref-remote-desc:
    1818
    1919  1. Remote (member)
    20       In the **Remote mode** the user works directly on ECMWF Linux member state server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the additional third-party libraries mentioned in :ref:`ref-requirements` as ECMWF provides everything with environment modules. The module selection will be done automatically in ``flex_extract``.
     20      In the **Remote mode** the user works directly on a ECMWF member-state Linux server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the third-party libraries mentioned in :ref:`ref-requirements`, as ECMWF provides everything with environment modules. The module selection will be done automatically by ``flex_extract``.
    2121     
    2222.. _ref-gateway-desc:
    2323     
    2424  2. Gateway (member)
    25       The **Gateway mode** can be used if a local member state gateway server is in place. Then the job scripts can be submitted to the ECMWF Linux member state server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed at the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and some extra setup is done in the local ``flex_extract`` directory at the local gateway server. For more information about establishing a gateway server please see `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.
     25      The **Gateway mode** can be used if a local member-state gateway server is in place. Then, the job scripts can be submitted to the ECMWF member-state Linux server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed on the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and that some extra setup is done in the ``flex_extract`` directory on the local gateway server. For more information about establishing a gateway server, please refer to `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.
    2626
    2727.. _ref-local-desc:
    2828     
    2929  3. Local member
    30       Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.
     30      Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.
    3131     
    3232  4. Local public
    33       Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. In this case a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.
     33      Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF servers. The complete installation process is described in :ref:`ref-local-mode`. In this case, a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.
    3434     
    3535     
  • For_developers/Sphinx/source/Documentation/Overview/prog_flow.rst

    rb1674ed rf20342a  
    11************
    2 Program Flow
     2Program flow
    33************
    44
     
    1616.. figure:: ../../_files/submit.png   
    1717   
    18     Overview of the call of python's ``submit.py`` script and raw sequence of working steps done in ``flex_extract``.
     18    Overview of the call of the``submit.py`` Python script and raw sequence of work steps in ``flex_extract``.
    1919
    2020   
    21 The ``submit.py`` Python program is called by the Shell script ``run.sh`` or ``run_local.sh`` and accomplish the following steps:
     21The ``submit.py`` Python script is called by the shell script ``run.sh`` or ``run_local.sh`` and accomplishes the following steps:
    2222
    23     1. Setup the control data:
    24         It gets all command-line and ``CONTROL`` file parameters as well as optionally the ECMWF user credentials. Depending the :doc:`app_modes`, it might also prepare a job script which is then send to the ECMWF queue.
    25     2. Retrieves data from MARS:
    26         It creates and sends MARS-requests either on the local machine or on ECMWF server, that receives the data and stores them in a specific format in GRIB files. If the parameter ``REQUEST`` was set ``1`` the data are not received but a file ``mars_requests.csv`` is created with a list of MARS requests and their settings. If it is set to ``2`` the file is created in addition to retrieving the data. The requests are created in an optimised way by splitting in time, jobs  and parameters.   
    27     3. Post-process data to create final ``FLEXPART`` input files:
    28         After all data is retrieved, the disaggregation of flux fields (`see here <../disagg.html>`_ ) is done as well as the calculation of vertical velocity (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Eventually, the GRIB fields are merged together such that a single grib file per time step is available with all fields for ``FLEXPART``. Since model level fields are typically in *GRIB2* format whereas surface level fields are still in *GRIB1* format, they can be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties reading pure *GRIB2* files since some parameter IDs change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files can be communicated to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally the progress of the script can be followed with the usual Linux tools.
     23    1. Setup of control data:
     24        Command-line and ``CONTROL``-file parameters are read, as well as (optionally) the ECMWF user credentials. Depending the :doc:`app_modes`, a job script might be prepared which is then sent to the ECMWF queue.
     25    2. Retrieval of data from MARS:
     26        MARS requests are created either on the local machine or on the ECMWF server and then submitted which retrieve the data and store them in GRIB files. If the parameter ``REQUEST`` was set ``1``, the data are not retrieved and instead a file ``mars_requests.csv`` is created, which contains a list of the MARS requests and their settings. If ``REQEST`` is set to ``2``, the csv file is created in addition to retrieving the data. The requests are created in an optimised way by splitting with respect to time, jobs  and parameters.   
     27    3. Post-processing of data to create final ``FLEXPART`` input files:
     28        After all data have been retrieved, flux fields are disaggregated (`see here <../disagg.html>`_ ) and the vertical velocity is calculated (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Finally, the GRIB fields are merged into a single grib file per time step containing all the fields for ``FLEXPART``. Since model-level fields are typically in *GRIB2* format, whereas surface-level fields are still in *GRIB1* format, they will be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties to read these *GRIB2* files since some parameter IDs have been change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files will be sent to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally, the progress of the script can be followed with the usual Linux tools.
    2929
    3030
     
    3333========================================
    3434
    35 More details on how different the program flow is for the different :doc:`app_modes` is sketched in the following diagrams
     35The following diagrams show how different the program flow is for the different :doc:`app_modes`
    3636
    3737+-------------------------------------------------+------------------------------------------------+
  • For_developers/Sphinx/source/Documentation/api.rst

    rba99230 rf20342a  
    11****************************
    2 Auto Generated Documentation
     2Auto-generated documentation
    33****************************
    44   
  • For_developers/Sphinx/source/Documentation/disagg.rst

    rd9abaac rf20342a  
    11***************************
    2 Disaggregation of Flux Data
     2Disaggregation of flux data
    33***************************
    44   
    5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational particles in time and space. This method requires point values in the discrete input fields. However, flux data (as listed in table :ref:`ref-table-fluxpar`) from the ECMWF represent cell averages or integrals and are accumulated over a specific time interval, depending on the dataset. Hence, to conserve the integral quantity with ``FLEXPART``'s linear interpolation a pre-processing scheme has to be applied.
     5``FLEXPART`` interpolates meteorological input data linearly to the position of computational
     6particles in time and space. This method requires point values in the discrete input fields.
     7However, flux data (as listed in table :ref:`ref-table-fluxpar` below) from the ECMWF represent cell
     8averages or integrals and are accumulated over a specific time interval, depending on the data
     9set. Hence, to conserve the integral quantity with the linear interpolation used in ``FLEXPART``,
     10pre-processing has to be applied.
    611
    712.. _ref-table-fluxpar:
    813
    9 .. csv-table:: flux fields
     14.. csv-table:: Flux fields
    1015    :header: "Short Name", "Name", "Units", "Interpolation Type"
    1116    :align: center
     
    2025   
    2126
    22 The first step is to *de-accumulate* the fields in time so that each value represents an integral in x, y, t space.
    23 Afterwards, a *disaggregation* scheme is applied which means to break down the integral value into point values.
    24 In order to be able to carry out the disaggregation procedure proposed by Paul James, additional flux data is retrieved automatically for one day at the beginning and one day at the end of the period specified. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not communicated to the final ``FLEXPART`` input files.
     27The first step is to *de-accumulate* the fields in time so that each value represents non-overlapping integrals in x-, y-, and t-space.
     28Afterwards, a *disaggregation* scheme is applied which means to convert the integral value to corresponding point values to be used late for the interpolation.
     29The disaggregation procedure as proposed by Paul James (currently, the standard) requires additional flux data for one day at the beginning and one day at the end of the period specified.
     30They are retrieved automatically. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not contained in the final ``FLEXPART`` input files.
    2531
    26 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note, that the first two and last two flux files do not contain any data.
     32The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note that the first two and last two flux files do not contain any data.
    2733
    2834.. note::
    2935
    30     Note also that for operational retrievals (``BASETIME`` set to 00 or 12) forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on demand scripts.       
     36    Note also that for operational retrievals (``BASETIME`` set to 00 or 12), forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on-demand scripts.       
    3137       
    3238
     
    3440--------------------------------------------------
    3541
    36 In ``flex_extract`` up to version 5 the disaggregation was done with a Fortran program (FLXACC2). In version 6 this part was converted to Python.
     42In ``flex_extract`` up to version 5, the disaggregation was done with a Fortran program (FLXACC2). In version 6, this part was recoded in Python.
    3743
    38 
    39 In the old versions (below 7.1) a relatively simple method processes the precipitation fields in a way that is consistent with the scheme applied in ``FLEXPART`` for all variables: linear interpolation between times where input fields are available.
    40 At first the accumulated values are divided by the number of hours (i.e., 3 or 6).
     44In the old versions (below 7.1), a relatively simple method processes the precipitation fields in a way that is consistent with the linear interpolation between times where input fields are available that is applied in ``FLEXPART`` for all variables.
     45This scheme (from Paul James) at first divides the accumulated values by the number of hours (i.e., 3 or 6). ???
    4146The best option for disaggregation, which was realised, is conservation within the interval under consideration plus the two adjacent ones.
    4247Unfortunately, this leads to undesired temporal smoothing of the precipitation time series – maxima are damped and minima are raised.
     
    5358    :figclass: align-center
    5459
    55     Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by a half-time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).
     60    Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by half a time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).
    5661
    5762
    5863
    59 Disaggregation is done for 4 adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the 4 adjacent timespans.
     64Disaggregation is done for four adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the four adjacent timespans.
    6065
    6166.. math::
     
    6974
    7075
    71 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the 4 original timespans has a value below 0 it is set to 0 prior to the calculation.
     76This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the four original timespans has a value below 0, it is set to 0 prior to the calculation.
    7277   
    7378.. math::
     
    7883
    7984
    80 
    81 
    8285Disaggregation for precipitation in version 7.1
    8386-----------------------------------------------
    8487
    85 Due to the problems with generating precipitation in originally dry (or lower) intervals and the temporal smoothing a new algorithm was developed. The approach is based on a one dimensional piecewise linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional monotonicity filter helps to gain monotonicity.
    86 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the linear formulation.
    87 These requirements on the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i.e. the reconstructed values vanish throughout this whole time interval, too.
     88Due to the problems mentioned above, a new algorithm was developed. The approach is based on a one-dimensional, piecewise-linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements of preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional filter improves monotonicity.
     89The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the use of a linear formulation.
     90These requirements for the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i. e., the reconstructed values vanish throughout this whole time interval, too.
    8891In the simplest scenario of an isolated precipitation event, where in the time interval before and after the data values are zero, the reconstruction algorithm therefore has to vanish at the boundaries of the interval, too.
    8992The additional conditions of continuity and conservation of the precipitation amount then require us to introduce sub-grid points if we want to keep a linear interpolation (Fig. 2).
     
    142145
    143146
    144 In the case of the new disaggregation method for precipitation, the two new sub grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval and 1 or 2 for the two new sub grid points respectively. The filenames do not change.   
     147In the case of the new disaggregation method for precipitation, the two new sub-grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval, and 1 or 2, respectively, for the two new sub-grid points. The filenames do not change.   
    145148
    146149   
    147150.. note::
    148151
    149     The new method for disaggregation was published in the Geoscientific Model Development Journal in 2018:
     152    The new method for disaggregation was published in the journal Geoscientific Model Development in 2018:
    150153   
    151154    Hittmeir, S., Philipp, A., and Seibert, P.: A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART, Geosci. Model Dev., 11, 2503-2523, https://doi.org/10.5194/gmd-11-2503-2018, 2018.
    152155
    153      
    154    
    155 
    156  
    157156
    158157
    159 Disaggregation for the rest of the flux fields
     158
     159Disaggregation for the other flux fields
    160160----------------------------------------------
    161161     
    162162The accumulated values for the other variables are first divided by the number of hours and
    163 then interpolated to the exact times X using a bicubic interpolation which conserves the integrals of the fluxes within each timespan.
    164 Disaggregation is done for 4 adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which generates a new, disaggregated value which is output at the central point of the 4 adjacent timespans.
     163then interpolated to the exact times using a bicubic interpolation which conserves the integrals of the fluxes within each timespan.
     164Disaggregation is done for four adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which produces a new, disaggregated value that is the output at the central point of the four adjacent timespans.
    165165
    166166.. math::
  • For_developers/Sphinx/source/Documentation/input.rst

    rb1674ed rf20342a  
    11********************
    2 Control & Input Data
     2Control & input data
    33********************
    44
    5 Input Data
     5Input data
    66    - :doc:`Input/control`
    7           ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual dataset to be retrieved. They are initialized by ``flex_extract`` with their default values and can be overwritten with definitions set in the so called :doc:`Input/control`.
     7          ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialised by ``flex_extract`` with certain default values which can be overwritten with definitions set in the so-called :doc:`Input/control`.
    88
    9           To be able to successfully retrieve data from the ECMWF Mars archive it is necessary to understand these parameters and set them to proper and consistent values. They are described in :doc:`Input/control_params` section.
     9          For a successfull retrieval of data from the ECMWF MARS archive it is necessary to understand these parameters and to set them to proper and consistent values. They are described in :doc:`Input/control_params` section.
    1010
    11           We also have some :doc:`Input/examples` and description of :doc:`Input/changes` changes to previous versions and downward compatibilities.
     11          Furthermore, some :doc:`Input/examples` are provided, and in :doc:`Input/changes` changes to previous versions and downward compatibilities are described.
    1212       
    1313    - :doc:`Input/ecmwf_env`
    14          For ``flex_extract`` it is necessary to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.
     14         ``flex_extract`` needs to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.
    1515
    1616    - :doc:`Input/templates`
    17          A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt for example the jobscripts regarding its settings for the batch jobs.         
    18 
    19 
     17         A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt, for example, the job scripts with regard to the settings for the batch jobs.         
    2018
    2119
     
    2927
    3028Controlling
    31     The main tasks and behaviour of ``flex_extract`` are controlled by its Python scripts. There are two top-level scripts, one for installation called install_ and one for execution called submit_.
    32     They can interpret a number of command line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:
     29    The main tasks and the behaviour of ``flex_extract`` are controlled by the Python scripts. There are two top-level scripts, one for installation called install_, and one for execution called submit_.
     30    They interpret a number of command-line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:
    3331
    3432    .. code-block:: bash
     
    3836       python3 Source/Python/submit.py --help
    3937   
    40     In this new version we provide also the wrapping Shell scripts setup_ and run_, which sets the command line parameters, do some checks and execute the corresponing Python scripts ``install.py`` and ``submit.py`` respectivley.
    41      
    42     It might be faster and easier for beginners. See :doc:`../quick_start` for information on how to use them.
     38    With version 7.1, we provide also wrapper shell scripts setup_ and run_ which set the command-line parameters, do some checks, and execute the corresponing Python scripts ``install.py`` and ``submit.py``, respectively.
     39     It might be faster and easier for beginners if they are used. See :doc:`../quick_start` for information on how to use them.
    4340
    44     Additionally, ``flex_extract`` creates the Korn Shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be send to the ECMWF serves in the **remote mode** and the **gateway mode** for starting batch jobs.
     41    ``flex_extract`` also creates the Korn shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be sent to the ECMWF servers in the **remote mode** and the **gateway mode** for starting batch jobs.
    4542
    46     The Fortran program will be compiled during the installation process by the :doc:`Input/fortran_makefile`.
     43    The Fortran program is compiled during the installation process using the :doc:`Input/fortran_makefile`.
    4744   
    48     To sum up, the following scripts controls ``flex_extract``:
     45    To sum up, the following scripts control ``flex_extract``:
    4946
    5047    Installation
  • For_developers/Sphinx/source/Documentation/output.rst

    rb1674ed rf20342a  
    11***********
    2 Output Data
     2Output data
    33***********
    44
    5 The output data of ``flex_extract`` are separated mainly into temporary files and the final ``FLEXPART`` input files:
     5The output data of ``flex_extract`` can be divided into the final ``FLEXPART`` input files and  temporary files:
    66
    77+-----------------------------------------------+----------------------------------------------+   
    88|   ``FLEXPART`` input files                    |  Temporary files (saved in debug mode)       |
    99+-----------------------------------------------+----------------------------------------------+
    10 | - Standard output filenames                   | - MARS request file (opt)                    |
     10| - Standard output file names                  | - MARS request file (optional)               |
    1111| - Output for pure forecast                    | - flux files                                 |
    1212| - Output for ensemble members                 | - VERTICAL.EC                                |
     
    2121========================
    2222
    23 The final output files of ``flex_extract`` are also the meteorological ``FLEXPART`` input files.
    24 The naming of these files depend on the kind of data extracted by ``flex_extract``.
     23The final output files of ``flex_extract`` are the meteorological input files for ``FLEXPART``.
     24The naming convention for these files depends on the kind of data extracted by ``flex_extract``.
    2525
    2626Standard output files
    2727---------------------
    2828 
    29 In general, there is a file for each time step with the filename format:
     29In general, there is one file for each time named:
    3030
    3131.. code-block:: bash
     
    3333    <prefix>YYMMDDHH
    3434   
    35 The ``prefix`` is by default defined as ``EN`` and can be re-defined in the ``CONTROL`` file.
    36 Each file contains all meteorological fields needed by ``FLEXPART`` for all selected model levels for a specific time step.
    37 
    38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` where we extracted only the lowest model level for demonstration reasons:
     35where YY are the last two digits of the year, MM is the month, DD the day, and HH the hour (UTC). <prefix> is by default defined as EN, and can be re-defined in the ``CONTROL`` file.
     36Each file contains all meteorological fields at all levels as needed by ``FLEXPART``, valid for the time indicated in the file name.
     37
     38Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` (where we extracted only the lowest model level for demonstration purposes):
    3939
    4040.. code-block:: bash
     
    8484------------------------------
    8585
    86 ``Flex_extract`` can retrieve forecasts which can be longer than 23 hours. To avoid collisions of time steps for forecasts of more than one day a new scheme for filenames in pure forecast mode is introduced:
     86``Flex_extract`` is able to retrieve forecasts with a lead time of more than 23 hours. In order to avoid collisions of time steps names, a new scheme for filenames in pure forecast mode is introduced:
    8787
    8888.. code-block:: bash
     
    9090    <prefix>YYMMDD.HH.<FORECAST_STEP>
    9191
    92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3 digit number which represents the forecast step in hours.
     92The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3-digit number which represents the forecast step in hours.
    9393   
    9494
     
    9696-------------------------------------
    9797
    98 Ensembles can be retrieved and are addressed by the grib message parameter ``number``. The ensembles are saved per file and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3 digit format.
     98``Flex_extract`` is able to retrieve ensembles data; they are labelled by the grib message parameter ``number``. Each ensemble member is saved in a separate file, and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3-digit format.
    9999
    100100.. code-block:: bash
     
    106106-------------------------------------------------------
    107107
    108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type. They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`.
    109 The two additional fields are marked with the ``step`` parameter in the Grib messages and are set to "1" and "2" for sub-grid point 1 and 2 respectively.
    110 The output filenames do not change in this case. 
    111 Below is an example list of precipitation fields in an output file generated with the new disaggregation method:
     108The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type (large-scale and convective). They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`.
     109The two additional fields are addressed using the ``step`` parameter in the GRIB messages, which
     110is set to "1" or "2", for sub-grid points 1 and 2, respectively.
     111The output file names are not altered. 
     112An example of the list of precipitation fields in an output file generated with the new disaggregation method is found below:
    112113
    113114.. code-block:: bash
     
    129130===============
    130131
    131 ``Flex_extract`` works with a number of temporary data files which are usually deleted after a successful data extraction. They are only stored if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).
     132``Flex_extract`` creates a number of temporary data files which are usually deleted at the end of a successful run. They are preserved only if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).
    132133
    133134MARS grib files
     
    135136
    136137``Flex_extract`` retrieves all meteorological fields from MARS and stores them in files ending with ``.grb``.
    137 Since the request times and data transfer of MARS access are limited and ECMWF asks for efficiency in requesting data from MARS, ``flex_extract`` splits the overall data request in several smaller requests. Each request is stored in an extra ``.grb`` file and the file names are put together by several pieces of information:
     138Since there are limits implemented by ECMWF for the time per request and data transfer from MARS,
     139and as ECMWF asks for efficient MARS retrievals, ``flex_extract`` splits the overall data request
     140into several smaller requests. Each request is stored in its own ``.grb`` file, and the file
     141names are composed of several pieces of information:
    138142
    139143    .. code-block:: bash
     
    144148       
    145149Field type:
    146     ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4d variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast
     150    ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4D variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast
    147151Grid type:
    148    ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat/lon), ``_OROLSM`` - Orography parameter
     152   ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat / lon), ``_OROLSM`` - Orography parameter
    149153Temporal property:
    150154    ``__`` - instantaneous fields, ``_acc`` - accumulated fields
    151155Level type:
    152     ``ML`` - Model Level, ``SL`` - Surface Level
     156    ``ML`` - model level, ``SL`` - surface level
    153157ppid:
    154     The process number of the parent process of submitted script.
     158    The process number of the parent process of the script submitted.
    155159pid:
    156     The process number of the submitted script.
    157 
    158 The process ids should avoid mixing of fields if several ``flex_extract`` jobs are performed in parallel (which is, however, not recommended). The date format is YYYYMMDDHH.
    159 
    160 Example ``.grb`` files for a day of CERA-20C data:
     160    The process number of the script submitted.
     161
     162
     163Example ``.grb`` files for one day of CERA-20C data:
    161164
    162165    .. code-block:: bash
     
    172175-----------------
    173176
    174 This file is a ``csv`` file called ``mars_requests.csv`` with a list of the actual settings of MARS request parameters (one request per line) in a flex_extract job. It is used for documenting the data which were retrieved and for testing reasons.
    175 
    176 Each request consist of the following parameters, whose meaning mainly can be taken from :doc:`Input/control_params` or :doc:`Input/run`:
     177This file is a ``csv`` file called ``mars_requests.csv`` listing the actual settings of the MARS
     178request (one request per line) in a flex_extract job.
     179It is used for documenting which data were retrieved, and for testing.
     180
     181Each request consists of the following parameters, whose meaning mostly can be taken from :doc:`Input/control_params` or :doc:`Input/run`:
    177182request_number, accuracy, area, dataset, date, expver, gaussian, grid, levelist, levtype, marsclass, number, param, repres, resol, step, stream, target, time, type
    178183 
    179 Example output of a one day retrieval of CERA-20c data:
     184Example output of a one-day retrieval of CERA-20C data:
    180185
    181186.. code-block:: bash
     
    192197-----------
    193198
    194 The vertical discretization of model levels. This file contains the ``A`` and ``B`` parameters to calculate the model level height in meters.
     199This file contains information describing the vertical discretisation (model levels)
     200in form of the ``A`` and ``B`` parameters which allow to calculate the actual pressure of a model level from the surface pressure.
    195201
    196202
     
    198204----------
    199205
    200 This file is usually called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords.
    201 
    202 
    203 flux files
     206This file is called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords.
     207#PS NEEDS MORE DESCRIPTION
     208
     209
     210Flux files
    204211----------
    205212
    206 The flux files contain the de-accumulated and dis-aggregated flux fields of large scale and convective precipitation, eastward turbulent surface stress, northward turbulent surface stress, surface sensible heat flux and the surface net solar radiation.
     213The flux files contain the de-accumulated and dis-aggregated flux fields of large-scale and convective precipitation, east- and northward turbulent surface stresses, the surface sensible heat flux, and the surface net solar radiation.
    207214
    208215.. code-block:: bash
     
    210217    flux<date>[.N<xxx>][.<xxx>]
    211218
    212 The date format is YYYYMMDDHH. The optional block ``[.N<xxx>]`` marks the ensemble forecast number, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.
     219The date format is YYYYMMDDHH as explained before. The optional block ``[.N<xxx>]`` is used for the ensemble forecast date, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.
    213220
    214221.. note::
    215222
    216     In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval and ``1`` or ``2`` for the two new intervals respectively.
     223    In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval, and ``1`` or ``2``,  respectively, for the two new intervals.
    217224
    218225   
     
    226233    fort.xx
    227234   
    228 where ``xx`` is the number which defines the meteorological fields stored in these files.
    229 They are generated by the Python part of ``flex_extract`` by just splitting the meteorological fields for a unique time stamp from the ``*.grb`` files into the ``fort`` files.
    230 The following table defines the numbers with their corresponding content.   
     235where ``xx`` is a number which defines the meteorological fields stored in these files.
     236They are generated by the Python code in ``flex_extract`` by splitting the meteorological fields for a unique time stamp from the ``*.grb`` files, storing them under the names ``fort.<XX>`` where <XX> represents some number.
     237The following table defines the numbers and the corresponding content:   
    231238
    232239.. csv-table:: Content of fort - files
     
    240247    "16", "surface fields"
    241248    "17", "specific humidity"
    242     "18", "surface specific humidity (reduced gaussian)"
    243     "19", "vertical velocity (pressure) (optional)"
     249    "18", "surface specific humidity (reduced Gaussian grid)"
     250    "19", "omega (vertical velocity in pressure coordinates) (optional)"
    244251    "21", "eta-coordinate vertical velocity (optional)"
    245     "22", "total cloud water content (optional)"
    246 
    247 Some of the fields are solely retrieved with specific settings, e.g. the eta-coordinate vertical velocity is not available in ERA-Interim datasets and the total cloud water content is an optional field for ``FLEXPART v10`` and newer.
     252    "22", "total cloud-water content (optional)"
     253
     254Some of the fields are solely retrieved with specific settings, e. g., the eta-coordinate vertical velocity is not available in ERA-Interim datasets, and the total cloud-water content is an optional field which is useful for ``FLEXPART v10`` and newer.
    248255
    249256The ``calc_etadot`` program saves its results in file ``fort.15`` which typically contains:
     
    259266.. note::
    260267 
    261     The ``fort.4`` file is the namelist file to drive the Fortran program ``calc_etadot``. It is therefore also an input file.
     268    The ``fort.4`` file is the namelist file to control the Fortran program ``calc_etadot``. It is therefore also an input file.
    262269   
    263270    Example of a namelist:
  • For_developers/Sphinx/source/Documentation/overview.rst

    rb1674ed rf20342a  
    33========
    44
    5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) Mars archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` Atmospheric Transport Modelling system.
    6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who wants to use meteorological data from ECMWF to drive the ``FLEXPART`` model.
    7 The software retrieves the minimal number of parameters ``FLEXPART`` needs to work and provides the data in the explicity format ``FLEXPART`` understands.
     5``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) MARS archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` atmospheric transport modelling system.
     6``Flex_extract`` was created explicitly for ``FLEXPART`` users who want to use meteorological data from ECMWF to drive the ``FLEXPART`` model.
     7The software retrieves the minimum set of parameters needed by ``FLEXPART`` to work, and provides the data in the specific format required by ``FLEXPART``.
    88
    9 ``Flex_extract`` consists of 2 main parts:
    10     1. a Python part, where the reading of parameter settings, retrieving data from MARS and preparing the data for ``FLEXPART`` is done and
    11     2. a Fortran part, where the calculation of the vertical velocity is done and if necessary the conversion from spectral to regular latitude/longitude grids.
     9``Flex_extract`` consists of two main parts:
     10    1. a Python part which reads the parameter settings, retrieves the data from MARS, and prepares them for ``FLEXPART``, and
     11    2. a Fortran part which calculates the vertical velocity and, if necessary, converts variables from the spectral representation to regular latitude/longitude grids.
    1212
    13 Additionally, it has some Korn shell scripts which are used to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.   
     13In addition, there are some Korn shell scripts to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.   
    1414
    1515A number of Shell scripts are wrapped around the software package for easy installation and fast job submission.
    1616
    17 The software depends on a number of third-party libraries which can be found in :ref:`ref-requirements`.
     17The software depends on some third-party libraries as listed in :ref:`ref-requirements`.
    1818
    19 Some details on the tasks and program worksteps are described in :doc:`Overview/prog_flow`.
     19Details of the tasks and program work steps are described in :doc:`Overview/prog_flow`.
    2020
    2121
  • For_developers/Sphinx/source/Documentation/vertco.rst

    rb1674ed rf20342a  
    11*******************
    2 Vertical Coordinate
     2Vertical wind
    33*******************
    44       
    5 Calculation of vertical velocity and preparation of Output-files
     5Calculation of vertical velocity and preparation of output files
    66================================================================
    77
    8 ``flex_extract`` has two ways to calculate the vertical velocity for ``FLEXTRA``/``FLEXPART``:
     8Two methods are provided in ``flex_extract`` for the calculation of the vertical velocity for ``FLEXTRA``/``FLEXPART``:
    99    (i) from the horizontal wind field,
    10     (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**.
     10    (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**, which contains the vertical velocity directly in the eta coordinate system of the ECMWF model.
    1111
    1212Especially for high resolution data, use of the ``MARS`` parameter 77 is recommended,
     
    2020   
    2121   
    22 Calculation of vertical velocity from horizontal wind using the continuity equation
     22Calculation of the vertical velocity from the horizontal wind using the continuity equation
    2323===================================================================================
    2424
    25 The vertical velocity is computed by the FORTRAN90 program ``calc_etadot`` in the ECMWF
    26 vertical coordinate system by applying the equation of continuity and thereby ensuring mass consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the
     25The vertical velocity in the ECMWF's eta vertical coordinate system is computed by the Fortran program ``calc_etadot``, using the continuity equation and thereby ensuring mass-consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the
    2726documents v20_update_protocol.pdf, V30_update_protocol.pdf and
    2827V40_update_protocol.pdf. The computational demand and accuracy of ``calc_etadot`` is highly
     
    3029following guidance can be given for choosing the right parameters:
    3130
    32     * For very fine output grids (0.25 degree or finer) the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279  needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.
     31    * For very fine output grids (0.25 degree or finer), the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279  needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.
    3332    * For lower resolution (often global) output grids, calculation of vertical velocities with lower than operational spectral resolution is recommended. For global grids the following settings appear optimal:
    3433        - For 1.0 degree grids: ``GAUSS=1``, ``RESOL=255``, ``SMOOTH=179``
    3534        - For 0.5 degree grids: ``GAUSS=1``, ``RESOL=399``, ``SMOOTH=359``
    3635        - Calculation on the lat/lon grid is not recommended for less than the operational (T1279) resolution.   
    37         - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279, 2047 in future models). This choice is restricted because a reduced Gaussian grid is defined in then ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.
    38         - For ``SMOOTH`` any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if vertical velocity is calculated on lat\/lon grid (``GAUSS=0``).
    39     * The on demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however.
     36        - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279; 2047 in future model versions). This choice is restricted because a reduced Gaussian grid is defined in the ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.
     37        - For ``SMOOTH``, any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if the vertical velocity is calculated on a lat\/lon grid (``GAUSS=0``).
     38    * The on-demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however.
    4039    * Regional grids are not cyclic in zonal directions, but global grids are. The software assumes a cyclic grid if ``RIGHT``-``LEFT`` is equal to ``GRID`` or is equal to ``GRID``-360.
    41     * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` for application in ATM modelling. If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and the surface pressure tendency. ``OMEGADIFF`` should be zero except for debugging, since it triggers expensive calculations on the Gaussian grid.
     40    * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` (the standard ``flex_extract`` output files) If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and from the surface pressure tendency. ``OMEGADIFF`` should be set to zero except for debugging, since it triggers expensive calculations on the Gaussian grid.
    4241   
    4342   
    44 Calculation of vertical velocity from pre-calculated MARS parameter 77
     43Calculation of the vertical velocity from the pre-calculated MARS parameter 77
    4544======================================================================
    4645
    47 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that this parameter is available if the ``CONTROL`` parameter ``ETA`` is set to 1.
     46Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that parameter 77 is available if the ``CONTROL`` parameter ``ETA`` is set to 1.
    4847
    4948It is recommended to use the pre-calculated parameter 77 by setting ``ETA`` to 1 whenever possible.
    5049
    51 Setting parameter ``ETA`` to 1 normally disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.
     50Setting the parameter ``ETA`` to 1 disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.
    5251
    5352.. note::
    54    However, the calculation on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.
     53   However, the calculations on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case, ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.
    5554
    5655The parameters ``RESOL``, ``GRID``, ``UPPER``, ``LOWER``, ``LEFT``, ``RIGHT`` still apply. As for calculations on the Gaussian grid, the spectral resolution parameter ``RESOL`` should be compatible with the grid resolution (see previous subsection).
  • For_developers/Sphinx/source/Ecmwf/access.rst

    rba99230 rf20342a  
    11************
    2 Access Modes
     2Access modes
    33************
    44
     
    88.. _CDS API: https://cds.climate.copernicus.eu/api-how-to
    99
    10 Access to the ECMWF Mars archive is divided into two groups: **member state** users and **public** users.
     10Access to the ECMWF MARS archive is divided into two groups: **member state** users and **public** users.
    1111
    12 **Member state user**:
    13     This access mode allows the user to work directly on the ECMWF Linux Member State Servers or via a Web Access Toolkit ``ecaccess`` through a local Member State Gateway Server. This enables the user to have direct and full access to the Mars archive. There might be some limitations in user rights such as the declined access to the latest forecasts. This has to be discussed with the `Computing Representative`_. This user group is also able to work from their local facilities without a gateway server in the same way a **public** user would. The only difference is the connection with the Web API. However, this is automatically selected by ``flex_extract``.
     12**Member-state user**:
     13    This access mode allows the user to work directly on a ECMWF member-state Linux server or via the ``ecaccess`` Web-Access Toolkit through a local member-state Gateway server. This enables the user to have direct and full access to the MARS archive. There might be some limitations in user rights, such as no access to the latest forecasts. In case such data are needed, this has to be agreed upon with the national `Computing Representative`_. This user group is also able to work from their local facilities without a gateway server in the same way a **public** user would. The only difference is the connection with the Web API, which, however, is automatically selected by ``flex_extract``.
    1414   
    1515
    1616**Public user**:
    17     This access mode allows every user to access the ECMWF `public datasets`_ from their local facilities. ``Flex_extract`` is able (tested for the use with ``FLEXPART``) to extract the re-analysis datasets such as ERA-Interim and CERA-20C. The main difference to the **member state user** is the method of access with the Web API and the availability of data. For example, in ERA-Interim there is only a 6-hourly temporal resolution instead of 3 hours. The access method is selected by providing the command line argument "public=1" and providing the MARS keyword "dataset" in the ``CONTROL`` file. Also, the user has to explicitly accept the license of the dataset to be retrieved. This can be done as described in the installation process at section :ref:`ref-licence`.   
     17    This access mode allows every user to access the ECMWF `public datasets`_ from their local facilities. ``Flex_extract`` is able to extract the re-analysis datasets such as ERA-Interim and CERA-20C for use with ``FLEXPART`` (tested). The main difference to the **member-state user** is the method of access with the Web API and the availability of data. For example, in ERA-Interim,  only a 6-hourly temporal resolution is available instead of 3 hours. The access method is selected by providing the command line argument "public=1" and providing the MARS keyword "dataset" in the ``CONTROL`` file. Also, the user has to explicitly accept the license of the data set to be retrieved. This can be done as described in the installation process at section :ref:`ref-licence`.   
    1818     
    1919.. note::
    2020   
    21    The availability of the public dataset *ERA5* with the ECMWF Web API was cancelled in March 2019. The oportunity of local retrieval of this dataset was moved to the `Climate Data Store`_ which uses another Web API named `CDS API`_. This Data Store stores the data on explicit webservers for faster and easier access. Unfortunately, for *ERA5* there are only surface level and pressure level data available for *public users*. In the case of a *member user* it is possible to bypass the request to the MARS archive from ECMWF to retrieve the data. ``Flex_extract`` is already modified to use this API so *member user* can already retrieve *ERA5* data while *public users* have to wait until model level are available.
     21   The availability of the public dataset *ERA5* with the ECMWF Web API was cancelled by ECWMF in March 2019. Local retrieval of this dataset now has to use the `Climate Data Store`_ (CDS) with a different Web API called `CDS API`_. CDS stores the data on dedicated web servers for faster and easier access. Unfortunately, for *ERA5*, only surface level and pressure level data are available for *public users* which is not enough to run FLEXPART. For a *member user*, it is possible to pass the request to the MARS archive to retrieve the data. ``Flex_extract`` is already modified to use this API so a *member user* can already retrieve *ERA5* data for FLEXPART while *public users* have to wait until model level are made available.
    2222       
    2323For information on how to register see :ref:`ref-registration`.
  • For_developers/Sphinx/source/Ecmwf/ec-links.rst

    rba99230 rf20342a  
    11################################
    2 Link Collection for Quick finder
     2Link collection
    33################################
    44
    55
    6 ECMWF - General Overview   
     6ECMWF - General overview   
    77    `ECMWF Home <https://www.ecmwf.int/>`_
    88   
    9     `ECMWF Training <https://www.ecmwf.int/en/learning>`_
     9    `ECMWF training <https://www.ecmwf.int/en/learning>`_
    1010   
    11     `General User Documentation <https://software.ecmwf.int/wiki/display/UDOC/User+Documentation>`_
     11    `General user documentation <https://software.ecmwf.int/wiki/display/UDOC/User+Documentation>`_
    1212   
    13     `Software Support <https://confluence.ecmwf.int/display/SUP>`_
     13    `Software support <https://confluence.ecmwf.int/display/SUP>`_
    1414
    1515MARS
    1616    `MARS user documentation <https://confluence.ecmwf.int//display/UDOC/MARS+user+documentation>`_
    1717   
    18     `MARS Keywords <https://software.ecmwf.int/wiki/display/UDOC/MARS+keywords>`_
     18    `MARS keywords <https://software.ecmwf.int/wiki/display/UDOC/MARS+keywords>`_
    1919   
    20     `MARS Content <https://confluence.ecmwf.int/display/UDOC/MARS+content>`_
     20    `MARS content <https://confluence.ecmwf.int/display/UDOC/MARS+content>`_
    2121   
    22     `MARS Actions <https://confluence.ecmwf.int/display/UDOC/MARS+actions>`_
     22    `MARS actions <https://confluence.ecmwf.int/display/UDOC/MARS+actions>`_
    2323   
    24     `Parameter Database <https://apps.ecmwf.int/codes/grib/param-db>`_
     24    `Parameter database <https://apps.ecmwf.int/codes/grib/param-db>`_
    2525 
    2626Registration
    27     `Contact of Computing Representative's <https://www.ecmwf.int/en/about/contact-us/computing-representatives>`_
     27    `Contacts of Computing Representatives <https://www.ecmwf.int/en/about/contact-us/computing-representatives>`_
    2828
    2929    `Public registration for ECMWF Web API <https://software.ecmwf.int/wiki/display/WEBAPI/Access+MARS>`_
    3030       
    31     `CDS Registration <https://cds.climate.copernicus.eu/user/register>`_
     31    `CDS registration <https://cds.climate.copernicus.eu/user/register>`_
    3232
    33 Available Member State Datasets
    34     `Web Interface for accessing member state datasets <http://apps.ecmwf.int/archive-catalogue/>`_
     33Member-State data sets available
     34    `Web interface for accessing member-state data sets <http://apps.ecmwf.int/archive-catalogue/>`_
    3535   
    36     `Available datasets for member state users <https://www.ecmwf.int/en/forecasts/datasets>`_
     36    `Data sets available for member state users <https://www.ecmwf.int/en/forecasts/datasets>`_
    3737   
    38 Available Public Datasets
    39     `Web Interface for accessing public datasets <http://apps.ecmwf.int/datasets/>`_
     38Public data sets available
     39    `Web interface for accessing public data sets <http://apps.ecmwf.int/datasets/>`_
    4040   
    41     `ECMWF's public datasets <https://confluence.ecmwf.int/display/WEBAPI/Available+ECMWF+Public+Datasets>`_
     41    `ECMWF's public data sets <https://confluence.ecmwf.int/display/WEBAPI/Available+ECMWF+Public+Datasets>`_
    4242   
    43     `Public dataset Licences <https://software.ecmwf.int/wiki/display/WEBAPI/Available+ECMWF+Public+Datasets>`_
     43    `Public data set licences <https://software.ecmwf.int/wiki/display/WEBAPI/Available+ECMWF+Public+Datasets>`_
    4444   
    45     `ERA5 public dataset Licence <https://cds.climate.copernicus.eu/cdsapp#!/search?type=dataset>`_
     45    `ERA5 public dataset licence <https://cds.climate.copernicus.eu/cdsapp#!/search?type=dataset>`_
    4646
    4747
    4848Datasets
    4949    Overview
    50         `Complete list of datasets <https://www.ecmwf.int/en/forecasts/datasets>`_
     50        `Complete list of data sets <https://www.ecmwf.int/en/forecasts/datasets>`_
    5151               
    5252        `What is climate reanalysis <https://www.ecmwf.int/en/research/climate-reanalysis>`_
     
    5757        `List of real_time datasets <https://www.ecmwf.int/en/forecasts/datasets/catalogue-ecmwf-real-time-products>`_
    5858
    59         `Atmospheric model - HRES (our typical operational dataset) <https://www.ecmwf.int/en/forecasts/datasets/set-i>`_
     59        `Atmospheric model - HRES (typical operational dataset) <https://www.ecmwf.int/en/forecasts/datasets/set-i>`_
    6060       
    6161        `Atmospheric model - ENS (15-day ensemble forecast) <https://www.ecmwf.int/en/forecasts/datasets/set-iii>`_
     
    6666        `ERA-Interim documentation <https://www.ecmwf.int/en/elibrary/8174-era-interim-archive-version-20>`_
    6767   
    68         `ERA-Interim dataset <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/era-interim>`_
     68        `ERA-Interim data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/era-interim>`_
    6969   
    7070    CERA-20C
    7171        `What is CERA-20C <https://software.ecmwf.int/wiki/display/CKB/What+is+CERA-20C>`_
    7272       
    73         `CERA-20C dataset <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/cera-20c>`_
     73        `CERA-20C data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/cera-20c>`_
    7474           
    7575    ERA5
     
    8484        `ERA5 Documentation <https://software.ecmwf.int/wiki/display/CKB/ERA5+data+documentation>`_       
    8585
    86 Third Party Libraries
    87     `ECMWF Web API Home <https://software.ecmwf.int/wiki/display/WEBAPI/ECMWF+Web+API+Home>`_
     86Third-party libraries
     87    `ECMWF Web API home <https://software.ecmwf.int/wiki/display/WEBAPI/ECMWF+Web+API+Home>`_
    8888
    8989    `Building ECMWF software with gfortran <https://software.ecmwf.int/wiki/display/SUP/2015/05/11/Building+ECMWF+software+with+gfortran>`_
     
    103103   
    104104
    105 Scientific Information
    106     `Octahedral reduced Gaussian Grid <https://confluence.ecmwf.int/display/FCST/Introducing+the+octahedral+reduced+Gaussian+grid>`_
     105Scientific information
     106    `Octahedral reduced Gaussian grid <https://confluence.ecmwf.int/display/FCST/Introducing+the+octahedral+reduced+Gaussian+grid>`_
    107107   
    108108    `Precipitation <https://www.ecmwf.int/en/newsletter/147/meteorology/use-high-density-observations-precipitation-verification>`_
    109109
    110110
    111 Technical Information of ECMWF serves
     111Technical information for ECMWF servers
    112112
    113     `Introduction presentation to SLURM  <https://confluence.ecmwf.int/download/attachments/73008494/intro-slurm-2017.pdf?version=1&modificationDate=1488574096323&api=v2>`_
     113    `Introductary presentation of SLURM  <https://confluence.ecmwf.int/download/attachments/73008494/intro-slurm-2017.pdf?version=1&modificationDate=1488574096323&api=v2>`_
    114114
    115 Troubleshooting
    116     `ECMWF Web API Troubleshooting <https://confluence.ecmwf.int/display/WEBAPI/Web-API+Troubleshooting>`_
     115Trouble-shooting
     116    `ECMWF Web API trouble-shooting <https://confluence.ecmwf.int/display/WEBAPI/Web-API+Troubleshooting>`_
    117117
    118118
  • For_developers/Sphinx/source/Ecmwf/hintsecmwf.rst

    rba99230 rf20342a  
    11##################################
    2 Hints to specify dataset retrieval
     2Hints for data set selection
    33##################################
    44
     
    88
    99
    10 How can I find out what data is available?
     10How can I find out what data are available?
    1111==========================================
    1212
    13 Go to the `Web Interface for accessing member state datasets <http://apps.ecmwf.int/archive-catalogue/>`_
     13Go to the `Web Interface for accessing member-state data sets <http://apps.ecmwf.int/archive-catalogue/>`_
    1414and click yourself through the steps to define your set of data and see what is available to you.
    1515
    16 For public users there is  the `Web Interface for accessing public datasets <http://apps.ecmwf.int/datasets/>`_.
     16For public users there is  the `Web Interface for accessing public data sets <http://apps.ecmwf.int/datasets/>`_.
    1717
    1818
  • For_developers/Sphinx/source/Ecmwf/msdata.rst

    rba99230 rf20342a  
    1 #########################################
    2 Available Datasets for Member State users
    3 #########################################
     1##########################################
     2Available data sets for member-state users
     3##########################################
    44
    55
    66
    7 Model level data
     7Model-level data
    88================
    99
     
    1111
    1212
    13 Surface level data
     13Surface data
    1414==================
    1515
  • For_developers/Sphinx/source/Ecmwf/pubdata.rst

    rba99230 rf20342a  
    1 Available Datasets for Public users
    2 ***********************************
     1Available data sets for public users
     2************************************
    33
    4   IN PREPARATION
     4  UNDER PREPARATION
    55
    66
  • For_developers/Sphinx/source/Evaluation/staticcode.rst

    rba99230 rf20342a  
    11********************
    2 Static Code Analysis
     2Static code analysis
    33********************
    44
  • For_developers/Sphinx/source/Evaluation/testcases.rst

    rb1674ed rf20342a  
    11********************
    2 Testcases
     2Test cases
    33********************
    44
     
    1111
    1212
    13 Comparison of grib files
     13Comparison of GRIB files
    1414========================
    1515
  • For_developers/Sphinx/source/Installation/local.rst

    rb1674ed rf20342a  
    5151The installation is the same for the access modes **member** and **public**.
    5252
    53 The environment on your local system has to provide these software packages
     53The environment on your local system has to provide the following software packages
    5454and libraries, since the preparation of the extraction and the post-processing is done on the local machine:
    5555
    56 +------------------------------------------------+-----------------+
    57 |  Python part                                   | Fortran part    |
    58 +------------------------------------------------+-----------------+
    59 | * `Python3`_                                   | * `gfortran`_   |
    60 | * `numpy`_                                     | * `fftw3`_      |
    61 | * `genshi`_                                    | * `eccodes`_    |
    62 | * `eccodes for python`_                        | * `emoslib`_    |
    63 | * `ecmwf-api-client`_ (everything except ERA5) |                 |
    64 | * `cdsapi`_ (just for ERA5 and member user)    |                 |
    65 +------------------------------------------------+-----------------+
     56+-------------------------------------------------+-----------------+
     57|  Python part                                    | Fortran part    |
     58+-------------------------------------------------+-----------------+
     59| 1. `Python3`_                                   | 1. `gfortran`_  |
     60| 2. `numpy`_                                     | 2. `fftw3`_     |
     61| 3. `genshi`_                                    | 3. `eccodes`_   |
     62| 4. `eccodes for python`_                        | 4. `emoslib`_   |
     63| 5. `ecmwf-api-client`_ (everything except ERA5) |                 |
     64| 6. `cdsapi`_ (just for ERA5 and member user)    |                 |
     65+-------------------------------------------------+-----------------+
    6666
    6767
    6868.. _ref-prep-local:
    6969
    70 Prepare local environment
    71 =========================
    72 
    73 The easiest way to install all required packages is to use the package management system of your Linux distribution  which requires admin rights.
     70Preparing the local environment
     71===============================
     72
     73The easiest way to install all required packages is to use the package management system of your Linux distribution which requires admin rights.
    7474The installation was tested on a *Debian GNU/Linux buster* and an *Ubuntu 18.04 Bionic Beaver* system.
    7575
    7676.. code-block:: sh
    7777
    78   # On a Debian or Debian-derived sytem (e. g. Ubuntu) system you may use the following commands (or equivalent commands of your preferred package manager):
    79   # (if not already available):
     78  # On a Debian or Debian-derived (e. g. Ubuntu) system,
     79  # you may use the following commands (or equivalent commands of your preferred package manager):
     80  # (if respective packages are not already available):
    8081   apt-get install python3 (usually already available on GNU/Linux systems)
    8182   apt-get install python3-eccodes
     
    8687   apt-get install libeccodes-dev
    8788   apt-get install libemos-dev
    88   # Some of these packages will pull in further packages as dependencies. This is fine, and some are even needed by ``flex_extract''.
    89  
    90 
    91   # As currently the CDS and ECMWF API packages are not available as Debian packages, they need to be installed outside of the Debian (Ubuntu etc.) package management system. The recommended way is:
     89  # Some of these packages will pull in further packages as dependencies.
     90  # This is fine, and some are even needed by ``flex_extract''.
     91
     92  # As currently the CDS and ECMWF API packages are not available as Debian packages,
     93  # they need to be installed outside of the Debian (Ubuntu etc.) package management system.
     94  # The recommended way is:
    9295   apt-get install pip
    9396   pip install cdsapi
     
    9699.. note::
    97100
    98     In case you would like to use Anaconda Python we recommend you follow the installation instructions of
    99     `Anaconda Python Installation for Linux <https://docs.anaconda.com/anaconda/install/linux/>`_ and then install the
    100     ``eccodes`` package from ``conda`` with:
     101    If you are using Anaconda Python, we recommend to follow the installation instructions of
     102    `Anaconda Python Installation for Linux <https://docs.anaconda.com/anaconda/install/linux/>`_
     103    and then install the ``eccodes`` package from ``conda`` with:
    101104
    102105    .. code-block:: bash
     
    104107       conda install conda-forge::python-eccodes   
    105108   
    106 The CDS API (cdsapi) is required for ERA5 data and the ECMWF Web API (ecmwf-api-client) for all other public datasets.   
     109The CDS API (``cdsapi``) is required for ERA5 data and the ECMWF Web API (ecmwf-api-client) for all other public datasets.   
    107110   
    108111.. note::
    109112
    110     Since **public users** currently don't have access to the full *ERA5* dataset they can skip the installation of the ``CDS API``.
    111 
    112 Both user groups have to provide keys with their credentials for the Web API's in their home directory. Therefore, follow these instructions:
     113    Since **public users** currently don't have access to the full *ERA5* dataset, they can skip the installation of the CDS API.
     114
     115Both user groups have to provide keys with their credentials for the Web APIs in their home directory, following these instructions:
    113116       
    114117ECMWF Web API:
    115    Go to `MARS access`_ website and log in with your credentials. Afterwards, on this site in section "Install ECMWF KEY" the key for the ECMWF Web API should be listed. Please follow the instructions in this section under 1 (save the key in a file `.ecmwfapirc` in your home directory).
     118   Go to the `MARS access`_ website and log in with your credentials. Afterwards, go to the section "Install ECMWF KEY", where the key for the ECMWF Web API should be listed. Please follow the instructions in this section under 1 (save the key in a file ``.ecmwfapirc`` in your home directory).
    116119     
    117120CDS API:
    118    Go to `CDS API registration`_ and register there too. Log in at the `cdsapi`_ website and follow the instructions at section "Install the CDS API key" to save your credentials in a `.cdsapirc` file.
     121   Go to `CDS API registration`_ and register there, too. Log in on the `cdsapi`_ website and follow the instructions in the section "Install the CDS API key" to save your credentials in file ``.cdsapirc``.
    119122
    120123   
    121124.. _ref-test-local:
    122125   
    123 Test local environment
    124 ======================
    125 
    126 Check the availability of the python packages by typing ``python3`` in a terminal window and run the ``import`` commands in the python shell. If there are no error messages, you succeeded in setting up the environment.
    127 
     126Testing the local environment
     127=============================
     128
     129Check the availability of the python packages by typing ``python3`` in a terminal window and run the ``import`` commands in the python shell:
    128130.. code-block:: python
    129131   
     
    135137   import ecmwfapi
    136138   
    137 
    138 
    139 Test the Web API's
    140 ------------------
     139If there are no error messages, you succeeded in setting up the environment.
     140
     141
     142Testing the Web APIs
     143--------------------
    141144
    142145You can start very simple test retrievals for both Web APIs to be sure that everything works. This is recommended to minimise the range of possible errors using ``flex_extract`` later on.
     
    148151
    149152+----------------------------------------------------------+----------------------------------------------------------+
    150 |Please use this piece of Python code for **Member user**: |Please use this piece of Python code for **Public user**: |
     153|Please use this Python code snippet as a **Member user**: |Please use this Python code snippet as a **Public user**: |
    151154+----------------------------------------------------------+----------------------------------------------------------+
    152155|.. code-block:: python                                    |.. code-block:: python                                    |
     
    178181Extraction of ERA5 data via CDS API might take time as currently there is a high demand for ERA5 data. Therefore, as a simple test for the API just retrieve pressure-level data (even if that is NOT what we need for FLEXPART), as they are stored on disk and don't need to be retrieved from MARS (which is the time-consuming action):
    179182
    180 Please use this piece of Python code to retrieve a small sample of *ERA5* pressure levels:
     183Please use the following Python code snippet to retrieve a small sample of *ERA5* pressure level data:
    181184
    182185.. code-block:: python
     
    204207.. **Member-state user**
    205208
    206 Please use this piece of Python code to retrieve a small *ERA5* data sample as a **member-state user**! The **Public user** do not have access to the full *ERA5* dataset!
     209Please use the following Python code snippet to retrieve a small *ERA5* data sample as a **member-state user**! The **Public user** do not have access to the full *ERA5* dataset!
    207210
    208211.. code-block:: python
     
    268271==================
    269272
    270 First prepare the Fortran ``makefile`` for your environment and set it in the ``setup.sh`` script. (See section :ref:`Fortran Makefile <ref-convert>` for more information.)
    271 ``flex_extract`` comes with two ``makefiles`` prepared for the ``gfortran`` compiler. One for the normal use ``makefile_fast`` and one for debugging ``makefile_debug`` which is usually only resonable for developers.
    272  
    273 They assume that ``eccodes`` and ``emoslib`` are installed as distribution packages and can be found at ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number.
     273First, adapt the Fortran ``makefile`` for your environment (if necessary) and insert it into ``setup.sh`` script (see :ref:`Fortran Makefile <ref-convert>` for more information).
     274They can be found at ``flex_extract_vX.X/Source/Fortran/``, where ``vX.X`` should be substituted by the current flex_extract version number.
    274275
    275276.. caution::   
     
    277278   ``makefiles`` if other than standard paths are used.
    278279
    279 So starting from the root directory of ``flex_extract``,
    280 go to the ``Fortran`` source directory and open the ``makefile`` of your
    281 choice to modify with an editor of your choice. We use the ``nedit`` in this case.
     280Thus, go to the ``Fortran`` source directory and open the ``makefile`` of your
     281choice, and check / modify with an editor of your choice:
    282282
    283283.. code-block:: bash
     
    286286   nedit makefile_fast
    287287 
    288 Edit the paths to the ``eccodes`` library on your local machine.
    289 
     288Set the paths to the ``eccodes`` library on your local machine, if necessary.
    290289
    291290.. caution::
     
    302301   to find out the path to the ``eccodes`` library.
    303302   
    304 Substitute these paths in the ``makefile`` for parameters **ECCODES_INCLUDE_DIR**
    305 and **ECCODES_LIB** and save it.
     303Assign these paths to the parameters **ECCODES_INCLUDE_DIR**
     304and **ECCODES_LIB** in the makefile, and save it.
    306305
    307306.. code-block:: bash
    308307
    309    # these are the paths on a current Debian 10 Testing system (May 2019)
     308   # these are the paths on Debian Buster:
    310309   ECCODES_INCLUDE_DIR=/usr/lib/x86_64-linux-gnu/fortran/gfortran-mod-15/
    311310   ECCODES_LIB= -L/usr/lib -leccodes_f90 -leccodes -lm 
     
    313312   
    314313The Fortran program called ``calc_etadot`` will be compiled during the
    315 installation process.Therefore the name of the ``makefile`` to be used needs to be given in  ``setup.sh``.
     314installation process. Therefore, the name of the ``makefile`` to be used needs to be given in  ``setup.sh``.
    316315
    317316In the root directory of ``flex_extract``, open the ``setup.sh`` script
    318 and adapt the installation parameters in the section labelled with
    319 "AVAILABLE COMMANDLINE ARGUMENTS TO SET" like shown below.
     317with an editor and adapt the installation parameters in the section labelled with
     318"AVAILABLE COMMANDLINE ARGUMENTS TO SET" as shown below:
    320319
    321320
  • For_developers/Sphinx/source/dev_guide.rst

    rb1674ed rf20342a  
    66.. note::
    77
    8   This section still needs to be done.
     8  This section still needs to be written.
    99   
    10 .. repository (how /who manages the code, where to get)
     10.. repository (how / who manages the code, where to get)
    1111   
    1212   
  • For_developers/Sphinx/source/documentation.rst

    rb1674ed rf20342a  
    33*************
    44       
    5     Overview (Under construction)
     5    Overview (under construction)
    66     
    7     Control & Input Data
     7    Control & input data
    88   
    9     Output Data (Under construction)
     9    Output data (under construction)
    1010   
    11     Disaggregation of Flux Data (Under construction)
     11    Disaggregation of flux data (under construction)
    1212   
    13     Vertical Coordinate (Under construction)
    14       - Methods (GAUSS, ETA, OMEGA)
     13    Vertical coordinate (under construction)
     14      - methods (GAUSS, ETA, OMEGA)
    1515      - calc_etadot
    1616   
    17     Auto Generated Documentation
     17    Auto-generated documentation
    1818      - Python
    19       - Fortran (Under construction)
     19      - Fortran (under construction)
    2020
    2121   
  • For_developers/Sphinx/source/ecmwf_data.rst

    rba99230 rf20342a  
    77
    88
    9 The European Centre for Medium-Range Weather Forecasts (`ECMWF`_), based in Reading, UK, is an independent intergovernmental organisation supported by 34 states. It is both a research institute and a full time operational service. It produces global numerical weather predictions and some other data which is fully available to the national meteorological services in the `Member States`_, Co-operating States and the broader community. Especially, the published re-analysis datasets are made available to the public with some limits in specific datasets.
     9The European Centre for Medium-Range Weather Forecasts (`ECMWF`_), based in Reading, UK, is an independent intergovernmental organisation supported by 34 states. It is both a research institute and a 24 h / 7 d operational service. It produces global numerical weather predictions and some other data which are fully available to the national meteorological services in the `Member States`_, Co-operating States, and to some extend to the broader community. Specifically, re-analysis data sets are made available to the public, however, with some limitations for specific data sets.
    1010
    11 The amount and structure of the available data from ECMWF is very complex. The operational data changes regularly in time and spatial resolution, physics and parameter. This has to be taken into account carefully and each user has to investigate his dataset of interest carefully before selecting and retrieving it with ``flex_extract``.
    12 The re-analysis datasets are consistent in all the above mentioned topics over their whole period but they have each their own specialities which makes treatment with ``flex_extract`` special in some way. For example, they have different starting times for their forecasts or different parameter availability. They also have differences in time and spatial resolution and most importantly for ``flex_extract`` they are different in the way of providing the vertical coordinate.
     11There is vast amount and of data with a complex structure available from ECMWF. The operational data undergo changes with respect to temporal and spatial resolution, model physics and parameters available. This has to be taken into account carefully and every user should have a clear idea of the data set intended to be used before retrieving it with ``flex_extract``.
     12Each re-analysis data set is homogeneous with respect to resolution etc., but the different re-analyses alll have specific properties which requires a corresponding treatment with ``flex_extract``. For example, the starting times of the forecasts may be different, or the availability of parameters (model output variables) may vary. They also differ in their temporal and spatial resolution, and - most importantly for ``flex_extract`` - there are differences in the way how the vertical wind component may be accessed.
    1313
    14 There is much to learn from ECMWF and their datasets and data handling and this might be confusing at first. We therefore collected the most important information for ``flex_extract`` users. In the following sections the user can use them to get to know enough to understand how ``flex_extract`` is best used and to select the parameters of the ``CONTROL`` files.
     14As there is much to learn about ECMWF and its data sets and data handling, it might be confusing at first. Therefore, we have here collected the information which is most important for ``flex_extract`` users. Study the following sections to learn how ``flex_extract`` is best used, and to select the right parameters in the ``CONTROL`` files.
    1515
    1616
    1717:doc:`Ecmwf/access`
    18     Description of available access methods to the ECMWF data.
     18    Description of available  methods to access the ECMWF data.
    1919
    2020:doc:`Ecmwf/msdata`
    21     Information about available data and parameters for member state users which can be retrieved with ``flex_extract``
     21    Information about available data and parameters for member-state users which can be retrieved with ``flex_extract``
    2222
    2323:doc:`Ecmwf/pubdata`
    24     Information about available data and parameters for the public datasets which can be retrieved with ``flex_extract``
     24    Information about available data and parameters for the public data sets which can be retrieved with ``flex_extract``
    2525
    2626:doc:`Ecmwf/hintsecmwf`
    27     Collection of hints to best find information to define the dataset for retrievement and
    28     to define the ``CONTROL`` files.
     27    Collection of hints to best find information to define the data set for retrieval, and
     28    to define the content of the ``CONTROL`` files.
    2929
    3030:doc:`Ecmwf/ec-links`
    31     Link collection for additional and useful information as well as references to specific dataset publications.
     31    Link collection for additional and useful information as well as references to publications on specific data sets.
    3232
    3333
  • For_developers/Sphinx/source/evaluation.rst

    rb1674ed rf20342a  
    66.. note::
    77
    8   This section in the online documentation still needs to be done.
    9   Currently, evaluation methods and information can be found in the `flex_extract discussion paper <https://www.geosci-model-dev-discuss.net/gmd-2019-358/>`_ of the Geoscientific Model Development journal.
     8  This section still needs to be written.
     9  Currently, evaluation methods can be found in the `flex_extract discussion paper <https://www.geosci-model-dev-discuss.net/gmd-2019-358/>`_ of the journal Geoscientific Model Development.
    1010 
    1111 
  • For_developers/Sphinx/source/quick_start.rst

    rb936fd3 rf20342a  
    263263    CONTROL_OD.OPER.FC.gauss.highres 
    264264    CONTROL_OD.OPER.FC.operational           
    265     CONTROL_OD.OPER.FC.twiceaday.1hourly
    266     CONTROL_OD.OPER.FC.twiceaday.3hourly
     265    CONTROL_OD.OPER.FC.twicedaily.1hourly
     266    CONTROL_OD.OPER.FC.twicedaily.3hourly
    267267   
    268268   
     
    277277
    278278                   
    279 A common problem for beginners in retrieving ECMWF datasets is the mismatch in the definition of these parameters. For example, if you would like to retrieve operational data before ``June 25th 2013`` and set the maximum level to ``137`` you will get an error because this number of levels was first introduced at this effective day. So, be cautious in the combination of space and time resolution as well as the field types which are not available all the time.
     279A common problem for beginners in retrieving ECMWF datasets is a mismatch in the choice of values for these parameters. For example, if you try to retrieve operational data for 24 June 2013 or earlier and set the maximum level to 137, you will get an error because this number of levels was introduced only on 25 June 2013. Thus, be careful in the combination of space and time resolution as well as the field types.
    280280
    281281
    282282.. note::
    283283
    284     Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please see the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_
    285 
    286 
    287 In the following we shortly discuss the main retrieval opportunities of the different datasets and  categoize the ``CONTROL`` files.   
     284    Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please consult the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_
     285
     286In the following, we shortly discuss the typical retrievals for the different datasets and  point to the respective ``CONTROL`` files.   
    288287                   
    289288     
     
    291290---------------         
    292291
    293 The main difference in the definition of a ``CONRTOL`` file for a public dataset is the setting of the parameter ``DATASET``. This specification enables the selection of a public dataset in MARS. Otherwise the request would not find the dataset.
     292The main characteristic in the definition of a ``CONTROL`` file for a public dataset is the parameter ``DATASET``. Its specification enables the selection of a public dataset in MARS. Without this parameter, the request would not find the dataset.
    294293For the two public datasets *CERA-20C* and *ERA-Interim* an example file with the ending ``.public`` is provided and can be used straightaway.
    295294
     
    299298    CONTROL_EI.public     
    300299
    301 For *CERA-20C* it seems that there are no differences in the dataset against the full dataset, while the *public ERA-Interim* has only analysis fields every 6 hour without filling forecasts in between for model levels. Therefore it is only possible to retrieve 6-hourly data for *public ERA-Interim*.
     300For *CERA-20C* it seems that there are no differences compared the full dataset, whereas the *public ERA-Interim* has only 6-hourly analysis fields, without forecasts to fill in between, for model levels. Therefore, it is only possible to retrieve 6-hourly data for *public ERA-Interim*.
    302301
    303302.. note::
    304303
    305     In general, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and per newsletter.
     304    In principle, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and on the FLEXPART user email list.
    306305                     
    307306
     
    309308----
    310309
    311 For this dataset it is important to keep in mind that the dataset is available for the period 09/1901 until 12/2010 and the temporal resolution is limited to 3-hourly fields.
    312 It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream. It has ``10`` ensemble members. The example ``CONTROL`` files will only select the first member (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``).
    313 Another important difference to all other datasets is the forecast starting time which is 18 UTC. Which means that the forecast in *CERA-20C* for flux fields is  12 hours long. Since the forecast extends over a single day we need to extract one day in advance and one day subsequently. This is automatically done in ``flex_extract``.
     310For this dataset, it is important to keep in mind that it is available for the period 09/1901 until 12/2010, and that the temporal resolution is limited to 3 h.
     311It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream. There are 10 ensemble members. The example ``CONTROL`` files will only select the first member (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``).
     312Another important difference to all other datasets is that the forecast starting time is 18 UTC. This means that forecasts for flux fields cover 12 hours. Since the forecast extends over a single day we need to extract one day in advance and one day subsequently. This is automatically done in ``flex_extract``.
     313##PS check previous para
    314314
    315315
     
    317317-----
    318318
    319 This is the newest re-analysis dataset and has a temporal resolution of 1-hourly analysis fields. Up to date it is available until April 2019 with regular release of new months.
    320 The original horizontal resolution is ``0.28125°`` which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction  must be an exact multiple of the resolution. It might be easier for users to use ``0.25`` for the resolution which MARS will automatically interpolate.
    321 The forecast starting time is ``06/18 UTC`` which is important for the flux data. This should be set in the ``CONTROL`` file via the ``ACCTIME 06/18`` parameter in correspondence with ``ACCMAXSTEP 12`` and ``ACCTYPE FC``.
     319This is the latest re-analysis dataset, and has a temporal resolution of 1-h (analysis fields). At the time of writing, it is available until April 2019 with regular release of new months.
     320The original horizontal resolution is 0.28125° which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction  must be an integer multiple of the resolution. It is also possible to use ``0.25`` for the resolution; MARS will then automatically interpolate to this resolution which is still close enough to be acceptable.
     321The forecast starting time is ``06/18 UTC`` which is important for the flux data. Correspondingly, one should set in the ``CONTROL`` file ``ACCTIME 06/18``, ``ACCMAXSTEP 12``, and ``ACCTYPE FC``.
    322322
    323323.. note::
    324324
    325     We know that *ERA5* also has an ensemble data assimilation system but this is not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble retrieval for *ERA5* is a future ToDo.
     325    *ERA5* also includes an ensemble data assimilation system but related fields are not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble field retrieval for *ERA5* is a *to-do* for the future.
    326326
    327327
     
    330330-----------
    331331
    332 This re-analysis dataset will exceed its end of production at 31st August 2019!
    333 It is then available from 1st January 1979 to 31st August 2019. The ``etadot`` is not available in this dataset. Therefore ``flex_extract`` must select the ``GAUSS`` parameter to retrieve the divergence field in addition. The vertical velocity is the calculated with the continuity equation in the Fortran program ``calc_etadot``. Since the analysis fields are only available for every 6th hour, the dataset can be made 3 hourly by adding forecast fields in between. No ensemble members are available.
    334 
     332The production of this re-analysis dataset has stopped on 31 August 2019!
     333It is available for the period from 1 January 1979 to 31 August 2019. The ``etadot`` parameter is not available in this dataset. Therefore, one must use the ``GAUSS`` parameter, which retrieves the divergence field in addition and calculates the vertical velocity from the continuity equation in the Fortran program ``calc_etadot``. While the analysis fields are only available for every 6th hour, the dataset can be made 3-hourly by adding forecast fields in between. No ensemble members are available.
    335334
    336335   
     
    338337----------------
    339338
    340 This is the real time atmospheric model in high resolution with a 10-day forecast. This means it underwent regular adaptations and improvements over the years. Hence, retrieving data from this dataset needs extra attention in selecting correct settings of parameter. See :ref:`ref-tab-dataset-cmp` for the most important parameters.
    341 Nowadays, it is available 1 hourly by filling the gaps of the 6 hourly analysis fields with 1 hourly forecast fields. Since 4th June 2008 the eta coordinate is directly available so that ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to troubles in retrieving this high resolution dataset in terms of job duration and quota exceedence.
    342 It is recommended to submit such high resolution cases for single day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits.   
    343 
    344 ``CONTROL`` files for normal daily retrievals with a mix of analysis and forecast fields are listed below:
     339This data set provides the output of the real-time atmospheric model runs in high resolution, including 10-day forecasts. The model undergoes frequent adaptations and improvements. Thus, retrieving data from this dataset requires extra attention in selecting correct settings of the parameters. See :ref:`ref-tab-dataset-cmp` for the most important parameters.
     340Currently, fields can be retrieved at 1 h temporal resolution by filling the gaps between analysis fields with 1-hourly forecast fields. Since 4 June 2008, the eta coordinate vertical velocity is directly available from MARS, therefore ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to problems in terms of job duration and disk space quota.
     341It is recommended to submit such high resolution cases as single day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits.   
     342
     343``CONTROL`` files for standard retrievals with a mix of analysis and forecast fields are listed below:
    345344
    346345.. code-block:: bash
     
    351350    CONTROL_OD.OPER.FC.gauss.highres 
    352351   
    353 These files defines the minimum number of parameters necessary to retrieve a daily subset. The setup of field types is optimal and should only be changed if the user understands what he does. The grid, domain and temporal resolution can be changed according to availability.     
    354    
    355 
     352These files defines the minimum number of parameters necessary to retrieve a daily subset. The given settings for the TYPE parameter are already optimised, and should only be changed if you know what you are doing. Grid, domain, and temporal resolution may be changed according to availability.
     353   
    356354
    357355.. note::
    358356
    359      Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ to get to know hints about retrieval efficiency and troubleshooting.
    360  
     357     Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ for hints about retrieval efficiency and troubleshooting.
    361358   
    362359
    363360Pure forecast
    364     It is possible to retrieve pure forecasts exceeding a day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecast of *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*.
    365     The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today in 1-hourly temporal resolution. *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which contain 50 ensemble members.
    366     Please be aware that in this case it is necessary to set the specific type for flux fields explicitly, otherwise it could select a default value which might be different from what you expect!
     361    It is possible to retrieve pure forecasts exceeding one day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecasts of *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*.
     362    The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today (1-hourly). *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which currently works with 50 ensemble members.
     363    Please be aware that in this case it is necessary to set the type for flux fields explicitly, otherwise a default value might be selected, different from what you expect!
    367364   
    368365    .. code-block:: bash
     
    373370
    374371
    375 
    376372Half-day retrievals
    377     If a forecast for just half a day is wanted it can be done by substituting the analysis fields also by forecast fields as shown in files with ``twiceaday`` in it. They produce a full day retrieval with pure 12 hour forecasts twice a day. It is also possible to use the operational version which would get the time information from ECMWF's environmental variables and therefore get the newest forecast per day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours upfront to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used this can be done for any other date too.
     373    If a forecast is wanted for half a day only, this can be done by substituting the analysis fields by forecast fields as shown in files with ``twicedaily`` in their name. They produce a full-day retrieval with pure 12 hour forecasts, twice a day. It is also possible to use the operational version which would obtain the time information from ECMWF's environment variables and therefore use the newest forecast for each day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours up to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used, this can be done for any other date, too.
    378374   
    379375    .. code-block:: bash
     
    381377        CONTROL_OD.OPER.FC.eta.basetime
    382378        CONTROL_OD.OPER.FC.operational           
    383         CONTROL_OD.OPER.FC.twiceaday.1hourly
    384         CONTROL_OD.OPER.FC.twiceaday.3hourly
    385 
    386 
     379        CONTROL_OD.OPER.FC.twicedaily.1hourly
     380        CONTROL_OD.OPER.FC.twicedaily.3hourly
    387381
    388382
    389383Ensemble members
    390     The retrieval of ensemble members were already mentioned in the pure forecast section and for *CERA-20C* data.
    391     In this ``flex_extract`` version there is an additional possibility to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the real-time dataset. This model version has (up to May 2019) 25 ensemble members and a control run (``number 0``). Starting from June 2019 it has 50 ensemble members. Therefore we created the possibility to double up the 25 ensemble members (before June 2019) to 50 members by taking the original 25 members from MARS and subtracting 2 times the difference between the member value and the control value. This is done by selecting the parameter ``DOUBLEELDA`` and set it to ``1``.
    392      
     384    The retrieval of ensemble members was already mentioned in the pure forecast section and for *CERA-20C* data.
     385    This ``flex_extract`` version allows to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the operational dataset. Until May 2019, there were 25 ensemble members and a control run (``number 0``). Starting with June 2019, the number of ensemble members has been increased to 50. Therefore, we created the option to create 25 additional "pseudo-ensemble members" for periods before June 2019. The original 25 members from MARS are taken, and the difference between the member value and the control value is subtracted twice. This is done if the parameter ``DOUBLEELDA`` is included and set it to ``1``.
    393386   
    394387    .. code-block:: bash
     
    396389        CONTROL_OD.ELDA.FC.eta.ens.double   
    397390        CONTROL_OD.ENFO.PF.ens
    398 
    399 
    400391   
    401392   
     
    404395
    405396rrint
    406     Decides if the precipitation flux data uses the old (``0``) or new (``1``) disaggregation scheme. See :doc:`Documentation/disagg` for explanaition.
     397    Selects the disaggregation scheme for precipitation flux: old (``0``) or new (``1``). See :doc:`Documentation/disagg` for explanation.
    407398cwc
    408     Decides if the total cloud water content will be retrieved (set to ``1``) in addition. This is the sum of cloud liquid and cloud ice water content.
     399    If present and set to ``1``, the total cloud water content will be retrieved in addition. This is the sum of cloud liquid and cloud ice water content.
    409400addpar
    410     With this parameter an additional list of 2-dimensional, non-flux parameters can be retrieved. Use format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names.
     401    With this parameter. an additional list of 2-dimensional, non-flux parameters can be retrieved. Use the format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names as defined by MARS.
    411402doubleelda
    412     Use this to double the ensemble member number by adding further disturbance to each member.
     403    Use this to double the ensemble member number by adding further disturbance to each member (to be used with 25 members).
    413404debug
    414     If set to ``1`` all temporary files were kept at the end. Otherwise everything except the final output files will be deleted.
     405    If set to ``1``, all temporary files are preserved. Otherwise, everything except the final output files will be deleted.
    415406request
    416407    This produces an extra *csv* file ``mars_requests.csv`` where the content of each mars request of the job is stored. Useful for debugging and documentation.
    417408mailfail
    418     At default the mail is send to the mail connected with the user account. Add additional email addresses if you want. But as soon as you enter a new mail, the default will be overwritten. If you would like to keep the mail from your user account, please add ``${USER}`` to the list ( comma seperated ) or mail addresses.
    419      
    420        
    421        
    422 Hints for definition of some parameter combinations
    423 ---------------------------------------------------
    424 
    425 Field types and times
    426     This combination is very important. It defines the temporal resolution and which field type is extracted per time step.
    427     The time declaration for analysis (AN) fields uses the times of the specific analysis and (forecast time) steps have to be ``0``. The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) times and the (forecast) steps. Both of them together defines the actual time step. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general it is enough to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations.
    428 
    429     .. code-block:: bash
    430        :caption: Example of a setting for the field types and temporal resolution.
     409    As a default, e-mails are sent to the mail address connected with the user account. It is possible to overwrite this by specifying one or more e-mail addresses (comma-separated list). In order to include the e-mail associated with the user account, add ``${USER}`` to the list.
     410       
     411       
     412Hints for proper definition of certain parameter combinations
     413-------------------------------------------------------------
     414
     415Field type and time
     416    This combination is very important. It defines the temporal resolution and which field type is extracted on each time step.
     417    The time declaration for analysis (AN) fields uses the times of the specific analysis while the (forecast time) step has to be ``0``.
     418    The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) time and the (forecast) step. Together they define the actual time. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general, it is sufficient to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations.
     419# needs to be rephrased
     420
     421    .. code-block:: bash
     422       :caption: Example of a setting for the field types and temporal resolution. It will retrieve 3-hourly fields, with analyses at 00 and 12 UTC and the corresponding forecasts inbetween.
    431423
    432424        DTIME 3
     
    437429 
    438430Vertical velocity           
    439     The vertical velocity for ``FLEXPART`` is not directly available from MARS. Therefore it has to be calculated. There are a couple of different options. The following parameters are responsible for the selection. See :doc:`Documentation/vertco` for a detailed explanation. The ``ETADIFF``, ``OMEGA`` and ``OMEGADIFF`` versions are only recommended for debugging and testing reasons. Usually it is a decision between ``GAUSS`` and ``ETA``, where for ``GAUSS`` spectral fields of the horizontal wind fields and the divergence are to be retrieved and used with the continuity equation to calculate the vertical velocity. For ``ETA`` the latitude/longitude fields of horizontal wind fields and eta-coordinate are to be retrieved. It is recommended to use ``ETA`` where possible due to a reduced computation time. 
    440 
    441     .. code-block:: bash
    442         :caption: Example setting for the vertical coordinate retrieval.
     431    The vertical velocity for ``FLEXPART`` is not directly available from MARS and has to be calculated.
     432    There are several options for this, and the following parameters are responsible for the selection. See :doc:`Documentation/vertco` for a detailed explanation. Using ``ETADIFF 1``, ``OMEGA 1`` and ``OMEGADIFF 1`` is recommended for debugging and testing only.
     433  Usually, one has to decide between ``GAUSS 1`` and ``ETA 1``. ``GAUSS 1`` means that spectral fields of the horizontal wind fields and the divergence are retrieved and that the vertical velocity is calculate using the continuity equation. ``ETA 1`` means that horizontal wind fields etadot are retrieved on a regular lat-lon grid. It is recommended to use ``ETA 1`` where possible, as there is a substantial computational overhead for solving the continuity equation.
     434
     435    .. code-block:: bash
     436        :caption: Example setting for the vertical coordinate retrieval (recommended if etadot fields are available).
    443437       
    444438        GAUSS 0
     
    451445
    452446Grid resolution and domain
    453     The grid and domain selection depends on each other. The grid can be defined in the format of normal degrees (e.g. ``1.``) or as in older versions by 1/1000. degrees (e.g. ``1000`` for ``1°``).
    454     After selecting the grid, the domain has to be defined in a way that the length of the domain in longitude or latitude direction  must be an exact multiple of the grid.
    455     The horizontal resolution for spectral fields will be set by the parameter ``RESOL``. For information about how to select an appropriate value you can read the explanation of the MARS keyword `here <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and in `this table  <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_.
    456    
    457     .. code-block:: bash
    458         :caption: Example setting for a northern hemisphere domain with a grid of ``0.25°``.
     447    The grid and domain parameters depends on each other. ``grid`` refers to the grid resolution. It can be given as decimal values (e.g., ``1.`` meaning 1.0°), or as in previous versions of flex_extract, as integer values refering to 1/1000 degrees (e.g., ``1000`` means also 1°). The code applies common sense to determine what format is to be assumed.
     448    After selecting grid, the ``domain`` has to be defined. The extension in longitude or latitude direction must be an integer multiple of ``grid``.
     449#PS shouldn't we explain how to define a domain??
     450    The horizontal resolution for spectral fields is set by the parameter ``RESOL``. For information about how to select an appropriate value please read the explanation of the MARS keyword RESOL as found `in this entry of the ECMWF on-line documentation <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and  `this table (also ECMWF documentation) <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_.
     451   
     452    .. code-block:: bash
     453        :caption: Example setting for a domain covering the northern hemisphere domain with a grid resolution of ``0.25°``.
    459454   
    460455        GRID 0.25
     
    468463
    469464Flux data
    470     The flux fields are accumulated forecast fields all the time. Since some re-analysis dataset nowadays have complete set of analysis fields in their temporal resolution it was important to define a new parameter set to define the flux fields since the information could not be taken from ``TYPE``, ``TIME`` and ``STEP`` any longer. Select a forecast field type ``ACCTYPE``, the forecast starting time ``ACCTIME`` and the maximum forecast step ``ACCMAXSTEP``. The ``DTIME`` parameter defines the temporal resolution for the whole period.
     465    Flux fields are always forecast fields and contain values of the fluxes accumulated since the start of the respective forecast. As certain re-analysis dataset cover all time steps with analysis fields, it was necessary to define a new parameter set for the definition of the flux fields. The following parameters are used specifically for flux fields, if provided. ``ACCTYPE`` is the field type (must be a type of forecast), ``ACCTIME``  the forecast starting time, and  ``ACCMAXSTEP`` the maximum forecast step;``DTIME`` the temporal resolution. ACCTYPE is assumed to be the same during the whole period given by ACCTIME and ACCMAXSTEP.
    471466   
    472467    .. code-block:: bash
    473468       :caption: Example setting for the definition of flux fields.
     469#PS for which application would this be typical?
    474470   
    475471        DTIME 3
     
    478474        ACCMAXSTEP 36
    479475
    480 
    481476   
    482477.. toctree::
Note: See TracChangeset for help on using the changeset viewer.
hosted by ZAMG