Changeset f20342a in flex_extract.git
- Timestamp:
- May 27, 2020, 8:01:54 PM (4 years ago)
- Branches:
- master, ctbto, dev
- Children:
- 550435b
- Parents:
- a14839a
- Files:
-
- 36 edited
Legend:
- Unmodified
- Added
- Removed
-
Documentation/html/Documentation/Api/api_fortran.html
rb1674ed rf20342a 184 184 <div class="section" id="fortran-s-auto-generated-documentation"> 185 185 <h1>Fortran’s Auto Generated Documentation<a class="headerlink" href="#fortran-s-auto-generated-documentation" title="Permalink to this headline">¶</a></h1> 186 <p>Link to other documentation!</p> 187 <p>…. f:autoprogram:: preconvert</p> 186 <p><a class="reference external" href="Fortran/index.html">Fortran API</a></p> 188 187 <div class="toctree-wrapper compound"> 189 188 </div> -
For_developers/Sphinx/source/Documentation/Api/api_fortran.rst
rba99230 rf20342a 1 1 ************************************** 2 Fortran's Auto Generated Documentation 2 Auto-generated documentation for the Fortran programme 3 3 ************************************** 4 4 … … 6 6 :local: 7 7 8 9 10 Link to other documentation! 8 9 `Fortran API <Fortran/index.html>`_ 11 10 12 13 14 15 .... f:autoprogram:: preconvert16 17 18 19 11 20 12 .. toctree:: 21 13 :hidden: 22 14 :maxdepth: 2 23 24 25 26 -
For_developers/Sphinx/source/Documentation/Api/api_python.rst
rba99230 rf20342a 1 1 ************************************* 2 Python's Auto Generated Documentation 2 Auto-generated documentation for the Python scripts 3 3 ************************************* 4 4 -
For_developers/Sphinx/source/Documentation/Input/changes.rst
rba99230 rf20342a 8 8 - comments available with ``#`` 9 9 - only parameters which are needed to override the default values are necessary 10 - number of type/step/time elements do not have to be 24 any more. Just select the intervalyou need.11 - the ``dtime`` parameter needs to be consistent with ``type/step/time`` . For example ``dtime`` can be coarser as ``time`` intervals areavailable, but not finer.10 - number of type/step/time elements does not have to be 24 anymore. Just provide what you need. 11 - the ``dtime`` parameter needs to be consistent with ``type/step/time``, for example, ``dtime`` can be coarser than the ``time`` intervals available, but not finer. 12 12 13 13 -
For_developers/Sphinx/source/Documentation/Input/compilejob.rst
rb1674ed rf20342a 1 1 ******************************************** 2 The Compilation Jobscript ``compilejob.ksh``2 The compilation job script ``compilejob.ksh`` 3 3 ******************************************** 4 4 5 The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.5 The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory. 6 6 7 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate7 ``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate 8 8 the Korn-shell script from the template files by substituting the individual parameters. 9 9 These individual parameters are marked by a doubled ``$`` sign in ``job.temp``. 10 10 11 The job script has a number of settings for the batch system which are fixed anddifferentiates between the *ecgate* and the *cca/ccb*11 The job script has a number of settings for the batch system which are fixed, and it differentiates between the *ecgate* and the *cca/ccb* 12 12 server system to load the necessary modules for the environment when submitted to the batch queue. 13 13 … … 19 19 ------------------------------------ 20 20 21 #. It sets necessary batchsystem parameters21 #. It sets the necessary batch-system parameters 22 22 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules 23 #. It sets some environment variab els for the single session23 #. It sets some environment variables for the single session 24 24 #. It creates the ``flex_extract`` root directory in the ``$HOME`` path of the user 25 #. It untars the tar -ball into the root directory.26 #. It compiles the Fortran program s's``Makefile``.27 #. At the end it checks if the script returned an error or not and send the log file via emailto the user.25 #. It untars the tarball into the root directory. 26 #. It compiles the Fortran program using ``Makefile``. 27 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user. 28 28 29 29 -
For_developers/Sphinx/source/Documentation/Input/control.rst
rb1674ed rf20342a 10 10 11 11 This file is an input file for :literal:`flex_extract's` main script :literal:`submit.py`. 12 It contains the controlling parameters :literal:`flex_extract` needs to decide on dataset specifications,13 handling of the retrieved data and general bahaviour. The naming convention is usually (but not necessary):12 It contains the controlling parameters which :literal:`flex_extract` needs to decide on data set specifications, 13 handling of the data retrieved, and general behaviour. The naming convention is usually (but not necessarily): 14 14 15 15 :literal:`CONTROL_<Dataset>[.optionalIndications]` 16 16 17 The tested datasets are the operational dataset and the re-analysis datasets CERA-20C, ERA5and ERA-Interim.18 The optional extra indications for the re-analysis datasets mark the files for *public users*19 and *global* domain. For the operational data sets (*OD*)the file names contain also information of20 the stream, the field type for forecasts, the method for extracting the vertical coordinate and other things like timeor horizontal resolution.17 There are a number of data sets for which the procedures have been tested, the operational data and the re-analysis datasets CERA-20C, ERA5, and ERA-Interim. 18 The optional indications for the re-analysis data sets mark the files for *public users* 19 and *global* domain. For the operational data sets (*OD*), the file names contain also information of 20 the stream, the field type for forecasts, the method for extracting the vertical wind, and other information such as temporal or horizontal resolution. 21 21 22 22 … … 24 24 ---------------------------------- 25 25 The first string of each line is the parameter name, the following string(s) (separated by spaces) is (are) the parameter values. 26 The parameters can be sorted in any order with one parameter per line.26 The parameters can be listed in any order with one parameter per line. 27 27 Comments are started with a '#' - sign. Some of these parameters can be overruled by the command line 28 28 parameters given to the :literal:`submit.py` script. 29 All parameters have default values . Only those parameters which have to be changed30 mustbe listed in the :literal:`CONTROL` files.29 All parameters have default values; only those parameters which deviate from default 30 have be listed in the :literal:`CONTROL` files. 31 31 32 32 … … 35 35 36 36 A number of example files can be found in the directory :literal:`flex_extract_vX.X/Run/Control/`. 37 They can be used as a template for adaptation s and understand what's possible to38 retrieve from ECMWF's archive.39 For each main dataset there is an example and additionally some variances in resolution, type of field or type of retrieving the vertical coordinate.37 They can be used as a template for adaptation, and to understand what can be 38 retrievee from ECMWF's archives. 39 There is an example for each main data set, and in addition, some more varied with respect to resolution, type of field, or way of retrieving the vertical wind. 40 40 41 41 … … 45 45 ------------ 46 46 The file :literal:`CONTROL.documentation` documents the available parameters 47 in grouped sections with their default values. In :doc:`control_params` you can find a more 48 detailed description with additional hints, possible values and some useful information about 47 in grouped sections together with their default values. 48 In :doc:`control_params`, you can find a more 49 detailed description with additional hints, possible values, and further information about 49 50 the setting of these parameters. 50 51 -
For_developers/Sphinx/source/Documentation/Input/control_params.rst
rb1674ed rf20342a 11 11 ************ 12 12 13 .. exceltable:: User parameter in CONTROL file13 .. exceltable:: User parameters in CONTROL file 14 14 :file: ../../_files/CONTROLparameter.xls 15 15 :sheet: UserSection … … 21 21 *************** 22 22 23 .. exceltable:: General parameter in CONTROL file23 .. exceltable:: General parameters in CONTROL file 24 24 :file: ../../_files/CONTROLparameter.xls 25 25 :sheet: GeneralSection … … 31 31 ************ 32 32 33 .. exceltable:: Time parameter in CONTROL file33 .. exceltable:: Time parameters in CONTROL file 34 34 :file: ../../_files/CONTROLparameter.xls 35 35 :sheet: TimeSection … … 42 42 ************ 43 43 44 .. exceltable:: Data parameter in CONTROL file44 .. exceltable:: Data parameters in CONTROL file 45 45 :file: ../../_files/CONTROLparameter.xls 46 46 :sheet: DataSection … … 53 53 ****************** 54 54 55 .. exceltable:: Data field parameter in CONTROL file55 .. exceltable:: Data field parameters in CONTROL file 56 56 :file: ../../_files/CONTROLparameter.xls 57 57 :sheet: DatafieldsSection … … 64 64 ***************** 65 65 66 .. exceltable:: Flux data parameter in CONTROL file66 .. exceltable:: Flux data parameters in CONTROL file 67 67 :file: ../../_files/CONTROLparameter.xls 68 68 :sheet: FluxDataSection … … 75 75 ************** 76 76 77 .. exceltable:: Domain parameter in CONTROL file77 .. exceltable:: Domain parameters in CONTROL file 78 78 :file: ../../_files/CONTROLparameter.xls 79 79 :sheet: DomainSection … … 99 99 *********************** 100 100 101 .. exceltable:: Additional data parameter in CONTROL file101 .. exceltable:: Additional data parameters in CONTROL file 102 102 :file: ../../_files/CONTROLparameter.xls 103 103 :sheet: AddDataSection -
For_developers/Sphinx/source/Documentation/Input/ecmwf_env.rst
rb1674ed rf20342a 1 1 **************************************** 2 ECMWF User Credential file ``ECMWF_ENV``2 ECMWF user credential file ``ECMWF_ENV`` 3 3 **************************************** 4 4 … … 16 16 ------------------------ 17 17 18 The following shows an example of the content of an ``ECMWF_ENV`` file:18 An example of the content of an ``ECMWF_ENV`` file is shown below: 19 19 20 20 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/examples.rst
rb1674ed rf20342a 3 3 ********************** 4 4 5 ``Flex_extract`` has a coupleof example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.5 ``Flex_extract`` comes with a number of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``. 6 6 7 Here is a list of the example files and a description of the data set:7 Here is a list of the example files: 8 8 9 9 CONTROL.documentation 10 This file is not intended to be used with ``flex_extract``. It has a list of all possible parameters and their default values for a quick overview.10 This file is not intended to be used with ``flex_extract``. It just contains a list of all possible parameters and their default values for a quick overview. 11 11 12 12 .. code-block:: bash … … 33 33 CONTROL_OD.OPER.FC.gauss.highres 34 34 CONTROL_OD.OPER.FC.operational 35 CONTROL_OD.OPER.FC.twice aday.1hourly36 CONTROL_OD.OPER.FC.twice aday.3hourly35 CONTROL_OD.OPER.FC.twicedaily.1hourly 36 CONTROL_OD.OPER.FC.twicedaily.3hourly 37 37 38 38 #PS some information to be added. 39 39 40 .. toctree:: 40 41 :hidden: -
For_developers/Sphinx/source/Documentation/Input/fortran_makefile.rst
rb1674ed rf20342a 1 1 ************************************** 2 The Fortran Makefile -``calc_etadot``2 The Fortran makefile for ``calc_etadot`` 3 3 ************************************** 4 4 5 5 .. _ref-convert: 6 6 7 ``Flex_extract``'s Fortran programwill be compiled during8 the installation process to get the executable named ``calc_etadot``.7 The Fortran program ``calc_etadot`` will be compiled during 8 the installation process to produce the executable called ``calc_etadot``. 9 9 10 ``Flex_extract`` has a couple of ``makefiles`` preparedwhich can be found in the directory11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number.12 A list of these ``makefiles`` areshown below:10 ``Flex_extract`` includes several ``makefiles`` which can be found in the directory 11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted by the current flex_extract version number. 12 A list of these ``makefiles`` is shown below: 13 13 14 14 … … 16 16 | Files to be used as they are! 17 17 18 | **makefile_ecgate** 19 | For the use on ECMWF's server **ecgate**. 20 21 | **makefile_cray** 22 | For the use on ECMWF's server **cca/ccb**. 18 | **makefile_ecgate**: For use on ECMWF's server **ecgate**. 19 | **makefile_cray**: For use on ECMWF's server **cca/ccb**. 23 20 24 21 | **Local mode** 25 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** 22 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** if they don't correspond to the standard paths pre-set in the makefiles. 26 23 27 | **makefile_fast** 28 | For the use with gfortran compiler and optimization mode.24 | **makefile_fast**: For use with the gfortran compiler and optimisation mode. 25 | **makefile_debug**: For use with the gfortran compiler and debugging mode. Primarily for developers. 29 26 30 | **makefile_debug** 31 | For the use with gfortran compiler in debugging mode.27 If you want to use another compiler than gfortran locally, you can still take ``makefile_fast``, 28 and adapt everything that is compiler-specific in this file. 32 29 33 34 For instructions on how to adapt the ``makefiles`` for the local application mode 30 For instructions on how to adapt the ``makefile`` (local application mode only), 35 31 please see :ref:`ref-install-local`. 36 37 32 38 33 -
For_developers/Sphinx/source/Documentation/Input/jobscript.rst
rb1674ed rf20342a 1 1 ************************* 2 The Jobscript ``job.ksh``2 The job script ``job.ksh`` 3 3 ************************* 4 4 5 The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.5 The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**. 6 6 7 It is based on the ``job.temp`` template file which isstored in the ``Templates`` directory.8 This template is by itselfgenerated in the installation process from a ``job.template`` template file.7 It is based on the ``job.temp`` template file stored in the ``Templates`` directory. 8 This template is generated in the installation process from a ``job.template`` template file. 9 9 10 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate10 ``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate 11 11 the Korn-shell script from the template files by substituting the individual parameters. 12 These individual parameters are marked by a doubled ``$`` signin ``job.temp``.12 These individual parameters are marked by ``$$`` in ``job.temp``. 13 13 14 The job script has a number of settings for the batch system which are fixedand differentiates between the *ecgate* and the *cca/ccb*14 The job script has a number of settings for the batch system which are fixed, and differentiates between the *ecgate* and the *cca/ccb* 15 15 server system to load the necessary modules for the environment when submitted to the batch queue. 16 16 … … 19 19 20 20 21 What does the job script do?21 What does the job script do? 22 22 --------------------------- 23 23 24 #. It sets necessary batch system parameters 25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules 26 #. It sets some environment variab els for the single session27 #. It creates the directory structure in the user s ``$SCRATCH`` file system28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are given to the jobscript with its default or the user defined values. It also sets the``CONTROL`` as an environment variable.29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new path es for input and output directoryand the recently generated ``CONTROL`` file.30 #. At the end it checks if the script returned an error or not and send the log file via emailto the user.24 #. It sets necessary batch system parameters. 25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules. 26 #. It sets some environment variables for the single session. 27 #. It creates the directory structure in the user's ``$SCRATCH`` file system. 28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are passed to the job script with their default or the user-defined values. It also sets ``CONTROL`` as an environment variable. 29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new paths for input and output directories and the recently generated ``CONTROL`` file. 30 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user. 31 31 32 32 -
For_developers/Sphinx/source/Documentation/Input/run.rst
rb1674ed rf20342a 1 1 ********************************** 2 The executable Script - ``run.sh``2 The executable script - ``run.sh`` 3 3 ********************************** 4 4 5 The execution of ``flex_extract`` is done by the ``run.sh`` Shell script, which is a wrappingscript for the top-level Python script ``submit.py``.5 The execution of ``flex_extract`` is done by the ``run.sh`` shell script, which is a wrapper script for the top-level Python script ``submit.py``. 6 6 The Python script constitutes the entry point to ECMWF data retrievals with ``flex_extract`` and controls the program flow. 7 7 8 ``submit.py`` has two ( three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file,9 the command line parameters and the so-called ``ECMWF_ENV`` file. Whereby, the command line parameters will override the ``CONTROL`` file parameters.8 ``submit.py`` has two (or three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 9 the command line parameters, and the so-called ``ECMWF_ENV`` file. Command line parameters will override parameters specified in the ``CONTROL`` file. 10 10 11 Based on th ese input information ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a Web API on a local maschine or submit a jobscript to ECMWF servers and retrieve the data there with sending the files to the local system eventually.11 Based on this input information, ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a web API on a local maschine, or submit a job script to an ECMWF server and retrieve the data there, and at the end sends the files to the local system. 12 12 13 13 14 14 15 15 16 Submission Parameter16 Submission parameters 17 17 -------------------- 18 18 19 19 20 .. exceltable:: Parameter for Submission20 .. exceltable:: Parameters for submission 21 21 :file: ../../_files/SubmitParameters.xls 22 22 :header: 1 … … 39 39 --------------------------------- 40 40 41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapp ing Shell script ``run.sh``. This top-level script is located in42 ``flex_extract_vX.X/Source/Python`` and is executable. With the `` help`` parameter we see again all possible43 command line parameter.41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapper shell script ``run.sh``. This top-level script is located in 42 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter 43 we see again all possible command line parameters. 44 44 45 45 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/setup.rst
rb1674ed rf20342a 1 1 ************************************** 2 The Installation Script - ``setup.sh``2 The installation script - ``setup.sh`` 3 3 ************************************** 4 4 5 6 The installation of ``flex_extract`` is done by the Shell script ``setup.sh`` which is located in the root directory of ``flex_extract``. 7 It calls the top-level Python script ``install.py`` which does all necessary operations to prepare the selected application environment. This includes: 8 9 - preparing the file ``ECMWF_ENV`` with the user credentials for member state access to ECMWF servers (in **remote** and **gateway** mode) 5 The installation of ``flex_extract`` is done by the shell script ``setup.sh`` located in the root directory of ``flex_extract``. 6 It calls the top-level Python script ``install.py`` which does all the necessary operations to prepare the application environment selected. This includes: 7 8 - preparing the file ``ECMWF_ENV`` with the user credentials for member-state access to ECMWF servers (in **remote** and **gateway** mode) 10 9 - preparation of a compilation Korn-shell script (in **remote** and **gateway** mode) 11 10 - preparation of a job template with user credentials (in **remote** and **gateway** mode) 12 - create a tar -ball of all necessary files13 - copying t ar-ball totarget location (depending on application mode and installation path)14 - submit compilation script to batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar tar-ball at target location (**local mode**)15 - compilation of the F ORTRAN90program ``calc_etadot``16 17 18 The Python installation script ``install.py`` has a couple of command line arguments which are defined in ``setup.sh`` in the section labelled with "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters for his personal use. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee necessary parameters were set.11 - create a tarball of all necessary files 12 - copying the tarball to the target location (depending on application mode and installation path) 13 - submit the compilation script to the batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar the tarball at target location (**local mode**) 14 - compilation of the Fortran program ``calc_etadot`` 15 16 17 The Python installation script ``install.py`` has several command line arguments defined in ``setup.sh``, in the section labelled "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters according to his/her personal needs. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee that the necessary parameters were set. 19 18 20 19 After the installation process, some tests can be conducted. They are described in section :ref:`ref-testinstallfe`. 21 20 22 The following diagram sketches the involved files and scriptsin the installation process:21 The following diagram sketches the files and scripts involved in the installation process: 23 22 24 23 .. _ref-install-blockdiag: … … 115 114 116 115 .. blockdiag:: 117 :caption: Diagram of data flow during the installation process. T he trapezoids are input files with the light blue area being the template files. The edge-rounded, orange boxes are the executable files which start the installation process and reads the input files. The rectangular, green boxes are the output files. The light green files are files which are only neededin the remota and gateway mode.116 :caption: Diagram of data flow during the installation process. Trapezoids are input files with the light blue area being the template files. Round-edge orange boxes are executable files which start the installation process and read the input files. Rectangular green boxes are output files. Light green files are needed only in the remota and gateway mode. 118 117 119 118 blockdiag { … … 133 132 .. _ref-instparams: 134 133 135 Installation Parameter136 ---------------------- 134 Installation parameters 135 ----------------------- 137 136 138 137 .. exceltable:: Parameter for Installation … … 155 154 ---------------------------------- 156 155 157 It is also possible to start the installation process of ``flex_extract`` directly from command line by using the ``install.py`` script instead of the wrapping Shell script ``setup.sh``. This top-level script is located in158 ``flex_extract_vX.X/Source/Python`` and is executable. With the `` help`` parameter we see again all possible159 command line parameter.156 It is also possible to start the installation process of ``flex_extract`` directly from the command line by using the ``install.py`` script instead of the wrapper shell script ``setup.sh``. This top-level script is located in 157 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter, 158 we see again all possible command line parameters. 160 159 161 160 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/templates.rst
rb1674ed rf20342a 3 3 ********* 4 4 5 In ``flex_extract`` we use the Python package `genshi <https://genshi.edgewall.org/>`_ to create specific files from templates. It is the most efficient way to be able to quickly adapt e.g. the job scripts send to the ECMWF batch queue system or the namelist file für the Fortran programwithout the need to change the program code.5 In ``flex_extract``, the Python package `genshi <https://genshi.edgewall.org/>`_ is used to create specific files from templates. It is the most efficient way to be able to quickly adapt, e. g., the job scripts sent to the ECMWF batch queue system, or the namelist file für the Fortran program, without the need to change the program code. 6 6 7 7 .. note:: 8 Usually it is not recommended to change anything in these files without being able to understand the effects.8 Do not change anything in these files unless you understand the effects! 9 9 10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need s to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used the ``$`` sign needs to be doubled to `escape` and keep a single ``$`` sign as it is.10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used, the ``$`` sign needs to be doubled for `escaping`. 11 11 12 The following templates are used and can be found indirectory ``flex_extract_vX.X/Templates``:12 The following templates are used; they can be found in the directory ``flex_extract_vX.X/Templates``: 13 13 14 14 convert.nl 15 15 ---------- 16 16 17 This is the template for a Fortran namelist file called ``fort.4`` which will beread by ``calc_etadot``.17 This is the template for a Fortran namelist file called ``fort.4`` read by ``calc_etadot``. 18 18 It contains all the parameters ``calc_etadot`` needs. 19 19 … … 57 57 This template is used to create the job script file called ``compilejob.ksh`` during the installation process for the application modes **remote** and **gateway**. 58 58 59 At the beginning some directives for the batch system are set.60 On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.61 For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at`PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.62 63 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.64 65 Afterwards the installation steps as such are done. Including the generation of the root directory, putting files in place, compiling the Fortran program and sending a log file viaemail.59 At the beginning, some directives for the batch system are set. 60 On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 61 For the high-performance computers **cca** and **ccb**, the ``PBS`` comments are necessary; for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 62 63 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing. 64 65 Afterwards, the installation steps as such are done. They included the generation of the root directory, putting files in place, compiling the Fortran program, and sending a log file by email. 66 66 67 67 .. code-block:: ksh … … 145 145 This template is used to create the actual job script file called ``job.ksh`` for the execution of ``flex_extract`` in the application modes **remote** and **gateway**. 146 146 147 At the beginning some directives for the batch system are set. 148 On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 149 For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 150 151 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing. 152 153 Afterwards the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end a log file is send via email. 147 At the beginning, some directives for the batch system are set. 148 On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 149 For the high performance computers **cca** and **ccb**, the ``PBS`` comments are necessary; 150 for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 151 152 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing. 153 154 Afterwards, the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end, a log file is send by email. 154 155 155 156 .. code-block:: ksh … … 239 240 ------------ 240 241 241 This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. A couple ofparameters are set in this process, such as the user credentials and the ``flex_extract`` version number.242 This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. Several parameters are set in this process, such as the user credentials and the ``flex_extract`` version number. 242 243 243 244 .. code-block:: ksh … … 325 326 326 327 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 328 343 329 -
For_developers/Sphinx/source/Documentation/Overview/app_modes.rst
rb1674ed rf20342a 1 1 ***************** 2 Application Modes2 Application modes 3 3 ***************** 4 4 … … 13 13 .. _ref-app-modes: 14 14 15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has 4different :underline:`user application modes`:15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has four different :underline:`user application modes`: 16 16 17 17 .. _ref-remote-desc: 18 18 19 19 1. Remote (member) 20 In the **Remote mode** the user works directly on ECMWF Linux member state server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the additional third-party libraries mentioned in :ref:`ref-requirements` as ECMWF provides everything with environment modules. The module selection will be done automatically in``flex_extract``.20 In the **Remote mode** the user works directly on a ECMWF member-state Linux server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the third-party libraries mentioned in :ref:`ref-requirements`, as ECMWF provides everything with environment modules. The module selection will be done automatically by ``flex_extract``. 21 21 22 22 .. _ref-gateway-desc: 23 23 24 24 2. Gateway (member) 25 The **Gateway mode** can be used if a local member state gateway server is in place. Then the job scripts can be submitted to the ECMWF Linux member state server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed at the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and some extra setup is done in the local ``flex_extract`` directory at the local gateway server. For more information about establishing a gateway server please see`ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.25 The **Gateway mode** can be used if a local member-state gateway server is in place. Then, the job scripts can be submitted to the ECMWF member-state Linux server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed on the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and that some extra setup is done in the ``flex_extract`` directory on the local gateway server. For more information about establishing a gateway server, please refer to `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`. 26 26 27 27 .. _ref-local-desc: 28 28 29 29 3. Local member 30 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.30 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. 31 31 32 32 4. Local public 33 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. In this casea direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.33 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF servers. The complete installation process is described in :ref:`ref-local-mode`. In this case, a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve. 34 34 35 35 -
For_developers/Sphinx/source/Documentation/Overview/prog_flow.rst
rb1674ed rf20342a 1 1 ************ 2 Program Flow2 Program flow 3 3 ************ 4 4 … … 16 16 .. figure:: ../../_files/submit.png 17 17 18 Overview of the call of python's ``submit.py`` script and raw sequence of working steps donein ``flex_extract``.18 Overview of the call of the``submit.py`` Python script and raw sequence of work steps in ``flex_extract``. 19 19 20 20 21 The ``submit.py`` Python program is called by the Shell script ``run.sh`` or ``run_local.sh`` and accomplishthe following steps:21 The ``submit.py`` Python script is called by the shell script ``run.sh`` or ``run_local.sh`` and accomplishes the following steps: 22 22 23 1. Setup thecontrol data:24 It gets all command-line and ``CONTROL`` file parameters as well as optionally the ECMWF user credentials. Depending the :doc:`app_modes`, it might also prepare a job script which is then sendto the ECMWF queue.25 2. Retriev esdata from MARS:26 It creates and sends MARS-requests either on the local machine or on ECMWF server, that receives the data and stores them in a specific format in GRIB files. If the parameter ``REQUEST`` was set ``1`` the data are not received but a file ``mars_requests.csv`` is created with a list of MARS requests and their settings. If it is set to ``2`` the file is created in addition to retrieving the data. The requests are created in an optimised way by splitting intime, jobs and parameters.27 3. Post-process data to create final ``FLEXPART`` input files:28 After all data is retrieved, the disaggregation of flux fields (`see here <../disagg.html>`_ ) is done as well as the calculation of vertical velocity (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Eventually, the GRIB fields are merged together such that a single grib file per time step is available with all fields for ``FLEXPART``. Since model level fields are typically in *GRIB2* format whereas surface level fields are still in *GRIB1* format, they can be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties reading pure *GRIB2* files since some parameter IDs change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files can be communicated to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locallythe progress of the script can be followed with the usual Linux tools.23 1. Setup of control data: 24 Command-line and ``CONTROL``-file parameters are read, as well as (optionally) the ECMWF user credentials. Depending the :doc:`app_modes`, a job script might be prepared which is then sent to the ECMWF queue. 25 2. Retrieval of data from MARS: 26 MARS requests are created either on the local machine or on the ECMWF server and then submitted which retrieve the data and store them in GRIB files. If the parameter ``REQUEST`` was set ``1``, the data are not retrieved and instead a file ``mars_requests.csv`` is created, which contains a list of the MARS requests and their settings. If ``REQEST`` is set to ``2``, the csv file is created in addition to retrieving the data. The requests are created in an optimised way by splitting with respect to time, jobs and parameters. 27 3. Post-processing of data to create final ``FLEXPART`` input files: 28 After all data have been retrieved, flux fields are disaggregated (`see here <../disagg.html>`_ ) and the vertical velocity is calculated (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Finally, the GRIB fields are merged into a single grib file per time step containing all the fields for ``FLEXPART``. Since model-level fields are typically in *GRIB2* format, whereas surface-level fields are still in *GRIB1* format, they will be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties to read these *GRIB2* files since some parameter IDs have been change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files will be sent to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally, the progress of the script can be followed with the usual Linux tools. 29 29 30 30 … … 33 33 ======================================== 34 34 35 More details on how different the program flow is for the different :doc:`app_modes` is sketched in the following diagrams:35 The following diagrams show how different the program flow is for the different :doc:`app_modes`: 36 36 37 37 +-------------------------------------------------+------------------------------------------------+ -
For_developers/Sphinx/source/Documentation/api.rst
rba99230 rf20342a 1 1 **************************** 2 Auto Generated Documentation2 Auto-generated documentation 3 3 **************************** 4 4 -
For_developers/Sphinx/source/Documentation/disagg.rst
rd9abaac rf20342a 1 1 *************************** 2 Disaggregation of Flux Data2 Disaggregation of flux data 3 3 *************************** 4 4 5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational particles in time and space. This method requires point values in the discrete input fields. However, flux data (as listed in table :ref:`ref-table-fluxpar`) from the ECMWF represent cell averages or integrals and are accumulated over a specific time interval, depending on the dataset. Hence, to conserve the integral quantity with ``FLEXPART``'s linear interpolation a pre-processing scheme has to be applied. 5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational 6 particles in time and space. This method requires point values in the discrete input fields. 7 However, flux data (as listed in table :ref:`ref-table-fluxpar` below) from the ECMWF represent cell 8 averages or integrals and are accumulated over a specific time interval, depending on the data 9 set. Hence, to conserve the integral quantity with the linear interpolation used in ``FLEXPART``, 10 pre-processing has to be applied. 6 11 7 12 .. _ref-table-fluxpar: 8 13 9 .. csv-table:: flux fields14 .. csv-table:: Flux fields 10 15 :header: "Short Name", "Name", "Units", "Interpolation Type" 11 16 :align: center … … 20 25 21 26 22 The first step is to *de-accumulate* the fields in time so that each value represents an integral in x, y, t space. 23 Afterwards, a *disaggregation* scheme is applied which means to break down the integral value into point values. 24 In order to be able to carry out the disaggregation procedure proposed by Paul James, additional flux data is retrieved automatically for one day at the beginning and one day at the end of the period specified. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not communicated to the final ``FLEXPART`` input files. 27 The first step is to *de-accumulate* the fields in time so that each value represents non-overlapping integrals in x-, y-, and t-space. 28 Afterwards, a *disaggregation* scheme is applied which means to convert the integral value to corresponding point values to be used late for the interpolation. 29 The disaggregation procedure as proposed by Paul James (currently, the standard) requires additional flux data for one day at the beginning and one day at the end of the period specified. 30 They are retrieved automatically. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not contained in the final ``FLEXPART`` input files. 25 31 26 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note ,that the first two and last two flux files do not contain any data.32 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note that the first two and last two flux files do not contain any data. 27 33 28 34 .. note:: 29 35 30 Note also that for operational retrievals (``BASETIME`` set to 00 or 12) forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for ondemand scripts.36 Note also that for operational retrievals (``BASETIME`` set to 00 or 12), forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on-demand scripts. 31 37 32 38 … … 34 40 -------------------------------------------------- 35 41 36 In ``flex_extract`` up to version 5 the disaggregation was done with a Fortran program (FLXACC2). In version 6 this part was converted toPython.42 In ``flex_extract`` up to version 5, the disaggregation was done with a Fortran program (FLXACC2). In version 6, this part was recoded in Python. 37 43 38 39 In the old versions (below 7.1) a relatively simple method processes the precipitation fields in a way that is consistent with the scheme applied in ``FLEXPART`` for all variables: linear interpolation between times where input fields are available. 40 At first the accumulated values are divided by the number of hours (i.e., 3 or 6). 44 In the old versions (below 7.1), a relatively simple method processes the precipitation fields in a way that is consistent with the linear interpolation between times where input fields are available that is applied in ``FLEXPART`` for all variables. 45 This scheme (from Paul James) at first divides the accumulated values by the number of hours (i.e., 3 or 6). ??? 41 46 The best option for disaggregation, which was realised, is conservation within the interval under consideration plus the two adjacent ones. 42 47 Unfortunately, this leads to undesired temporal smoothing of the precipitation time series – maxima are damped and minima are raised. … … 53 58 :figclass: align-center 54 59 55 Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by a half-time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).60 Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by half a time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018). 56 61 57 62 58 63 59 Disaggregation is done for 4 adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the 4adjacent timespans.64 Disaggregation is done for four adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the four adjacent timespans. 60 65 61 66 .. math:: … … 69 74 70 75 71 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the 4 original timespans has a value below 0it is set to 0 prior to the calculation.76 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the four original timespans has a value below 0, it is set to 0 prior to the calculation. 72 77 73 78 .. math:: … … 78 83 79 84 80 81 82 85 Disaggregation for precipitation in version 7.1 83 86 ----------------------------------------------- 84 87 85 Due to the problems with generating precipitation in originally dry (or lower) intervals and the temporal smoothing a new algorithm was developed. The approach is based on a one dimensional piecewise linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional monotonicity filter helps to gainmonotonicity.86 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the linear formulation.87 These requirements on the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i.e.the reconstructed values vanish throughout this whole time interval, too.88 Due to the problems mentioned above, a new algorithm was developed. The approach is based on a one-dimensional, piecewise-linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements of preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional filter improves monotonicity. 89 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the use of a linear formulation. 90 These requirements for the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i. e., the reconstructed values vanish throughout this whole time interval, too. 88 91 In the simplest scenario of an isolated precipitation event, where in the time interval before and after the data values are zero, the reconstruction algorithm therefore has to vanish at the boundaries of the interval, too. 89 92 The additional conditions of continuity and conservation of the precipitation amount then require us to introduce sub-grid points if we want to keep a linear interpolation (Fig. 2). … … 142 145 143 146 144 In the case of the new disaggregation method for precipitation, the two new sub grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval and 1 or 2 for the two new sub grid points respectively. The filenames do not change.147 In the case of the new disaggregation method for precipitation, the two new sub-grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval, and 1 or 2, respectively, for the two new sub-grid points. The filenames do not change. 145 148 146 149 147 150 .. note:: 148 151 149 The new method for disaggregation was published in the Geoscientific Model Development Journalin 2018:152 The new method for disaggregation was published in the journal Geoscientific Model Development in 2018: 150 153 151 154 Hittmeir, S., Philipp, A., and Seibert, P.: A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART, Geosci. Model Dev., 11, 2503-2523, https://doi.org/10.5194/gmd-11-2503-2018, 2018. 152 155 153 154 155 156 157 156 158 157 159 Disaggregation for the rest of the flux fields 158 159 Disaggregation for the other flux fields 160 160 ---------------------------------------------- 161 161 162 162 The accumulated values for the other variables are first divided by the number of hours and 163 then interpolated to the exact times Xusing a bicubic interpolation which conserves the integrals of the fluxes within each timespan.164 Disaggregation is done for 4 adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which generates a new, disaggregated value which is output at the central point of the 4adjacent timespans.163 then interpolated to the exact times using a bicubic interpolation which conserves the integrals of the fluxes within each timespan. 164 Disaggregation is done for four adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which produces a new, disaggregated value that is the output at the central point of the four adjacent timespans. 165 165 166 166 .. math:: -
For_developers/Sphinx/source/Documentation/input.rst
rb1674ed rf20342a 1 1 ******************** 2 Control & Input Data2 Control & input data 3 3 ******************** 4 4 5 Input Data5 Input data 6 6 - :doc:`Input/control` 7 ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialized by ``flex_extract`` with their default values and can be overwritten with definitions set in the socalled :doc:`Input/control`.7 ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialised by ``flex_extract`` with certain default values which can be overwritten with definitions set in the so-called :doc:`Input/control`. 8 8 9 To be able to successfully retrieve data from the ECMWF Mars archive it is necessary to understand these parameters andset them to proper and consistent values. They are described in :doc:`Input/control_params` section.9 For a successfull retrieval of data from the ECMWF MARS archive it is necessary to understand these parameters and to set them to proper and consistent values. They are described in :doc:`Input/control_params` section. 10 10 11 We also have some :doc:`Input/examples` and description of :doc:`Input/changes` changes to previous versions and downward compatibilities.11 Furthermore, some :doc:`Input/examples` are provided, and in :doc:`Input/changes` changes to previous versions and downward compatibilities are described. 12 12 13 13 - :doc:`Input/ecmwf_env` 14 For ``flex_extract`` it is necessaryto be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.14 ``flex_extract`` needs to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process. 15 15 16 16 - :doc:`Input/templates` 17 A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt for example the jobscripts regarding its settings for the batch jobs. 18 19 17 A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt, for example, the job scripts with regard to the settings for the batch jobs. 20 18 21 19 … … 29 27 30 28 Controlling 31 The main tasks and behaviour of ``flex_extract`` are controlled by its Python scripts. There are two top-level scripts, one for installation called install_and one for execution called submit_.32 They can interpret a number of commandline arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:29 The main tasks and the behaviour of ``flex_extract`` are controlled by the Python scripts. There are two top-level scripts, one for installation called install_, and one for execution called submit_. 30 They interpret a number of command-line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type: 33 31 34 32 .. code-block:: bash … … 38 36 python3 Source/Python/submit.py --help 39 37 40 In this new version we provide also the wrapping Shell scripts setup_ and run_, which sets the command line parameters, do some checks and execute the corresponing Python scripts ``install.py`` and ``submit.py`` respectivley. 41 42 It might be faster and easier for beginners. See :doc:`../quick_start` for information on how to use them. 38 With version 7.1, we provide also wrapper shell scripts setup_ and run_ which set the command-line parameters, do some checks, and execute the corresponing Python scripts ``install.py`` and ``submit.py``, respectively. 39 It might be faster and easier for beginners if they are used. See :doc:`../quick_start` for information on how to use them. 43 40 44 Additionally, ``flex_extract`` creates the Korn Shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be send to the ECMWF serves in the **remote mode** and the **gateway mode** for starting batch jobs.41 ``flex_extract`` also creates the Korn shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be sent to the ECMWF servers in the **remote mode** and the **gateway mode** for starting batch jobs. 45 42 46 The Fortran program will be compiled during the installation process bythe :doc:`Input/fortran_makefile`.43 The Fortran program is compiled during the installation process using the :doc:`Input/fortran_makefile`. 47 44 48 To sum up, the following scripts control s``flex_extract``:45 To sum up, the following scripts control ``flex_extract``: 49 46 50 47 Installation -
For_developers/Sphinx/source/Documentation/output.rst
rb1674ed rf20342a 1 1 *********** 2 Output Data2 Output data 3 3 *********** 4 4 5 The output data of ``flex_extract`` are separated mainly into temporary files and the final ``FLEXPART`` inputfiles:5 The output data of ``flex_extract`` can be divided into the final ``FLEXPART`` input files and temporary files: 6 6 7 7 +-----------------------------------------------+----------------------------------------------+ 8 8 | ``FLEXPART`` input files | Temporary files (saved in debug mode) | 9 9 +-----------------------------------------------+----------------------------------------------+ 10 | - Standard output file names | - MARS request file (opt)|10 | - Standard output file names | - MARS request file (optional) | 11 11 | - Output for pure forecast | - flux files | 12 12 | - Output for ensemble members | - VERTICAL.EC | … … 21 21 ======================== 22 22 23 The final output files of ``flex_extract`` are also the meteorological ``FLEXPART`` input files.24 The naming of these files dependon the kind of data extracted by ``flex_extract``.23 The final output files of ``flex_extract`` are the meteorological input files for ``FLEXPART``. 24 The naming convention for these files depends on the kind of data extracted by ``flex_extract``. 25 25 26 26 Standard output files 27 27 --------------------- 28 28 29 In general, there is a file for each time step with the filename format:29 In general, there is one file for each time named: 30 30 31 31 .. code-block:: bash … … 33 33 <prefix>YYMMDDHH 34 34 35 The ``prefix`` is by default defined as ``EN``and can be re-defined in the ``CONTROL`` file.36 Each file contains all meteorological fields needed by ``FLEXPART`` for all selected model levels for a specific time step.37 38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` where we extracted only the lowest model level for demonstration reasons:35 where YY are the last two digits of the year, MM is the month, DD the day, and HH the hour (UTC). <prefix> is by default defined as EN, and can be re-defined in the ``CONTROL`` file. 36 Each file contains all meteorological fields at all levels as needed by ``FLEXPART``, valid for the time indicated in the file name. 37 38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` (where we extracted only the lowest model level for demonstration purposes): 39 39 40 40 .. code-block:: bash … … 84 84 ------------------------------ 85 85 86 ``Flex_extract`` can retrieve forecasts which can be longer than 23 hours. To avoid collisions of time steps for forecasts of more than one daya new scheme for filenames in pure forecast mode is introduced:86 ``Flex_extract`` is able to retrieve forecasts with a lead time of more than 23 hours. In order to avoid collisions of time steps names, a new scheme for filenames in pure forecast mode is introduced: 87 87 88 88 .. code-block:: bash … … 90 90 <prefix>YYMMDD.HH.<FORECAST_STEP> 91 91 92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3 92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3-digit number which represents the forecast step in hours. 93 93 94 94 … … 96 96 ------------------------------------- 97 97 98 Ensembles can be retrieved and are addressed by the grib message parameter ``number``. The ensembles are saved per file and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3digit format.98 ``Flex_extract`` is able to retrieve ensembles data; they are labelled by the grib message parameter ``number``. Each ensemble member is saved in a separate file, and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3-digit format. 99 99 100 100 .. code-block:: bash … … 106 106 ------------------------------------------------------- 107 107 108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type. They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`. 109 The two additional fields are marked with the ``step`` parameter in the Grib messages and are set to "1" and "2" for sub-grid point 1 and 2 respectively. 110 The output filenames do not change in this case. 111 Below is an example list of precipitation fields in an output file generated with the new disaggregation method: 108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type (large-scale and convective). They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`. 109 The two additional fields are addressed using the ``step`` parameter in the GRIB messages, which 110 is set to "1" or "2", for sub-grid points 1 and 2, respectively. 111 The output file names are not altered. 112 An example of the list of precipitation fields in an output file generated with the new disaggregation method is found below: 112 113 113 114 .. code-block:: bash … … 129 130 =============== 130 131 131 ``Flex_extract`` works with a number of temporary data files which are usually deleted after a successful data extraction. They are only storedif the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).132 ``Flex_extract`` creates a number of temporary data files which are usually deleted at the end of a successful run. They are preserved only if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`). 132 133 133 134 MARS grib files … … 135 136 136 137 ``Flex_extract`` retrieves all meteorological fields from MARS and stores them in files ending with ``.grb``. 137 Since the request times and data transfer of MARS access are limited and ECMWF asks for efficiency in requesting data from MARS, ``flex_extract`` splits the overall data request in several smaller requests. Each request is stored in an extra ``.grb`` file and the file names are put together by several pieces of information: 138 Since there are limits implemented by ECMWF for the time per request and data transfer from MARS, 139 and as ECMWF asks for efficient MARS retrievals, ``flex_extract`` splits the overall data request 140 into several smaller requests. Each request is stored in its own ``.grb`` file, and the file 141 names are composed of several pieces of information: 138 142 139 143 .. code-block:: bash … … 144 148 145 149 Field type: 146 ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4 dvariational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast150 ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4D variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast 147 151 Grid type: 148 ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat /lon), ``_OROLSM`` - Orography parameter152 ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat / lon), ``_OROLSM`` - Orography parameter 149 153 Temporal property: 150 154 ``__`` - instantaneous fields, ``_acc`` - accumulated fields 151 155 Level type: 152 ``ML`` - Model Level, ``SL`` - Surface Level156 ``ML`` - model level, ``SL`` - surface level 153 157 ppid: 154 The process number of the parent process of submitted script.158 The process number of the parent process of the script submitted. 155 159 pid: 156 The process number of the submitted script. 157 158 The process ids should avoid mixing of fields if several ``flex_extract`` jobs are performed in parallel (which is, however, not recommended). The date format is YYYYMMDDHH. 159 160 Example ``.grb`` files for a day of CERA-20C data: 160 The process number of the script submitted. 161 162 163 Example ``.grb`` files for one day of CERA-20C data: 161 164 162 165 .. code-block:: bash … … 172 175 ----------------- 173 176 174 This file is a ``csv`` file called ``mars_requests.csv`` with a list of the actual settings of MARS request parameters (one request per line) in a flex_extract job. It is used for documenting the data which were retrieved and for testing reasons. 175 176 Each request consist of the following parameters, whose meaning mainly can be taken from :doc:`Input/control_params` or :doc:`Input/run`: 177 This file is a ``csv`` file called ``mars_requests.csv`` listing the actual settings of the MARS 178 request (one request per line) in a flex_extract job. 179 It is used for documenting which data were retrieved, and for testing. 180 181 Each request consists of the following parameters, whose meaning mostly can be taken from :doc:`Input/control_params` or :doc:`Input/run`: 177 182 request_number, accuracy, area, dataset, date, expver, gaussian, grid, levelist, levtype, marsclass, number, param, repres, resol, step, stream, target, time, type 178 183 179 Example output of a one day retrieval of CERA-20cdata:184 Example output of a one-day retrieval of CERA-20C data: 180 185 181 186 .. code-block:: bash … … 192 197 ----------- 193 198 194 The vertical discretization of model levels. This file contains the ``A`` and ``B`` parameters to calculate the model level height in meters. 199 This file contains information describing the vertical discretisation (model levels) 200 in form of the ``A`` and ``B`` parameters which allow to calculate the actual pressure of a model level from the surface pressure. 195 201 196 202 … … 198 204 ---------- 199 205 200 This file is usually called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords. 201 202 203 flux files 206 This file is called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords. 207 #PS NEEDS MORE DESCRIPTION 208 209 210 Flux files 204 211 ---------- 205 212 206 The flux files contain the de-accumulated and dis-aggregated flux fields of large scale and convective precipitation, eastward turbulent surface stress, northward turbulent surface stress, surface sensible heat fluxand the surface net solar radiation.213 The flux files contain the de-accumulated and dis-aggregated flux fields of large-scale and convective precipitation, east- and northward turbulent surface stresses, the surface sensible heat flux, and the surface net solar radiation. 207 214 208 215 .. code-block:: bash … … 210 217 flux<date>[.N<xxx>][.<xxx>] 211 218 212 The date format is YYYYMMDDHH . The optional block ``[.N<xxx>]`` marks the ensemble forecast number, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.219 The date format is YYYYMMDDHH as explained before. The optional block ``[.N<xxx>]`` is used for the ensemble forecast date, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step. 213 220 214 221 .. note:: 215 222 216 In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval and ``1`` or ``2`` for the two new intervals respectively.223 In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval, and ``1`` or ``2``, respectively, for the two new intervals. 217 224 218 225 … … 226 233 fort.xx 227 234 228 where ``xx`` is thenumber which defines the meteorological fields stored in these files.229 They are generated by the Python part of ``flex_extract`` by just splitting the meteorological fields for a unique time stamp from the ``*.grb`` files into the ``fort`` files.230 The following table defines the numbers with their corresponding content.235 where ``xx`` is a number which defines the meteorological fields stored in these files. 236 They are generated by the Python code in ``flex_extract`` by splitting the meteorological fields for a unique time stamp from the ``*.grb`` files, storing them under the names ``fort.<XX>`` where <XX> represents some number. 237 The following table defines the numbers and the corresponding content: 231 238 232 239 .. csv-table:: Content of fort - files … … 240 247 "16", "surface fields" 241 248 "17", "specific humidity" 242 "18", "surface specific humidity (reduced gaussian)"243 "19", " vertical velocity (pressure) (optional)"249 "18", "surface specific humidity (reduced Gaussian grid)" 250 "19", "omega (vertical velocity in pressure coordinates) (optional)" 244 251 "21", "eta-coordinate vertical velocity (optional)" 245 "22", "total cloud 246 247 Some of the fields are solely retrieved with specific settings, e. g. the eta-coordinate vertical velocity is not available in ERA-Interim datasets and the total cloud water content is an optional fieldfor ``FLEXPART v10`` and newer.252 "22", "total cloud-water content (optional)" 253 254 Some of the fields are solely retrieved with specific settings, e. g., the eta-coordinate vertical velocity is not available in ERA-Interim datasets, and the total cloud-water content is an optional field which is useful for ``FLEXPART v10`` and newer. 248 255 249 256 The ``calc_etadot`` program saves its results in file ``fort.15`` which typically contains: … … 259 266 .. note:: 260 267 261 The ``fort.4`` file is the namelist file to drivethe Fortran program ``calc_etadot``. It is therefore also an input file.268 The ``fort.4`` file is the namelist file to control the Fortran program ``calc_etadot``. It is therefore also an input file. 262 269 263 270 Example of a namelist: -
For_developers/Sphinx/source/Documentation/overview.rst
rb1674ed rf20342a 3 3 ======== 4 4 5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) M ars archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` Atmospheric Transport Modelling system.6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who want sto use meteorological data from ECMWF to drive the ``FLEXPART`` model.7 The software retrieves the minim al number of parameters ``FLEXPART`` needs to work and provides the data in the explicity format ``FLEXPART`` understands.5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) MARS archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` atmospheric transport modelling system. 6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who want to use meteorological data from ECMWF to drive the ``FLEXPART`` model. 7 The software retrieves the minimum set of parameters needed by ``FLEXPART`` to work, and provides the data in the specific format required by ``FLEXPART``. 8 8 9 ``Flex_extract`` consists of 2main parts:10 1. a Python part , where the reading of parameter settings, retrieving data from MARS and preparing the data for ``FLEXPART`` is doneand11 2. a Fortran part , where the calculation of the vertical velocity is done and if necessary the conversion from spectralto regular latitude/longitude grids.9 ``Flex_extract`` consists of two main parts: 10 1. a Python part which reads the parameter settings, retrieves the data from MARS, and prepares them for ``FLEXPART``, and 11 2. a Fortran part which calculates the vertical velocity and, if necessary, converts variables from the spectral representation to regular latitude/longitude grids. 12 12 13 Additionally, it has some Korn shell scripts which are usedto set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.13 In addition, there are some Korn shell scripts to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes. 14 14 15 15 A number of Shell scripts are wrapped around the software package for easy installation and fast job submission. 16 16 17 The software depends on a number of third-party libraries which can be found in :ref:`ref-requirements`.17 The software depends on some third-party libraries as listed in :ref:`ref-requirements`. 18 18 19 Some details on the tasks and program worksteps are described in :doc:`Overview/prog_flow`.19 Details of the tasks and program work steps are described in :doc:`Overview/prog_flow`. 20 20 21 21 -
For_developers/Sphinx/source/Documentation/vertco.rst
rb1674ed rf20342a 1 1 ******************* 2 Vertical Coordinate2 Vertical wind 3 3 ******************* 4 4 5 Calculation of vertical velocity and preparation of Output-files5 Calculation of vertical velocity and preparation of output files 6 6 ================================================================ 7 7 8 ``flex_extract`` has two ways to calculatethe vertical velocity for ``FLEXTRA``/``FLEXPART``:8 Two methods are provided in ``flex_extract`` for the calculation of the vertical velocity for ``FLEXTRA``/``FLEXPART``: 9 9 (i) from the horizontal wind field, 10 (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C** .10 (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**, which contains the vertical velocity directly in the eta coordinate system of the ECMWF model. 11 11 12 12 Especially for high resolution data, use of the ``MARS`` parameter 77 is recommended, … … 20 20 21 21 22 Calculation of vertical velocity fromhorizontal wind using the continuity equation22 Calculation of the vertical velocity from the horizontal wind using the continuity equation 23 23 =================================================================================== 24 24 25 The vertical velocity is computed by the FORTRAN90 program ``calc_etadot`` in the ECMWF 26 vertical coordinate system by applying the equation of continuity and thereby ensuring mass consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the 25 The vertical velocity in the ECMWF's eta vertical coordinate system is computed by the Fortran program ``calc_etadot``, using the continuity equation and thereby ensuring mass-consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the 27 26 documents v20_update_protocol.pdf, V30_update_protocol.pdf and 28 27 V40_update_protocol.pdf. The computational demand and accuracy of ``calc_etadot`` is highly … … 30 29 following guidance can be given for choosing the right parameters: 31 30 32 * For very fine output grids (0.25 degree or finer) the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279 needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.31 * For very fine output grids (0.25 degree or finer), the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279 needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application. 33 32 * For lower resolution (often global) output grids, calculation of vertical velocities with lower than operational spectral resolution is recommended. For global grids the following settings appear optimal: 34 33 - For 1.0 degree grids: ``GAUSS=1``, ``RESOL=255``, ``SMOOTH=179`` 35 34 - For 0.5 degree grids: ``GAUSS=1``, ``RESOL=399``, ``SMOOTH=359`` 36 35 - Calculation on the lat/lon grid is not recommended for less than the operational (T1279) resolution. 37 - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279 , 2047 in future models). This choice is restricted because a reduced Gaussian grid is defined in thenECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.38 - For ``SMOOTH`` any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if vertical velocity is calculated onlat\/lon grid (``GAUSS=0``).39 * The on 36 - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279; 2047 in future model versions). This choice is restricted because a reduced Gaussian grid is defined in the ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution. 37 - For ``SMOOTH``, any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if the vertical velocity is calculated on a lat\/lon grid (``GAUSS=0``). 38 * The on-demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however. 40 39 * Regional grids are not cyclic in zonal directions, but global grids are. The software assumes a cyclic grid if ``RIGHT``-``LEFT`` is equal to ``GRID`` or is equal to ``GRID``-360. 41 * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` for application in ATM modelling. If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and the surface pressure tendency. ``OMEGADIFF`` should bezero except for debugging, since it triggers expensive calculations on the Gaussian grid.40 * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` (the standard ``flex_extract`` output files) If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and from the surface pressure tendency. ``OMEGADIFF`` should be set to zero except for debugging, since it triggers expensive calculations on the Gaussian grid. 42 41 43 42 44 Calculation of vertical velocity frompre-calculated MARS parameter 7743 Calculation of the vertical velocity from the pre-calculated MARS parameter 77 45 44 ====================================================================== 46 45 47 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that this parameteris available if the ``CONTROL`` parameter ``ETA`` is set to 1.46 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that parameter 77 is available if the ``CONTROL`` parameter ``ETA`` is set to 1. 48 47 49 48 It is recommended to use the pre-calculated parameter 77 by setting ``ETA`` to 1 whenever possible. 50 49 51 Setting parameter ``ETA`` to 1 normallydisables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.50 Setting the parameter ``ETA`` to 1 disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time. 52 51 53 52 .. note:: 54 However, the calculation on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.53 However, the calculations on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case, ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity. 55 54 56 55 The parameters ``RESOL``, ``GRID``, ``UPPER``, ``LOWER``, ``LEFT``, ``RIGHT`` still apply. As for calculations on the Gaussian grid, the spectral resolution parameter ``RESOL`` should be compatible with the grid resolution (see previous subsection). -
For_developers/Sphinx/source/Ecmwf/access.rst
rba99230 rf20342a 1 1 ************ 2 Access Modes2 Access modes 3 3 ************ 4 4 … … 8 8 .. _CDS API: https://cds.climate.copernicus.eu/api-how-to 9 9 10 Access to the ECMWF M arsarchive is divided into two groups: **member state** users and **public** users.10 Access to the ECMWF MARS archive is divided into two groups: **member state** users and **public** users. 11 11 12 **Member 13 This access mode allows the user to work directly on the ECMWF Linux Member State Servers or via a Web Access Toolkit ``ecaccess`` through a local Member State Gateway Server. This enables the user to have direct and full access to the Mars archive. There might be some limitations in user rights such as the declined access to the latest forecasts. This has to be discussed with the `Computing Representative`_. This user group is also able to work from their local facilities without a gateway server in the same way a **public** user would. The only difference is the connection with the Web API. However, thisis automatically selected by ``flex_extract``.12 **Member-state user**: 13 This access mode allows the user to work directly on a ECMWF member-state Linux server or via the ``ecaccess`` Web-Access Toolkit through a local member-state Gateway server. This enables the user to have direct and full access to the MARS archive. There might be some limitations in user rights, such as no access to the latest forecasts. In case such data are needed, this has to be agreed upon with the national `Computing Representative`_. This user group is also able to work from their local facilities without a gateway server in the same way a **public** user would. The only difference is the connection with the Web API, which, however, is automatically selected by ``flex_extract``. 14 14 15 15 16 16 **Public user**: 17 This access mode allows every user to access the ECMWF `public datasets`_ from their local facilities. ``Flex_extract`` is able (tested for the use with ``FLEXPART``) to extract the re-analysis datasets such as ERA-Interim and CERA-20C. The main difference to the **member state user** is the method of access with the Web API and the availability of data. For example, in ERA-Interim there is only a 6-hourly temporal resolution instead of 3 hours. The access method is selected by providing the command line argument "public=1" and providing the MARS keyword "dataset" in the ``CONTROL`` file. Also, the user has to explicitly accept the license of the dataset to be retrieved. This can be done as described in the installation process at section :ref:`ref-licence`.17 This access mode allows every user to access the ECMWF `public datasets`_ from their local facilities. ``Flex_extract`` is able to extract the re-analysis datasets such as ERA-Interim and CERA-20C for use with ``FLEXPART`` (tested). The main difference to the **member-state user** is the method of access with the Web API and the availability of data. For example, in ERA-Interim, only a 6-hourly temporal resolution is available instead of 3 hours. The access method is selected by providing the command line argument "public=1" and providing the MARS keyword "dataset" in the ``CONTROL`` file. Also, the user has to explicitly accept the license of the data set to be retrieved. This can be done as described in the installation process at section :ref:`ref-licence`. 18 18 19 19 .. note:: 20 20 21 The availability of the public dataset *ERA5* with the ECMWF Web API was cancelled in March 2019. The oportunity of local retrieval of this dataset was moved to the `Climate Data Store`_ which uses another Web API named `CDS API`_. This Data Store stores the data on explicit webservers for faster and easier access. Unfortunately, for *ERA5* there are only surface level and pressure level data available for *public users*. In the case of a *member user* it is possible to bypass the request to the MARS archive from ECMWF to retrieve the data. ``Flex_extract`` is already modified to use this API so *member user* can already retrieve *ERA5* data while *public users* have to wait until model level are available.21 The availability of the public dataset *ERA5* with the ECMWF Web API was cancelled by ECWMF in March 2019. Local retrieval of this dataset now has to use the `Climate Data Store`_ (CDS) with a different Web API called `CDS API`_. CDS stores the data on dedicated web servers for faster and easier access. Unfortunately, for *ERA5*, only surface level and pressure level data are available for *public users* which is not enough to run FLEXPART. For a *member user*, it is possible to pass the request to the MARS archive to retrieve the data. ``Flex_extract`` is already modified to use this API so a *member user* can already retrieve *ERA5* data for FLEXPART while *public users* have to wait until model level are made available. 22 22 23 23 For information on how to register see :ref:`ref-registration`. -
For_developers/Sphinx/source/Ecmwf/ec-links.rst
rba99230 rf20342a 1 1 ################################ 2 Link Collection for Quick finder2 Link collection 3 3 ################################ 4 4 5 5 6 ECMWF - General Overview6 ECMWF - General overview 7 7 `ECMWF Home <https://www.ecmwf.int/>`_ 8 8 9 `ECMWF Training <https://www.ecmwf.int/en/learning>`_9 `ECMWF training <https://www.ecmwf.int/en/learning>`_ 10 10 11 `General User Documentation <https://software.ecmwf.int/wiki/display/UDOC/User+Documentation>`_11 `General user documentation <https://software.ecmwf.int/wiki/display/UDOC/User+Documentation>`_ 12 12 13 `Software Support <https://confluence.ecmwf.int/display/SUP>`_13 `Software support <https://confluence.ecmwf.int/display/SUP>`_ 14 14 15 15 MARS 16 16 `MARS user documentation <https://confluence.ecmwf.int//display/UDOC/MARS+user+documentation>`_ 17 17 18 `MARS Keywords <https://software.ecmwf.int/wiki/display/UDOC/MARS+keywords>`_18 `MARS keywords <https://software.ecmwf.int/wiki/display/UDOC/MARS+keywords>`_ 19 19 20 `MARS Content <https://confluence.ecmwf.int/display/UDOC/MARS+content>`_20 `MARS content <https://confluence.ecmwf.int/display/UDOC/MARS+content>`_ 21 21 22 `MARS Actions <https://confluence.ecmwf.int/display/UDOC/MARS+actions>`_22 `MARS actions <https://confluence.ecmwf.int/display/UDOC/MARS+actions>`_ 23 23 24 `Parameter Database <https://apps.ecmwf.int/codes/grib/param-db>`_24 `Parameter database <https://apps.ecmwf.int/codes/grib/param-db>`_ 25 25 26 26 Registration 27 `Contact of Computing Representative's <https://www.ecmwf.int/en/about/contact-us/computing-representatives>`_27 `Contacts of Computing Representatives <https://www.ecmwf.int/en/about/contact-us/computing-representatives>`_ 28 28 29 29 `Public registration for ECMWF Web API <https://software.ecmwf.int/wiki/display/WEBAPI/Access+MARS>`_ 30 30 31 `CDS Registration <https://cds.climate.copernicus.eu/user/register>`_31 `CDS registration <https://cds.climate.copernicus.eu/user/register>`_ 32 32 33 Available Member State Datasets 34 `Web Interface for accessing member state datasets <http://apps.ecmwf.int/archive-catalogue/>`_33 Member-State data sets available 34 `Web interface for accessing member-state data sets <http://apps.ecmwf.int/archive-catalogue/>`_ 35 35 36 ` Available datasetsfor member state users <https://www.ecmwf.int/en/forecasts/datasets>`_36 `Data sets available for member state users <https://www.ecmwf.int/en/forecasts/datasets>`_ 37 37 38 Available Public Datasets 39 `Web Interface for accessing public datasets <http://apps.ecmwf.int/datasets/>`_38 Public data sets available 39 `Web interface for accessing public data sets <http://apps.ecmwf.int/datasets/>`_ 40 40 41 `ECMWF's public data sets <https://confluence.ecmwf.int/display/WEBAPI/Available+ECMWF+Public+Datasets>`_41 `ECMWF's public data sets <https://confluence.ecmwf.int/display/WEBAPI/Available+ECMWF+Public+Datasets>`_ 42 42 43 `Public data set Licences <https://software.ecmwf.int/wiki/display/WEBAPI/Available+ECMWF+Public+Datasets>`_43 `Public data set licences <https://software.ecmwf.int/wiki/display/WEBAPI/Available+ECMWF+Public+Datasets>`_ 44 44 45 `ERA5 public dataset Licence <https://cds.climate.copernicus.eu/cdsapp#!/search?type=dataset>`_45 `ERA5 public dataset licence <https://cds.climate.copernicus.eu/cdsapp#!/search?type=dataset>`_ 46 46 47 47 48 48 Datasets 49 49 Overview 50 `Complete list of data sets <https://www.ecmwf.int/en/forecasts/datasets>`_50 `Complete list of data sets <https://www.ecmwf.int/en/forecasts/datasets>`_ 51 51 52 52 `What is climate reanalysis <https://www.ecmwf.int/en/research/climate-reanalysis>`_ … … 57 57 `List of real_time datasets <https://www.ecmwf.int/en/forecasts/datasets/catalogue-ecmwf-real-time-products>`_ 58 58 59 `Atmospheric model - HRES ( ourtypical operational dataset) <https://www.ecmwf.int/en/forecasts/datasets/set-i>`_59 `Atmospheric model - HRES (typical operational dataset) <https://www.ecmwf.int/en/forecasts/datasets/set-i>`_ 60 60 61 61 `Atmospheric model - ENS (15-day ensemble forecast) <https://www.ecmwf.int/en/forecasts/datasets/set-iii>`_ … … 66 66 `ERA-Interim documentation <https://www.ecmwf.int/en/elibrary/8174-era-interim-archive-version-20>`_ 67 67 68 `ERA-Interim data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/era-interim>`_68 `ERA-Interim data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/era-interim>`_ 69 69 70 70 CERA-20C 71 71 `What is CERA-20C <https://software.ecmwf.int/wiki/display/CKB/What+is+CERA-20C>`_ 72 72 73 `CERA-20C data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/cera-20c>`_73 `CERA-20C data set <https://www.ecmwf.int/en/forecasts/datasets/archive-datasets/reanalysis-datasets/cera-20c>`_ 74 74 75 75 ERA5 … … 84 84 `ERA5 Documentation <https://software.ecmwf.int/wiki/display/CKB/ERA5+data+documentation>`_ 85 85 86 Third Party Libraries87 `ECMWF Web API Home <https://software.ecmwf.int/wiki/display/WEBAPI/ECMWF+Web+API+Home>`_86 Third-party libraries 87 `ECMWF Web API home <https://software.ecmwf.int/wiki/display/WEBAPI/ECMWF+Web+API+Home>`_ 88 88 89 89 `Building ECMWF software with gfortran <https://software.ecmwf.int/wiki/display/SUP/2015/05/11/Building+ECMWF+software+with+gfortran>`_ … … 103 103 104 104 105 Scientific Information106 `Octahedral reduced Gaussian Grid <https://confluence.ecmwf.int/display/FCST/Introducing+the+octahedral+reduced+Gaussian+grid>`_105 Scientific information 106 `Octahedral reduced Gaussian grid <https://confluence.ecmwf.int/display/FCST/Introducing+the+octahedral+reduced+Gaussian+grid>`_ 107 107 108 108 `Precipitation <https://www.ecmwf.int/en/newsletter/147/meteorology/use-high-density-observations-precipitation-verification>`_ 109 109 110 110 111 Technical Information of ECMWF serves111 Technical information for ECMWF servers 112 112 113 `Introduct ion presentation toSLURM <https://confluence.ecmwf.int/download/attachments/73008494/intro-slurm-2017.pdf?version=1&modificationDate=1488574096323&api=v2>`_113 `Introductary presentation of SLURM <https://confluence.ecmwf.int/download/attachments/73008494/intro-slurm-2017.pdf?version=1&modificationDate=1488574096323&api=v2>`_ 114 114 115 Trouble shooting116 `ECMWF Web API Troubleshooting <https://confluence.ecmwf.int/display/WEBAPI/Web-API+Troubleshooting>`_115 Trouble-shooting 116 `ECMWF Web API trouble-shooting <https://confluence.ecmwf.int/display/WEBAPI/Web-API+Troubleshooting>`_ 117 117 118 118 -
For_developers/Sphinx/source/Ecmwf/hintsecmwf.rst
rba99230 rf20342a 1 1 ################################## 2 Hints to specify dataset retrieval2 Hints for data set selection 3 3 ################################## 4 4 … … 8 8 9 9 10 How can I find out what data isavailable?10 How can I find out what data are available? 11 11 ========================================== 12 12 13 Go to the `Web Interface for accessing member state datasets <http://apps.ecmwf.int/archive-catalogue/>`_13 Go to the `Web Interface for accessing member-state data sets <http://apps.ecmwf.int/archive-catalogue/>`_ 14 14 and click yourself through the steps to define your set of data and see what is available to you. 15 15 16 For public users there is the `Web Interface for accessing public data sets <http://apps.ecmwf.int/datasets/>`_.16 For public users there is the `Web Interface for accessing public data sets <http://apps.ecmwf.int/datasets/>`_. 17 17 18 18 -
For_developers/Sphinx/source/Ecmwf/msdata.rst
rba99230 rf20342a 1 ######################################### 2 Available Datasets for Member State users3 ######################################### 1 ########################################## 2 Available data sets for member-state users 3 ########################################## 4 4 5 5 6 6 7 Model 7 Model-level data 8 8 ================ 9 9 … … 11 11 12 12 13 Surface leveldata13 Surface data 14 14 ================== 15 15 -
For_developers/Sphinx/source/Ecmwf/pubdata.rst
rba99230 rf20342a 1 Available Datasets for Public users2 *********************************** 1 Available data sets for public users 2 ************************************ 3 3 4 INPREPARATION4 UNDER PREPARATION 5 5 6 6 -
For_developers/Sphinx/source/Evaluation/staticcode.rst
rba99230 rf20342a 1 1 ******************** 2 Static Code Analysis2 Static code analysis 3 3 ******************** 4 4 -
For_developers/Sphinx/source/Evaluation/testcases.rst
rb1674ed rf20342a 1 1 ******************** 2 Test cases2 Test cases 3 3 ******************** 4 4 … … 11 11 12 12 13 Comparison of gribfiles13 Comparison of GRIB files 14 14 ======================== 15 15 -
For_developers/Sphinx/source/Installation/local.rst
rb1674ed rf20342a 51 51 The installation is the same for the access modes **member** and **public**. 52 52 53 The environment on your local system has to provide the sesoftware packages53 The environment on your local system has to provide the following software packages 54 54 and libraries, since the preparation of the extraction and the post-processing is done on the local machine: 55 55 56 +------------------------------------------------ +-----------------+57 | Python part | Fortran part |58 +------------------------------------------------ +-----------------+59 | * `Python3`_ | * `gfortran`_|60 | * `numpy`_ | * `fftw3`_|61 | * `genshi`_ | * `eccodes`_|62 | * `eccodes for python`_ | * `emoslib`_|63 | *`ecmwf-api-client`_ (everything except ERA5) | |64 | *`cdsapi`_ (just for ERA5 and member user) | |65 +------------------------------------------------ +-----------------+56 +-------------------------------------------------+-----------------+ 57 | Python part | Fortran part | 58 +-------------------------------------------------+-----------------+ 59 | 1. `Python3`_ | 1. `gfortran`_ | 60 | 2. `numpy`_ | 2. `fftw3`_ | 61 | 3. `genshi`_ | 3. `eccodes`_ | 62 | 4. `eccodes for python`_ | 4. `emoslib`_ | 63 | 5. `ecmwf-api-client`_ (everything except ERA5) | | 64 | 6. `cdsapi`_ (just for ERA5 and member user) | | 65 +-------------------------------------------------+-----------------+ 66 66 67 67 68 68 .. _ref-prep-local: 69 69 70 Prepar e local environment71 ========================= 72 73 The easiest way to install all required packages is to use the package management system of your Linux distribution 70 Preparing the local environment 71 =============================== 72 73 The easiest way to install all required packages is to use the package management system of your Linux distribution which requires admin rights. 74 74 The installation was tested on a *Debian GNU/Linux buster* and an *Ubuntu 18.04 Bionic Beaver* system. 75 75 76 76 .. code-block:: sh 77 77 78 # On a Debian or Debian-derived sytem (e. g. Ubuntu) system you may use the following commands (or equivalent commands of your preferred package manager): 79 # (if not already available): 78 # On a Debian or Debian-derived (e. g. Ubuntu) system, 79 # you may use the following commands (or equivalent commands of your preferred package manager): 80 # (if respective packages are not already available): 80 81 apt-get install python3 (usually already available on GNU/Linux systems) 81 82 apt-get install python3-eccodes … … 86 87 apt-get install libeccodes-dev 87 88 apt-get install libemos-dev 88 # Some of these packages will pull in further packages as dependencies. This is fine, and some are even needed by ``flex_extract''. 89 90 91 # As currently the CDS and ECMWF API packages are not available as Debian packages, they need to be installed outside of the Debian (Ubuntu etc.) package management system. The recommended way is: 89 # Some of these packages will pull in further packages as dependencies. 90 # This is fine, and some are even needed by ``flex_extract''. 91 92 # As currently the CDS and ECMWF API packages are not available as Debian packages, 93 # they need to be installed outside of the Debian (Ubuntu etc.) package management system. 94 # The recommended way is: 92 95 apt-get install pip 93 96 pip install cdsapi … … 96 99 .. note:: 97 100 98 I n case you would like to use Anaconda Python we recommend youfollow the installation instructions of99 `Anaconda Python Installation for Linux <https://docs.anaconda.com/anaconda/install/linux/>`_ and then install the100 ``eccodes`` package from ``conda`` with:101 If you are using Anaconda Python, we recommend to follow the installation instructions of 102 `Anaconda Python Installation for Linux <https://docs.anaconda.com/anaconda/install/linux/>`_ 103 and then install the ``eccodes`` package from ``conda`` with: 101 104 102 105 .. code-block:: bash … … 104 107 conda install conda-forge::python-eccodes 105 108 106 The CDS API ( cdsapi) is required for ERA5 data and the ECMWF Web API (ecmwf-api-client) for all other public datasets.109 The CDS API (``cdsapi``) is required for ERA5 data and the ECMWF Web API (ecmwf-api-client) for all other public datasets. 107 110 108 111 .. note:: 109 112 110 Since **public users** currently don't have access to the full *ERA5* dataset they can skip the installation of the ``CDS API``.111 112 Both user groups have to provide keys with their credentials for the Web API 's in their home directory. Therefore, followthese instructions:113 Since **public users** currently don't have access to the full *ERA5* dataset, they can skip the installation of the CDS API. 114 115 Both user groups have to provide keys with their credentials for the Web APIs in their home directory, following these instructions: 113 116 114 117 ECMWF Web API: 115 Go to `MARS access`_ website and log in with your credentials. Afterwards, on this site in section "Install ECMWF KEY" the key for the ECMWF Web API should be listed. Please follow the instructions in this section under 1 (save the key in a file `.ecmwfapirc` in your home directory).118 Go to the `MARS access`_ website and log in with your credentials. Afterwards, go to the section "Install ECMWF KEY", where the key for the ECMWF Web API should be listed. Please follow the instructions in this section under 1 (save the key in a file ``.ecmwfapirc`` in your home directory). 116 119 117 120 CDS API: 118 Go to `CDS API registration`_ and register there too. Log in at the `cdsapi`_ website and follow the instructions at section "Install the CDS API key" to save your credentials in a `.cdsapirc` file.121 Go to `CDS API registration`_ and register there, too. Log in on the `cdsapi`_ website and follow the instructions in the section "Install the CDS API key" to save your credentials in file ``.cdsapirc``. 119 122 120 123 121 124 .. _ref-test-local: 122 125 123 Test local environment 124 ====================== 125 126 Check the availability of the python packages by typing ``python3`` in a terminal window and run the ``import`` commands in the python shell. If there are no error messages, you succeeded in setting up the environment. 127 126 Testing the local environment 127 ============================= 128 129 Check the availability of the python packages by typing ``python3`` in a terminal window and run the ``import`` commands in the python shell: 128 130 .. code-block:: python 129 131 … … 135 137 import ecmwfapi 136 138 137 138 139 Test the Web API's 140 ------------------ 139 If there are no error messages, you succeeded in setting up the environment. 140 141 142 Testing the Web APIs 143 -------------------- 141 144 142 145 You can start very simple test retrievals for both Web APIs to be sure that everything works. This is recommended to minimise the range of possible errors using ``flex_extract`` later on. … … 148 151 149 152 +----------------------------------------------------------+----------------------------------------------------------+ 150 |Please use this piece of Python code for **Member user**: |Please use this piece of Python code for**Public user**: |153 |Please use this Python code snippet as a **Member user**: |Please use this Python code snippet as a **Public user**: | 151 154 +----------------------------------------------------------+----------------------------------------------------------+ 152 155 |.. code-block:: python |.. code-block:: python | … … 178 181 Extraction of ERA5 data via CDS API might take time as currently there is a high demand for ERA5 data. Therefore, as a simple test for the API just retrieve pressure-level data (even if that is NOT what we need for FLEXPART), as they are stored on disk and don't need to be retrieved from MARS (which is the time-consuming action): 179 182 180 Please use th is piece of Python code to retrieve a small sample of *ERA5* pressure levels:183 Please use the following Python code snippet to retrieve a small sample of *ERA5* pressure level data: 181 184 182 185 .. code-block:: python … … 204 207 .. **Member-state user** 205 208 206 Please use th is piece of Python codeto retrieve a small *ERA5* data sample as a **member-state user**! The **Public user** do not have access to the full *ERA5* dataset!209 Please use the following Python code snippet to retrieve a small *ERA5* data sample as a **member-state user**! The **Public user** do not have access to the full *ERA5* dataset! 207 210 208 211 .. code-block:: python … … 268 271 ================== 269 272 270 First prepare the Fortran ``makefile`` for your environment and set it in the ``setup.sh`` script. (See section :ref:`Fortran Makefile <ref-convert>` for more information.) 271 ``flex_extract`` comes with two ``makefiles`` prepared for the ``gfortran`` compiler. One for the normal use ``makefile_fast`` and one for debugging ``makefile_debug`` which is usually only resonable for developers. 272 273 They assume that ``eccodes`` and ``emoslib`` are installed as distribution packages and can be found at ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number. 273 First, adapt the Fortran ``makefile`` for your environment (if necessary) and insert it into ``setup.sh`` script (see :ref:`Fortran Makefile <ref-convert>` for more information). 274 They can be found at ``flex_extract_vX.X/Source/Fortran/``, where ``vX.X`` should be substituted by the current flex_extract version number. 274 275 275 276 .. caution:: … … 277 278 ``makefiles`` if other than standard paths are used. 278 279 279 So starting from the root directory of ``flex_extract``, 280 go to the ``Fortran`` source directory and open the ``makefile`` of your 281 choice to modify with an editor of your choice. We use the ``nedit`` in this case. 280 Thus, go to the ``Fortran`` source directory and open the ``makefile`` of your 281 choice, and check / modify with an editor of your choice: 282 282 283 283 .. code-block:: bash … … 286 286 nedit makefile_fast 287 287 288 Edit the paths to the ``eccodes`` library on your local machine. 289 288 Set the paths to the ``eccodes`` library on your local machine, if necessary. 290 289 291 290 .. caution:: … … 302 301 to find out the path to the ``eccodes`` library. 303 302 304 Substitute these paths in the ``makefile`` forparameters **ECCODES_INCLUDE_DIR**305 and **ECCODES_LIB** and save it.303 Assign these paths to the parameters **ECCODES_INCLUDE_DIR** 304 and **ECCODES_LIB** in the makefile, and save it. 306 305 307 306 .. code-block:: bash 308 307 309 # these are the paths on a current Debian 10 Testing system (May 2019)308 # these are the paths on Debian Buster: 310 309 ECCODES_INCLUDE_DIR=/usr/lib/x86_64-linux-gnu/fortran/gfortran-mod-15/ 311 310 ECCODES_LIB= -L/usr/lib -leccodes_f90 -leccodes -lm … … 313 312 314 313 The Fortran program called ``calc_etadot`` will be compiled during the 315 installation process. Thereforethe name of the ``makefile`` to be used needs to be given in ``setup.sh``.314 installation process. Therefore, the name of the ``makefile`` to be used needs to be given in ``setup.sh``. 316 315 317 316 In the root directory of ``flex_extract``, open the ``setup.sh`` script 318 and adapt the installation parameters in the section labelled with319 "AVAILABLE COMMANDLINE ARGUMENTS TO SET" like shown below.317 with an editor and adapt the installation parameters in the section labelled with 318 "AVAILABLE COMMANDLINE ARGUMENTS TO SET" as shown below: 320 319 321 320 -
For_developers/Sphinx/source/dev_guide.rst
rb1674ed rf20342a 6 6 .. note:: 7 7 8 This section still needs to be done.8 This section still needs to be written. 9 9 10 .. repository (how / who manages the code, where to get)10 .. repository (how / who manages the code, where to get) 11 11 12 12 -
For_developers/Sphinx/source/documentation.rst
rb1674ed rf20342a 3 3 ************* 4 4 5 Overview ( Under construction)5 Overview (under construction) 6 6 7 Control & Input Data7 Control & input data 8 8 9 Output Data (Under construction)9 Output data (under construction) 10 10 11 Disaggregation of Flux Data (Under construction)11 Disaggregation of flux data (under construction) 12 12 13 Vertical Coordinate (Under construction)14 - Methods (GAUSS, ETA, OMEGA)13 Vertical coordinate (under construction) 14 - methods (GAUSS, ETA, OMEGA) 15 15 - calc_etadot 16 16 17 Auto Generated Documentation17 Auto-generated documentation 18 18 - Python 19 - Fortran ( Under construction)19 - Fortran (under construction) 20 20 21 21 -
For_developers/Sphinx/source/ecmwf_data.rst
rba99230 rf20342a 7 7 8 8 9 The European Centre for Medium-Range Weather Forecasts (`ECMWF`_), based in Reading, UK, is an independent intergovernmental organisation supported by 34 states. It is both a research institute and a full time operational service. It produces global numerical weather predictions and some other data which is fully available to the national meteorological services in the `Member States`_, Co-operating States and the broader community. Especially, the published re-analysis datasets are made available to the public with some limits in specific datasets.9 The European Centre for Medium-Range Weather Forecasts (`ECMWF`_), based in Reading, UK, is an independent intergovernmental organisation supported by 34 states. It is both a research institute and a 24 h / 7 d operational service. It produces global numerical weather predictions and some other data which are fully available to the national meteorological services in the `Member States`_, Co-operating States, and to some extend to the broader community. Specifically, re-analysis data sets are made available to the public, however, with some limitations for specific data sets. 10 10 11 The amount and structure of the available data from ECMWF is very complex. The operational data changes regularly in time and spatial resolution, physics and parameter. This has to be taken into account carefully and each user has to investigate his dataset of interest carefully before selecting andretrieving it with ``flex_extract``.12 The re-analysis datasets are consistent in all the above mentioned topics over their whole period but they have each their own specialities which makes treatment with ``flex_extract`` special in some way. For example, they have different starting times for their forecasts or different parameter availability. They also have differences in time and spatial resolution and most importantly for ``flex_extract`` they are different in the way of providing the vertical coordinate.11 There is vast amount and of data with a complex structure available from ECMWF. The operational data undergo changes with respect to temporal and spatial resolution, model physics and parameters available. This has to be taken into account carefully and every user should have a clear idea of the data set intended to be used before retrieving it with ``flex_extract``. 12 Each re-analysis data set is homogeneous with respect to resolution etc., but the different re-analyses alll have specific properties which requires a corresponding treatment with ``flex_extract``. For example, the starting times of the forecasts may be different, or the availability of parameters (model output variables) may vary. They also differ in their temporal and spatial resolution, and - most importantly for ``flex_extract`` - there are differences in the way how the vertical wind component may be accessed. 13 13 14 There is much to learn from ECMWF and their datasets and data handling and this might be confusing at first. We therefore collected the most important information for ``flex_extract`` users. In the following sections the user can use them to get to know enough to understand how ``flex_extract`` is best used and to select the parameters ofthe ``CONTROL`` files.14 As there is much to learn about ECMWF and its data sets and data handling, it might be confusing at first. Therefore, we have here collected the information which is most important for ``flex_extract`` users. Study the following sections to learn how ``flex_extract`` is best used, and to select the right parameters in the ``CONTROL`` files. 15 15 16 16 17 17 :doc:`Ecmwf/access` 18 Description of available access methods tothe ECMWF data.18 Description of available methods to access the ECMWF data. 19 19 20 20 :doc:`Ecmwf/msdata` 21 Information about available data and parameters for member 21 Information about available data and parameters for member-state users which can be retrieved with ``flex_extract`` 22 22 23 23 :doc:`Ecmwf/pubdata` 24 Information about available data and parameters for the public data sets which can be retrieved with ``flex_extract``24 Information about available data and parameters for the public data sets which can be retrieved with ``flex_extract`` 25 25 26 26 :doc:`Ecmwf/hintsecmwf` 27 Collection of hints to best find information to define the data set for retrievementand28 to define the ``CONTROL`` files.27 Collection of hints to best find information to define the data set for retrieval, and 28 to define the content of the ``CONTROL`` files. 29 29 30 30 :doc:`Ecmwf/ec-links` 31 Link collection for additional and useful information as well as references to specific dataset publications.31 Link collection for additional and useful information as well as references to publications on specific data sets. 32 32 33 33 -
For_developers/Sphinx/source/evaluation.rst
rb1674ed rf20342a 6 6 .. note:: 7 7 8 This section in the online documentation still needs to be done.9 Currently, evaluation methods and information can be found in the `flex_extract discussion paper <https://www.geosci-model-dev-discuss.net/gmd-2019-358/>`_ of the Geoscientific Model Development journal.8 This section still needs to be written. 9 Currently, evaluation methods can be found in the `flex_extract discussion paper <https://www.geosci-model-dev-discuss.net/gmd-2019-358/>`_ of the journal Geoscientific Model Development. 10 10 11 11 -
For_developers/Sphinx/source/quick_start.rst
rb936fd3 rf20342a 263 263 CONTROL_OD.OPER.FC.gauss.highres 264 264 CONTROL_OD.OPER.FC.operational 265 CONTROL_OD.OPER.FC.twice aday.1hourly266 CONTROL_OD.OPER.FC.twice aday.3hourly265 CONTROL_OD.OPER.FC.twicedaily.1hourly 266 CONTROL_OD.OPER.FC.twicedaily.3hourly 267 267 268 268 … … 277 277 278 278 279 A common problem for beginners in retrieving ECMWF datasets is the mismatch in the definition of these parameters. For example, if you would like to retrieve operational data before ``June 25th 2013`` and set the maximum level to ``137`` you will get an error because this number of levels was first introduced at this effective day. So, be cautious in the combination of space and time resolution as well as the field types which are not available all the time.279 A common problem for beginners in retrieving ECMWF datasets is a mismatch in the choice of values for these parameters. For example, if you try to retrieve operational data for 24 June 2013 or earlier and set the maximum level to 137, you will get an error because this number of levels was introduced only on 25 June 2013. Thus, be careful in the combination of space and time resolution as well as the field types. 280 280 281 281 282 282 .. note:: 283 283 284 Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please see the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_ 285 286 287 In the following we shortly discuss the main retrieval opportunities of the different datasets and categoize the ``CONTROL`` files. 284 Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please consult the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_ 285 286 In the following, we shortly discuss the typical retrievals for the different datasets and point to the respective ``CONTROL`` files. 288 287 289 288 … … 291 290 --------------- 292 291 293 The main difference in the definition of a ``CONRTOL`` file for a public dataset is the setting of the parameter ``DATASET``. This specification enables the selection of a public dataset in MARS. Otherwisethe request would not find the dataset.292 The main characteristic in the definition of a ``CONTROL`` file for a public dataset is the parameter ``DATASET``. Its specification enables the selection of a public dataset in MARS. Without this parameter, the request would not find the dataset. 294 293 For the two public datasets *CERA-20C* and *ERA-Interim* an example file with the ending ``.public`` is provided and can be used straightaway. 295 294 … … 299 298 CONTROL_EI.public 300 299 301 For *CERA-20C* it seems that there are no differences in the dataset against the full dataset, while the *public ERA-Interim* has only analysis fields every 6 hour without filling forecasts in between for model levels. Thereforeit is only possible to retrieve 6-hourly data for *public ERA-Interim*.300 For *CERA-20C* it seems that there are no differences compared the full dataset, whereas the *public ERA-Interim* has only 6-hourly analysis fields, without forecasts to fill in between, for model levels. Therefore, it is only possible to retrieve 6-hourly data for *public ERA-Interim*. 302 301 303 302 .. note:: 304 303 305 In general, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and per newsletter.304 In principle, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and on the FLEXPART user email list. 306 305 307 306 … … 309 308 ---- 310 309 311 For this dataset it is important to keep in mind that the dataset is available for the period 09/1901 until 12/2010 and the temporal resolution is limited to 3-hourly fields. 312 It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream. It has ``10`` ensemble members. The example ``CONTROL`` files will only select the first member (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``). 313 Another important difference to all other datasets is the forecast starting time which is 18 UTC. Which means that the forecast in *CERA-20C* for flux fields is 12 hours long. Since the forecast extends over a single day we need to extract one day in advance and one day subsequently. This is automatically done in ``flex_extract``. 310 For this dataset, it is important to keep in mind that it is available for the period 09/1901 until 12/2010, and that the temporal resolution is limited to 3 h. 311 It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream. There are 10 ensemble members. The example ``CONTROL`` files will only select the first member (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``). 312 Another important difference to all other datasets is that the forecast starting time is 18 UTC. This means that forecasts for flux fields cover 12 hours. Since the forecast extends over a single day we need to extract one day in advance and one day subsequently. This is automatically done in ``flex_extract``. 313 ##PS check previous para 314 314 315 315 … … 317 317 ----- 318 318 319 This is the newest re-analysis dataset and has a temporal resolution of 1-hourly analysis fields. Up to dateit is available until April 2019 with regular release of new months.320 The original horizontal resolution is ``0.28125°`` which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction must be an exact multiple of the resolution. It might be easier for users to use ``0.25`` for the resolution which MARS will automatically interpolate.321 The forecast starting time is ``06/18 UTC`` which is important for the flux data. This should be set in the ``CONTROL`` file via the ``ACCTIME 06/18`` parameter in correspondence with ``ACCMAXSTEP 12``and ``ACCTYPE FC``.319 This is the latest re-analysis dataset, and has a temporal resolution of 1-h (analysis fields). At the time of writing, it is available until April 2019 with regular release of new months. 320 The original horizontal resolution is 0.28125° which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction must be an integer multiple of the resolution. It is also possible to use ``0.25`` for the resolution; MARS will then automatically interpolate to this resolution which is still close enough to be acceptable. 321 The forecast starting time is ``06/18 UTC`` which is important for the flux data. Correspondingly, one should set in the ``CONTROL`` file ``ACCTIME 06/18``, ``ACCMAXSTEP 12``, and ``ACCTYPE FC``. 322 322 323 323 .. note:: 324 324 325 We know that *ERA5* also has an ensemble data assimilation system but this is not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble retrieval for *ERA5* is a future ToDo.325 *ERA5* also includes an ensemble data assimilation system but related fields are not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble field retrieval for *ERA5* is a *to-do* for the future. 326 326 327 327 … … 330 330 ----------- 331 331 332 This re-analysis dataset will exceed its end of production at 31st August 2019! 333 It is then available from 1st January 1979 to 31st August 2019. The ``etadot`` is not available in this dataset. Therefore ``flex_extract`` must select the ``GAUSS`` parameter to retrieve the divergence field in addition. The vertical velocity is the calculated with the continuity equation in the Fortran program ``calc_etadot``. Since the analysis fields are only available for every 6th hour, the dataset can be made 3 hourly by adding forecast fields in between. No ensemble members are available. 334 332 The production of this re-analysis dataset has stopped on 31 August 2019! 333 It is available for the period from 1 January 1979 to 31 August 2019. The ``etadot`` parameter is not available in this dataset. Therefore, one must use the ``GAUSS`` parameter, which retrieves the divergence field in addition and calculates the vertical velocity from the continuity equation in the Fortran program ``calc_etadot``. While the analysis fields are only available for every 6th hour, the dataset can be made 3-hourly by adding forecast fields in between. No ensemble members are available. 335 334 336 335 … … 338 337 ---------------- 339 338 340 This is the real time atmospheric model in high resolution with a 10-day forecast. This means it underwent regular adaptations and improvements over the years. Hence, retrieving data from this dataset needs extra attention in selecting correct settings of parameter. See :ref:`ref-tab-dataset-cmp` for the most important parameters.341 Nowadays, it is available 1 hourly by filling the gaps of the 6 hourly analysis fields with 1 hourly forecast fields. Since 4th June 2008 the eta coordinate is directly available so that ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to troubles in retrieving this high resolution dataset in terms of job duration and quota exceedence. 342 It is recommended to submit such high resolution cases forsingle day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits.343 344 ``CONTROL`` files for normal dailyretrievals with a mix of analysis and forecast fields are listed below:339 This data set provides the output of the real-time atmospheric model runs in high resolution, including 10-day forecasts. The model undergoes frequent adaptations and improvements. Thus, retrieving data from this dataset requires extra attention in selecting correct settings of the parameters. See :ref:`ref-tab-dataset-cmp` for the most important parameters. 340 Currently, fields can be retrieved at 1 h temporal resolution by filling the gaps between analysis fields with 1-hourly forecast fields. Since 4 June 2008, the eta coordinate vertical velocity is directly available from MARS, therefore ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to problems in terms of job duration and disk space quota. 341 It is recommended to submit such high resolution cases as single day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits. 342 343 ``CONTROL`` files for standard retrievals with a mix of analysis and forecast fields are listed below: 345 344 346 345 .. code-block:: bash … … 351 350 CONTROL_OD.OPER.FC.gauss.highres 352 351 353 These files defines the minimum number of parameters necessary to retrieve a daily subset. The setup of field types is optimal and should only be changed if the user understands what he does. The grid, domain and temporal resolution can be changed according to availability. 354 355 352 These files defines the minimum number of parameters necessary to retrieve a daily subset. The given settings for the TYPE parameter are already optimised, and should only be changed if you know what you are doing. Grid, domain, and temporal resolution may be changed according to availability. 353 356 354 357 355 .. note:: 358 356 359 Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ to get to know hints about retrieval efficiency and troubleshooting. 360 357 Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ for hints about retrieval efficiency and troubleshooting. 361 358 362 359 363 360 Pure forecast 364 It is possible to retrieve pure forecasts exceeding a day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecastof *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*.365 The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today in 1-hourly temporal resolution. *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which contain50 ensemble members.366 Please be aware that in this case it is necessary to set the specific type for flux fields explicitly, otherwise it could select a default value which might bedifferent from what you expect!361 It is possible to retrieve pure forecasts exceeding one day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecasts of *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*. 362 The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today (1-hourly). *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which currently works with 50 ensemble members. 363 Please be aware that in this case it is necessary to set the type for flux fields explicitly, otherwise a default value might be selected, different from what you expect! 367 364 368 365 .. code-block:: bash … … 373 370 374 371 375 376 372 Half-day retrievals 377 If a forecast for just half a day is wanted it can be done by substituting the analysis fields also by forecast fields as shown in files with ``twiceaday`` in it. They produce a full day retrieval with pure 12 hour forecasts twice a day. It is also possible to use the operational version which would get the time information from ECMWF's environmental variables and therefore get the newest forecast per day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours upfront to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used this can be done for any other datetoo.373 If a forecast is wanted for half a day only, this can be done by substituting the analysis fields by forecast fields as shown in files with ``twicedaily`` in their name. They produce a full-day retrieval with pure 12 hour forecasts, twice a day. It is also possible to use the operational version which would obtain the time information from ECMWF's environment variables and therefore use the newest forecast for each day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours up to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used, this can be done for any other date, too. 378 374 379 375 .. code-block:: bash … … 381 377 CONTROL_OD.OPER.FC.eta.basetime 382 378 CONTROL_OD.OPER.FC.operational 383 CONTROL_OD.OPER.FC.twiceaday.1hourly 384 CONTROL_OD.OPER.FC.twiceaday.3hourly 385 386 379 CONTROL_OD.OPER.FC.twicedaily.1hourly 380 CONTROL_OD.OPER.FC.twicedaily.3hourly 387 381 388 382 389 383 Ensemble members 390 The retrieval of ensemble members were already mentioned in the pure forecast section and for *CERA-20C* data. 391 In this ``flex_extract`` version there is an additional possibility to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the real-time dataset. This model version has (up to May 2019) 25 ensemble members and a control run (``number 0``). Starting from June 2019 it has 50 ensemble members. Therefore we created the possibility to double up the 25 ensemble members (before June 2019) to 50 members by taking the original 25 members from MARS and subtracting 2 times the difference between the member value and the control value. This is done by selecting the parameter ``DOUBLEELDA`` and set it to ``1``. 392 384 The retrieval of ensemble members was already mentioned in the pure forecast section and for *CERA-20C* data. 385 This ``flex_extract`` version allows to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the operational dataset. Until May 2019, there were 25 ensemble members and a control run (``number 0``). Starting with June 2019, the number of ensemble members has been increased to 50. Therefore, we created the option to create 25 additional "pseudo-ensemble members" for periods before June 2019. The original 25 members from MARS are taken, and the difference between the member value and the control value is subtracted twice. This is done if the parameter ``DOUBLEELDA`` is included and set it to ``1``. 393 386 394 387 .. code-block:: bash … … 396 389 CONTROL_OD.ELDA.FC.eta.ens.double 397 390 CONTROL_OD.ENFO.PF.ens 398 399 400 391 401 392 … … 404 395 405 396 rrint 406 Decides if the precipitation flux data uses the old (``0``) or new (``1``) disaggregation scheme. See :doc:`Documentation/disagg` for explanaition.397 Selects the disaggregation scheme for precipitation flux: old (``0``) or new (``1``). See :doc:`Documentation/disagg` for explanation. 407 398 cwc 408 Decides if the total cloud water content will be retrieved (set to ``1``)in addition. This is the sum of cloud liquid and cloud ice water content.399 If present and set to ``1``, the total cloud water content will be retrieved in addition. This is the sum of cloud liquid and cloud ice water content. 409 400 addpar 410 With this parameter an additional list of 2-dimensional, non-flux parameters can be retrieved. Use format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names.401 With this parameter. an additional list of 2-dimensional, non-flux parameters can be retrieved. Use the format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names as defined by MARS. 411 402 doubleelda 412 Use this to double the ensemble member number by adding further disturbance to each member .403 Use this to double the ensemble member number by adding further disturbance to each member (to be used with 25 members). 413 404 debug 414 If set to ``1`` all temporary files were kept at the end. Otherwiseeverything except the final output files will be deleted.405 If set to ``1``, all temporary files are preserved. Otherwise, everything except the final output files will be deleted. 415 406 request 416 407 This produces an extra *csv* file ``mars_requests.csv`` where the content of each mars request of the job is stored. Useful for debugging and documentation. 417 408 mailfail 418 At default the mail is send to the mail connected with the user account. Add additional email addresses if you want. But as soon as you enter a new mail, the default will be overwritten. If you would like to keep the mail from your user account, please add ``${USER}`` to the list ( comma seperated ) or mail addresses. 419 420 421 422 Hints for definition of some parameter combinations 423 --------------------------------------------------- 424 425 Field types and times 426 This combination is very important. It defines the temporal resolution and which field type is extracted per time step. 427 The time declaration for analysis (AN) fields uses the times of the specific analysis and (forecast time) steps have to be ``0``. The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) times and the (forecast) steps. Both of them together defines the actual time step. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general it is enough to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations. 428 429 .. code-block:: bash 430 :caption: Example of a setting for the field types and temporal resolution. 409 As a default, e-mails are sent to the mail address connected with the user account. It is possible to overwrite this by specifying one or more e-mail addresses (comma-separated list). In order to include the e-mail associated with the user account, add ``${USER}`` to the list. 410 411 412 Hints for proper definition of certain parameter combinations 413 ------------------------------------------------------------- 414 415 Field type and time 416 This combination is very important. It defines the temporal resolution and which field type is extracted on each time step. 417 The time declaration for analysis (AN) fields uses the times of the specific analysis while the (forecast time) step has to be ``0``. 418 The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) time and the (forecast) step. Together they define the actual time. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general, it is sufficient to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations. 419 # needs to be rephrased 420 421 .. code-block:: bash 422 :caption: Example of a setting for the field types and temporal resolution. It will retrieve 3-hourly fields, with analyses at 00 and 12 UTC and the corresponding forecasts inbetween. 431 423 432 424 DTIME 3 … … 437 429 438 430 Vertical velocity 439 The vertical velocity for ``FLEXPART`` is not directly available from MARS. Therefore it has to be calculated. There are a couple of different options. The following parameters are responsible for the selection. See :doc:`Documentation/vertco` for a detailed explanation. The ``ETADIFF``, ``OMEGA`` and ``OMEGADIFF`` versions are only recommended for debugging and testing reasons. Usually it is a decision between ``GAUSS`` and ``ETA``, where for ``GAUSS`` spectral fields of the horizontal wind fields and the divergence are to be retrieved and used with the continuity equation to calculate the vertical velocity. For ``ETA`` the latitude/longitude fields of horizontal wind fields and eta-coordinate are to be retrieved. It is recommended to use ``ETA`` where possible due to a reduced computation time. 440 441 .. code-block:: bash 442 :caption: Example setting for the vertical coordinate retrieval. 431 The vertical velocity for ``FLEXPART`` is not directly available from MARS and has to be calculated. 432 There are several options for this, and the following parameters are responsible for the selection. See :doc:`Documentation/vertco` for a detailed explanation. Using ``ETADIFF 1``, ``OMEGA 1`` and ``OMEGADIFF 1`` is recommended for debugging and testing only. 433 Usually, one has to decide between ``GAUSS 1`` and ``ETA 1``. ``GAUSS 1`` means that spectral fields of the horizontal wind fields and the divergence are retrieved and that the vertical velocity is calculate using the continuity equation. ``ETA 1`` means that horizontal wind fields etadot are retrieved on a regular lat-lon grid. It is recommended to use ``ETA 1`` where possible, as there is a substantial computational overhead for solving the continuity equation. 434 435 .. code-block:: bash 436 :caption: Example setting for the vertical coordinate retrieval (recommended if etadot fields are available). 443 437 444 438 GAUSS 0 … … 451 445 452 446 Grid resolution and domain 453 The grid and domain selection depends on each other. The grid can be defined in the format of normal degrees (e.g. ``1.``) or as in older versions by 1/1000. degrees (e.g. ``1000`` for ``1°``). 454 After selecting the grid, the domain has to be defined in a way that the length of the domain in longitude or latitude direction must be an exact multiple of the grid. 455 The horizontal resolution for spectral fields will be set by the parameter ``RESOL``. For information about how to select an appropriate value you can read the explanation of the MARS keyword `here <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and in `this table <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_. 456 457 .. code-block:: bash 458 :caption: Example setting for a northern hemisphere domain with a grid of ``0.25°``. 447 The grid and domain parameters depends on each other. ``grid`` refers to the grid resolution. It can be given as decimal values (e.g., ``1.`` meaning 1.0°), or as in previous versions of flex_extract, as integer values refering to 1/1000 degrees (e.g., ``1000`` means also 1°). The code applies common sense to determine what format is to be assumed. 448 After selecting grid, the ``domain`` has to be defined. The extension in longitude or latitude direction must be an integer multiple of ``grid``. 449 #PS shouldn't we explain how to define a domain?? 450 The horizontal resolution for spectral fields is set by the parameter ``RESOL``. For information about how to select an appropriate value please read the explanation of the MARS keyword RESOL as found `in this entry of the ECMWF on-line documentation <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and `this table (also ECMWF documentation) <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_. 451 452 .. code-block:: bash 453 :caption: Example setting for a domain covering the northern hemisphere domain with a grid resolution of ``0.25°``. 459 454 460 455 GRID 0.25 … … 468 463 469 464 Flux data 470 The flux fields are accumulated forecast fields all the time. Since some re-analysis dataset nowadays have complete set of analysis fields in their temporal resolution it was important to define a new parameter set to define the flux fields since the information could not be taken from ``TYPE``, ``TIME`` and ``STEP`` any longer. Select a forecast field type ``ACCTYPE``, the forecast starting time ``ACCTIME`` and the maximum forecast step ``ACCMAXSTEP``. The ``DTIME`` parameter defines the temporal resolution for the whole period.465 Flux fields are always forecast fields and contain values of the fluxes accumulated since the start of the respective forecast. As certain re-analysis dataset cover all time steps with analysis fields, it was necessary to define a new parameter set for the definition of the flux fields. The following parameters are used specifically for flux fields, if provided. ``ACCTYPE`` is the field type (must be a type of forecast), ``ACCTIME`` the forecast starting time, and ``ACCMAXSTEP`` the maximum forecast step;``DTIME`` the temporal resolution. ACCTYPE is assumed to be the same during the whole period given by ACCTIME and ACCMAXSTEP. 471 466 472 467 .. code-block:: bash 473 468 :caption: Example setting for the definition of flux fields. 469 #PS for which application would this be typical? 474 470 475 471 DTIME 3 … … 478 474 ACCMAXSTEP 36 479 475 480 481 476 482 477 .. toctree::
Note: See TracChangeset
for help on using the changeset viewer.