Changeset f20342a in flex_extract.git for For_developers/Sphinx/source/Documentation
- Timestamp:
- May 27, 2020, 8:01:54 PM (4 years ago)
- Branches:
- master, ctbto, dev
- Children:
- 550435b
- Parents:
- a14839a
- Location:
- For_developers/Sphinx/source/Documentation
- Files:
-
- 21 edited
Legend:
- Unmodified
- Added
- Removed
-
For_developers/Sphinx/source/Documentation/Api/api_fortran.rst
rba99230 rf20342a 1 1 ************************************** 2 Fortran's Auto Generated Documentation 2 Auto-generated documentation for the Fortran programme 3 3 ************************************** 4 4 … … 6 6 :local: 7 7 8 9 10 Link to other documentation! 8 9 `Fortran API <Fortran/index.html>`_ 11 10 12 13 14 15 .... f:autoprogram:: preconvert16 17 18 19 11 20 12 .. toctree:: 21 13 :hidden: 22 14 :maxdepth: 2 23 24 25 26 -
For_developers/Sphinx/source/Documentation/Api/api_python.rst
rba99230 rf20342a 1 1 ************************************* 2 Python's Auto Generated Documentation 2 Auto-generated documentation for the Python scripts 3 3 ************************************* 4 4 -
For_developers/Sphinx/source/Documentation/Input/changes.rst
rba99230 rf20342a 8 8 - comments available with ``#`` 9 9 - only parameters which are needed to override the default values are necessary 10 - number of type/step/time elements do not have to be 24 any more. Just select the intervalyou need.11 - the ``dtime`` parameter needs to be consistent with ``type/step/time`` . For example ``dtime`` can be coarser as ``time`` intervals areavailable, but not finer.10 - number of type/step/time elements does not have to be 24 anymore. Just provide what you need. 11 - the ``dtime`` parameter needs to be consistent with ``type/step/time``, for example, ``dtime`` can be coarser than the ``time`` intervals available, but not finer. 12 12 13 13 -
For_developers/Sphinx/source/Documentation/Input/compilejob.rst
rb1674ed rf20342a 1 1 ******************************************** 2 The Compilation Jobscript ``compilejob.ksh``2 The compilation job script ``compilejob.ksh`` 3 3 ******************************************** 4 4 5 The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory.5 The compile job is a Korn-shell script which will be created during the installation process for the application modes **remote** and **gateway** from a template called ``compilejob.template`` in the template directory. 6 6 7 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate7 ``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate 8 8 the Korn-shell script from the template files by substituting the individual parameters. 9 9 These individual parameters are marked by a doubled ``$`` sign in ``job.temp``. 10 10 11 The job script has a number of settings for the batch system which are fixed anddifferentiates between the *ecgate* and the *cca/ccb*11 The job script has a number of settings for the batch system which are fixed, and it differentiates between the *ecgate* and the *cca/ccb* 12 12 server system to load the necessary modules for the environment when submitted to the batch queue. 13 13 … … 19 19 ------------------------------------ 20 20 21 #. It sets necessary batchsystem parameters21 #. It sets the necessary batch-system parameters 22 22 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules 23 #. It sets some environment variab els for the single session23 #. It sets some environment variables for the single session 24 24 #. It creates the ``flex_extract`` root directory in the ``$HOME`` path of the user 25 #. It untars the tar -ball into the root directory.26 #. It compiles the Fortran program s's``Makefile``.27 #. At the end it checks if the script returned an error or not and send the log file via emailto the user.25 #. It untars the tarball into the root directory. 26 #. It compiles the Fortran program using ``Makefile``. 27 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user. 28 28 29 29 -
For_developers/Sphinx/source/Documentation/Input/control.rst
rb1674ed rf20342a 10 10 11 11 This file is an input file for :literal:`flex_extract's` main script :literal:`submit.py`. 12 It contains the controlling parameters :literal:`flex_extract` needs to decide on dataset specifications,13 handling of the retrieved data and general bahaviour. The naming convention is usually (but not necessary):12 It contains the controlling parameters which :literal:`flex_extract` needs to decide on data set specifications, 13 handling of the data retrieved, and general behaviour. The naming convention is usually (but not necessarily): 14 14 15 15 :literal:`CONTROL_<Dataset>[.optionalIndications]` 16 16 17 The tested datasets are the operational dataset and the re-analysis datasets CERA-20C, ERA5and ERA-Interim.18 The optional extra indications for the re-analysis datasets mark the files for *public users*19 and *global* domain. For the operational data sets (*OD*)the file names contain also information of20 the stream, the field type for forecasts, the method for extracting the vertical coordinate and other things like timeor horizontal resolution.17 There are a number of data sets for which the procedures have been tested, the operational data and the re-analysis datasets CERA-20C, ERA5, and ERA-Interim. 18 The optional indications for the re-analysis data sets mark the files for *public users* 19 and *global* domain. For the operational data sets (*OD*), the file names contain also information of 20 the stream, the field type for forecasts, the method for extracting the vertical wind, and other information such as temporal or horizontal resolution. 21 21 22 22 … … 24 24 ---------------------------------- 25 25 The first string of each line is the parameter name, the following string(s) (separated by spaces) is (are) the parameter values. 26 The parameters can be sorted in any order with one parameter per line.26 The parameters can be listed in any order with one parameter per line. 27 27 Comments are started with a '#' - sign. Some of these parameters can be overruled by the command line 28 28 parameters given to the :literal:`submit.py` script. 29 All parameters have default values . Only those parameters which have to be changed30 mustbe listed in the :literal:`CONTROL` files.29 All parameters have default values; only those parameters which deviate from default 30 have be listed in the :literal:`CONTROL` files. 31 31 32 32 … … 35 35 36 36 A number of example files can be found in the directory :literal:`flex_extract_vX.X/Run/Control/`. 37 They can be used as a template for adaptation s and understand what's possible to38 retrieve from ECMWF's archive.39 For each main dataset there is an example and additionally some variances in resolution, type of field or type of retrieving the vertical coordinate.37 They can be used as a template for adaptation, and to understand what can be 38 retrievee from ECMWF's archives. 39 There is an example for each main data set, and in addition, some more varied with respect to resolution, type of field, or way of retrieving the vertical wind. 40 40 41 41 … … 45 45 ------------ 46 46 The file :literal:`CONTROL.documentation` documents the available parameters 47 in grouped sections with their default values. In :doc:`control_params` you can find a more 48 detailed description with additional hints, possible values and some useful information about 47 in grouped sections together with their default values. 48 In :doc:`control_params`, you can find a more 49 detailed description with additional hints, possible values, and further information about 49 50 the setting of these parameters. 50 51 -
For_developers/Sphinx/source/Documentation/Input/control_params.rst
rb1674ed rf20342a 11 11 ************ 12 12 13 .. exceltable:: User parameter in CONTROL file13 .. exceltable:: User parameters in CONTROL file 14 14 :file: ../../_files/CONTROLparameter.xls 15 15 :sheet: UserSection … … 21 21 *************** 22 22 23 .. exceltable:: General parameter in CONTROL file23 .. exceltable:: General parameters in CONTROL file 24 24 :file: ../../_files/CONTROLparameter.xls 25 25 :sheet: GeneralSection … … 31 31 ************ 32 32 33 .. exceltable:: Time parameter in CONTROL file33 .. exceltable:: Time parameters in CONTROL file 34 34 :file: ../../_files/CONTROLparameter.xls 35 35 :sheet: TimeSection … … 42 42 ************ 43 43 44 .. exceltable:: Data parameter in CONTROL file44 .. exceltable:: Data parameters in CONTROL file 45 45 :file: ../../_files/CONTROLparameter.xls 46 46 :sheet: DataSection … … 53 53 ****************** 54 54 55 .. exceltable:: Data field parameter in CONTROL file55 .. exceltable:: Data field parameters in CONTROL file 56 56 :file: ../../_files/CONTROLparameter.xls 57 57 :sheet: DatafieldsSection … … 64 64 ***************** 65 65 66 .. exceltable:: Flux data parameter in CONTROL file66 .. exceltable:: Flux data parameters in CONTROL file 67 67 :file: ../../_files/CONTROLparameter.xls 68 68 :sheet: FluxDataSection … … 75 75 ************** 76 76 77 .. exceltable:: Domain parameter in CONTROL file77 .. exceltable:: Domain parameters in CONTROL file 78 78 :file: ../../_files/CONTROLparameter.xls 79 79 :sheet: DomainSection … … 99 99 *********************** 100 100 101 .. exceltable:: Additional data parameter in CONTROL file101 .. exceltable:: Additional data parameters in CONTROL file 102 102 :file: ../../_files/CONTROLparameter.xls 103 103 :sheet: AddDataSection -
For_developers/Sphinx/source/Documentation/Input/ecmwf_env.rst
rb1674ed rf20342a 1 1 **************************************** 2 ECMWF User Credential file ``ECMWF_ENV``2 ECMWF user credential file ``ECMWF_ENV`` 3 3 **************************************** 4 4 … … 16 16 ------------------------ 17 17 18 The following shows an example of the content of an ``ECMWF_ENV`` file:18 An example of the content of an ``ECMWF_ENV`` file is shown below: 19 19 20 20 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/examples.rst
rb1674ed rf20342a 3 3 ********************** 4 4 5 ``Flex_extract`` has a coupleof example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``.5 ``Flex_extract`` comes with a number of example ``CONTROL`` files for a number of different data set constellations in the directory path ``flex_extract_vX.X/Run/Control``. 6 6 7 Here is a list of the example files and a description of the data set:7 Here is a list of the example files: 8 8 9 9 CONTROL.documentation 10 This file is not intended to be used with ``flex_extract``. It has a list of all possible parameters and their default values for a quick overview.10 This file is not intended to be used with ``flex_extract``. It just contains a list of all possible parameters and their default values for a quick overview. 11 11 12 12 .. code-block:: bash … … 33 33 CONTROL_OD.OPER.FC.gauss.highres 34 34 CONTROL_OD.OPER.FC.operational 35 CONTROL_OD.OPER.FC.twice aday.1hourly36 CONTROL_OD.OPER.FC.twice aday.3hourly35 CONTROL_OD.OPER.FC.twicedaily.1hourly 36 CONTROL_OD.OPER.FC.twicedaily.3hourly 37 37 38 38 #PS some information to be added. 39 39 40 .. toctree:: 40 41 :hidden: -
For_developers/Sphinx/source/Documentation/Input/fortran_makefile.rst
rb1674ed rf20342a 1 1 ************************************** 2 The Fortran Makefile -``calc_etadot``2 The Fortran makefile for ``calc_etadot`` 3 3 ************************************** 4 4 5 5 .. _ref-convert: 6 6 7 ``Flex_extract``'s Fortran programwill be compiled during8 the installation process to get the executable named ``calc_etadot``.7 The Fortran program ``calc_etadot`` will be compiled during 8 the installation process to produce the executable called ``calc_etadot``. 9 9 10 ``Flex_extract`` has a couple of ``makefiles`` preparedwhich can be found in the directory11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted with the current version number.12 A list of these ``makefiles`` areshown below:10 ``Flex_extract`` includes several ``makefiles`` which can be found in the directory 11 ``flex_extract_vX.X/Source/Fortran``, where ``vX.X`` should be substituted by the current flex_extract version number. 12 A list of these ``makefiles`` is shown below: 13 13 14 14 … … 16 16 | Files to be used as they are! 17 17 18 | **makefile_ecgate** 19 | For the use on ECMWF's server **ecgate**. 20 21 | **makefile_cray** 22 | For the use on ECMWF's server **cca/ccb**. 18 | **makefile_ecgate**: For use on ECMWF's server **ecgate**. 19 | **makefile_cray**: For use on ECMWF's server **cca/ccb**. 23 20 24 21 | **Local mode** 25 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** 22 | It is necessary to adapt **ECCODES_INCLUDE_DIR** and **ECCODES_LIB** if they don't correspond to the standard paths pre-set in the makefiles. 26 23 27 | **makefile_fast** 28 | For the use with gfortran compiler and optimization mode.24 | **makefile_fast**: For use with the gfortran compiler and optimisation mode. 25 | **makefile_debug**: For use with the gfortran compiler and debugging mode. Primarily for developers. 29 26 30 | **makefile_debug** 31 | For the use with gfortran compiler in debugging mode.27 If you want to use another compiler than gfortran locally, you can still take ``makefile_fast``, 28 and adapt everything that is compiler-specific in this file. 32 29 33 34 For instructions on how to adapt the ``makefiles`` for the local application mode 30 For instructions on how to adapt the ``makefile`` (local application mode only), 35 31 please see :ref:`ref-install-local`. 36 37 32 38 33 -
For_developers/Sphinx/source/Documentation/Input/jobscript.rst
rb1674ed rf20342a 1 1 ************************* 2 The Jobscript ``job.ksh``2 The job script ``job.ksh`` 3 3 ************************* 4 4 5 The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**.5 The job script is a Korn-shell script which will be created at runtime for each ``flex_extract`` execution in the application modes **remote** and **gateway**. 6 6 7 It is based on the ``job.temp`` template file which isstored in the ``Templates`` directory.8 This template is by itselfgenerated in the installation process from a ``job.template`` template file.7 It is based on the ``job.temp`` template file stored in the ``Templates`` directory. 8 This template is generated in the installation process from a ``job.template`` template file. 9 9 10 ``Flex_extract`` uses the python package `genshi <https://genshi.edgewall.org/>`_ to generate10 ``Flex_extract`` uses the Python package `genshi <https://genshi.edgewall.org/>`_ to generate 11 11 the Korn-shell script from the template files by substituting the individual parameters. 12 These individual parameters are marked by a doubled ``$`` signin ``job.temp``.12 These individual parameters are marked by ``$$`` in ``job.temp``. 13 13 14 The job script has a number of settings for the batch system which are fixedand differentiates between the *ecgate* and the *cca/ccb*14 The job script has a number of settings for the batch system which are fixed, and differentiates between the *ecgate* and the *cca/ccb* 15 15 server system to load the necessary modules for the environment when submitted to the batch queue. 16 16 … … 19 19 20 20 21 What does the job script do?21 What does the job script do? 22 22 --------------------------- 23 23 24 #. It sets necessary batch system parameters 25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules 26 #. It sets some environment variab els for the single session27 #. It creates the directory structure in the user s ``$SCRATCH`` file system28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are given to the jobscript with its default or the user defined values. It also sets the``CONTROL`` as an environment variable.29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new path es for input and output directoryand the recently generated ``CONTROL`` file.30 #. At the end it checks if the script returned an error or not and send the log file via emailto the user.24 #. It sets necessary batch system parameters. 25 #. It prepares the job environment at the ECMWF servers by loading the necessary library modules. 26 #. It sets some environment variables for the single session. 27 #. It creates the directory structure in the user's ``$SCRATCH`` file system. 28 #. It creates a CONTROL file on the ECMWF servers whith the parameters set before creating the ``jobscript.ksh``. ``Flex_extract`` has a set of parameters which are passed to the job script with their default or the user-defined values. It also sets ``CONTROL`` as an environment variable. 29 #. ``Flex_extract`` is started from within the ``work`` directory of the new directory structure by calling the ``submit.py`` script. It sets new paths for input and output directories and the recently generated ``CONTROL`` file. 30 #. At the end, it checks whether the script has returned an error or not, and emails the log file to the user. 31 31 32 32 -
For_developers/Sphinx/source/Documentation/Input/run.rst
rb1674ed rf20342a 1 1 ********************************** 2 The executable Script - ``run.sh``2 The executable script - ``run.sh`` 3 3 ********************************** 4 4 5 The execution of ``flex_extract`` is done by the ``run.sh`` Shell script, which is a wrappingscript for the top-level Python script ``submit.py``.5 The execution of ``flex_extract`` is done by the ``run.sh`` shell script, which is a wrapper script for the top-level Python script ``submit.py``. 6 6 The Python script constitutes the entry point to ECMWF data retrievals with ``flex_extract`` and controls the program flow. 7 7 8 ``submit.py`` has two ( three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file,9 the command line parameters and the so-called ``ECMWF_ENV`` file. Whereby, the command line parameters will override the ``CONTROL`` file parameters.8 ``submit.py`` has two (or three) sources for input parameters with information about program flow and ECMWF data selection, the so-called ``CONTROL`` file, 9 the command line parameters, and the so-called ``ECMWF_ENV`` file. Command line parameters will override parameters specified in the ``CONTROL`` file. 10 10 11 Based on th ese input information ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a Web API on a local maschine or submit a jobscript to ECMWF servers and retrieve the data there with sending the files to the local system eventually.11 Based on this input information, ``flex_extract`` applies one of the application modes to either retrieve the ECMWF data via a web API on a local maschine, or submit a job script to an ECMWF server and retrieve the data there, and at the end sends the files to the local system. 12 12 13 13 14 14 15 15 16 Submission Parameter16 Submission parameters 17 17 -------------------- 18 18 19 19 20 .. exceltable:: Parameter for Submission20 .. exceltable:: Parameters for submission 21 21 :file: ../../_files/SubmitParameters.xls 22 22 :header: 1 … … 39 39 --------------------------------- 40 40 41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapp ing Shell script ``run.sh``. This top-level script is located in42 ``flex_extract_vX.X/Source/Python`` and is executable. With the `` help`` parameter we see again all possible43 command line parameter.41 It is also possible to start ``flex_extract`` directly from command line by using the ``submit.py`` script instead of the wrapper shell script ``run.sh``. This top-level script is located in 42 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter 43 we see again all possible command line parameters. 44 44 45 45 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/setup.rst
rb1674ed rf20342a 1 1 ************************************** 2 The Installation Script - ``setup.sh``2 The installation script - ``setup.sh`` 3 3 ************************************** 4 4 5 6 The installation of ``flex_extract`` is done by the Shell script ``setup.sh`` which is located in the root directory of ``flex_extract``. 7 It calls the top-level Python script ``install.py`` which does all necessary operations to prepare the selected application environment. This includes: 8 9 - preparing the file ``ECMWF_ENV`` with the user credentials for member state access to ECMWF servers (in **remote** and **gateway** mode) 5 The installation of ``flex_extract`` is done by the shell script ``setup.sh`` located in the root directory of ``flex_extract``. 6 It calls the top-level Python script ``install.py`` which does all the necessary operations to prepare the application environment selected. This includes: 7 8 - preparing the file ``ECMWF_ENV`` with the user credentials for member-state access to ECMWF servers (in **remote** and **gateway** mode) 10 9 - preparation of a compilation Korn-shell script (in **remote** and **gateway** mode) 11 10 - preparation of a job template with user credentials (in **remote** and **gateway** mode) 12 - create a tar -ball of all necessary files13 - copying t ar-ball totarget location (depending on application mode and installation path)14 - submit compilation script to batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar tar-ball at target location (**local mode**)15 - compilation of the F ORTRAN90program ``calc_etadot``16 17 18 The Python installation script ``install.py`` has a couple of command line arguments which are defined in ``setup.sh`` in the section labelled with "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters for his personal use. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee necessary parameters were set.11 - create a tarball of all necessary files 12 - copying the tarball to the target location (depending on application mode and installation path) 13 - submit the compilation script to the batch queue at ECMWF servers (in **remote** and **gateway** mode) or just untar the tarball at target location (**local mode**) 14 - compilation of the Fortran program ``calc_etadot`` 15 16 17 The Python installation script ``install.py`` has several command line arguments defined in ``setup.sh``, in the section labelled "*AVAILABLE COMMANDLINE ARGUMENTS TO SET*". The user has to adapt these parameters according to his/her personal needs. The parameters are listed and described in :ref:`ref-instparams`. The script also does some checks to guarantee that the necessary parameters were set. 19 18 20 19 After the installation process, some tests can be conducted. They are described in section :ref:`ref-testinstallfe`. 21 20 22 The following diagram sketches the involved files and scriptsin the installation process:21 The following diagram sketches the files and scripts involved in the installation process: 23 22 24 23 .. _ref-install-blockdiag: … … 115 114 116 115 .. blockdiag:: 117 :caption: Diagram of data flow during the installation process. T he trapezoids are input files with the light blue area being the template files. The edge-rounded, orange boxes are the executable files which start the installation process and reads the input files. The rectangular, green boxes are the output files. The light green files are files which are only neededin the remota and gateway mode.116 :caption: Diagram of data flow during the installation process. Trapezoids are input files with the light blue area being the template files. Round-edge orange boxes are executable files which start the installation process and read the input files. Rectangular green boxes are output files. Light green files are needed only in the remota and gateway mode. 118 117 119 118 blockdiag { … … 133 132 .. _ref-instparams: 134 133 135 Installation Parameter136 ---------------------- 134 Installation parameters 135 ----------------------- 137 136 138 137 .. exceltable:: Parameter for Installation … … 155 154 ---------------------------------- 156 155 157 It is also possible to start the installation process of ``flex_extract`` directly from command line by using the ``install.py`` script instead of the wrapping Shell script ``setup.sh``. This top-level script is located in158 ``flex_extract_vX.X/Source/Python`` and is executable. With the `` help`` parameter we see again all possible159 command line parameter.156 It is also possible to start the installation process of ``flex_extract`` directly from the command line by using the ``install.py`` script instead of the wrapper shell script ``setup.sh``. This top-level script is located in 157 ``flex_extract_vX.X/Source/Python`` and is executable. With the ``--help`` parameter, 158 we see again all possible command line parameters. 160 159 161 160 .. code-block:: bash -
For_developers/Sphinx/source/Documentation/Input/templates.rst
rb1674ed rf20342a 3 3 ********* 4 4 5 In ``flex_extract`` we use the Python package `genshi <https://genshi.edgewall.org/>`_ to create specific files from templates. It is the most efficient way to be able to quickly adapt e.g. the job scripts send to the ECMWF batch queue system or the namelist file für the Fortran programwithout the need to change the program code.5 In ``flex_extract``, the Python package `genshi <https://genshi.edgewall.org/>`_ is used to create specific files from templates. It is the most efficient way to be able to quickly adapt, e. g., the job scripts sent to the ECMWF batch queue system, or the namelist file für the Fortran program, without the need to change the program code. 6 6 7 7 .. note:: 8 Usually it is not recommended to change anything in these files without being able to understand the effects.8 Do not change anything in these files unless you understand the effects! 9 9 10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need s to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used the ``$`` sign needs to be doubled to `escape` and keep a single ``$`` sign as it is.10 Each template file has its content framework and keeps so-called placeholder variables in the positions where the values need to be substituted at run time. These placeholders are marked by a leading ``$`` sign. In case of the Kornshell job scripts, where (environment) variables are used, the ``$`` sign needs to be doubled for `escaping`. 11 11 12 The following templates are used and can be found indirectory ``flex_extract_vX.X/Templates``:12 The following templates are used; they can be found in the directory ``flex_extract_vX.X/Templates``: 13 13 14 14 convert.nl 15 15 ---------- 16 16 17 This is the template for a Fortran namelist file called ``fort.4`` which will beread by ``calc_etadot``.17 This is the template for a Fortran namelist file called ``fort.4`` read by ``calc_etadot``. 18 18 It contains all the parameters ``calc_etadot`` needs. 19 19 … … 57 57 This template is used to create the job script file called ``compilejob.ksh`` during the installation process for the application modes **remote** and **gateway**. 58 58 59 At the beginning some directives for the batch system are set.60 On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_.61 For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at`PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_.62 63 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing.64 65 Afterwards the installation steps as such are done. Including the generation of the root directory, putting files in place, compiling the Fortran program and sending a log file viaemail.59 At the beginning, some directives for the batch system are set. 60 On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 61 For the high-performance computers **cca** and **ccb**, the ``PBS`` comments are necessary; for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 62 63 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing. 64 65 Afterwards, the installation steps as such are done. They included the generation of the root directory, putting files in place, compiling the Fortran program, and sending a log file by email. 66 66 67 67 .. code-block:: ksh … … 145 145 This template is used to create the actual job script file called ``job.ksh`` for the execution of ``flex_extract`` in the application modes **remote** and **gateway**. 146 146 147 At the beginning some directives for the batch system are set. 148 On the **ecgate** server the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 149 For the high performance computers **cca** and **ccb** the ``PBS`` comments are necessary and can be view at `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 150 151 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending in the ``HOST``. It should not be changed without testing. 152 153 Afterwards the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end a log file is send via email. 147 At the beginning, some directives for the batch system are set. 148 On the **ecgate** server, the ``SBATCH`` comments are the directives for the SLURM workload manager. A description of the single lines can be found at `SLURM directives <https://confluence.ecmwf.int/display/UDOC/Writing+SLURM+jobs>`_. 149 For the high performance computers **cca** and **ccb**, the ``PBS`` comments are necessary; 150 for details see `PBS directives <https://confluence.ecmwf.int/display/UDOC/Batch+environment%3A++PBS>`_. 151 152 The software environment requirements mentioned in :ref:`ref-requirements` are prepared by loading the corresponding modules depending on the ``HOST``. It should not be changed without testing. 153 154 Afterwards, the run directory and the ``CONTROL`` file are created and ``flex_extract`` is executed. In the end, a log file is send by email. 154 155 155 156 .. code-block:: ksh … … 239 240 ------------ 240 241 241 This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. A couple ofparameters are set in this process, such as the user credentials and the ``flex_extract`` version number.242 This template is used to create the template for the execution job script ``job.temp`` for ``flex_extract`` in the installation process. A description of the file can be found under ``job.temp``. Several parameters are set in this process, such as the user credentials and the ``flex_extract`` version number. 242 243 243 244 .. code-block:: ksh … … 325 326 326 327 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 328 343 329 -
For_developers/Sphinx/source/Documentation/Overview/app_modes.rst
rb1674ed rf20342a 1 1 ***************** 2 Application Modes2 Application modes 3 3 ***************** 4 4 … … 13 13 .. _ref-app-modes: 14 14 15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has 4different :underline:`user application modes`:15 Arising from the two user groups described in :doc:`../../Ecmwf/access`, ``flex_extract`` has four different :underline:`user application modes`: 16 16 17 17 .. _ref-remote-desc: 18 18 19 19 1. Remote (member) 20 In the **Remote mode** the user works directly on ECMWF Linux member state server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the additional third-party libraries mentioned in :ref:`ref-requirements` as ECMWF provides everything with environment modules. The module selection will be done automatically in``flex_extract``.20 In the **Remote mode** the user works directly on a ECMWF member-state Linux server, such as ``ecgate`` or ``cca/ccb``. The software will be installed in the ``$HOME`` directory. The user does not need to install any of the third-party libraries mentioned in :ref:`ref-requirements`, as ECMWF provides everything with environment modules. The module selection will be done automatically by ``flex_extract``. 21 21 22 22 .. _ref-gateway-desc: 23 23 24 24 2. Gateway (member) 25 The **Gateway mode** can be used if a local member state gateway server is in place. Then the job scripts can be submitted to the ECMWF Linux member state server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed at the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and some extra setup is done in the local ``flex_extract`` directory at the local gateway server. For more information about establishing a gateway server please see`ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`.25 The **Gateway mode** can be used if a local member-state gateway server is in place. Then, the job scripts can be submitted to the ECMWF member-state Linux server via the ECMWF web access tool ``ecaccess``. The installation script of ``flex_extract`` must be executed on the local gateway server such that the software will be installed in the ``$HOME`` directory at the ECMWF server and that some extra setup is done in the ``flex_extract`` directory on the local gateway server. For more information about establishing a gateway server, please refer to `ECMWF's instructions on gateway server`_. For the **Gateway mode** the necessary environment has to be established which is described in :ref:`ref-prep-gateway`. 26 26 27 27 .. _ref-local-desc: 28 28 29 29 3. Local member 30 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`.30 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. 31 31 32 32 4. Local public 33 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario a software environment similar to that at ECMWF is required. Additionally, Web API's have to be installed to access ECMWF server. The complete installation process is described in :ref:`ref-local-mode`. In this casea direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve.33 Scripts are installed and executed on a local machine, either in the current ``flex_extract`` directory or in a path given to the installation script. Under this scenario, a software environment similar to that at ECMWF is required. Additionally, web API's have to be installed to access ECMWF servers. The complete installation process is described in :ref:`ref-local-mode`. In this case, a direct registration at ECMWF is necessary and the user has to accept a specific license agreement for each dataset he/she intends to retrieve. 34 34 35 35 -
For_developers/Sphinx/source/Documentation/Overview/prog_flow.rst
rb1674ed rf20342a 1 1 ************ 2 Program Flow2 Program flow 3 3 ************ 4 4 … … 16 16 .. figure:: ../../_files/submit.png 17 17 18 Overview of the call of python's ``submit.py`` script and raw sequence of working steps donein ``flex_extract``.18 Overview of the call of the``submit.py`` Python script and raw sequence of work steps in ``flex_extract``. 19 19 20 20 21 The ``submit.py`` Python program is called by the Shell script ``run.sh`` or ``run_local.sh`` and accomplishthe following steps:21 The ``submit.py`` Python script is called by the shell script ``run.sh`` or ``run_local.sh`` and accomplishes the following steps: 22 22 23 1. Setup thecontrol data:24 It gets all command-line and ``CONTROL`` file parameters as well as optionally the ECMWF user credentials. Depending the :doc:`app_modes`, it might also prepare a job script which is then sendto the ECMWF queue.25 2. Retriev esdata from MARS:26 It creates and sends MARS-requests either on the local machine or on ECMWF server, that receives the data and stores them in a specific format in GRIB files. If the parameter ``REQUEST`` was set ``1`` the data are not received but a file ``mars_requests.csv`` is created with a list of MARS requests and their settings. If it is set to ``2`` the file is created in addition to retrieving the data. The requests are created in an optimised way by splitting intime, jobs and parameters.27 3. Post-process data to create final ``FLEXPART`` input files:28 After all data is retrieved, the disaggregation of flux fields (`see here <../disagg.html>`_ ) is done as well as the calculation of vertical velocity (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Eventually, the GRIB fields are merged together such that a single grib file per time step is available with all fields for ``FLEXPART``. Since model level fields are typically in *GRIB2* format whereas surface level fields are still in *GRIB1* format, they can be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties reading pure *GRIB2* files since some parameter IDs change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files can be communicated to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locallythe progress of the script can be followed with the usual Linux tools.23 1. Setup of control data: 24 Command-line and ``CONTROL``-file parameters are read, as well as (optionally) the ECMWF user credentials. Depending the :doc:`app_modes`, a job script might be prepared which is then sent to the ECMWF queue. 25 2. Retrieval of data from MARS: 26 MARS requests are created either on the local machine or on the ECMWF server and then submitted which retrieve the data and store them in GRIB files. If the parameter ``REQUEST`` was set ``1``, the data are not retrieved and instead a file ``mars_requests.csv`` is created, which contains a list of the MARS requests and their settings. If ``REQEST`` is set to ``2``, the csv file is created in addition to retrieving the data. The requests are created in an optimised way by splitting with respect to time, jobs and parameters. 27 3. Post-processing of data to create final ``FLEXPART`` input files: 28 After all data have been retrieved, flux fields are disaggregated (`see here <../disagg.html>`_ ) and the vertical velocity is calculated (`see here <../vertco.html>`_) by the Fortran program ``calc_etadot``. Finally, the GRIB fields are merged into a single grib file per time step containing all the fields for ``FLEXPART``. Since model-level fields are typically in *GRIB2* format, whereas surface-level fields are still in *GRIB1* format, they will be converted into GRIB2 if parameter ``FORMAT`` is set to *GRIB2*. Please note, however, that older versions of FLEXPART may have difficulties to read these *GRIB2* files since some parameter IDs have been change in *GRIB2*. If the retrieval is executed remotely at ECMWF, the resulting files will be sent to the local gateway server via the ``ECtrans`` utility if the parameter ``ECTRANS`` is set to ``1`` and the parameters ``GATEWAY``, ``DESTINATION`` have been set properly during installation. The status of the transfer can be checked with the command ``ecaccess-ectrans-list`` (on the local gateway server). If the script is executed locally, the progress of the script can be followed with the usual Linux tools. 29 29 30 30 … … 33 33 ======================================== 34 34 35 More details on how different the program flow is for the different :doc:`app_modes` is sketched in the following diagrams:35 The following diagrams show how different the program flow is for the different :doc:`app_modes`: 36 36 37 37 +-------------------------------------------------+------------------------------------------------+ -
For_developers/Sphinx/source/Documentation/api.rst
rba99230 rf20342a 1 1 **************************** 2 Auto Generated Documentation2 Auto-generated documentation 3 3 **************************** 4 4 -
For_developers/Sphinx/source/Documentation/disagg.rst
rd9abaac rf20342a 1 1 *************************** 2 Disaggregation of Flux Data2 Disaggregation of flux data 3 3 *************************** 4 4 5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational particles in time and space. This method requires point values in the discrete input fields. However, flux data (as listed in table :ref:`ref-table-fluxpar`) from the ECMWF represent cell averages or integrals and are accumulated over a specific time interval, depending on the dataset. Hence, to conserve the integral quantity with ``FLEXPART``'s linear interpolation a pre-processing scheme has to be applied. 5 ``FLEXPART`` interpolates meteorological input data linearly to the position of computational 6 particles in time and space. This method requires point values in the discrete input fields. 7 However, flux data (as listed in table :ref:`ref-table-fluxpar` below) from the ECMWF represent cell 8 averages or integrals and are accumulated over a specific time interval, depending on the data 9 set. Hence, to conserve the integral quantity with the linear interpolation used in ``FLEXPART``, 10 pre-processing has to be applied. 6 11 7 12 .. _ref-table-fluxpar: 8 13 9 .. csv-table:: flux fields14 .. csv-table:: Flux fields 10 15 :header: "Short Name", "Name", "Units", "Interpolation Type" 11 16 :align: center … … 20 25 21 26 22 The first step is to *de-accumulate* the fields in time so that each value represents an integral in x, y, t space. 23 Afterwards, a *disaggregation* scheme is applied which means to break down the integral value into point values. 24 In order to be able to carry out the disaggregation procedure proposed by Paul James, additional flux data is retrieved automatically for one day at the beginning and one day at the end of the period specified. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not communicated to the final ``FLEXPART`` input files. 27 The first step is to *de-accumulate* the fields in time so that each value represents non-overlapping integrals in x-, y-, and t-space. 28 Afterwards, a *disaggregation* scheme is applied which means to convert the integral value to corresponding point values to be used late for the interpolation. 29 The disaggregation procedure as proposed by Paul James (currently, the standard) requires additional flux data for one day at the beginning and one day at the end of the period specified. 30 They are retrieved automatically. Thus, data for flux computation will be requested for the period START_DATE-1 to END_DATE+1. Note that these (additional) dates are used only for interpolation within ``flex_extract`` and are not contained in the final ``FLEXPART`` input files. 25 31 26 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note ,that the first two and last two flux files do not contain any data.32 The flux disaggregation produces files named ``fluxYYYYMMDDHH``, where ``YYYYMMDDHH`` is the date format. Note that the first two and last two flux files do not contain any data. 27 33 28 34 .. note:: 29 35 30 Note also that for operational retrievals (``BASETIME`` set to 00 or 12) forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for ondemand scripts.36 Note also that for operational retrievals (``BASETIME`` set to 00 or 12), forecast fluxes are only available until ``BASETIME``, so that no polynomial interpolation is possible in the last two time intervals. This is the reason why setting ``BASETIME`` is not recommended for on-demand scripts. 31 37 32 38 … … 34 40 -------------------------------------------------- 35 41 36 In ``flex_extract`` up to version 5 the disaggregation was done with a Fortran program (FLXACC2). In version 6 this part was converted toPython.42 In ``flex_extract`` up to version 5, the disaggregation was done with a Fortran program (FLXACC2). In version 6, this part was recoded in Python. 37 43 38 39 In the old versions (below 7.1) a relatively simple method processes the precipitation fields in a way that is consistent with the scheme applied in ``FLEXPART`` for all variables: linear interpolation between times where input fields are available. 40 At first the accumulated values are divided by the number of hours (i.e., 3 or 6). 44 In the old versions (below 7.1), a relatively simple method processes the precipitation fields in a way that is consistent with the linear interpolation between times where input fields are available that is applied in ``FLEXPART`` for all variables. 45 This scheme (from Paul James) at first divides the accumulated values by the number of hours (i.e., 3 or 6). ??? 41 46 The best option for disaggregation, which was realised, is conservation within the interval under consideration plus the two adjacent ones. 42 47 Unfortunately, this leads to undesired temporal smoothing of the precipitation time series – maxima are damped and minima are raised. … … 53 58 :figclass: align-center 54 59 55 Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by a half-time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018).60 Fig. 1: Example of disaggregation scheme as implemented in older versions for an isolated precipitation event lasting one time interval (thick blue line). The amount of original precipitation after de-accumulation is given by the blue-shaded area. The green circles represent the discrete grid points after disaggregation and linearly interpolate in between them as indicated by the green line and the green-shaded area. Note that supporting points for the interpolation are shifted by half a time interval compared to the times when other meteorological fields are available (Hittmeir et al. 2018). 56 61 57 62 58 63 59 Disaggregation is done for 4 adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the 4adjacent timespans.64 Disaggregation is done for four adjacent timespans (:math:`a_0, a_1, a_2, a_3`) which generates a new, disaggregated value which is output at the central point of the four adjacent timespans. 60 65 61 66 .. math:: … … 69 74 70 75 71 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the 4 original timespans has a value below 0it is set to 0 prior to the calculation.76 This new point :math:`p` is used for linear interpolation of the complete timeseries afterwards. If one of the four original timespans has a value below 0, it is set to 0 prior to the calculation. 72 77 73 78 .. math:: … … 78 83 79 84 80 81 82 85 Disaggregation for precipitation in version 7.1 83 86 ----------------------------------------------- 84 87 85 Due to the problems with generating precipitation in originally dry (or lower) intervals and the temporal smoothing a new algorithm was developed. The approach is based on a one dimensional piecewise linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements by preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional monotonicity filter helps to gainmonotonicity.86 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the linear formulation.87 These requirements on the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i.e.the reconstructed values vanish throughout this whole time interval, too.88 Due to the problems mentioned above, a new algorithm was developed. The approach is based on a one-dimensional, piecewise-linear function with two additional supporting grid points within each grid cell, dividing the interval into three pieces. It fulfils the desired requirements of preserving the integral precipitation in each time interval, guaranteeing continuity at interval boundaries, and maintaining non-negativity. An additional filter improves monotonicity. 89 The more natural requirements of symmetry, reality, computational efficiency and easy implementation motivates the use of a linear formulation. 90 These requirements for the reconstruction algorithm imply that time intervals with no precipitation remain unchanged, i. e., the reconstructed values vanish throughout this whole time interval, too. 88 91 In the simplest scenario of an isolated precipitation event, where in the time interval before and after the data values are zero, the reconstruction algorithm therefore has to vanish at the boundaries of the interval, too. 89 92 The additional conditions of continuity and conservation of the precipitation amount then require us to introduce sub-grid points if we want to keep a linear interpolation (Fig. 2). … … 142 145 143 146 144 In the case of the new disaggregation method for precipitation, the two new sub grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval and 1 or 2 for the two new sub grid points respectively. The filenames do not change.147 In the case of the new disaggregation method for precipitation, the two new sub-grid points are added in the ``flux`` output files. They are identified by the forecast step parameter ``step`` which is 0 for the original time interval, and 1 or 2, respectively, for the two new sub-grid points. The filenames do not change. 145 148 146 149 147 150 .. note:: 148 151 149 The new method for disaggregation was published in the Geoscientific Model Development Journalin 2018:152 The new method for disaggregation was published in the journal Geoscientific Model Development in 2018: 150 153 151 154 Hittmeir, S., Philipp, A., and Seibert, P.: A conservative reconstruction scheme for the interpolation of extensive quantities in the Lagrangian particle dispersion model FLEXPART, Geosci. Model Dev., 11, 2503-2523, https://doi.org/10.5194/gmd-11-2503-2018, 2018. 152 155 153 154 155 156 157 156 158 157 159 Disaggregation for the rest of the flux fields 158 159 Disaggregation for the other flux fields 160 160 ---------------------------------------------- 161 161 162 162 The accumulated values for the other variables are first divided by the number of hours and 163 then interpolated to the exact times Xusing a bicubic interpolation which conserves the integrals of the fluxes within each timespan.164 Disaggregation is done for 4 adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which generates a new, disaggregated value which is output at the central point of the 4adjacent timespans.163 then interpolated to the exact times using a bicubic interpolation which conserves the integrals of the fluxes within each timespan. 164 Disaggregation is done for four adjacent timespans (:math:`p_a, p_b, p_c, p_d`) which produces a new, disaggregated value that is the output at the central point of the four adjacent timespans. 165 165 166 166 .. math:: -
For_developers/Sphinx/source/Documentation/input.rst
rb1674ed rf20342a 1 1 ******************** 2 Control & Input Data2 Control & input data 3 3 ******************** 4 4 5 Input Data5 Input data 6 6 - :doc:`Input/control` 7 ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialized by ``flex_extract`` with their default values and can be overwritten with definitions set in the socalled :doc:`Input/control`.7 ``Flex_extract`` needs a number of controlling parameters to decide on the behaviour and the actual data set to be retrieved. They are initialised by ``flex_extract`` with certain default values which can be overwritten with definitions set in the so-called :doc:`Input/control`. 8 8 9 To be able to successfully retrieve data from the ECMWF Mars archive it is necessary to understand these parameters andset them to proper and consistent values. They are described in :doc:`Input/control_params` section.9 For a successfull retrieval of data from the ECMWF MARS archive it is necessary to understand these parameters and to set them to proper and consistent values. They are described in :doc:`Input/control_params` section. 10 10 11 We also have some :doc:`Input/examples` and description of :doc:`Input/changes` changes to previous versions and downward compatibilities.11 Furthermore, some :doc:`Input/examples` are provided, and in :doc:`Input/changes` changes to previous versions and downward compatibilities are described. 12 12 13 13 - :doc:`Input/ecmwf_env` 14 For ``flex_extract`` it is necessaryto be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process.14 ``flex_extract`` needs to be able to reach ECMWF servers in the **remote mode** and the **gateway mode**. Therefore a :doc:`Input/ecmwf_env` is created during the installation process. 15 15 16 16 - :doc:`Input/templates` 17 A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt for example the jobscripts regarding its settings for the batch jobs. 18 19 17 A number of files which are created by ``flex_extract`` are taken from templates. This makes it easy to adapt, for example, the job scripts with regard to the settings for the batch jobs. 20 18 21 19 … … 29 27 30 28 Controlling 31 The main tasks and behaviour of ``flex_extract`` are controlled by its Python scripts. There are two top-level scripts, one for installation called install_and one for execution called submit_.32 They can interpret a number of commandline arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type:29 The main tasks and the behaviour of ``flex_extract`` are controlled by the Python scripts. There are two top-level scripts, one for installation called install_, and one for execution called submit_. 30 They interpret a number of command-line arguments which can be seen by typing ``--help`` after the script call. Go to the root directory of ``flex_extract`` to type: 33 31 34 32 .. code-block:: bash … … 38 36 python3 Source/Python/submit.py --help 39 37 40 In this new version we provide also the wrapping Shell scripts setup_ and run_, which sets the command line parameters, do some checks and execute the corresponing Python scripts ``install.py`` and ``submit.py`` respectivley. 41 42 It might be faster and easier for beginners. See :doc:`../quick_start` for information on how to use them. 38 With version 7.1, we provide also wrapper shell scripts setup_ and run_ which set the command-line parameters, do some checks, and execute the corresponing Python scripts ``install.py`` and ``submit.py``, respectively. 39 It might be faster and easier for beginners if they are used. See :doc:`../quick_start` for information on how to use them. 43 40 44 Additionally, ``flex_extract`` creates the Korn Shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be send to the ECMWF serves in the **remote mode** and the **gateway mode** for starting batch jobs.41 ``flex_extract`` also creates the Korn shell scripts :doc:`Input/compilejob` and :doc:`Input/jobscript` which will be sent to the ECMWF servers in the **remote mode** and the **gateway mode** for starting batch jobs. 45 42 46 The Fortran program will be compiled during the installation process bythe :doc:`Input/fortran_makefile`.43 The Fortran program is compiled during the installation process using the :doc:`Input/fortran_makefile`. 47 44 48 To sum up, the following scripts control s``flex_extract``:45 To sum up, the following scripts control ``flex_extract``: 49 46 50 47 Installation -
For_developers/Sphinx/source/Documentation/output.rst
rb1674ed rf20342a 1 1 *********** 2 Output Data2 Output data 3 3 *********** 4 4 5 The output data of ``flex_extract`` are separated mainly into temporary files and the final ``FLEXPART`` inputfiles:5 The output data of ``flex_extract`` can be divided into the final ``FLEXPART`` input files and temporary files: 6 6 7 7 +-----------------------------------------------+----------------------------------------------+ 8 8 | ``FLEXPART`` input files | Temporary files (saved in debug mode) | 9 9 +-----------------------------------------------+----------------------------------------------+ 10 | - Standard output file names | - MARS request file (opt)|10 | - Standard output file names | - MARS request file (optional) | 11 11 | - Output for pure forecast | - flux files | 12 12 | - Output for ensemble members | - VERTICAL.EC | … … 21 21 ======================== 22 22 23 The final output files of ``flex_extract`` are also the meteorological ``FLEXPART`` input files.24 The naming of these files dependon the kind of data extracted by ``flex_extract``.23 The final output files of ``flex_extract`` are the meteorological input files for ``FLEXPART``. 24 The naming convention for these files depends on the kind of data extracted by ``flex_extract``. 25 25 26 26 Standard output files 27 27 --------------------- 28 28 29 In general, there is a file for each time step with the filename format:29 In general, there is one file for each time named: 30 30 31 31 .. code-block:: bash … … 33 33 <prefix>YYMMDDHH 34 34 35 The ``prefix`` is by default defined as ``EN``and can be re-defined in the ``CONTROL`` file.36 Each file contains all meteorological fields needed by ``FLEXPART`` for all selected model levels for a specific time step.37 38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` where we extracted only the lowest model level for demonstration reasons:35 where YY are the last two digits of the year, MM is the month, DD the day, and HH the hour (UTC). <prefix> is by default defined as EN, and can be re-defined in the ``CONTROL`` file. 36 Each file contains all meteorological fields at all levels as needed by ``FLEXPART``, valid for the time indicated in the file name. 37 38 Here is an example output which lists the meteorological fields in a single file called ``CE00010800`` (where we extracted only the lowest model level for demonstration purposes): 39 39 40 40 .. code-block:: bash … … 84 84 ------------------------------ 85 85 86 ``Flex_extract`` can retrieve forecasts which can be longer than 23 hours. To avoid collisions of time steps for forecasts of more than one daya new scheme for filenames in pure forecast mode is introduced:86 ``Flex_extract`` is able to retrieve forecasts with a lead time of more than 23 hours. In order to avoid collisions of time steps names, a new scheme for filenames in pure forecast mode is introduced: 87 87 88 88 .. code-block:: bash … … 90 90 <prefix>YYMMDD.HH.<FORECAST_STEP> 91 91 92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3 92 The ``<prefix>`` is, as in the standard output, by default ``EN`` and can be re-defined in the ``CONTROL`` file. ``YYMMDD`` is the date format and ``HH`` the forecast time which is the starting time for the forecasts. The ``FORECAST_STEP`` is a 3-digit number which represents the forecast step in hours. 93 93 94 94 … … 96 96 ------------------------------------- 97 97 98 Ensembles can be retrieved and are addressed by the grib message parameter ``number``. The ensembles are saved per file and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3digit format.98 ``Flex_extract`` is able to retrieve ensembles data; they are labelled by the grib message parameter ``number``. Each ensemble member is saved in a separate file, and standard filenames are supplemented by the letter ``N`` and the ensemble member number in a 3-digit format. 99 99 100 100 .. code-block:: bash … … 106 106 ------------------------------------------------------- 107 107 108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type. They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`. 109 The two additional fields are marked with the ``step`` parameter in the Grib messages and are set to "1" and "2" for sub-grid point 1 and 2 respectively. 110 The output filenames do not change in this case. 111 Below is an example list of precipitation fields in an output file generated with the new disaggregation method: 108 The new disaggregation method for precipitation fields produces two additional precipitation fields for each time step and precipitation type (large-scale and convective). They serve as sub-grid points in the original time interval. For details of the method see :doc:`disagg`. 109 The two additional fields are addressed using the ``step`` parameter in the GRIB messages, which 110 is set to "1" or "2", for sub-grid points 1 and 2, respectively. 111 The output file names are not altered. 112 An example of the list of precipitation fields in an output file generated with the new disaggregation method is found below: 112 113 113 114 .. code-block:: bash … … 129 130 =============== 130 131 131 ``Flex_extract`` works with a number of temporary data files which are usually deleted after a successful data extraction. They are only storedif the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`).132 ``Flex_extract`` creates a number of temporary data files which are usually deleted at the end of a successful run. They are preserved only if the ``DEBUG`` mode is switched on (see :doc:`Input/control_params`). 132 133 133 134 MARS grib files … … 135 136 136 137 ``Flex_extract`` retrieves all meteorological fields from MARS and stores them in files ending with ``.grb``. 137 Since the request times and data transfer of MARS access are limited and ECMWF asks for efficiency in requesting data from MARS, ``flex_extract`` splits the overall data request in several smaller requests. Each request is stored in an extra ``.grb`` file and the file names are put together by several pieces of information: 138 Since there are limits implemented by ECMWF for the time per request and data transfer from MARS, 139 and as ECMWF asks for efficient MARS retrievals, ``flex_extract`` splits the overall data request 140 into several smaller requests. Each request is stored in its own ``.grb`` file, and the file 141 names are composed of several pieces of information: 138 142 139 143 .. code-block:: bash … … 144 148 145 149 Field type: 146 ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4 dvariational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast150 ``AN`` - Analysis, ``FC`` - Forecast, ``4V`` - 4D variational analysis, ``CV`` - Validation forecast, ``CF`` - Control forecast, ``PF`` - Perturbed forecast 147 151 Grid type: 148 ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat /lon), ``_OROLSM`` - Orography parameter152 ``SH`` - Spherical Harmonics, ``GG`` - Gaussian Grid, ``OG`` - Output Grid (typically lat / lon), ``_OROLSM`` - Orography parameter 149 153 Temporal property: 150 154 ``__`` - instantaneous fields, ``_acc`` - accumulated fields 151 155 Level type: 152 ``ML`` - Model Level, ``SL`` - Surface Level156 ``ML`` - model level, ``SL`` - surface level 153 157 ppid: 154 The process number of the parent process of submitted script.158 The process number of the parent process of the script submitted. 155 159 pid: 156 The process number of the submitted script. 157 158 The process ids should avoid mixing of fields if several ``flex_extract`` jobs are performed in parallel (which is, however, not recommended). The date format is YYYYMMDDHH. 159 160 Example ``.grb`` files for a day of CERA-20C data: 160 The process number of the script submitted. 161 162 163 Example ``.grb`` files for one day of CERA-20C data: 161 164 162 165 .. code-block:: bash … … 172 175 ----------------- 173 176 174 This file is a ``csv`` file called ``mars_requests.csv`` with a list of the actual settings of MARS request parameters (one request per line) in a flex_extract job. It is used for documenting the data which were retrieved and for testing reasons. 175 176 Each request consist of the following parameters, whose meaning mainly can be taken from :doc:`Input/control_params` or :doc:`Input/run`: 177 This file is a ``csv`` file called ``mars_requests.csv`` listing the actual settings of the MARS 178 request (one request per line) in a flex_extract job. 179 It is used for documenting which data were retrieved, and for testing. 180 181 Each request consists of the following parameters, whose meaning mostly can be taken from :doc:`Input/control_params` or :doc:`Input/run`: 177 182 request_number, accuracy, area, dataset, date, expver, gaussian, grid, levelist, levtype, marsclass, number, param, repres, resol, step, stream, target, time, type 178 183 179 Example output of a one day retrieval of CERA-20cdata:184 Example output of a one-day retrieval of CERA-20C data: 180 185 181 186 .. code-block:: bash … … 192 197 ----------- 193 198 194 The vertical discretization of model levels. This file contains the ``A`` and ``B`` parameters to calculate the model level height in meters. 199 This file contains information describing the vertical discretisation (model levels) 200 in form of the ``A`` and ``B`` parameters which allow to calculate the actual pressure of a model level from the surface pressure. 195 201 196 202 … … 198 204 ---------- 199 205 200 This file is usually called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords. 201 202 203 flux files 206 This file is called ``date_time_stepRange.idx``. It contains indices pointing to specific grib messages from one or more grib files. The messages are selected with a composition of grib message keywords. 207 #PS NEEDS MORE DESCRIPTION 208 209 210 Flux files 204 211 ---------- 205 212 206 The flux files contain the de-accumulated and dis-aggregated flux fields of large scale and convective precipitation, eastward turbulent surface stress, northward turbulent surface stress, surface sensible heat fluxand the surface net solar radiation.213 The flux files contain the de-accumulated and dis-aggregated flux fields of large-scale and convective precipitation, east- and northward turbulent surface stresses, the surface sensible heat flux, and the surface net solar radiation. 207 214 208 215 .. code-block:: bash … … 210 217 flux<date>[.N<xxx>][.<xxx>] 211 218 212 The date format is YYYYMMDDHH . The optional block ``[.N<xxx>]`` marks the ensemble forecast number, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step.219 The date format is YYYYMMDDHH as explained before. The optional block ``[.N<xxx>]`` is used for the ensemble forecast date, where ``<xxx>`` is the ensemble member number. The optional block ``[.<xxx>]`` marks a pure forecast with ``<xxx>`` being the forecast step. 213 220 214 221 .. note:: 215 222 216 In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval and ``1`` or ``2`` for the two new intervals respectively.223 In the case of the new dis-aggregation method for precipitation, two new sub-intervals are added in between each time interval. They are identified by the forecast step parameter which is ``0`` for the original time interval, and ``1`` or ``2``, respectively, for the two new intervals. 217 224 218 225 … … 226 233 fort.xx 227 234 228 where ``xx`` is thenumber which defines the meteorological fields stored in these files.229 They are generated by the Python part of ``flex_extract`` by just splitting the meteorological fields for a unique time stamp from the ``*.grb`` files into the ``fort`` files.230 The following table defines the numbers with their corresponding content.235 where ``xx`` is a number which defines the meteorological fields stored in these files. 236 They are generated by the Python code in ``flex_extract`` by splitting the meteorological fields for a unique time stamp from the ``*.grb`` files, storing them under the names ``fort.<XX>`` where <XX> represents some number. 237 The following table defines the numbers and the corresponding content: 231 238 232 239 .. csv-table:: Content of fort - files … … 240 247 "16", "surface fields" 241 248 "17", "specific humidity" 242 "18", "surface specific humidity (reduced gaussian)"243 "19", " vertical velocity (pressure) (optional)"249 "18", "surface specific humidity (reduced Gaussian grid)" 250 "19", "omega (vertical velocity in pressure coordinates) (optional)" 244 251 "21", "eta-coordinate vertical velocity (optional)" 245 "22", "total cloud 246 247 Some of the fields are solely retrieved with specific settings, e. g. the eta-coordinate vertical velocity is not available in ERA-Interim datasets and the total cloud water content is an optional fieldfor ``FLEXPART v10`` and newer.252 "22", "total cloud-water content (optional)" 253 254 Some of the fields are solely retrieved with specific settings, e. g., the eta-coordinate vertical velocity is not available in ERA-Interim datasets, and the total cloud-water content is an optional field which is useful for ``FLEXPART v10`` and newer. 248 255 249 256 The ``calc_etadot`` program saves its results in file ``fort.15`` which typically contains: … … 259 266 .. note:: 260 267 261 The ``fort.4`` file is the namelist file to drivethe Fortran program ``calc_etadot``. It is therefore also an input file.268 The ``fort.4`` file is the namelist file to control the Fortran program ``calc_etadot``. It is therefore also an input file. 262 269 263 270 Example of a namelist: -
For_developers/Sphinx/source/Documentation/overview.rst
rb1674ed rf20342a 3 3 ======== 4 4 5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) M ars archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` Atmospheric Transport Modelling system.6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who want sto use meteorological data from ECMWF to drive the ``FLEXPART`` model.7 The software retrieves the minim al number of parameters ``FLEXPART`` needs to work and provides the data in the explicity format ``FLEXPART`` understands.5 ``Flex_extract`` is an open-source software to retrieve meteorological fields from the European Centre for Medium-Range Weather Forecasts (ECMWF) MARS archive to serve as input files for the ``FLEXTRA``/``FLEXPART`` atmospheric transport modelling system. 6 ``Flex_extract`` was created explicitly for ``FLEXPART`` users who want to use meteorological data from ECMWF to drive the ``FLEXPART`` model. 7 The software retrieves the minimum set of parameters needed by ``FLEXPART`` to work, and provides the data in the specific format required by ``FLEXPART``. 8 8 9 ``Flex_extract`` consists of 2main parts:10 1. a Python part , where the reading of parameter settings, retrieving data from MARS and preparing the data for ``FLEXPART`` is doneand11 2. a Fortran part , where the calculation of the vertical velocity is done and if necessary the conversion from spectralto regular latitude/longitude grids.9 ``Flex_extract`` consists of two main parts: 10 1. a Python part which reads the parameter settings, retrieves the data from MARS, and prepares them for ``FLEXPART``, and 11 2. a Fortran part which calculates the vertical velocity and, if necessary, converts variables from the spectral representation to regular latitude/longitude grids. 12 12 13 Additionally, it has some Korn shell scripts which are usedto set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes.13 In addition, there are some Korn shell scripts to set the environment and batch job features on ECMWF servers for the *gateway* and *remote* mode. See :doc:`Overview/app_modes` for information of application modes. 14 14 15 15 A number of Shell scripts are wrapped around the software package for easy installation and fast job submission. 16 16 17 The software depends on a number of third-party libraries which can be found in :ref:`ref-requirements`.17 The software depends on some third-party libraries as listed in :ref:`ref-requirements`. 18 18 19 Some details on the tasks and program worksteps are described in :doc:`Overview/prog_flow`.19 Details of the tasks and program work steps are described in :doc:`Overview/prog_flow`. 20 20 21 21 -
For_developers/Sphinx/source/Documentation/vertco.rst
rb1674ed rf20342a 1 1 ******************* 2 Vertical Coordinate2 Vertical wind 3 3 ******************* 4 4 5 Calculation of vertical velocity and preparation of Output-files5 Calculation of vertical velocity and preparation of output files 6 6 ================================================================ 7 7 8 ``flex_extract`` has two ways to calculatethe vertical velocity for ``FLEXTRA``/``FLEXPART``:8 Two methods are provided in ``flex_extract`` for the calculation of the vertical velocity for ``FLEXTRA``/``FLEXPART``: 9 9 (i) from the horizontal wind field, 10 (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C** .10 (ii) from the MARS parameter 77, which is available for operational forecasts and analyses since September 2008 and for reanalysis datasets **ERA5** and **CERA-20C**, which contains the vertical velocity directly in the eta coordinate system of the ECMWF model. 11 11 12 12 Especially for high resolution data, use of the ``MARS`` parameter 77 is recommended, … … 20 20 21 21 22 Calculation of vertical velocity fromhorizontal wind using the continuity equation22 Calculation of the vertical velocity from the horizontal wind using the continuity equation 23 23 =================================================================================== 24 24 25 The vertical velocity is computed by the FORTRAN90 program ``calc_etadot`` in the ECMWF 26 vertical coordinate system by applying the equation of continuity and thereby ensuring mass consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the 25 The vertical velocity in the ECMWF's eta vertical coordinate system is computed by the Fortran program ``calc_etadot``, using the continuity equation and thereby ensuring mass-consistent 3D wind fields. A detailed description of ``calc_etadot`` can be found in the 27 26 documents v20_update_protocol.pdf, V30_update_protocol.pdf and 28 27 V40_update_protocol.pdf. The computational demand and accuracy of ``calc_etadot`` is highly … … 30 29 following guidance can be given for choosing the right parameters: 31 30 32 * For very fine output grids (0.25 degree or finer) the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279 needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application.31 * For very fine output grids (0.25 degree or finer), the full resolution T799 or even T1279 of the operational model is required (``RESOL=799``, ``SMOOTH=0``). The highest available resolution (and the calculation of vertical velocity on the Gaussian grid (``GAUSS=1``) is, however, rather demanding and feasible only for resolutions up to T799. Higher resolutions are achievable on the HPC. If data retrieval at T1279 needs to be performed on *ecgate*, the computation of the vertical velocity is feasible only on the lat/lon grid (``GAUSS=0``), which also yields very good results. Please read document v20_update_protocol.pdf-v60_update_protocol.pdf to see if the errors incurred are acceptable for the planned application. 33 32 * For lower resolution (often global) output grids, calculation of vertical velocities with lower than operational spectral resolution is recommended. For global grids the following settings appear optimal: 34 33 - For 1.0 degree grids: ``GAUSS=1``, ``RESOL=255``, ``SMOOTH=179`` 35 34 - For 0.5 degree grids: ``GAUSS=1``, ``RESOL=399``, ``SMOOTH=359`` 36 35 - Calculation on the lat/lon grid is not recommended for less than the operational (T1279) resolution. 37 - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279 , 2047 in future models). This choice is restricted because a reduced Gaussian grid is defined in thenECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution.38 - For ``SMOOTH`` any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if vertical velocity is calculated onlat\/lon grid (``GAUSS=0``).39 * The on 36 - If ``GAUSS`` is set to 1, only the following choices are possible for ``RESOL`` on *ecgate*: 159,255,319,399,511,799, (on the HPC also 1279; 2047 in future model versions). This choice is restricted because a reduced Gaussian grid is defined in the ECMWF EMOSLIB only for these spectral resolutions. For ``GAUSS=0``, ``RESOL`` can be any value below the operational resolution. 37 - For ``SMOOTH``, any resolution lower than ``RESOL`` is possible. If no smoothing is desired, ``SMOOTH=0`` should be chosen. ``SMOOTH`` has no effect if the vertical velocity is calculated on a lat\/lon grid (``GAUSS=0``). 38 * The on-demand scripts send an error message for settings where ``SMOOTH`` (if set) and ``RESOL`` are larger than 360./``GRID``/2, since in this case, the output grid cannot resolve the highest wave numbers. The scripts continue operations, however. 40 39 * Regional grids are not cyclic in zonal directions, but global grids are. The software assumes a cyclic grid if ``RIGHT``-``LEFT`` is equal to ``GRID`` or is equal to ``GRID``-360. 41 * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` for application in ATM modelling. If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and the surface pressure tendency. ``OMEGADIFF`` should bezero except for debugging, since it triggers expensive calculations on the Gaussian grid.40 * Finally, model and flux data as well as the vertical velocity computed are written to files ``<prefix>yymmddhh`` (the standard ``flex_extract`` output files) If the parameters ``OMEGA`` or ``OMEGADIFF`` are set, also files ``OMEGAyymmddhh`` are created, containing the pressure vertical velocity (omega) and the difference between omega from ``MARS`` and from the surface pressure tendency. ``OMEGADIFF`` should be set to zero except for debugging, since it triggers expensive calculations on the Gaussian grid. 42 41 43 42 44 Calculation of vertical velocity frompre-calculated MARS parameter 7743 Calculation of the vertical velocity from the pre-calculated MARS parameter 77 45 44 ====================================================================== 46 45 47 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that this parameteris available if the ``CONTROL`` parameter ``ETA`` is set to 1.46 Since November 2008, the parameter 77 (deta/dt) is stored in ``MARS`` on full model levels. ``FLEXTRA``/``FLEXPART`` in its current version requires ``deta/dt`` on model half levels, multiplied by ``dp/deta``. In ``flex_extract``, the program ``calc_etadot`` assumes that parameter 77 is available if the ``CONTROL`` parameter ``ETA`` is set to 1. 48 47 49 48 It is recommended to use the pre-calculated parameter 77 by setting ``ETA`` to 1 whenever possible. 50 49 51 Setting parameter ``ETA`` to 1 normallydisables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time.50 Setting the parameter ``ETA`` to 1 disables calculation of vertical velocity from the horizontal wind field, which saves a lot of computational time. 52 51 53 52 .. note:: 54 However, the calculation on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity.53 However, the calculations on the Gaussian grid are avoided only if both ``GAUSS`` and ``ETADIFF`` are set to 0. Please set ``ETADIFF`` to 1 only if you are really need it for debugging since this is a very expensive option. In this case, ``ETAyymmddhh`` files are produced that contain the vertical velocity from horizontal winds and the difference to the pre-calculated vertical velocity. 55 54 56 55 The parameters ``RESOL``, ``GRID``, ``UPPER``, ``LOWER``, ``LEFT``, ``RIGHT`` still apply. As for calculations on the Gaussian grid, the spectral resolution parameter ``RESOL`` should be compatible with the grid resolution (see previous subsection).
Note: See TracChangeset
for help on using the changeset viewer.