source: flex_extract.git/Documentation/html/_sources/quick_start.rst.txt @ 6931f61

ctbtodev
Last change on this file since 6931f61 was 6931f61, checked in by anphi <anne.philipp@…>, 4 years ago

Update Onlinedocumentation after review of language editing

  • Property mode set to 100644
File size: 31.8 KB
Line 
1*****
2Usage
3*****
4
5``flex_extract`` is a command-line tool. In the first versions, it was started via a korn shell script and since version 6, the entry point was a python script. From version 7.1, a bash shell script was implemented to call ``flex_extract`` with the command-line parameters.
6
7To submit an extraction job, change the working directory to the subdirectory ``Run`` (directly under the ``flex_extract_vX.X`` root directory, where  ``X.X`` is the version number):
8
9.. code-block:: bash
10
11    cd <path-to-flex_extract_vX.X>/Run
12
13Within this directory you can find everything you need to modify and run ``flex_extract``. The following tree shows a shortened list of directories and important files. The ``*`` serves as a wildcard. The brackets ``[]`` indicate that the file is present only in certain modes of application.
14
15.. code-block:: bash
16       
17    Run
18    ├── Control
19    │   ├── CONTROL_*
20    ├── Jobscripts
21    │   ├── compilejob.ksh
22    │   ├── job.ksh
23    │   ├── [joboper.ksh]
24    ├── Workspace
25    │   ├── CERA_example
26    │   │   ├── CE000908*
27    ├── [ECMWF_ENV]
28    ├── run_local.sh
29    └── run.sh
30
31The ``Jobscripts`` directory is used to store the Korn shell job scripts generated by a ``flex_extract`` run in the **Remote** or **Gateway** mode. They are used to submit the setup information to the ECMWF server and start the jobs in ECMWF's batch mode. The typical user must not touch these files. They are generated from template files which are stored in the ``Templates`` directory under ``flex_extract_vX.X``. Usually there will be a ``compilejob.ksh`` and a ``job.ksh`` script which are explained in the section :doc:`Documentation/input`. In the rare case of operational data extraction there will be a ``joboper.ksh`` which reads time information from environment variables at the ECMWF servers.
32
33The ``Controls`` directory contains a number of sample ``CONTROL`` files. They cover the current range of possible kinds of extractions. Some parameters in the ``CONTROL`` files can be adapted and some others should not be changed. In this :doc:`quick_start` guide we explain how an extraction with ``flex_extract`` can be started in the different :doc:`Documentation/Overview/app_modes` and point out some specifics of each dataset and ``CONTROL`` file.
34
35Directly under ``Run`` you find the files ``run.sh`` and ``run_local.sh`` and according to your selected :doc:`Documentation/Overview/app_modes` there might also be a file named ``ECMWF_ENV`` for the user credentials to quickly and automatically access ECMWF servers. 
36
37From version 7.1 on, the ``run.sh`` (or ``run_local.sh``) script is the main entry point to ``flex_extract``.
38
39.. note::
40
41    Note that for experienced users (or users of older versions), it is still possible to start ``flex_extract`` directly via the ``submit.py`` script in directory ``flex_extract_vX.X/Source/Python``.
42   
43
44
45Job preparation
46===============
47
48To actually start a job with ``flex_extract`` it is sufficient to start either ``run.sh`` or ``run_local.sh``. Data sets and access modes are selected in ``CONTROL`` files and within the user section of the ``run`` scripts. One should select one of the sample ``CONTROL`` files. The following sections describes the differences in the application modes and where the results will be stored.
49   
50
51Remote and gateway modes
52------------------------
53
54For member-state users it is recommended to use the *remote* or *gateway* mode, especially for more demanding tasks,  which retrieve and convert the data on ECMWF machines; only the final output files are transferrred to the local host.
55
56Remote mode
57    The only difference between both modes is the users working location. In the *remote* mode you have to login to the ECMWF server and then go to the ``Run`` directory as shown above. At ECMWF servers ``flex_extract`` is installed in the ``$HOME`` directory. However, to be able to start the program you have to load the ``Python3`` environment with the module system first.
58
59    .. code-block:: bash
60
61        # Remote mode
62        ssh -X <ecuid>@ecaccess.ecmwf.int
63
64    .. code-block:: bash
65   
66        # On ECMWF server
67        [<ecuid>@ecgb11 ~]$ cd flex_extract_vX.X/Run
68       
69
70Gateway mode       
71    For the gateway mode you have to log in on the gateway server and go to the ``Run`` directory of ``flex_extract``:
72
73    .. code-block:: bash
74
75        # Gateway mode
76        ssh <user>@<gatewayserver>
77        cd <path-to-flex_extract_vX.X>/Run
78
79       
80From here on the working process is the same for both modes.   
81
82For your first submission you should use one of the example ``CONTROL`` files stored in the ``Control`` directory. We recommend to extract *CERA-20C* data since they usually guarantee quick results and are best for testing reasons.
83 
84Therefore open the ``run.sh`` file and modify the parameter block marked in the file as shown below:
85
86.. code-block:: bash
87
88    # -----------------------------------------------------------------
89    # AVAILABLE COMMANDLINE ARGUMENTS TO SET
90    #
91    # THE USER HAS TO SPECIFY THESE PARAMETERS:
92
93    QUEUE='ecgate'
94    START_DATE=None
95    END_DATE=None
96    DATE_CHUNK=None
97    JOB_CHUNK=3
98    BASETIME=None
99    STEP=None
100    LEVELIST=None
101    AREA=None
102    INPUTDIR=None
103    OUTPUTDIR=None
104    PP_ID=None
105    JOB_TEMPLATE='jobscript.template'
106    CONTROLFILE='CONTROL_CERA'
107    DEBUG=0
108    REQUEST=2
109    PUBLIC=0   
110
111
112This would retrieve a one day (08.09.2000) *CERA-20C* dataset with 3 hourly temporal resolution and a small 1° domain over Europe. Since the ``ectrans`` parameter is set to ``1`` the resulting output files will be transferred to the local gateway into the path stored in the destination (SEE INSTRUCTIONS FROM INSTALLATION).  The parameters listed in the ``run.sh`` file would overwrite existing settings in the ``CONTROL`` file.   
113
114To start the retrieval you only have to start the script by:
115
116.. code-block:: bash
117 
118    ./run.sh
119
120``Flex_extract`` will print some information about the job. If there is no error in the submission to the ECMWF server you will see something like this:
121   
122.. code-block:: bash
123   
124    ---- On-demand mode! ----
125    The job id is: 10627807
126    You should get an email per job with subject flex.hostname.pid
127    FLEX_EXTRACT JOB SCRIPT IS SUBMITED!
128 
129
130Once submitted you can check the progress of the submitted job using ``ecaccess-job-list``. You should get an email after the job is finished with a detailed protocol of what was done.
131
132In case the job fails you will receive an email with the subject ``ERROR!`` and the job name. You can then check for information in the email or you can check on ECMWF server in the ``$SCRATCH`` directory for debugging information.   
133
134.. code-block:: bash
135       
136    cd $SCRATCH
137    ls -rthl
138
139The last command lists the most recent logs and temporary retrieval directories (usually ``extractXXXXX``, where XXXXX is the process id). Under ``extractXXXXX`` a copy of the ``CONTROL`` file is stored under the name ``CONTROL``, the protocol is stored in the file ``prot`` and the temporary files as well as the resulting files are stored in a directory ``work``. The original name of the ``CONTROL`` file is stored in this new file under parameter ``controlfile``.
140
141.. code-block:: bash
142    :caption: "Example structure of ``flex_extract`` output directory on ECMWF servers."
143       
144    extractXXXXX
145    ├── CONTROL
146    ├── prot
147    ├── work
148    │   ├── temporary files
149    │   ├── CE000908* (resulting files)
150
151If the job was submitted to the HPC ( ``queue=cca`` or  ``queue=ccb`` ) you may login to the HPC and look into the directory ``/scratch/ms/ECGID/ECUID/.ecaccess_do_not_remove`` for job logs. The working directories are deleted after job failure and thus normally cannot be accessed.
152
153To check if the resulting files are still transferred to local gateway server you can use the command ``ecaccess-ectrans-list`` or check the destination path for resulting files on your local gateway server.
154   
155
156Local mode
157----------
158   
159To get to know the working process and to start your first submission you could use one of the example ``CONTROL`` files stored in the ``Control`` directory as they are. For quick results and for testing reasons it is recommended to extract *CERA-20C* data.
160 
161Open the ``run_local.sh`` file and modify the parameter block marked in the file as shown below. The differences are highlighted.
162
163+-----------------------------------------------+-----------------------------------------------+
164|   Take this for **member-state user**         |  Take this for **public user**                |
165+-----------------------------------------------+-----------------------------------------------+
166| .. code-block:: bash                          | .. code-block:: bash                          |
167|    :emphasize-lines: 16,20,23                 |    :emphasize-lines: 16,20,23                 |
168|                                               |                                               |
169|    # -----------------------------------------|    # -----------------------------------------|
170|    # AVAILABLE COMMANDLINE ARGUMENTS TO SET   |    # AVAILABLE COMMANDLINE ARGUMENTS TO SET   |
171|    #                                          |    #                                          |
172|    # THE USER HAS TO SPECIFY THESE PARAMETERs:|    # THE USER HAS TO SPECIFY THESE PARAMETERs:|
173|    #                                          |    #                                          |
174|                                               |                                               |
175|    QUEUE=''                                   |    QUEUE=''                                   |
176|    START_DATE=None                            |    START_DATE=None                            |
177|    END_DATE=None                              |    END_DATE=None                              |
178|    DATE_CHUNK=None                            |    DATE_CHUNK=None                            |
179|    JOB_CHUNK=None                             |    JOB_CHUNK=None                             |
180|    BASETIME=None                              |    BASETIME=None                              |
181|    STEP=None                                  |    STEP=None                                  |
182|    LEVELIST=None                              |    LEVELIST=None                              |
183|    AREA=None                                  |    AREA=None                                  |
184|    INPUTDIR='./Workspace/CERA'                |    INPUTDIR='./Workspace/CERApublic'          |
185|    OUTPUTDIR=None                             |    OUTPUTDIR=None                             |
186|    PP_ID=None                                 |    PP_ID=None                                 |
187|    JOB_TEMPLATE=''                            |    JOB_TEMPLATE=''                            |
188|    CONTROLFILE='CONTROL_CERA'                 |    CONTROLFILE='CONTROL_CERA.public'          |
189|    DEBUG=0                                    |    DEBUG=0                                    |
190|    REQUEST=0                                  |    REQUEST=0                                  |
191|    PUBLIC=0                                   |    PUBLIC=1                                   |
192|                                               |                                               |
193+-----------------------------------------------+-----------------------------------------------+
194
195
196This would retrieve a one day (08.09.2000) *CERA-20C* dataset with 3 hourly temporal resolution and a small 1° domain over Europe. The destination location for this retrieval will be within the ``Workspace`` directory within ``Run``. This can be changed to whatever path you like. The parameters listed in ``run_local.sh`` would overwrite existing settings in the ``CONTROL`` file. 
197   
198To start the retrieval you then start the script by:
199
200.. code-block:: bash
201 
202    ./run_local.sh
203   
204 
205While job submission on the local host is convenient and easy to monitor (on standard output), there are a few caveats with this option:
206
207    1. There is a maximum size of 20GB for single retrieval via ECMWF Web API. Normally this is not a problem but for global fields with T1279 resolution and hourly time steps the limit may already apply.
208    2. If the retrieved MARS files are large but the resulting files are relative small (small local domain) then the retrieval to the local host may be inefficient since all data must be transferred via the Internet. This scenario applies most notably if ``etadot`` has to be calculated via the continuity equation as this requires global fields even if the domain is local. In this case job submission via ecgate might be a better choice. It really depends on the use patterns and also on the internet connection speed.   
209       
210
211
212
213Selection and adjustment of ``CONTROL`` files
214=============================================
215
216
217This section describes how to work with the ``CONTROL`` files. A detailed explanation of ``CONTROL`` file parameters and naming compositions can be found `here <Documentation/Input/control.html>`_. The more accurately the ``CONTROL`` file describes the retrieval needed, the fewer command-line parameters are needed to be set in the ``run`` scripts. With version ``7.1`` all ``CONTROL`` file parameters have default values. They can be found in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or in the ``CONTROL.documentation`` file within the ``Control`` directory. Only those parameters which need to be changed for a dataset retrieval needs to be set in a ``CONTROL`` file!
218
219The limitation of a dataset to be retrieved should be done very cautiously. The datasets can differ in many ways and vary over the time in resolution and parameterisations methods, especially the operational model cycles improves through a lot of changes over the time. If you are not familiar with the data it might be useful or necessary to check for availability of data in ECMWF’s MARS:
220
221    - **Public users** can use a web mask to check on data or list available data at this `Public datasets web interface <https://apps.ecmwf.int/datasets/>`_.
222    - **Member state users** can check availability of data online in the `MARS catalogue <https://apps.ecmwf.int/mars-catalogue/>`_.
223
224There you can select step by step what data suits your needs. This would be the most straightforeward way of checking for available data and therefore limit the possibility of ``flex_extract`` to fail. The following figure gives an example how the web interface would look like:
225
226
227.. _ref-fig-mars-catalogue-ss:
228
229.. figure:: _files/MARS_catalogue_snapshot.png
230
231   
232
233Additionally, you can find a lot of helpful links to dataset documentations, direct links to specific dataset web catalogues or further general information at the `link collection <Ecmwf/ec-links.html>`_ in the ECMWF data section.   
234
235
236``Flex_extract`` is specialised to retrieve a limited number of datasets, namely *ERA-Interim*, *CERA-20C*, *ERA5* and *HRES (operational data)* as well as the *ENS (operational data, 15-day forecast)*. The limitation relates mainly to the dataset itself, the stream (what kind of forecast or what subset of dataset) and the experiment number. Mostly, the experiment number is equal to ``1`` to signal that the actual version should be used.
237
238The next level of differentiation would be the field type, level type and time period. ``Flex_extract`` currently only supports the main streams for the re-analysis datasets and provides extraction of different streams for the operational dataset. The possibilities of compositions of dataset and stream selection are represented by the current list of example ``CONTROL`` files. You can see this in the naming of the example files:
239
240
241.. code-block:: bash
242    :caption: "Current example ``CONTROL`` files distributed with ``flex_extract``. "
243
244    CONTROL_CERA   
245    CONTROL_CERA.global   
246    CONTROL_CERA.public     
247    CONTROL_EA5     
248    CONTROL_EA5.global       
249    CONTROL_EI   
250    CONTROL_EI.global   
251    CONTROL_EI.public       
252    CONTROL_OD.ELDA.FC.eta.ens.double
253    CONTROL_OD.ENFO.CF
254    CONTROL_OD.ENFO.CV
255    CONTROL_OD.ENFO.PF                         
256    CONTROL_OD.ENFO.PF.36hours   
257    CONTROL_OD.ENFO.PF.ens
258    CONTROL_OD.OPER.4V.operational
259    CONTROL_OD.OPER.FC.36hours
260    CONTROL_OD.OPER.FC.eta.global
261    CONTROL_OD.OPER.FC.eta.highres 
262    CONTROL_OD.OPER.FC.gauss.highres 
263    CONTROL_OD.OPER.FC.operational           
264    CONTROL_OD.OPER.FC.twicedaily.1hourly
265    CONTROL_OD.OPER.FC.twicedaily.3hourly
266   
267   
268
269The main differences and features in the datasets are listed in the table shown below:   
270         
271
272.. _ref-tab-dataset-cmp:
273
274.. figure:: _files/dataset_cmp_table.png
275
276
277                   
278A common problem for beginners in retrieving ECMWF datasets is a mismatch in the choice of values for these parameters. For example, if you try to retrieve operational data for 24 June 2013 or earlier and set the maximum level to 137, you will get an error because this number of levels was introduced only on 25 June 2013. Thus, be careful in the combination of space and time resolution as well as the field types.
279
280
281.. note::
282
283    Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please consult the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_
284
285In the following, we shortly discuss the typical retrievals for the different datasets and  point to the respective ``CONTROL`` files.   
286                   
287     
288Public datasets
289---------------         
290
291The main characteristic in the definition of a ``CONTROL`` file for a public dataset is the parameter ``DATASET``. Its specification enables the selection of a public dataset in MARS. Without this parameter, the request would not find the dataset.
292For the two public datasets *CERA-20C* and *ERA-Interim* an example file with the ending ``.public`` is provided and can be used straightaway.
293
294.. code-block:: bash
295 
296    CONTROL_CERA.public 
297    CONTROL_EI.public     
298
299For *CERA-20C* it seems that there are no differences compared the full dataset, whereas the *public ERA-Interim* has only 6-hourly analysis fields, without forecasts to fill in between, for model levels. Therefore, it is only possible to retrieve 6-hourly data for *public ERA-Interim*.
300
301.. note::
302
303    In principle, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and on the FLEXPART user email list.
304                     
305
306CERA
307----
308
309For this dataset, it is important to keep in mind that it is available for the period 09/1901 until 12/2010, and that the temporal resolution is limited to 3 h.
310It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream.
311There are 10 ensemble members. The example ``CONTROL`` files retrieves the first member only (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``).
312Another important difference to all other datasets is that there is one forecast per day, starting at 18 UTC. The forecast lead time is 24 hours and extends beyond the calendar day. Therefore, ``flex_extract`` needs to extract also the day before the first day for which data are desired, which is handled automatically.
313
314
315ERA 5
316-----
317
318This is the latest re-analysis dataset, and has a temporal resolution of 1-h (analysis fields). At the time of writing, it is available until April 2019 with regular release of new months.
319The original horizontal resolution is 0.28125° which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction  must be an integer multiple of the resolution. It is also possible to use ``0.25`` for the resolution; MARS will then automatically interpolate to this resolution which is still close enough to be acceptable.
320The forecast starting time is ``06/18 UTC`` which is important for the flux data. Correspondingly, one should set in the ``CONTROL`` file ``ACCTIME 06/18``, ``ACCMAXSTEP 12``, and ``ACCTYPE FC``.
321
322.. note::
323
324    *ERA5* also includes an ensemble data assimilation system but related fields are not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble field retrieval for *ERA5* is a *to-do* for the future.
325
326
327
328ERA-Interim
329-----------
330
331The production of this re-analysis dataset has stopped on 31 August 2019!
332It is available for the period from 1 January 1979 to 31 August 2019. The ``etadot`` parameter is not available in this dataset. Therefore, one must use the ``GAUSS`` parameter, which retrieves the divergence field in addition and calculates the vertical velocity from the continuity equation in the Fortran program ``calc_etadot``. While the analysis fields are only available for every 6th hour, the dataset can be made 3-hourly by adding forecast fields in between. No ensemble members are available.
333
334   
335Operational data
336----------------
337
338This data set provides the output of the real-time atmospheric model runs in high resolution, including 10-day forecasts. The model undergoes frequent adaptations and improvements. Thus, retrieving data from this dataset requires extra attention in selecting correct settings of the parameters. See :ref:`[Table of datasets]<ref-tab-dataset-cmp>` for the most important parameters.
339Currently, fields can be retrieved at 1 h temporal resolution by filling the gaps between analysis fields with 1-hourly forecast fields. Since 4 June 2008, the eta coordinate vertical velocity is directly available from MARS, therefore ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to problems in terms of job duration and disk space quota.
340It is recommended to submit such high resolution cases as single day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits.   
341
342``CONTROL`` files for standard retrievals with a mix of analysis and forecast fields are listed below:
343
344.. code-block:: bash
345
346    CONTROL_OD.OPER.4V.eta.global
347    CONTROL_OD.OPER.FC.eta.global
348    CONTROL_OD.OPER.FC.eta.highres 
349    CONTROL_OD.OPER.FC.gauss.highres 
350   
351These files defines the minimum number of parameters necessary to retrieve a daily subset. The given settings for the TYPE parameter are already optimised, and should only be changed if you know what you are doing. Grid, domain, and temporal resolution may be changed according to availability.
352   
353
354.. note::
355
356     Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ for hints about retrieval efficiency and troubleshooting.
357   
358
359Long forecast
360    It is possible to retrieve long forecasts exceeding one day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecasts of *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*.
361    The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today (1-hourly). *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which currently works with 50 ensemble members.
362    Please be aware that in this case it is necessary to set the type for flux fields explicitly, otherwise a default value might be selected, different from what you expect!
363   
364    .. code-block:: bash
365
366        CONTROL_OD.ENFO.CF.36hours
367        CONTROL_OD.ENFO.CV.36hours   
368        CONTROL_OD.OPER.FC.36hours 
369
370
371Half-day retrievals
372    If a forecast is wanted for half a day only, this can be done by substituting the analysis fields by forecast fields as shown in files with ``twicedaily`` in their name. They produce a full-day retrieval with pure 12 hour forecasts, twice a day. It is also possible to use the operational version which would obtain the time information from ECMWF's environment variables and therefore use the newest forecast for each day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours up to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used, this can be done for any other date, too.
373   
374    .. code-block:: bash
375
376        CONTROL_OD.OPER.FC.eta.basetime
377        CONTROL_OD.OPER.FC.operational           
378        CONTROL_OD.OPER.FC.twicedaily.1hourly
379        CONTROL_OD.OPER.FC.twicedaily.3hourly
380
381
382Ensemble members
383    The retrieval of ensemble members was already mentioned in the pure forecast section and for *CERA-20C* data.
384    This ``flex_extract`` version allows to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the operational dataset. Until May 2019, there were 25 ensemble members and a control run (``number 0``). Starting with June 2019, the number of ensemble members has been increased to 50. Therefore, we created the option to create 25 additional "pseudo-ensemble members" for periods before June 2019. The original 25 members from MARS are taken, and the difference between the member value and the control value is subtracted twice. This is done if the parameter ``DOUBLEELDA`` is included and set it to ``1``.
385   
386    .. code-block:: bash
387
388        CONTROL_OD.ELDA.FC.eta.ens.double   
389        CONTROL_OD.ENFO.PF.ens
390   
391   
392Specific features
393-----------------
394
395rrint
396    Selects the disaggregation scheme for precipitation flux: old (``0``) or new (``1``). See :doc:`Documentation/disagg` for explanation.
397cwc
398    If present and set to ``1``, the total cloud water content will be retrieved in addition. This is the sum of cloud liquid and cloud ice water content.
399addpar
400    With this parameter. an additional list of 2-dimensional, non-flux parameters can be retrieved. Use the format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names as defined by MARS.
401doubleelda
402    Use this to double the ensemble member number by adding further disturbance to each member (to be used with 25 members).
403debug
404    If set to ``1``, all temporary files are preserved. Otherwise, everything except the final output files will be deleted.
405request
406    This produces an extra *csv* file ``mars_requests.csv`` where the content of each MARS request submitted within the job is stored, which is useful for debugging and documentation. Possible values are 0 for normal data retrieval, 1 for not retrieving data and just writing out the MARS requests, and 2 to retrieve data and write out requests.
407mailfail
408    As a default, e-mails are sent to the mail address defined for the ECMWF user account. It is possible to overwrite this by specifying one or more e-mail addresses (comma-separated list). In order to include the e-mail associated with the user account, add ``${USER}`` to the list.
409       
410       
411Hints for proper definition of certain parameter combinations
412-------------------------------------------------------------
413
414Field type and time
415    This combination is very important. It defines the temporal resolution and which field type is extracted on each time step.
416    The time declaration for analysis (AN) fields uses the times of the specific analysis while the (forecast time) step has to be ``0``.
417    The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) time and the (forecast) step. Together they define the actual time. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general, it is sufficient to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations.
418   
419    .. code-block:: bash
420       :caption: Example of a setting for the field types and temporal resolution. It will retrieve 3-hourly fields, with analyses at 00 and 12 UTC and the corresponding forecasts inbetween.
421
422        DTIME 3
423        TYPE AN FC FC FC AN FC FC FC
424        TIME 00 00 00 00 12 12 12 12
425        STEP 00 03 06 09 00 03 06 09
426   
427 
428Vertical velocity           
429    The vertical velocity for ``FLEXPART`` is not directly available from MARS and has to be calculated.
430    There are several options for this, and the following parameters are responsible for the selection.                 See :doc:`Documentation/vertco` for a detailed explanation. Using ``ETADIFF 1``, ``OMEGA 1`` and ``OMEGADIFF 1`` is recommended for debugging and testing only.
431    Usually, one has to decide between ``GAUSS 1`` and ``ETA 1``. ``GAUSS 1`` means that spectral fields of the horizontal wind fields and the divergence are retrieved and that the vertical velocity is calculate using the continuity equation. ``ETA 1`` means that horizontal wind fields etadot are retrieved on a regular lat-lon grid. It is recommended to use ``ETA 1`` where possible, as there is a substantial computational overhead for solving the continuity equation.
432
433    .. code-block:: bash
434        :caption: Example setting for the vertical coordinate retrieval (recommended if etadot fields are available).
435       
436        GAUSS 0
437        ETA 1
438        ETADIFF 0
439        DPDETA 1
440        OMEGA 0
441        OMEGADIFF 0
442       
443
444Grid resolution and domain
445    The grid and domain parameters depends on each other. ``grid`` refers to the grid resolution. It can be given as decimal values (e.g., ``1.`` meaning 1.0°), or as in previous versions of flex_extract, as integer values refering to 1/1000 degrees (e.g., ``1000`` means also 1°). The code applies common sense to determine what format is to be assumed.
446    After selecting grid, the ``domain`` has to be defined. The extension in longitude or latitude direction must be an integer multiple of ``grid``.
447   
448               
449    The horizontal resolution for spectral fields is set by the parameter ``RESOL``. For information about how to select an appropriate value please read the explanation of the MARS keyword RESOL as found `in this entry of the ECMWF on-line documentation <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and  `this table (also ECMWF documentation) <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_.
450   
451    .. code-block:: bash
452        :caption: Example setting for a domain covering the northern hemisphere domain with a grid resolution of ``0.25°``.
453   
454        GRID 0.25
455        RESOL 799
456        SMOOTH 0
457        UPPER 90.
458        LOWER 0.
459        LEFT -179.75
460        RIGHT 180.
461   
462
463Flux data
464    Flux fields are always forecast fields and contain values of the fluxes accumulated since the start of the respective forecast. As certain re-analysis dataset cover all time steps with analysis fields, it was necessary to define a new parameter set for the definition of the flux fields. The following parameters are used specifically for flux fields. ``ACCTYPE`` is the field type (must be a type of forecast), ``ACCTIME``  the forecast starting time, and  ``ACCMAXSTEP`` the maximum forecast step;``DTIME`` the temporal resolution. ACCTYPE is assumed to be the same during the whole period given by ACCTIME and ACCMAXSTEP. These values will be set automatically if not provided in a ``CONTROL`` file.
465           
466    .. code-block:: bash
467       :caption: Example setting for the definition of flux fields.
468
469   
470        DTIME 3
471        ACCTYPE FC
472        ACCTIME 00/12
473        ACCMAXSTEP 36
474
475   
476.. toctree::
477    :hidden:
478    :maxdepth: 2
479   
480..    user_guide/oper_modes
481..    user_guide/ecmwf
482..    user_guide/how_to
483..    user_guide/control_templates
484   
Note: See TracBrowser for help on using the repository browser.
hosted by ZAMG