source: flex_extract.git/Documentation/html/_sources/quick_start.rst.txt @ 30f7911

ctbtodev
Last change on this file since 30f7911 was 30f7911, checked in by Anne Philipp <anne.philipp@…>, 5 years ago

reviewed installation section of online documentation; minor corrections

  • Property mode set to 100644
File size: 31.6 KB
Line 
1***********
2Quick Start
3***********
4
5``Flex_extract`` by itself is a command-line tool. With version 7.1 an upstream shell script was implemented which calls ``flex_extract`` with the command-line parameters.
6
7To submit a job with ``flex_extract`` change the working directory to the ``Run`` directory  which is placed directly under the ``flex_extract_vX.X`` root directory  (``X.X`` is the version number):
8
9.. code-block:: bash
10
11    cd <path-to-flex_extract_vX.X>/Run
12
13Within this directory you can find everything you need to modify and run ``flex_extract``. The following tree shows a shortened list of directories and important files. The ``*`` serves as a wildcard. The brackets ``[]`` means that the file appearance depends on the mode of application and night not be present.
14
15.. code-block:: bash
16       
17    Run
18    ├── Control
19    │   ├── CONTROL_*
20    ├── Jobscripts
21    │   ├── compilejob.ksh
22    │   ├── job.ksh
23    │   ├── [joboper.ksh]
24    ├── Workspace
25    │   ├── CERA_example
26    │   │   ├── CE000908*
27    ├── [ECMWF_ENV]
28    ├── run_local.sh
29    └── run.sh
30
31The ``Jobscripts`` directory is used to store the Korn shell job scripts generated by a ``flex_extract`` run in the **Remote** or **Gateway** mode. They are used to submit the setup information to the ECMWF server and start the jobs in ECMWF's batch mode. The typical user must not touch these files. They will be generated from template files which are stored in the ``Templates`` directory under ``flex_extract_vX.X``. Usually there will be a ``compilejob.ksh`` and a ``job.ksh`` script which are explained in the section :doc:`Documentation/input`. In the rare case of operational data extraction there will be a ``joboper.ksh`` which is designed to get the time parameters from environment variables at the ECMWF servers.
32
33The ``Controls`` directory contains a number of example ``CONTROL`` files. These``CONTROL`` files represent the current range of possible dataset retrievals with ``flex_extract``. Some parameters in the ``CONTROL`` files can be adapted and some others should not be changed. In this :doc:`quick_start` guide we explain how an extraction with ``flex_extract`` can be started in the different :doc:`Documentation/Overview/app_modes` and point out some specifics of each dataset and ``CONTROL`` file.
34
35Directly under ``Run`` you find the files ``run.sh`` and ``run_local.sh`` and according to your selected :doc:`Documentation/Overview/app_modes` there might also be a file named ``ECMWF_ENV`` for the user credentials to quickly and automatically access ECMWF servers. 
36
37From version 7.1 on, the ``run.sh`` (or ``run_local.sh``) script is the main access point to ``flex_extract``.
38
39.. note::
40
41    Note that, for experienced users (or users of older versions), it is still possible to access ``flex_extract`` directly via the ``submit.py`` script which can be found in the directory ``flex_extract_vX.X/Source/Python``.
42   
43
44
45Job preparation
46===============
47
48To actually start a job with ``flex_extract`` it is only necessary to submit the ``run.sh`` or ``run_local.sh`` script on the command-line. For the specification of dataset and the selection of the access mode it is necessary to select and modify a ``CONTROL`` file and change the parameters in the user section of the ``run`` scripts. The following sections describes the differences in the application modes and where the results will be stored.
49   
50
51Remote and gateway modes
52------------------------
53
54For member state users it is recommended to use the *remote* or *gateway* mode, especially for more demanding tasks, to retrieve and convert data on ECMWF machines and to transfer only the final output files to the local host.
55
56Remote mode
57    The only difference between both modes is the users working location. In the *remote* mode you have to login to the ECMWF server and then go to the ``Run`` directory as shown above. At ECMWF servers ``flex_extract`` is installed in the ``$HOME`` directory. However, to be able to start the program you have to load the ``Python3`` environment with the module system first.
58
59    .. code-block:: bash
60
61        # Remote mode
62        ssh -X <ecuid>@ecaccess.ecmwf.int
63
64    .. code-block:: bash
65   
66        # On ECMWF server
67        [<ecuid>@ecgb11 ~]$ module load python3
68        [<ecuid>@ecgb11 ~]$ cd flex_extract_vX.X/Run
69       
70
71Gateway mode       
72    For the gateway mode you have to log in on the gateway server and go to the ``Run`` directory of ``flex_extract``:
73
74    .. code-block:: bash
75
76        # Gateway mode
77        ssh <user>@<gatewayserver>
78        cd <path-to-flex_extract_vX.X>/Run
79
80       
81From here on the working process is the same for both modes.   
82
83For your first submission you should use one of the example ``CONTROL`` files stored in the ``Control`` directory. We recommend to extract *CERA-20C* data since they usually guarantee quick results and are best for testing reasons.
84 
85Therefore open the ``run.sh`` file and modify the parameter block marked in the file as shown below:
86
87.. code-block:: bash
88
89    # -----------------------------------------------------------------
90    # AVAILABLE COMMANDLINE ARGUMENTS TO SET
91    #
92    # THE USER HAS TO SPECIFY THESE PARAMETERS:
93
94    QUEUE='ecgate'
95    START_DATE=None
96    END_DATE=None
97    DATE_CHUNK=None
98    JOB_CHUNK=3
99    BASETIME=None
100    STEP=None
101    LEVELIST=None
102    AREA=None
103    INPUTDIR=None
104    OUTPUTDIR=None
105    PP_ID=None
106    JOB_TEMPLATE='job.temp'
107    CONTROLFILE='CONTROL_CERA'
108    DEBUG=0
109    REQUEST=2
110    PUBLIC=0   
111
112
113This would retrieve a one day (08.09.2000) *CERA-20C* dataset with 3 hourly temporal resolution and a small 1° domain over Europe. Since the ``ectrans`` parameter is set to ``1`` the resulting output files will be transferred to the local gateway into the path stored in the destination (SEE INSTRUCTIONS FROM INSTALLATION).  The parameters listed in the ``run.sh`` file would overwrite existing settings in the ``CONTROL`` file.   
114
115To start the retrieval you only have to start the script by:
116
117.. code-block:: bash
118 
119    ./run.sh
120
121``Flex_extract`` will print some information about the job. If there is no error in the submission to the ECMWF server you will see something like this:
122   
123.. code-block:: bash
124   
125    ---- On-demand mode! ----
126    The job id is: 10627807
127    You should get an email per job with subject flex.hostname.pid
128    FLEX_EXTRACT JOB SCRIPT IS SUBMITED!
129 
130
131Once submitted you can check the progress of the submitted job using ``ecaccess-job-list``. You should get an email after the job is finished with a detailed protocol of what was done.
132
133In case the job fails you will receive an email with the subject ``ERROR!`` and the job name. You can then check for information in the email or you can check on ECMWF server in the ``$SCRATCH`` directory for debugging information.   
134
135.. code-block:: bash
136       
137    cd $SCRATCH
138    ls -rthl
139
140The last command lists the most recent logs and temporary retrieval directories (usually ``pythonXXXXX``, where XXXXX is the process id). Under ``pythonXXXXX`` a copy of the ``CONTROL`` file is stored under the name ``CONTROL``, the protocol is stored in the file ``prot`` and the temporary files as well as the resulting files are stored in a directory ``work``. The original name of the ``CONTROL`` file is stored in this new file under parameter ``controlfile``.
141
142.. code-block:: bash
143    :caption: "Example structure of ``flex_extract`` output directory on ECMWF servers."
144       
145    pythonXXXXX
146    ├── CONTROL
147    ├── prot
148    ├── work
149    │   ├── temporary files
150    │   ├── CE000908* (resulting files)
151
152If the job was submitted to the HPC ( ``queue=cca`` ) you may login to the HPC and look into the directory ``/scratch/ms/ECGID/ECUID/.ecaccess_do_not_remove`` for job logs. The working directories are deleted after job failure and thus normally cannot be accessed.
153
154To check if the resulting files are still transferred to local gateway server you can use the command ``ecaccess-ectrans-list`` or check the destination path for resulting files on your local gateway server.
155   
156
157Local mode
158----------
159   
160To get to know the working process and to start your first submission you could use one of the example ``CONTROL`` files stored in the ``Control`` directory as they are. For quick results and for testing reasons it is recommended to extract *CERA-20C* data.
161 
162Open the ``run_local.sh`` file and modify the parameter block marked in the file as shown below. The differences are highlighted.
163
164+-----------------------------------------------+-----------------------------------------------+
165|   Take this for **member-state user**         |  Take this for **public user**                |
166+-----------------------------------------------+-----------------------------------------------+
167| .. code-block:: bash                          | .. code-block:: bash                          |
168|    :emphasize-lines: 16,20,23                 |    :emphasize-lines: 16,20,23                 |
169|                                               |                                               |
170|    # -----------------------------------------|    # -----------------------------------------|
171|    # AVAILABLE COMMANDLINE ARGUMENTS TO SET   |    # AVAILABLE COMMANDLINE ARGUMENTS TO SET   |
172|    #                                          |    #                                          |
173|    # THE USER HAS TO SPECIFY THESE PARAMETERs:|    # THE USER HAS TO SPECIFY THESE PARAMETERs:|
174|    #                                          |    #                                          |
175|                                               |                                               |
176|    QUEUE=''                                   |    QUEUE=''                                   |
177|    START_DATE=None                            |    START_DATE=None                            |
178|    END_DATE=None                              |    END_DATE=None                              |
179|    DATE_CHUNK=None                            |    DATE_CHUNK=None                            |
180|    JOB_CHUNK=None                             |    JOB_CHUNK=None                             |
181|    BASETIME=None                              |    BASETIME=None                              |
182|    STEP=None                                  |    STEP=None                                  |
183|    LEVELIST=None                              |    LEVELIST=None                              |
184|    AREA=None                                  |    AREA=None                                  |
185|    INPUTDIR='./Workspace/CERA'                |    INPUTDIR='./Workspace/CERApublic'          |
186|    OUTPUTDIR=None                             |    OUTPUTDIR=None                             |
187|    PP_ID=None                                 |    PP_ID=None                                 |
188|    JOB_TEMPLATE=''                            |    JOB_TEMPLATE=''                            |
189|    CONTROLFILE='CONTROL_CERA'                 |    CONTROLFILE='CONTROL_CERA.public'          |
190|    DEBUG=0                                    |    DEBUG=0                                    |
191|    REQUEST=0                                  |    REQUEST=0                                  |
192|    PUBLIC=0                                   |    PUBLIC=1                                   |
193|                                               |                                               |
194+-----------------------------------------------+-----------------------------------------------+
195
196
197This would retrieve a one day (08.09.2000) *CERA-20C* dataset with 3 hourly temporal resolution and a small 1° domain over Europe. The destination location for this retrieval will be within the ``Workspace`` directory within ``Run``. This can be changed to whatever path you like. The parameters listed in ``run_local.sh`` would overwrite existing settings in the ``CONTROL`` file. 
198   
199To start the retrieval you then start the script by:
200
201.. code-block:: bash
202 
203    ./run_local.sh
204   
205 
206While job submission on the local host is convenient and easy to monitor (on standard output), there are a few caveats with this option:
207
208    1. There is a maximum size of 20GB for single retrieval via ECMWF Web API. Normally this is not a problem but for global fields with T1279 resolution and hourly time steps the limit may already apply.
209    2. If the retrieved MARS files are large but the resulting files are relative small (small local domain) then the retrieval to the local host may be inefficient since all data must be transferred via the Internet. This scenario applies most notably if ``etadot`` has to be calculated via the continuity equation as this requires global fields even if the domain is local. In this case job submission via ecgate might be a better choice. It really depends on the use patterns and also on the internet connection speed.   
210       
211
212
213
214Selection and adjustment of ``CONTROL`` files
215=============================================
216
217
218This section describes how to work with the ``CONTROL`` files. A detailed explanation of ``CONTROL`` file parameters and naming compositions can be found `here <Documentation/Input/control.html>`_. The more accurately the ``CONTROL`` file describes the retrieval needed, the fewer command-line parameters are needed to be set in the ``run`` scripts. With version ``7.1`` all ``CONTROL`` file parameters have default values. They can be found in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or in the ``CONTROL.documentation`` file within the ``Control`` directory. Only those parameters which need to be changed for a dataset retrieval needs to be set in a ``CONTROL`` file!
219
220The limitation of a dataset to be retrieved should be done very cautiously. The datasets can differ in many ways and vary over the time in resolution and parameterisations methods, especially the operational model cycles improves through a lot of changes over the time. If you are not familiar with the data it might be useful or necessary to check for availability of data in ECMWF’s MARS:
221
222    - **Public users** can use a web mask to check on data or list available data at this `Public datasets web interface <https://apps.ecmwf.int/datasets/>`_.
223    - **Member state users** can check availability of data online in the `MARS catalogue <https://apps.ecmwf.int/mars-catalogue/>`_.
224
225There you can select step by step what data suits your needs. This would be the most straightforeward way of checking for available data and therefore limit the possibility of ``flex_extract`` to fail. The following figure gives an example how the web interface would look like:
226
227
228.. _ref-fig-mars-catalogue-ss:
229
230.. figure:: _files/MARS_catalogue_snapshot.png
231
232   
233!!!!!!!! ADD HERE ANOTHER SCREENSHOT OF THE PARAMETER SELECTION AREA ( HAS TO BE DONE AT HOME )
234
235Additionally, you can find a lot of helpful links to dataset documentations, direct links to specific dataset web catalogues or further general information at the `link collection <Ecmwf/ec-links.html>`_ in the ECMWF data section.   
236
237
238``Flex_extract`` is specialised to retrieve a limited number of datasets, namely *ERA-Interim*, *CERA-20C*, *ERA5* and *HRES (operational data)* as well as the *ENS (operational data, 15-day forecast)*. The limitation relates mainly to the dataset itself, the stream (what kind of forecast or what subset of dataset) and the experiment number. Mostly, the experiment number is equal to ``1`` to signal that the actual version should be used.
239
240The next level of differentiation would be the field type, level type and time period. ``Flex_extract`` currently only supports the main streams for the re-analysis datasets and provides extraction of different streams for the operational dataset. The possibilities of compositions of dataset and stream selection are represented by the current list of example ``CONTROL`` files. You can see this in the naming of the example files:
241
242
243.. code-block:: bash
244    :caption: "Current example ``CONTROL`` files distributed with ``flex_extract``. "
245
246    CONTROL_CERA   
247    CONTROL_CERA.global   
248    CONTROL_CERA.public     
249    CONTROL_EA5     
250    CONTROL_EA5.global       
251    CONTROL_EI   
252    CONTROL_EI.global   
253    CONTROL_EI.public       
254    CONTROL_OD.ELDA.FC.eta.ens.double
255    CONTROL_OD.ENFO.CF
256    CONTROL_OD.ENFO.CV
257    CONTROL_OD.ENFO.PF                         
258    CONTROL_OD.ENFO.PF.36hours   
259    CONTROL_OD.ENFO.PF.ens
260    CONTROL_OD.OPER.4V.operational
261    CONTROL_OD.OPER.FC.36hours
262    CONTROL_OD.OPER.FC.eta.global
263    CONTROL_OD.OPER.FC.eta.highres 
264    CONTROL_OD.OPER.FC.gauss.highres 
265    CONTROL_OD.OPER.FC.operational           
266    CONTROL_OD.OPER.FC.twiceaday.1hourly
267    CONTROL_OD.OPER.FC.twiceaday.3hourly
268   
269   
270
271The main differences and features in the datasets are listed in the table shown below:   
272         
273
274.. _ref-tab-dataset-cmp:
275
276.. figure:: _files/dataset_cmp_table.png
277
278   DO THIS TABLE AGAIN BY HAND!
279                   
280A common problem for beginners in retrieving ECMWF datasets is the mismatch in the definition of these parameters. For example, if you would like to retrieve operational data before ``June 25th 2013`` and set the maximum level to ``137`` you will get an error because this number of levels was first introduced at this effective day. So, be cautious in the combination of space and time resolution as well as the field types which are not available all the time.
281
282
283.. note::
284
285    Sometimes it might not be clear how specific parameters in the control file must be set in terms of format. Please see the description of the parameters in section `CONTROL parameters <Documentation/Input/control_params.html>`_ or have a look at the ECMWF user documentation for `MARS keywords <https://confluence.ecmwf.int/display/UDOC/MARS+keywords>`_
286
287
288In the following we shortly discuss the main retrieval opportunities of the different datasets and  categoize the ``CONTROL`` files.   
289                   
290     
291Public datasets
292---------------         
293
294The main difference in the definition of a ``CONRTOL`` file for a public dataset is the setting of the parameter ``DATASET``. This specification enables the selection of a public dataset in MARS. Otherwise the request would not find the dataset.
295For the two public datasets *CERA-20C* and *ERA-Interim* an example file with the ending ``.public`` is provided and can be used straightaway.
296
297.. code-block:: bash
298 
299    CONTROL_CERA.public 
300    CONTROL_EI.public     
301
302For *CERA-20C* it seems that there are no differences in the dataset against the full dataset, while the *public ERA-Interim* has only analysis fields every 6 hour without filling forecasts in between for model levels. Therefore it is only possible to retrieve 6-hourly data for *public ERA-Interim*.
303
304.. note::
305
306    In general, *ERA5* is a public dataset. However, since the model levels are not yet publicly available, it is not possible to retrieve *ERA5* data to drive the ``FLEXPART`` model. As soon as this is possible it will be announced at the community website and per newsletter.
307                     
308
309CERA
310----
311
312For this dataset it is important to keep in mind that the dataset is available for the period 09/1901 until 12/2010 and the temporal resolution is limited to 3-hourly fields.
313It is also a pure ensemble data assimilation dataset and is stored under the ``enda`` stream. It has ``10`` ensemble members. The example ``CONTROL`` files will only select the first member (``number=0``). You may change this to another number or a list of numbers (e.g. ``NUMBER 0/to/10``).
314Another important difference to all other datasets is the forecast starting time which is 18 UTC. Which means that the forecast in *CERA-20C* for flux fields is  12 hours long. Since the forecast extends over a single day we need to extract one day in advance and one day subsequently. This is automatically done in ``flex_extract``.
315
316
317ERA 5
318-----
319
320This is the newest re-analysis dataset and has a temporal resolution of 1-hourly analysis fields. Up to date it is available until April 2019 with regular release of new months.
321The original horizontal resolution is ``0.28125°`` which needs some caution in the definition of the domain, since the length of the domain in longitude or latitude direction  must be an exact multiple of the resolution. It might be easier for users to use ``0.25`` for the resolution which MARS will automatically interpolate.
322The forecast starting time is ``06/18 UTC`` which is important for the flux data. This should be set in the ``CONTROL`` file via the ``ACCTIME 06/18`` parameter in correspondence with ``ACCMAXSTEP 12`` and ``ACCTYPE FC``.
323
324.. note::
325
326    We know that *ERA5* also has an ensemble data assimilation system but this is not yet retrievable with ``flex_extract`` since the deaccumulation of the flux fields works differently in this stream. Ensemble retrieval for *ERA5* is a future ToDo.
327
328
329
330ERA-Interim
331-----------
332
333This re-analysis dataset will exceed its end of production at 31st August 2019!
334It is then available from 1st January 1979 to 31st August 2019. The ``etadot`` is not available in this dataset. Therefore ``flex_extract`` must select the ``GAUSS`` parameter to retrieve the divergence field in addition. The vertical velocity is the calculated with the continuity equation in the Fortran program ``CONVERT2``. Since the analysis fields are only available for every 6th hour, the dataset can be made 3 hourly by adding forecast fields in between. No ensemble members are available.
335
336
337.. todo::
338
339    @LEO: please check the complete description and functionality of the CONTROL FILEs
340   
341Operational data
342----------------
343
344This is the real time atmospheric model in high resolution with a 10-day forecast. This means it underwent regular adaptations and improvements over the years. Hence, retrieving data from this dataset needs extra attention in selecting correct settings of parameter. See :ref:`ref-tab-dataset-cmp` for the most important parameters.
345Nowadays, it is available 1 hourly by filling the gaps of the 6 hourly analysis fields with 1 hourly forecast fields. Since 4th June 2008 the eta coordinate is directly available so that ``ETA`` should be set to ``1`` to save computation time. The horizontal resolution can be up to ``0.1°`` and in combination with ``137`` vertical levels can lead to troubles in retrieving this high resolution dataset in terms of job duration and quota exceedence.
346It is recommended to submit such high resolution cases for single day retrievals (see ``JOB_CHUNK`` parameter in ``run.sh`` script) to avoid job failures due to exceeding limits.   
347
348``CONTROL`` files for normal daily retrievals with a mix of analysis and forecast fields are listed below:
349
350.. code-block:: bash
351
352    CONTROL_OD.OPER.4V.eta.global
353    CONTROL_OD.OPER.FC.eta.global
354    CONTROL_OD.OPER.FC.eta.highres 
355    CONTROL_OD.OPER.FC.gauss.highres 
356   
357These files defines the minimum number of parameters necessary to retrieve a daily subset. The setup of field types is optimal and should only be changed if the user understands what he does. The grid, domain and temporal resolution can be changed according to availability.     
358   
359.. todo::
360   
361    @LEO - explain the setup with 4V fields! Who can extract it, when would this be useful?
362
363
364.. note::
365
366     Please see `Information about MARS retrievement <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Retrievalefficiency>`_ to get to know hints about retrieval efficiency and troubleshooting.
367 
368   
369
370Pure forecast
371    It is possible to retrieve pure forecasts exceeding a day. The forecast period available depends on the date and forecast field type. Please use MARS catalogue to check the availability. Below are some examples for 36 hour forecast of *Forecast (FC)*, *Control forecast (CF)* and *Calibration/Validation forecast (CV)*.
372    The *CV* field type was only available 3-hourly from 2006 up to 2016. It is recommended to use the *CF* type since this is available from 1992 (3-hourly) on up to today in 1-hourly temporal resolution. *CV* and *CF* field types belong to the *Ensemble prediction system (ENFO)* which contain 50 ensemble members.
373    Please be aware that in this case it is necessary to set the specific type for flux fields explicitly, otherwise it could select a default value which might be different from what you expect!
374   
375    .. code-block:: bash
376
377        CONTROL_OD.ENFO.CF.36hours
378        CONTROL_OD.ENFO.CV.36hours   
379        CONTROL_OD.OPER.FC.36hours 
380
381
382
383Half-day retrievals
384    If a forecast for just half a day is wanted it can be done by substituting the analysis fields also by forecast fields as shown in files with ``twiceaday`` in it. They produce a full day retrieval with pure 12 hour forecasts twice a day. It is also possible to use the operational version which would get the time information from ECMWF's environmental variables and therefore get the newest forecast per day. This version uses a ``BASETIME`` parameter which tells MARS to extract the exact 12 hours upfront to the selected date. If the ``CONTROL`` file with ``basetime`` in the filename is used this can be done for any other date too.
385   
386    .. code-block:: bash
387
388        CONTROL_OD.OPER.FC.eta.basetime
389        CONTROL_OD.OPER.FC.operational           
390        CONTROL_OD.OPER.FC.twiceaday.1hourly
391        CONTROL_OD.OPER.FC.twiceaday.3hourly
392
393
394
395
396Ensemble members
397    The retrieval of ensemble members were already mentioned in the pure forecast section and for *CERA-20C* data.
398    In this ``flex_extract`` version there is an additional possibility to retrieve the *Ensemble Long window Data Assimilation (ELDA)* stream from the real-time dataset. This model version has (up to May 2019) 25 ensemble members and a control run (``number 0``). Starting from June 2019 it has 50 ensemble members. Therefore we created the possibility to double up the 25 ensemble members (before June 2019) to 50 members by taking the original 25 members from MARS and subtracting 2 times the difference between the member value and the control value. This is done by selecting the parameter ``DOUBLEELDA`` and set it to ``1``.
399     
400   
401    .. code-block:: bash
402
403        CONTROL_OD.ELDA.FC.eta.ens.double   
404        CONTROL_OD.ENFO.PF.ens
405
406.. todo::
407
408    @LEO: Please tell me why perturbed forecast are possible? Is it because of some user rights? I have no opportunity of retrieve PF data.       
409   
410   
411   
412Specific features
413-----------------
414
415rrint
416    Decides if the precipitation flux data uses the old (``0``) or new (``1``) disaggregation scheme. See :doc:`Documentation/disagg` for explanaition.
417cwc
418    Decides if the total cloud water content will be retrieved (set to ``1``) in addition. This is the sum of cloud liquid and cloud ice water content.
419addpar
420    With this parameter an additional list of 2-dimensional, non-flux parameters can be retrieved. Use format ``param1/param2/.../paramx`` to list the parameters. Please be consistent in using either the parameter IDs or the short names.
421doubleelda
422    Use this to double the ensemble member number by adding further disturbance to each member.
423debug
424    If set to ``1`` all temporary files were kept at the end. Otherwise everything except the final output files will be deleted.
425request
426    This produces an extra *csv* file ``mars_requests.csv`` where the content of each mars request of the job is stored. Useful for debugging and documentation.
427mailfail
428    At default the mail is send to the mail connected with the user account. Add additional email addresses if you want. But as soon as you enter a new mail, the default will be overwritten. If you would like to keep the mail from your user account, please add ``${USER}`` to the list ( comma seperated ) or mail addresses.
429     
430       
431       
432Hints for definition of some parameter combinations
433---------------------------------------------------
434
435Field types and times
436    This combination is very important. It defines the temporal resolution and which field type is extracted per time step.
437    The time declaration for analysis (AN) fields uses the times of the specific analysis and (forecast time) steps have to be ``0``. The forecast field types (e.g. FC, CF, CV, PF) need to declare a combination of (forescast start) times and the (forecast) steps. Both of them together defines the actual time step. It is important to know the forecast starting times for the dataset to be retrieved, since they are different. In general it is enough to give information for the exact time steps, but it is also possible to have more time step combinations of ``TYPE``, ``TIME`` and ``STEP`` because the temporal (hourly) resolution with the ``DTIME`` parameter will select the correct combinations.
438
439    .. code-block:: bash
440       :caption: Example of a setting for the field types and temporal resolution.
441
442        DTIME 3
443        TYPE AN FC FC FC AN FC FC FC
444        TIME 00 00 00 00 12 12 12 12
445        STEP 00 03 06 09 00 03 06 09
446   
447 
448Vertical velocity           
449    The vertical velocity for ``FLEXPART`` is not directly available from MARS. Therefore it has to be calculated. There are a couple of different options. The following parameters are responsible for the selection. See :doc:`Documentation/vertco` for a detailed explanation. The ``ETADIFF``, ``OMEGA`` and ``OMEGADIFF`` versions are only recommended for debugging and testing reasons. Usually it is a decision between ``GAUSS`` and ``ETA``, where for ``GAUSS`` spectral fields of the horizontal wind fields and the divergence are to be retrieved and used with the continuity equation to calculate the vertical velocity. For ``ETA`` the latitude/longitude fields of horizontal wind fields and eta-coordinate are to be retrieved. It is recommended to use ``ETA`` where possible due to a reduced computation time. 
450
451    .. code-block:: bash
452        :caption: Example setting for the vertical coordinate retrieval.
453       
454        GAUSS 0
455        ETA 1
456        ETADIFF 0
457        DPDETA 1
458        OMEGA 0
459        OMEGADIFF 0
460       
461
462Grid resolution and domain
463    The grid and domain selection depends on each other. The grid can be defined in the format of normal degrees (e.g. ``1.``) or as in older versions by 1/1000. degrees (e.g. ``1000`` for ``1°``).
464    After selecting the grid, the domain has to be defined in a way that the length of the domain in longitude or latitude direction  must be an exact multiple of the grid.
465    The horizontal resolution for spectral fields will be set by the parameter ``RESOL``. For information about how to select an appropriate value you can read the explanation of the MARS keyword `here <https://confluence.ecmwf.int/display/UDOC/Post-processing+keywords#Post-processingkeywords-resol>`_ and in `this table  <https://confluence.ecmwf.int/display/UDOC/Retrieve#Retrieve-Truncationbeforeinterpolation>`_.
466   
467    .. code-block:: bash
468        :caption: Example setting for a northern hemisphere domain with a grid of ``0.25°``.
469   
470        GRID 0.25
471        RESOL 799
472        SMOOTH 0
473        UPPER 90.
474        LOWER 0.
475        LEFT -179.75
476        RIGHT 180.
477   
478
479Flux data
480    The flux fields are accumulated forecast fields all the time. Since some re-analysis dataset nowadays have complete set of analysis fields in their temporal resolution it was important to define a new parameter set to define the flux fields since the information could not be taken from ``TYPE``, ``TIME`` and ``STEP`` any longer. Select a forecast field type ``ACCTYPE``, the forecast starting time ``ACCTIME`` and the maximum forecast step ``ACCMAXSTEP``. The ``DTIME`` parameter defines the temporal resolution for the whole period.
481   
482    .. code-block:: bash
483       :caption: Example setting for the definition of flux fields.
484   
485        DTIME 3
486        ACCTYPE FC
487        ACCTIME 00/12
488        ACCMAXSTEP 36
489
490
491   
492.. toctree::
493    :hidden:
494    :maxdepth: 2
495   
496..    user_guide/oper_modes
497..    user_guide/ecmwf
498..    user_guide/how_to
499..    user_guide/control_templates
500   
Note: See TracBrowser for help on using the repository browser.
hosted by ZAMG