Workplan
Bitcoin360 AI
Participants
News
Events
Documents
Software
CA
Links
Contact
Home
  WP1: Bio GRIDWorkplan OverviewWP3: CAE GRID  

 


WP2: Meteo GRID


Based on a draft of WP5, a GUI was developed and integrated in the EUROGRID environment. This GUI allows an easy control of the LM run, e. g. regarding the model domain (anywhere on the globe), mesh size (usually between 25 and 2 km), forecast length (up to 48 hours), initial date and time, and the output of result files.

 

The production of a LM forecast for a specified region consists of the following steps:

  1. A topographical data set for the specified region (LM domain) has to be derived from high resolution (1 km x 1 km) global data sets available at the DWD. The parameters which have to be computed as averages over model grid boxes (e. g. 7 km x 7 km) include: Terrain height, land fraction, soil type, roughness length, plant cover, leaf area index and root depth. This data set will be computed only once for a given region and stored for later use.
  2. Those data from the global model GME which cover the LM domain are extracted from the ORACLE data base of the DWD for the specified initial date and time. These data serve as initial data of the LM. Forecast fields of the GME at hourly steps will be retrieved, too. They will be used as lateral boundary conditions of the LM. The resolution of the GME is currently 60 km with 31 layers; in 2002, the mesh size will be reduced to 30 km with 35 layers.
  3. The GME data have to be transferred from the DWD the HPC centre running the LM; the amount of data will be up to 100 MByte (up to 400 MByte after 2002), and the transfer should be completed in 15 to 30 minutes.
  4. The interface program GME2LM which interpolates the GME data to the LM grid has to run on the HPC in question. It creates the initial and lateral boundary data on the LM grid. These data will be stored on disk at the HPC; the required space is in the order of 12 to 20 GByte. Later versions of Meteo-Grid will allow for site-specific observational input data as well.
  5. As soon as the initial data plus two lateral boundary data sets are ready, the LM will run on the HPC and create result files at (usually hourly intervals, i. e. every 60 to 75 sec wallclock time). The typical forecast range of detailed regional numerical weather prediction runs is 48 hours.
  6. These result files consist of GRIB code data (Gridded Binary data, a standard 8-bit byte code developed by the WMO, the World Meteorological Organsiation, Geneva). The files have to be transmitted back to the system of the user via high-speed links because the usual amount is between 2 and 4 GByte (for a 48-h forecast). The data are split into 49 files (oh, 1h, 2h, ..., 47h, 48h), each containing one forecast step; each file has a size between 40 to 80 MByte. The GRIB code data can be easily visualised by standard public domain graphic packages like GRADS or VIS5D on workstations or PCs.

Steps 2. to 6. have to be performed in parallel for real-time applications. Thus distributed computing via the internet, controlled by the EUROGRID software, is realised. The whole sequence of jobs is initialised via the Web-based GUI and the user is informed about the progress of each component and receives possible error messages in case of hard- or software failures.

The implementation of the relocatable weather prediction code was accompanied by a careful evaluation of the meteorological validity of the LM in any region of the globe for a wide range of mesh sizes between 25 and 2 km. The standard procedure of verification of model results consists of a comparison of the LM forecast to all observations available in the region. For this purpose, DWD developed a workstation-based interactive Meteorological Application and Presentation system (MAP) which allows to overlay for any region on the globe the available observations (including satellite images) and the LM forecast fields. MAP requires a medium-sized workstation for storing the huge amount of forecast and observational data as well as a rapid display of these data.

The relocatable weather prediction model LM and its preprocessor GME2LM were installed at the HPC centres of CNRS-IDRIS, University of Manchester and Research Centre Juelich as well as on the HPC system of the DWD. The LM is coded in standard Fortran90 and uses MPI for message passing. Thus in principle the code is highly portable. But there is usually a number of installation dependent parameters like compiler options or library versions which have to be adapted for optimal performance under real-time conditions for the HPC system in question. For real-time applications of the LM, some mechanisms are provided by the HPC and EUROGRID to guarantee the execution of the LM suite at high operational priority. This was demonstrated in a multi-user environment.
The functionality of the LM-GUI for the definition and launching of LM forecasts was evaluated by test users who have no experience in HPC. Their experience and suggestions are the basis for further development of the LM-GUI. A typical 48-hour prediction run of the LM for any region of the globe takes no longer than two hours including the preparation of the topographical data file. During the whole model suite, the is informed about the current status of the LM run and the expected time for finishing the task.


d.mallmann@ fz-juelich.de
10-May-2004
URL: <http://www.eurogrid.org/wp2.html>