"Victor Alessandrini"
Folie 2
Folie 3
HPC Centres
|
|
|
|
|
Contribute to the acceleration of
scientific discovery, by the use of information technologies |
|
Provide high performance supercomputing
environments for dealing with science’s more challenging problems. |
|
Act as a technology transfer agent
between R&D in information technologies, and computational science |
Targets for HPC in next
decade
|
|
|
Capture more physics in the simulation
of complex systems |
|
Complex systems are characterized by
multiple time and/or length scales |
|
Not easy to capture multiple scales in
one code |
|
Code coupling for multi-physics
applications is viable alternative in some conditions |
|
This leads naturally to computational
grids |
|
Heterogeneous algorithms map naturally
to heterogeneous grids. |
Project motivations and
strategies
|
|
|
Focus on heterogeneous, very high
performance supercomputing environments. |
|
Use of grid technologies to provide a
unified image and a transparent access to such environments |
|
Deploy an application testbed across
Europe by the integration of partner’s HPC environments |
|
Provide a major effort to develop and
deploy distributed scientific applications (EUROGRID is roughly 2/3
applications development, 1/3 technology development) |
EUROGRID middleware
|
|
|
|
CUSTOM |
|
UNICORE |
|
COMMODITY |
|
MPI (scientific standard, soon
interoperable) |
|
CORBA - JAVA (internet standard, has
significant software engineering advantages, but not yet fully adapted to
performance focused complex applications). |
|
… (others) |
UNICORE partners
|
|
|
|
Forschungezentrum Julich (FZJ, project
leader) |
|
Rechenzentrum Universitat Stuttgart (RUS) |
|
Deutscher Wetterdienst Offenbach (DWD) |
|
Konnrad-Zuse-Zentrum Berlin (ZIB) |
|
Leibniz Rechenzentrum Munchen (LRZ) |
|
Rechenzentrum Universitat Karlsrhue (RUKA) |
|
Padenborn Center for Parallel Computing
(PC2) |
|
Technische Universitat Dresden (TUD) |
|
Pallas GmbH Bruhl (Pallas) |
|
Fecit, subcontractor to Pallas |
|
|
UNICORE goals
|
|
|
UNICORE develops a seamless, secure,
intuitive software infrastructure to HPC resources |
|
Provides consistent batch access to
heterogeneous remote systems … |
|
… with minimal intrusion into the
Centers |
|
Supports multi-site and multi-systems
applications for one job |
|
Exploits existing and emerging
technologies |
Folie 10
UNICORE Plus Project
(2000 - 2002)
|
|
|
Resource Modelling (ZIB, Berlin) |
|
Data Management Enhancements (RUS,
Stuttgart) |
|
Extended Job Control (DWD, Offenbach) |
|
Application Specific Interfaces (FRZ,
Julich) |
|
Co-sheduling (PC2, Paderborn) |
|
MPI and PACX integration (RUS,
Stuttgart) |
|
Vampir extensions (TUD, Dresden) |
|
Advanced administration (FRZ, Julich) |
|
|
WP1 : Bio-Grid (ICM leading partner)
|
|
|
Computation portal to bio-molecular
applications |
|
Build interfaces to well known
bio-molecular applications, simplify access to databases. |
|
Integrate interfaces within EUROGRID
software. |
WP2 : Meteo Grid (DWD leading partner)
WP3 : CAE Grid (EADS leading partner)
|
|
|
|
Focuses on industrial CAE applications |
|
Code coupling and multi-physics
optimisations to improve system design. |
|
ASP - type services : |
|
User interfaces to hide the complexity
of HPC systems to industrial users |
|
Supercomputing as an e-business :
accurate cost prediction of CAE simulations |
WP4 : HPC-GRID (IDRIS leading partner)
|
|
|
Targets : |
|
The establishment, by the HPC centres
partners of EUROGRID, of an application testbed for general purpose HPC distributed applications |
|
The installations and tests of EUROGRID
software releases |
|
The development of new relevant GRID
applications, using existing middlewares, in scientific areas not covered by
WP1 to WP3. |
HPC - GRID
WP4 : some prospects on
applications
|
|
|
Interactive monitoring and steering of
complex simulations (running in batch mode), using JAVA - CORBA technology. |
|
Coupling of atmospheric and
hydrological models |
|
Fluid-structure coupling for space
propulsion |
|
Direct numerical simulation of
turbulent combustion. |
|
…many others to come in the area of
electrodynamics, material sciences, quantum chemistry, ... |
WP5 : Technology
developments
|
|
|
|
Efficient file transfer (FECIT leading
partner) |
|
Optimisations of transfer bandwidth and
cost |
|
Fail-safe and encripted transfer |
|
Overlap of transfer and processing |
|
Emphasis on quality of service. |
|
Resource broker (UoM leading partner) |
|
Must handle static and dynamic
information to match the user’s computational requirements |
|
Builds on CSAR past experience. |
WP5 : Technology
developments
|
|
|
|
ASP infrastructure (DEBIS leading
partner) |
|
Supercomputing as an e-business? |
|
Build browser-based job submission GUI |
|
Build tools for cost-estimation,
accounting and Web-based billing of services. |
|
Interactive access (Parallab leading partner) |
|
Deals with all kinds of issues related
to simulation steering by visualized output. |
|
Integrate these facilities in EUROGRID
software. |
WP6 : (Pallas leading partner)
|
|
|
|
Maintain working versions of EUROGRID
software |
|
Integrate domain specific extensions
(WP1 to WP4) and new technology (WP5). |
|
EUROGRID 0.0 : UNICORE today |
|
EUROGRID 0.5 : |
|
Data transfer, application couplings,
interactive access |
|
EUROGRID 1.0 : |
|
Resource broker, ASP infrastructure |
|
EUROGRID 2 : final version |
Conclusions :
|
|
|
Integration of modern grid software
technologies in European supercomputing infrastructures |
|
Major effort in distributed application
development in science and technology |