PDAF - The Parallel Data Assimilation Framework: Experiences with Kalman Filtering
The application of advanced data assimilation algorithms based on theKalman filter with large-scale numerical models is computationallyextremely demanding. In addition, the implementation of an dataassimilation system on the basis of existing numerical models iscomplicated by the fact that these models are typically not preparedto be used with data assimilation algorithms. To facilitate theimplementation of data assimilation systems and to reduce thecomputing time for data assimilation, the parallel data assimilationframework PDAF has been developed. PDAF allows to combine an existingnumerical model with data assimilation algorithms, like statisticalfilters, with minimal changes to the model code. Furthermore, PDAFenables the efficient use of parallel computers by creating a paralleldata assimilation system. This talk presents the structure andabilities of PDAF. In addition, the application of filter algorithmsbased on the Kalman filter is discussed and their parallel performancewithin PDAF is shown.
AWI Organizations > Infrastructure > Computing and Data Centre
Helmholtz Research Programs > MARCOPOLI (2004-2008) > German community ocean model