Tâche 3.1

Tâche 3.2
Tâche 3.3




La tâche 3 s'intéresse à la représentation des extrêmes par les modèles de climat, à l'identification et à la représentation des incertitudes à partir des projections climatiques et à la cascade d'incertitude engendrée par les méthodes de descente d'échelle. L'objectif sera de tester ces différents sujets à travers les sorties de modèles utilisés pour traiter les quatre études de cas du projet. En particulier les analyses porteront sur les méthodologies utilisées pour analyser les résultats des modèles et la fiabilité des sorties de modèles pour répondre aux questions sur les impacts du changement climatique.  Il est également prévu de tester la valeur ajoutée des nouvelles simulations réalisées dans le cadre du futur rapport du GIEC (CMIP5). L'objectif sera alors de fournir un guide d'utilisations des sorties de modèles face au nombre croissant des jeux de données mis à disposition.

boutton haut de page





boutton haut de page


Task 3.1: data analysis - benefit of new data sets for the analyses of extremes

The different case studies developed in task 2 are based on the occurrence of specific events or of extremes. This subtask will investigate how the different events are reproduced in climate models and how their representation evolved when considering the large scale CMIP3 multi-model ensemble simulations performed for the AR4, the new CMIP5 set that will be release by the end of 2010, or the regional simulations that will be distributed through the GICC-DRIAS project. The focus will be put on the climate variables (such as temperature, precipitation and wind), specific events and indices considered in the different case study, as well as on a subset of the STARDEX indices (http://www.cru.uea.ac.uk/projects/stardex/) that serves in several impact study and  provide a first guess on the evolution of extremes. The analyses will be guided by different questions:

  • What are the limitations of climate models and why?
  • Is there particular processes that are not well reproduce and that explain model biases?
  •  What is the role of remote and local factors?
  •  Can we identify improved performance with increased model resolution or with the use of complex model initialisation for decadal prediction?

The evaluation will make use of the long observation series available at Meteo-France, as well as different gridded data sets commonly used for climate analyses and evaluation, and data of instrumented sites such as SIRTA near  Paris from which it is possible to test the ability of climate model to reproduce the evolution of the atmospheric  boundary layer and of the coupling with the land surface. We will consider mainly the simulations of the last century and for key events the “hindcast” studies proposed in the decadal prediction intercomparison project, to  test if model initialisation has an impact on the representation of extremes. The different partners will join their  efforts in the analyses of extremes events, looking at extreme indices, with IPSL and CNRM-GAME analysing in  more depth the physical processes driving these particular events. A weather regime approach will be used to split model errors (model vs observations) and anomalies (future vs present-day climate) into a large-scale dynamical contribution and a regional-scale process contribution (e.g cloud and land-surface feedbacks). In addition  CNRM-GAME will analyse in detail the 2009-2010 cold wave which will provide additional physical criteria to test model physics with a strong connection with Task 2.

A second activity in this subtask will consider the future evolution of these extremes and analyse the reasons of  their changes. Also the evaluation criteria developed in the first part of the work will serve as a guide to decide if model results can be trusted based on dynamical and physical arguments. A specific care will be put in the comparison of the results of the large scale and regional simulations for 2010-2030 with the new results that will be provided by decadal prediction, such as to assess the value added of these simulations for impact studies. We  will also make use of the expertise of several climate specialists by synthesising some of the key results of the ESCRIME project that are relevant for SECIF. Indeed the different climate experts are gathered under the ESCRIME banner. Thanks to the INSU-LEFE MISSTERRE several scientific projects covering a wide range of speciality will be launched in the coming months.

Deliverables :

 D.3.1.1: Provision of a suite of analyses to test the ability of climate models to reproduce a subset of extremes events of interest for the case studies developed in the project 

D.3.1.2: Analyses of the evolution of extremes and of the reasons of the possible changes at for different time periods focussing mainly on the next decades and mi-century.

 boutton haut de page


Task 3.2: uncertainty and risk analysis – model evaluation

Probabilities and uncertainties related to climate change become increasingly relevant for decision-making processes on all scales. Stakeholders must be informed on the strengths and limitations of complex scientific findings about climate change, connect it to practical questions and problems, and deal with climate uncertainties without compromising their capacity to act. This highlights the importance of improved availability and accessible expertise regarding the use of quality climate information. Before communicating this information, preliminary work is needed to evaluate it. Several aspects will be considered in this subtask, going from simple approaches to more complex statistical ones. A survey of existing methodologies will be first performed, considering the different approaches proposed in the AR4 and new solution that have emerged as part of the EU project ENSEMBLES (ENSEMBLES, 2009, http://ensembleseu.metoffice.com/). Two questions will be treated in parallel. The first one concerns the uncertainty and how to represent it from ensemble simulations, whereas the second one concerns  the detection and attribution of the signal. These approaches still need scientific developments. In addition, the outputs of this work will be oriented towards end-users so as to make sure that the information provided can be used for impact studies or decision processes. This activity will be closely link to the ongoing work in the GICC DRIAS project, for which a probabilistic presentation of the evolution of key climate variables is proposed using  regional simulation for the French territory. Here we will go one step further and the results of these analyses should provide guidelines for addition key statistical analyses that are relevant for the different case Study and should also be distributed through DRIAS. In addition to the survey and test of different methodologies by the different partners, we propose to provide an analysis of uncertainty that will be based on the estimate of the probability density of key environmental quantities. In particular, we will examine the factors that control the probability distribution of variables simulated by climate models. Those factors include the type of model and resolution. We have started developing such tools in a Bayesian context, in the FP7 NICE project (Kalache et al, 2010). They will be extended and disseminated to the stakeholders. CLIMPACT who already attend to the development of such methods could provide its end-user point of view. Risk analysis will be eventually performed in coordination with stakeholder and will provide tools for decision making from uncertain predictions (Jones et al, 2000). This phase could be preferentially implemented for the case study “sewer system”.


D.3.2.1 : probabilistic estimates of the evolution of extremes focussing mainly on cold and heat waves

D.3.2.2 : probabilistic estimate of uncertainties and identification of the factors controlling the distribution using  the “sewer case” as an example.

 boutton haut de page


Task 3.3: downscaling and other statistical methods – Implementation for case studies

Requests of industries about climate change include more often data at regional/local scale (for instance: T2m at  specific meteorological stations for energy consumption models used by gas or electricity sectors). In this purpose, many downscaling methods have been developed, each one adapted for a specific application. Research has  demonstrated that mesoscale modeling is a useful tool for providing climate information at the scale appropriate for societal use (Leung et al. 2003). Two main approaches coexist. Comprehensive downscaling methods solve explicitly the prognostic equations of the meso-scale meteorology (Dickinson et al., 1989 and Giorgi, 1990). They give explicit links between all meteorological parameters. They also explain physics that give extreme values. However the implementation of such models requires larger computing means. Statistical downscaling methods implement statistical links between local parameters and global model outputs (Boe et al., 2007; Najac et al., 2008, Tisseuil et al., 2009) often based on weather regime classifications (Plaut et al., 2001; Vautard, 1990).  Nowadays, few studies have been performed for improving temporal resolution of climate data whereas it also represents a real challenge for industrial vulnerability assessment (hourly data used in their management tools and models). Some recent works based on hourly generation model (Arnaud, 2007; Cantet, 2009; Flecher, 2009) have shown the use potential related to this new products. In this subtask, we propose first to develop an inventory and an analysis of the statistical methods especially for the parameter involved in the 4 case studies (precipitation, temperature and winds). In a second time, development and application of a method will be achieved according to the case studies needs. Objective is to develop hourly data set for temperature, precipitation and winds. The WRF mesoscale model will be run for different IPCC scenarios, using multi-year time slices centered on the years during this century. Output from CMIP3 and CMIP5 global climate model will be used  to force the mesoscale model. Different initial times for the WRF RCM will be employed to create an ensemble simulation.

Application to the heat or cold waves

In the present work, we will investigate, in the climate change context, the contribution of comprehensive downscalling techniques using mesoscale modeling:

  • Spatial distribution of extreme values for the end-users design (gas or electricity networks and dispatching) - Typical  hourly values and persistence during a day for different time horizon.
  • How other meteorological parameters are coupled in particular Temperature/haste, Temperature/humidity or Temperature/wind

Application to the wastewater pool flood

As the key parameters of wastewater treatment are both the hourly rainfall intensity (mm/h) and the total precipitation during several days before, we shall test the contribution of the mesoscale modeling as a way of downcaling the global climate models output:

  • Due to mesh resolution for convective rains computation, the shower intensities (mm/h) are not direct  output of global models,
  • The total rainfall ant the spatial distribution on the whole water catchment area could also be directly evaluated,


D.3.3.1: Review of downscaling approaches: the end-user point of view 

D.3.3.2: Implementation of a statistical and a comprehensive downscaling chain. Application on cold waves and rainfall for wastewater plant for 2000, 2025, 2050 and 2100 ranges.

boutton haut de page