**NEW Sharing area**

User documentation

Admin documentation

Development resources

Download

Private area

Home

**NEW Sharing area**

User documentation

Admin documentation

Development resources

Download

Private area

User documentation

Admin documentation

Development resources

Download

Private area

Short usage overview:

Computer experiments concepts hold in the following features (as plugins) of Promethee are based on several sources:

The main issues of computer experiments are divided in following topics:

**Deep Inside Computer Experiments**: french consortium funded by industrial firms and involving many academic units.**GDR MASCOT-NUM**: a french Research Group on Stochastic Analysis Methods for COdes and NUMerical treatments.

- Noisy Expected Improvement and on-line computation time allocation for the optimization of simulators with tunable fidelity (EngOpt 2010)by V. Picheny, D. Ginsbourger, Y. Richet
### Abstract

This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depends on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: ﬁrstly, we propose a quantile-based criterion for the sequential choice of experiments, in the fashion of the classical Expected Improvement criterion, which allows a rigorous treatment of heterogeneous response precisions. Secondly, we present a procedure that allocates on-line the computational time given to each measurement, allowing a better distribution of the computational eﬀort and increased eﬃciency. Finally, the optimization method is applied to an original application in nuclear criticality safety.### Keywords

Noisy optimization, Kriging, Tunable ﬁdelity### Content

http://hal.archives-ouvertes.fr/docs/00/49/42/29/PDF/engoptfinal.pdf - Algorithm-assisted Assessment in Criticality Safety (ICNC 2011)by J.Crevel, G. Caplin, Y. Richet
### Abstract

Nuclear criticality safety evaluations aim at avoiding any criticality accident in both normal and abnormal conditions. The most penalizing configuration is to be identified by moving several physical and chemical parameters within their credible range of variation. Exhaustive analyses, very expensive in calculations for day-to-day usage, have already highlighted the limitations of both “parameter-by-parameter” or even “crossed-parameters” approaches to identify the maximum value of the effective neutron multiplication factor (keff).

Illustrated by a basic example of an interim storage of wet PuO2 powder, this paper shows that an algorithm-assisted search of the maximum value of keff would be able to help a nuclear criticality safety assessor to identify, with a few calculations, the interest zone(s) of high keff values of multi-parameters problems. Nevertheless, expert’s skills remain necessary to make an adequate nuclear criticality safety assessment.

The algorithm described, based on surrogate optimisation procedures to manage an adaptive design of experiments, was designed through the DICE industrial research consortium [6] and then tested within computer tools supporting the French CRISTAL criticality safety code package [6].### Key Words

Algorithm, Optimization, Monte Carlo, Design of Experiments. - Using Efficient Global Optimization Algorithm to assist Nuclear Criticality Safety Assessment (submitted to Nuclear Science and Engineering)by Y. Richet, G. Caplin, J. Crevel, D. Ginsbourger, V. Picheny
### Abstract

Nuclear criticality safety assessment often requires group-wise Monte Carlo simulations of k-effective in order to check sub-criticality of the system of interest. A typical task to be performed by safety assessors is hence to find the worse combination of input parameters of the criticality Monte Carlo code (i.e. leading to maximum reactivity) over the whole operating range. Then, checking sub-criticality can be done by solving a maximization problem where the input-output map defined by the Monte Carlo code stands for the objective function, or “parametric” model. This straightforward view of criticality parametric calculations complies with recent works in Design of Computer Experiments, an active research field in applied stochastics. This framework provides a robust support to enhance and consolidate good practices in criticality safety assessment. Indeed, supplementing the standard “expert driven” assessment by an optimization algorithm may be helpful to increase the reliability of the whole process, and the robustness of its conclusions. Such a new safety practice is intended to rely on both well-suited theoretical tools (compliant optimization algorithms) and computing infrastructure (a flexible grid computing environment). This paper presents an efficient solution to this two-sided theoretical and technical challenge.### Keywords

Criticality, Monte Carlo, Design of Experiments, Optimization, Kriging ++ - Optimization of Noisy Computer Experiments with Tunable Precision (Technometrics 2011)by V. Picheny, D. Ginsbourger, Y. Richet, G. Caplin
### Abstract

This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: firstly, we propose a quantile-based criterion for the sequential choice of experiments, in the fashion of the classical Expected Improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Secondly, we present a procedure that allocates on-line the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety.### Content

http://hal.archives-ouvertes.fr/docs/00/61/48/62/PDF/EQI_v2.pdf - Fast kriging-based stepwise uncertainty reduction with application to the identification of an excursion setby C. Chevalier, J. Bect, D. Ginsbourger, E. Vazquez, V. Picheny, Y. Richet
### Abstract

A Stepwise Uncertainty Reduction (SUR) strategy aims at constructing a sequence X1(f),X2(f), . . . of evaluation points of a function f : Rd → R in such a way that the residual uncertainty about a quantity of interest S(f) given the information provided by the evaluation results is small. In Bect, Ginsbourger, Li, Picheny and Vazquez, Statistics and Computing, 2011, several SUR approaches have been shown to be particularly efficient for the problem of estimating the volume of an excursion set of a function f above a threshold. Here, we build upon these results and we present fast implementations of some SUR strategies, which are based on two ideas. The first idea is to take advantage of update formulas for kriging. The second idea is to derive closed-form expressions for some integrals that appear in the SUR criteria. We are able to demonstrate significant speed-ups and we illustrate our algorithms on a nuclear safety application.### Content

http://hal.archives-ouvertes.fr/docs/00/64/11/08/PDF/FastKrigingInversion.pdf

The main issues of computer experiments are divided in following topics:

This type of algorithms are dedicated to provide an overview of the behaviour of the code. It is often used as a preliminary study or to give a hierarchical approach over code inputs.

- FAST

An algorithm to get output variance contribution of each input variance (known as first order and total Sobol indices). - Morris

A stable and low cost method to get effect (linearity and interaction) estimation of each input toward output. - SRRC

Standardized Regression Coefficients (SRC), or Standardized Rank Regression Coefficients (SRRC) are sensitivity indices based on linear or monotonic assumptions in the case of independent factors. - PRCC

Partial Correlation Coefficients (PCC), or Partial Rank Correlation Coefficients (PRCC) are sensitivity indices based on linear (resp. monotonic) assumptions, in the case of (linearly) correlated factors.

This engineering practice is dedicated to provide a model of output uncertainty due to input uncertainties.

- Random sampling

Random sampling of input variables, returns output histogram and summary from [R]. - Random sampling with statistic target

Random sampling of input variables as long as the given statistic target precision not reached, returns output histogram and summary from [R]. - Monte Carlo

Random sampling of input variables, returns the Monte Carlo mean estimation and standard deviation. - Wilks

Random sampling of input variables, to provide an upper bound for a given quantile. Size of sample given by Wilks formula.

This common engineering issue is often encountered in the design process of most of high-tech. products.

- Gradient Descent method

Simple local minimization based on finite differences gradient estimation. - Conjugate gradients

Conjugate gradients optimization algorithm. - BFGS method

Quasi-Newton optimization algorithm. - L-BFGS-B method

Quasi-Newton optimization algorithm in a bounded space. - Nelder-Mead method

Nelder-Mead optimization algorithm. - Simulated Annealing method

Simulated annealing optimization algorithm. - CMA-ES algorithm

Covariance Matrix Adaptation Evolution Strategy. - GenOuD algorithm

GENetic Optimization Using Derivatives. - NSGA2 multi-obective & constraints optimizer

Nondominated Sorting Genetic Algorithm 2. - Efficient Global Optimization

Kriging based optimization algorithm using on Expected Improvement criterion. - Efficient Global Optimization - IECI

Kriging based opimization algorithm using Integrated Expected Conditonal Improvment criterion. Allows constraints optimization.

This issue may be defined as threshold exceedance probability estimation, calibration, or data assimilation. It is often used in regulatory tasks and assessments, or to build standards for process safety.

- Dichotomy

1D zero search, 2-nd order polynomial based inversion. Relies on monotony hypothesis on output. - Stepwiser Uncertainty Reduction

Kriging based inversion algorithm based on global exceedance volume uncertainty minimzation. - Ranjan's criterion

Kriging based inversion algorithm using a proxy of local exceedance volume uncertainty minimzation. - Bichon's criterion

Kriging based inversion algorithm using a proxy of local exceedance volume uncertainty minimzation. - Threshold Exceedance Product criterion

Kriging based inversion algorithm using two sided product of Expected Exceedance.

These design plugins are basically used to provide an input space filling used by third-party software for meta-modelling issues, for instance.

How to post-process a Promethee project with my desktop tools (spreadsheet, R, ...) ?

depending on the step of the workflow concerned, here are some possible post-processing features:

- 2d/3d plots: you can export all data in a copy:paste toward your spreadsheet software,
- parallel plots (parplot) : you can select any data axis as x, y, or z to plot in a 2d/3d plot, and then export data,
- in many algorithm, the temporary R data used is stored in
**.Rdata**files available in the cases directory. - in the final report, all calcualtion points input & output are stored in a .csv file,

© IRSN - All right reserved - Legal information

## Submit your own tips or question