Making Decision Under Uncertainty



Effective management of uncertainties and life-cycle risks is critical to the success of oil and gas projects. The uncertainties and risks are, in fact, the central challenge of the Exploration and Production business.

In May of 2014 Alejandro Primera was invited to give a talk about the relevance of capturing the uncertainties associated with reservoir characteristics. The work included Experimental Design (ED) and the modelling of reservoir uncertainties in a numerical simulation. The aim of the talk was to provide the audience with examples and the experience of making decisions under high subsurface and surface facility uncertainty and risk associated.


The topics discussed by Alejandro Primera emphasized on the correlation of the decision making process with the uncertainties present in the process. Because of the mix of subsurface, surface and economic uncertainties, the application of ED sampling to studies could be a cumbersome process, by which multiple models have to be coupled, so a strict process for the configuration of studies is far from been established in the industry, on top of the large number of parameters which need to be taken into consideration. Hence, Alejandro underlined that the complete evaluation of subsurface and surface uncertainty usually depends on geological, economic and technological uncertainties which can impact, in various degrees, the recovery process and the project viability. 

During the talk, ED examples were shown to the audience to envisage the level and range of uncertainties which can be incorporated into simulation modelling. In addition, the interaction between the different elements affecting critical decisions making in a field development scenario was highlighted. Furthermore, the differences between technical and organisational complications when choosing between different development options were also discussed.


More about ED

ED can play an essential role when it comes to development of ranking and uncertainty analysis studies in several industries. In a set of experiments, we intentionally change one or more input parameters in order to observe the effect the changes have on one or more responses or Key Performance Indicators (KPIs). It is important to plan and develop the studies, ensuring that the right type of data and sufficient sample size and power are available to answer the research questions of interest as clearly and efficiently as possible. There are several commercial and open source tools that could be used to select experiments sets and fit linear, quadratic or more complex models.













Figure 1 : ED sample workflow


In Oil and Gas, Numerical simulation of subsurface fluid flow coupled with surface infrastructure is a time consuming exercise. Computer experiments with quantitative factors require special types of ED, it is often possible to include many different levels of the factors. Also, the experimental region is often too large to assume that a linear or quadratic model adequately represents the phenomenon under investigation. Consequently, it is desirable to fill the experimental space with points as well as possible (space-filling designs) in such a way that each run provides additional information even if some factors turn out to be irrelevant.


Ranking Designs

In every project the range and type of input parameters will differ. However, what remains essential is the identification of most influential parameters.  It is important to analyse the effect of each parameters, whether positive or negative, and the magnitude of the effect on the KPI.

 Figure 2 : tornado plot indicating the significance of each input variable on the NPV at $100/bbl.


Proxys and Monte Carlo Simulation

One of the advantages of ED, is the possibility to use linear regression models (proxies of the more complex problem) to predict KPIs with the uncertainty input in seconds, this has a potential use in optimisation problems and probabilistic exercises like Monte Carlo simulation (MC).

As an example a numerical simulation model could take 2 hours to run, if we assume capacity to run 10 concurrent scenarios, a 10000 MC simulation would take around 2000 hours or over 80 days. An ED sample set of 100 numerical simulation cases would take 20 hours, and a proxy generated out of the regression process would take seconds to run. More over the MC simulation can be repeated and the proxy improved with space filling designs (adding few more numerical simulation cases).


Figure 3 : MC simulation for NPV at $100/bbl along with P10,P50 and P90 points. 

0 comments: