### Bayesian updating normal distribution

If we have better than a subjective guess, for instance a worldwide sampling of data, we can estimate the mean and variance of this prior. The following example of estimating the API gravity or oil density may help to see the above formulas at work. I can recommend an explanation of the update process for a probability and for a normal distribution given by Jacobs , which is more complete than what I have given here and explains the derivation of the formulas. In case of no sample information in our local prospect area, all we can do is accept the prior distribution of observations as the "predictive distribution" and use it as input to Gaeapas. Note that for prospect appraisal, the data are almost exclusively mean values of a parameter, such as the mean porosity of a reservoir in a field, not individual porosities in sidewall plugs of pieces of cores. By a standard result on the factorization of probability density functions see also the introduction to Bayesian inference , we have that Therefore, the posterior distribution is a normal distribution with mean and variance. So the actual prior distribution is telling us how uncertain the mean is. For prospect appraisal this is usually a worldwide sampling as used in the Gaeapas appraisal program. Posterior distribution, the revised, or "updated" prior, based on the sample of new information 3. When a prior dataset can be roughly represented by a normal distribution, bayesian statistics show that sample information from the same process can be used to obtain a posterior normal distribution. Both the prior and the sample mean convey some information a signal about. By the way, an excellent source to see the math behind the updating procedure, including the treatment of the variance is given by Jacobs, In the limit, all weight is given to the information coming from the sample and no weight is given to the prior. The formulas involved are shown here without giving the derivation Jacob, , Winkler, We have yet to figure out what is. The latter is a weighted combination of the prior and the sample. Thus, the posterior distribution of is a normal distribution with mean and variance Note that the posterior mean is the weighted average of two signals: The greater the precision of a signal, the higher its weight is. The weight given to the sample mean increases with , while the weight given to the prior mean does not. The following data and calculation shows what happens in this example. The bayesian process of obtaining a posterior distribution of observations which can be used for sampling in a Monte Carlo procedure, uses the distribution of the mean of the observations. The mean of this distribution is the same as the posterior mean, but the variance of this mean is a weighted combination of process and posterior variance. This will be done in the next proof. If several new observations are made, the mean value of these is used and compared to the prior distribution. A prior distribution can be constructed by collecting data, or by "subjective experience" which can not be formally processed. The World-wide prior distributions we have assembled are distributions of observed values. In practice a compromise is made that is anyway much better than not using the worldwide background factual experience.