![]() |
Dakota Reference Manual
Version 6.15
Explore and Predict with Confidence
|
Calculate the Kullback-Leibler Divergence between prior and posterior
Alias: none
Argument(s): none
The Kullback-Leibler (KL) Divergence, also called the relative entropy, provides a measure of the difference between two probability distributions. By specifying kl_divergence
, the KL Divergence between the posterior and the prior
parameter distributions is calculated such that
This quantity can be interpreted as the amount of information gained about the parameters during the Bayesian update.
Expected Output
If kl_divergence
is specified, the calculated value will be reported to the screen at the end of the calibration, following the sample statistics of the response functions. Example output is given below.
Additional Discussion
The quantity calculated is a -nearest neighbor approximation of the possibly multi-dimensional integral given above. Therefore, some applications whose true KL Divergence is quite close to zero may report a negative KL Divergence.
Below is a method
block of a Dakota input file that indicates the calculation of the KL Divergence
method, bayes_calibration queso dram seed = 34785 chain_samples = 1000 posterior_stats kl_divergence
The calculated KL approximation is indicated in the screen output by "Information gained from prior to posterior" as shown below
Sample moment statistics for each response function: Mean Std Dev Skewness Kurtosis least_sq_term_1 3.9982462078e-01 4.7683816550e-04 -2.3448518080e+00 7.7381497770e+00 Information gained from prior to posterior = 1.0066819600e+01 <<<<< Iterator bayes_calibration completed. <<<<< Environment execution completed.