![]() |
Dakota Reference Manual
Version 6.15
Explore and Predict with Confidence
|
Use second Kraskov algorithm to compute mutual information
Alias: none
Argument(s): none
This algorithm is derived in[58] . The mutual information between random variables is approximated by
where is the digamma function,
is the number of nearest neighbors being used, and
is the number of samples available for the joint distribution of the random variables. For each point
in the joint distribution,
and its
nearest neighbors are projected into each marginal subpsace. For each subspace
,
is defined as the radius of the
-ball containing all
points. Then,
is the number of points in the
-th subspace within a distance of
from the point
. The angular brackets denote that the average of
is taken over all points
.
method bayes_calibration queso dram seed = 34785 chain_samples = 1000 posterior_stats mutual_info ksg2
method bayes_calibration queso dram chain_samples = 1000 seed = 348 experimental_design initial_samples = 5 num_candidates = 10 max_hifi_evaluations = 3 ksg2