[1910.11219] A Bayesian nonparametric test for conditional independence
The ability to detect dependences of a highly nonlinear or even non-functional nature allows for much greater confidence in the robustness of any inference procedure in which this type of test is embedded.
Abstract: This article introduces a Bayesian nonparametric method for quantifying the
relative evidence in a dataset in favour of the dependence or independence of
two variables conditional on a third. The approach uses Polya tree priors on
spaces of conditional probability densities, accounting for uncertainty in the
form of the underlying distributions in a nonparametric way. The Bayesian
perspective provides an inherently symmetric probability measure of conditional
dependence or independence, a feature particularly advantageous in causal
discovery and not employed by any previous procedure of this type.
Fig. 3: Construction of the Pólya tree distribution on Ω = [0, 1]. From each set C∗, a particle of probability mass passes to the left with (random) probability θ∗0 and to the right with probability θ∗1 = 1 − θ∗0, with all θ being Beta-distributed as described in the main text. (PÓLYA TREES) (BAYESIAN CONDITIONAL INDEPENDENCE TEST)
Fig. 4: Pairwise dependence graph output by the Bayesian conditional independence test for five variables from the CalCOFI dataset, conditional on T degC. Edges are present where p(H1|W) > 0.99. (Real data)›