[1910.10793] We Know Where We Don't Know: 3D Bayesian CNNs for Uncertainty Quantification of Binary Segmentations for Material Simulations
Our BCNN produces uncertainty maps which capture continuity and visual gradients, outperforms Monte Carlo dropout networks (MCDNs) on recent uncertainty metrics, and achieves equal or better segmentation accuracy than MCDNs in most cases

Abstract Deep learning has been applied with great success to the segmentation of 3D X-Ray Computed Tomography (CT) scans. Establishing the credibility of these segmentations requires uncertainty quantification (UQ) to identify problem areas. Recent UQ architectures include Monte Carlo dropout networks (MCDNs), which approximate Bayesian inference in deep Gaussian processes, and Bayesian neural networks (BNNs), which use variational inference to learn the posterior distribution of the neural network weights. BNNs hold several advantages over MCDNs for UQ, but due to the difficulty of training BNNs, they have not, to our knowledge, been successfully applied to 3D domains. In light of several recent developments in the implementation of BNNs, we present a novel 3D Bayesian convolutional neural network (BCNN) that provides accurate binary segmentations and uncertainty maps for 3D volumes. We present experimental results on CT scans of lithium-ion battery electrode materials and laser-welded metals to demonstrate that our BCNN provides improved UQ as compared to an MCDN while achieving equal or better segmentation accuracy. In particular, the uncertainty maps generated by our BCNN capture continuity and visual gradients, making them interpretable as confidence intervals for segmentation usable in subsequent simulations.
‹Figure 1: Schematic of our BCNN architecture with sample volume dimensions from the Graphite dataset. Best viewed in electronic format. Measurements are (depth, height, width, channels). (Uncertainty Quantification)Figure 3: Results on Laser Weld Test Set Sample S33, Slice 453. Note that our BCNN uncertainty captures the visual gradients around the edges of the material, while the MCDN uncertainty displays a pixelated line at best. (Laser Weld Dataset)Figure 2: Results on Graphite Test Set Sample GCA2000, Slice 212. Note that our BCNN uncertainty is focused around the light gray edges of the material in the original slice, while the MCDN uncertainty is pixelated and uninterpretable. (Results)Figure 4: BCNN Segmentation Failure Case: Laser Weld Test Set Sample S4, Slice 372. Note that, while our BCNN produces a poor segmentation, its uncertainty map exactly corresponds to the areas of the image where the segmentation overpredicts. We show that the uncertainty-based refinement algorithm proposed by Martinez et al. [16] applied to the BCNN output with a threshold of 0.22 produces a high-accuracy segmentation. (Laser Weld Dataset)›