[edit]
A Bayesian Analysis of the Radioactive Releases of Fukushima
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:1243-1251, 2012.
Abstract
The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know the types of nuclides and their levels of concentration from the recorded mixture of radiations to well take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. The model can infer from the sparsely sampled measurements what nuclides are present as well as their concentration levels. An important property of the proposed model is that it admits unique recovery of the parameters. On synthetic data we demonstrate that our model is able to infer the underlying components and on data from the Fukushima Daiichi plant we establish that the model is able to well account for the data. We further demonstrate how the model extends to include all the available measurements recorded throughout Japan. The model can be considered a first attempt to apply Bayesian learning unsupervised in order to give a more detailed account also of the latent structure present in the data of the Fukushima Daiichi disaster.