Just found what is probably the most direct hit so far on the topic of detecting shielded nuclear weapons--once again by the Fetter crew called "Detecting Nuclear Warheads." The thinking here could probably be generalized to any sort of nuclear weapon or improvised nuclear device.
The paper looks at the tradeoffs of detection distance/time. An intriguing observation it makes is that signals below the background can be detected given enough time. The idea is that the background counts are subject to the central limit theorem (many sources) and so their fluctuations from the mean rate grow proportional to the square root of time. Whereas a single source, however weak, contributes a constant number of counts per second growing steadily over time. After enough time the weak source can be unambiguously detected if the background is well-known. The numbers in this paper are very conservative, and they also discuss how these assumptions are likely to be relaxed in practice.
This and other related references can be found at http://www.puaf.umd.edu/faculty/papers/fetter/publications.htm
Comments