As well established as the practice of intelligence analysis may be, researchers continue to ask elementary questions about what analysis is, how it is done, and how it can be done better.
“Intelligence analysis involves a complex process of assessing the reliability of information from a wide variety of sources and combining seemingly unrelated events. This problem is challenging because it involves aspects of data mining, data correlation and human judgment,” one recent study (pdf) performed for the Office of Naval Research observed.
The study focused on development of computer tools to support the analytical method known as Analysis of Competing Hypotheses (ACH), previously explored by Folker (pdf), among others.
See “Assisting People to Become Independent Learners in the Analysis of Intelligence” by Peter L. Pirolli, Palo Alto Research Center, Inc., Final Report to the Office of Naval Research, February 2006.
No one will be surprised if we end up with a continuing resolution to push our shutdown deadline out past the midterms, so the real question is what else will they get done this summer?
Rebuilding public participation starts with something simple — treating the public not as a problem to manage, but as a source of ingenuity government cannot function without.
If the government wants a system of learning and adaptation that improves results in real time, it has to treat translation, utilization, and adaptation as core functions of governance rather than as afterthoughts.
Coordination among federal science agencies is essential to ensure government-wide alignment on R&D investment priorities. However, the federal R&D enterprise suffers from egregious siloization.