As well established as the practice of intelligence analysis may be, researchers continue to ask elementary questions about what analysis is, how it is done, and how it can be done better.
“Intelligence analysis involves a complex process of assessing the reliability of information from a wide variety of sources and combining seemingly unrelated events. This problem is challenging because it involves aspects of data mining, data correlation and human judgment,” one recent study (pdf) performed for the Office of Naval Research observed.
The study focused on development of computer tools to support the analytical method known as Analysis of Competing Hypotheses (ACH), previously explored by Folker (pdf), among others.
See “Assisting People to Become Independent Learners in the Analysis of Intelligence” by Peter L. Pirolli, Palo Alto Research Center, Inc., Final Report to the Office of Naval Research, February 2006.
If carbon markets are going to play a meaningful role — whether as engines of transition finance, as instruments of accurate pricing across heterogeneous climate interventions, or both — they need the infrastructure and standards that any serious market requires.
Good information sources, like collections, must be available and maintained if companies are going to successfully implement the vision of AI for science expressed by their marketing and executives.
Let’s see what rules we can rewrite and beliefs we can reset: a few digital service sacred cows are long overdue to be put out to pasture.
Nestled in the cuts and investments of interest to the S&T community is a more complex story of how the administration is approaching the practice of science diplomacy.