Monthly Archives: September 2011
We started IntelCloud a couple of months ago as an experiment to see what a blog might look like that talked about intelligence analysis and related issues in an open forum. During that time we’ve learned quite a bit but we also learned that we may have been a bit too ambitious. So, for now, we’re shuttering IntelCloud and hoping that in the not too distant future we’ll apply what we learned there as well as a few new tricks into a new and improved platform to provide you all with timely, relevant and interesting information about our craft and profession.
In the meantime, if you enjoyed what you read here and are looking for more of the same, we’d like to recommend the following blogs from two of our regular contributors:
Thank you for taking the time to read our stuff and we look forward to seeing you again.
I continue to work my way through journal articles as summer dwindles to a close…
Designing Effective Teaching and Learning Environments for a New Generation of Analysts. James G. Breckenridge. International Journal of Intelligence and Counterintelligence 23:2
Beyond Analytical Tradecraft. Roger Zane George. International Journal of Intelligence and Counterintelligence. 23:2
How to prepare people for the task of intelligence analysis has been on people’s minds for quite some time now. After 9/11 the push was just to get asses into seats and fill the scores of newly created analyst positions with anybody who could get a security clearance (and that wasn’t even a requirement for most state and local positions). Only after spending gobs of money and seeing a steady stream of medeocare products did people begin to think that perhaps (just perhaps) some effort should be placed on training these people in analytical skills.
Breckenridge gives a blueprint of how to get from here to there but, unfortunately, comes a bit too late. In the past year or so there have been significant, weighty additions to the very small library of work dealing with the training (and the even less discussed selection and hiring) of analysts. I would therefore point you to works such as that by the National Research Council.
Still, Breckenridge does bring up a brilliant point that I’ve not seen addressed before. That of training the educators.
Training programs invest very little time and few resources to teach instructors how to create an innovative learning environment. Instructors tend to be selected solely for their subject matter expertise, plucked out of their cubicles, and thrust into the classroom with a lesson plan shoved into their hands; at that point, they are expected to engage students in a meaningful and skilled way.
I’d argue it’s even worse that that. Most instructors are warned in very strict terms not to deviate in any way from approved lesson plans. Now, I’m sure this is valuable when teaching technical skills (how to rebuild an engine or transplant an kidney, for example) but intelligence analysis doesn’t really work that way. I find that classes (like individual students) differ for a whole host of reasons and what works for one doesn’t for another. Our industrial, mass-production model simply doesn’t work when the whole point of the task you’re trying to teach is to be intellectually nimble and refusing to be bound by individual, cultural or institutional cognitive straightjackets.
Of course, another reason for this has been the very issue that Breckenridge points out. Since no real effort has been made to evaluate instructors we’ve seen some horror stories emerge. The reaction (wrongheaded, IMO) has been to limit the freedom of instructors so that rouges won’t be able to do too much damage (see: limit liability to the hosting organization) when they really should be focusing on making sure they have good instructors that deserve trust and can be counted on to give high quality (which does not equal identical) instruction.
Really, should we expect that university classes across the country should teach the same classes (whether Intro to English or Biomechanics) identically with the exact same course materials, exercises, tests, etc.? So why would we expect the same for intelligence analysts?
George looks at deficiencies of analytical training and tradecraft from the perspective of four key biases: Cognitive, cultural, organizational and political. Each presents unique challenges to any intelligence analyst or organization and while his article focuses on national security implications I think there are equally important issues when looking at intelligence as applied to crime or terrorism. I highlight the most vexing (IMO) issues here.
Cognitive: “…the more expert one becomes-relying upon a highly developed mental model of the intelligence target-the more the analyst becomes prone to missing major discontinuities or key changes in a [subject]…” This might be one of the strongest cases for a good mix for generalists and specialists to commingle in an intelligence shop. A nice mix can guard against missing subtle shifts in the environment (the realm of the expert) and fundamental changes that defy conventional wisdom (the job of the generalist). I haven’t seem much thought (let alone practice) in this area and while most discussions on this subject are in the national security field, I’d suggest that it would be just as valuable in examining criminal activity. The emergence of crack cocaine in the 1980s should be seen as an example of strategic surprise which could have been foreseen but, like the collapse of the Soviet Union or the Iraqi invasion of Kuwait, but all were overlooked because experts adhering to the conventional wisdom knew how drug markets, superpowers and Middle Eastern despots acted.
Cultural: “What an intelligence analyst might view as the most logical way to calculate the risks and benefits of different actions is, perhaps more often than not, not the way leaders in Arab, Asian, or African cultures will calculate. Law enforcement analysts tend to overlook cultural issues in many cases yet the differences can be profound. Analysts tend not to come from an overly diverse background and I question, for example, how easily an analyst from a typical upper middle class, suburban background can (without study and effort) understand the cost/benefit calculations of those on extreme ends of the economic and social scales. Law enforcement officers can learn this on the job over years of direct contact and those that do it well are at a marked advantage over those who can’t. Why don’t we exert any effort of orient analysts to these cultural differences?
Organizational: Many analysts aren’t aware of the bias their organization (or their community more generally) has on their work. It is rare for law enforcement analysts to ask for those outside their community to review analysis for alternate perspectives or interpretations. Additionally, there are obstacles to even reviewing analyses even within the community. As George points out:
In the academic world, such peer review is customary; however, peer review can be accepted or rejected by the academics circulating their work. But it cannot be in the CIA [I include this passage here because I believe it applies much more widely than in the CIA and even more broadly than in the national security intelligence community. TwS]. In the Agency, and in other intelligence agencies, only “one organizational view” of an issue can prevail. Such conformity defies both human nature and logic.
One way to ensure organizational conformity is to frame the questions in such a way to preclude alternatives. So, for example, in the law enforcement field problems are often framed in such a way to only allow for crime suppression responses rather than looking more broadly at possible ways to reduce criminality.
One of the things I most welcome seeing in the George article is the explicit identification of the need for an outreach program to educate intelligence customers. Again, while this passage is focused on national security, apply locally:
…[intelligence agencies need] to develop an outreach program to policymakers in both the Legislative and Executive branches. Analysts spend lots of time trying to understand the policy process to improve their support to policymakers. But no time is devoted to educating the policymakers themselves…Educating policymakers about the analytical business might just reduce their urge to misconstrue or selectively present intelligence analysis, although it is no guarantee. At a minimum, they would thereafter understand more about analytical tradecraft and make better use of analysts.
Well said, sir!