A similar situation applies to our earlier example involving the security engineer and the system administrator. Consider the following statements.

(a) System administrator: “I estimate the value of Pr(C | ~P) is 0.1.”

(b) Security engineer: “I estimate the value of Pr(C | ~P) is 0.25.”

If both the system administrator and security engineer are using the personal interpretation, we can use the definition of personal probabilities to show why these two statements are not literally in contradiction.

(a’) System administrator: “My degree of belief that C will occur, conditional upon ~P, is 0.1.”

(b’) Security engineer: “My degree of belief that C will occur, conditional upon ~P, is 0.25.”

Since each person is merely describing their own degree of belief, (a’) and (b’) are not contradictory.

**(2) Different ISRA methodologies use different probability theories.** Most quantitative ISRA methodologies rely upon on a frequency theory of probability, while qualitative methodologies tend to rely on a non-objective theory of probability. While I have not discussed the arguments for and against each of the interpretations of probability, suffice it to say there is an entire literature devoted to the subject (see Gillies 2000 for a comprehensive discussion). A genuine criticism of a probability theory is automatically a criticism of any ISRA methodology that relies upon it.

**(3) The theories suggest audit or test procedures for validating probability estimates. **The theories clarify what sort of justification is needed for a probability value in an ISRA. For example, the frequency approach requires data about what sorts of events happen in the long run. Thus, if an auditor has been asked to determine whether an organization has performed an ISRA, one test an auditor could perform would be to request empirical evidence about the series of events used to calculate the frequency probability. If that evidence is available, the auditor could then perform a further test by double-checking the math used to derive the frequency probability.

## 2 Comments

Great post, Jeff.

One thing I’ll add is to counter the criticism that ISRA relies on *predictions* of the future, which is another way of saying “knowledge about the future”. Most InfoSec people, in their gut, feel that such knowledge is unattainable or infeasable.

But ISRA is really not about predicting the future or having highly certain knowledge about the future. Instead, it’s benefit is to help us ORGANIZE OUR UNCERTAINTY. It’s the systematic treatment of uncertainty and ignorance in all it’s forms, with a goal of promoting continuous learning and adaptation.

Russell Cameron Thomas

Thanks, Russell. I’m glad you liked the post!

Regarding the issue of ‘predicting’ the future, I think I agree with your point, but I would word it in a slightly different way. I would say that risk analyses do make ‘predictions’ about the future, but these predictions are hedged in various ways. For example, personal probabilities and intersubjective probabilities represent our degrees of belief (and, accordingly, our uncertainty) regarding various information security-related hazards. Additionally, as my discussion of single-case probabilities hopefully makes clear, frequency probabilities typically don’t make a prediction about a single event. On the other hand, estimated relative frequencies do … estimate the actual relative frequency in the real world, and hence the corresponding ‘actual’ frequency probability. Thus, for example, an ISRA may not provide an inductively correct argument for concluding that

thisweb server will be attacked atthistime, but it may be able to show thatsomesystem will be attacked atsometime during a given time span. In that sense, I would say that ISRA does make predictions. This does not deny what I think is your point, however, that the criticism of ISRA falsely assumes that ISRA is committed to making a series of predictions about single events.Jeff