The State of Xinjiang

I wrote this paper last April. I was never going to publish it because I was worried about the consequences to it. However, as the summer has moved into the fall and the protests in Hong Kong have…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




How to measure Cybersecurity requirements

Just ask!

If you put together “N” cybersecurity professionals, I can guarantee, in my experience that, if such a conversation is initiated, there will be a total number of definitions “X” of each of the following concepts:

Where, for every concept, sadly X > N

Add to the mix Confidentiality, Integrity, Availability, Possession, Utility, Risk, Authentication, Authorization, Audit, Reliability, Access Control, Identification, Privacy, Anonymity, Business Continuity, Non-Repudiation or Accountability among other less popular ones, and the Babel Tower will reach the sky.

I looked into how they define things is physics for inspiration, as they use something called “operational definitions”. These operation definitions have some neat attributes:

The objective measurement of a certain attribute of a certain object should render always the same result independently of who makes the measurement, when the measurement is made , or what method is used to measure the attribute. Therefore, the scientific method uses Operational Definitions, as they provide an almost complete independence of the observer, method, or timing.

This seems, to me, a manager’s dream, as measurements that render metrics are key for management.

The one thing the operational definition does not provide directly is an insight on the nature of the attribute that is measured. I guess that is why they are not popular among cybersecurity professionals, after all, you want to know what “the thing” is, because you work with “the thing” all the time.

An operational definition (if you have resisted the urge to check Wikipedia so far) is when we define something by the method used to measure it.

While non scientific definitions define an attribute by its essence or nature, operational definitions define attributes by the method used to measure the attribute. For example: “The weight of an object is the numbers and units that appear when that object is placed on a weighing scale”. Definitions of weight that are not operational, like “The amount of mass an object has”, are easier to understand intuitively, but don’t enable objective measurement.

Besides the three advantages mentioned above, operational definitions have an additional cool feature: You can define something in relation to something else as in this graphic :

System of Units (from Wikipedia)

If we are to use operational definitions in cybersecurity, me need to define how we measure:

And ideally this definition will lead to definitions of:

My approach is the following: Information does not have value by itself, it has value to the owner of the information. The measurement procedure is to ask questions to the owner that quantify what is that value. We can call each of the values that are measured a “Security Requirement”. Getting a little technical, a Security Requirement is an emergent property [1] (or attribute) that arises from a user using an information system. If the user or the information system don’t exist, or the user does not use the information system, the security requirement does not arise. [3][5]

The following questions are of interest and should render quantifiable answers:

Traditional analysis of security requirements use risk analysis or risk assessment techniques, rendering results similar to the following three different scenarios:

But this type of analysis does not render results with units (figures are not units) or actionable lists. and we don’t get clear success criteria that can drive management.

In the same three scenarios we could ask, for the sake of an example the following questions:

a) Who would you want to share this information with?
b) Who would you not want to share this information with?

And obtain answers like:

It is quite clear to me what type of analysis leads more readily to actions that will satisfy the security requirements of the owner of the information.

In the first example, if the Board can’t access the information: that is an incident; if someone who is not in the board can access the information, that is an incident as well. On the other hand, every time the Board can access the information, we have succeeded, and when someone who is not in the Board fails to access the information, again we succeed.

Asking questions with the operational approach instead of C-I-A will help you avoid conversations like this (from my old videoblog):

We can define an incident as any instance of a security expectation of a user being failed. Using this point of view, a definition of an incident is not company-independent. While for a company trade secrets will be key for their success, other company will have no secrets at all. While for a company won’t survive for three days without their information systems, it will take only eight hours for other company to go out of business.

Security requirements are independent of the observer, so different professionals will measure the same values.

Security requirements have units and are immediately actionable as the success criteria measured is specific enough to readily determine which security efforts will contribute to directly or indirectly meet them.

Security requirements are relevant to their context. This greatly helps directing security efforts towards controls that are relevant to the business.

The barrier of communication between the security professionals and the users disappears, as there is no need to explain any specialist’s concepts.

Demonstrating the value of security becomes easier, as an agreement on security requirements makes it transparent what the business is getting for the investment it is making.

I am afraid that with this article I am contributing to making X greater than N, or perhaps to make X=1? Time will tell.

This article is part of a series that starts here: Principles of Evidence Based Cybersecurity Management

[1] Emergent Properties in complex systems are attributes that their constituent parts don’t exhibit. For example, ripples in the sand in a beach is an attribute that individual sand grains and air don’t exhibit by themselves. If there was no sand, or no air, the ripples would not exist. Attributes of ripples are for example their height or separation.
[2] A Measurement is the procedure of obtaining a reduction in the uncertainty of a number that is characteristic of an attribute.
[3] Using questions for measurement is a well-known method that is used in science when the human factor is present. Using questions is used extensively in polls, and it is used by Delphi method that was developed by the RAND Corporation in the late 1950s.
[4] Mayfield’s Paradox states that to keep everyone out of an information system requires an infinite amount of money, and to get everyone onto an information system also requires infinite money, while costs between these extremes are relatively low.
[5] Level of measurement is a classification that describes the nature of information within the values assigned to variables. The best known has four levels, or scales, of measurement: nominal, ordinal, interval, and ratio.

Add a comment

Related posts:

ISLAMIC CAPITAL MARKETS

Islamic capital markets are components of a financial system that deals with raising capital using shariah-compliant securities and instruments as alternatives to conventional shares, bonds, and…

On Shortness of Life

We are all going to die. Our lives are fragile. Yet when we hear about our favorite celebrity passing away at a young age, it’s shocking. How can this be? We try to find reasons to justify the…

Ten 90s Songs that Reached into the Depths of Our Souls

Coming of age in the 90s was much different than that of today. While there was technology at our fingertips, there was no such thing as social media. Our socializing was over the phone or in person…