Analyzing security: understanding and analyzing risk in a networked environment

by MAJ Sheryl French

Once upon a time (that�s how good fairy tales begin), security pundits believed perfect security could be achieved in a networked environment, and cryptography was the panacea. They were wrong.

Information-security expert Bruce Schneier � founder and chief technical officer of Counterpane Internet Security Inc., and author of Applied Cryptography (John Wiley and Sons, 1996) � prefaces his latest book, Secrets and Lies: Digital Security in a Networked World (John Wiley and Sons, 2000), with an apology of sorts. He unequivocally states that the mathematics-based security utopia described in Applied Cryptography is false. It�s not that cryptography is less effective today; it�s the realization that cryptography is but one component of network security.

Security also involves people and computers. People are imperfect. They have good days and bad. They get tired, or lazy, or busy. They make mistakes. Computers are developed by businesses, eager to make a profit, engineered and coded by people � and we know about people. The result? Security is imperfect, and risks must be managed.

Changing paradigms

This shift in philosophy profoundly impacts U.S. Army tactical units. Cryptography has been considered the ultimate information security for decades. Automated information systems within the tactical battlefield operated in stovepipes, exchanging messages using U.S. Message Text Format or Joint Variable Message Format through point-to-point connections. These systems did not exchange information outside of their environment, and encryption of these point-to-point connections provided perfect security.

As the Army transitioned to using commercial standards and protocols, advances in information technology profoundly changed military operations. Modern tactical-information systems exchange information using transmission-control protocol/Internet protocol vertically and horizontally, both internal and external to the tactical battlespace. Situation awareness, the common tactical picture and the common operational picture depend on this exchange of information. Cryptography continues to provide transmission security for the links, ensuring radio transmissions cannot be intercepted.

However, the fact is every computer is virtually connected to every other computer at all times. External connections from the tactical command-and-control network through the Defense Information Systems Network�s standard tactical-entry point or tactical reachback into an Army installation�s secure local-area network extends this virtual connection to every computer on the secure IP routed network. Risk analysis becomes paramount in the decision process for establishing and maintaining connectivity, both internally and externally.

Applying risk analysis to the networked environment

Risk analysis allows commanders to identify and manage risks to their networked environment by balancing risk, cost and benefit. It will also result in a clear understanding of the residual risk and the cost of mitigating that risk. Risk analysis isn�t new to the commander; however, applying risk analysis to the tactical information-networking environment may be.

To effectively conduct risk analysis, you must first define your environment, critical systems within your environment, and interaction between your environment and those outside it. Next, identify the relative criticality of each element, without respect to specific risks. What impact would the loss or failure have on your operation? This first step will allow you to focus your efforts in the risk-assessment and management phases of the risk-analysis process. This step will also become a key element in evaluating risk-management alternatives.

The risk-assessment phase begins with risk identification, enumerating possible hazards that would threaten a system�s survivability or dependability. Definitions of risk vary greatly. The simplest definition of risk comes from Merriam-Webster�s dictionary, which defines it as the "possibility of loss." However, this definition falls short in identifying risks to information security because it fails to include adverse effects� degree of probability and that probability�s relationship to the loss�s severity should the hazard occur.

For example, if a system is struck by lightning, the probability of adverse effects is very high, as is the severity of the loss, whereas a soda spilled into a keyboard would result in high probability of adverse effect to the keyboard but a low severity of loss.

Risks aren�t strictly physical. Connectivity identified in the definition phase will factor into and produce a number of risks to the environment. A team approach will enhance the process of identifying risks, bringing in multiple perspectives. For instance, a technical-network specialist may consider the abundant risks posed by the nefarious hacker but fail to consider physical security, training or environmental issues.

Identification and quantification of risks should be an exhaustive exercise that will change with each implementation. However, once completed thoroughly the first time, it will be readily tailored to meet future needs.

Thus far, you�ve only quantified the risk, not the likelihood that it will occur, or the consequences or impact of it occurring. Those factors round out the risk-assessment phase of risk analysis.

Risk management follows risk assessment in the risk-analysis process. Risk management is a discipline for dealing with uncertainty by taking steps to protect vital assets and resources. Building on the risk assessment�s results, identify viable options to mitigate, control or, if possible, avoid the identified risk(s).

There may be multiple options for each risk. Selecting the appropriate option will be based on both the cost/benefit of the solution and prioritization of the risk. While it may be possible to completely eliminate a specific risk, other, higher-priority risks may dictate a less complete, lower cost mitigation.

Identify all costs and benefits of each option. Costs may extend beyond financial. The cost could be loss or delay of information due to eliminating specific connections, inserting air gaps and man-in-the-loop precautions, or diverting manpower from one operation to another.

Evaluate courses of action to determine the optimal mix costs, benefits and residual risks to meet mission and security requirements. This evaluation should include representatives from the user community as well as management. Decisions that will impact operations cannot be solely left up to the system administrator or security specialist.

Lastly, establish a process for continuous evaluation. Decisions and changes made after the initial risk analysis and implementation may significantly impact the effectiveness of previous measures.

Role of vulnerability testing in managing risk

Vulnerability testing contributes significantly to risk management. The Defense Department has defined a process, vulnerability analysis, which is "the systematic examination of an information system or product to determine the adequacy of security measures; identify security deficiencies; provide data from which to predict the effectiveness of proposed security measures; and confirm the adequacy of such measures after implementation," according to Joint Publication 1-02, DoD�s Dictionary of Military and Associated Terms.

The Army has adopted the same definition for vulnerability assessments. Whichever term is used, the objectives are to expand on the risk assessment�s output, evaluate the effectiveness of measures implemented in risk management and provide feedback on residual vulnerabilities and risks.

Legacy system or new development � variations in intent

Vulnerability assessments of current systems or systems-of-systems are designed to provide commanders with an accurate picture of these systems� survivability in today�s environment. They are based on current threat assessments and estimates. Using a methodology referred to as "red teaming," current systems are analyzed, probed and exploited to identify design flaws and shortfalls in risk management.

Many legacy systems in use weren�t designed with security as a fundamental component. Therefore, these vulnerability assessments focus primarily on proper implementation of known patches and operational mitigation measures. Vulnerability assessments of systems during the development process must extend the limits of testing beyond current threat assessments to identify potential future threats.

These assessments often become experimental, allowing the assessor to freely probe systems using trial and error, computer-generated attack scenarios and brute force to identify susceptibilities that one day could be exploited. Assessments cannot be constrained by what we know to be our adversary�s current capability, or even by our own capabilities. These experiments are equally an assessment of the systems� robustness and an opportunity for our scientific and test community to expand its capabilities.

The risk to the system is that it could be irreparably damaged, but without this "no holds barred" approach, we can�t hope to maximize the survivability of our networked systems.

Who benefits from vulnerability-assessment testing?

Field commanders or materiel developers are the vulnerability assessment�s main beneficiaries. They are the requestor, and therefore, the first recipient of all results. However, with the high cost of vulnerability assessments, it would be shortsighted to limit lessons-learned to this restricted audience. Vulnerabilities rarely apply to only one command or system.

But sharing the results of specific vulnerability assessments is a double-edged sword. Results of an operational assessment, which identifies failure to implement known patches or weaknesses in operational-security procedures, could be considered adverse information to the command. If these results were routinely published, it wouldn�t be long before commanders stopped requesting assessments all together. Also, the assessment could identify vulnerabilities for which there�s no identified mitigation. This information would be extremely detrimental in an adversary�s hands.

The current approach is to generalize results across multiple commands, referring not to a specific unit but to security in general. This has both positive and negative results. It allows the information to be disseminated, but it dilutes the actual impact once specificity is removed. The "it could never happen to us" paradigm still exists.

For new-system developments, commonalties in operating systems, applications and tools � as well as interfaces between systems � mean vulnerabilities identified in one system have direct applicability to other systems. The results of any vulnerability assessment would prove valuable to developers of systems across a broad spectrum. However, they would also provide critical information that could be exploited by an adversary to compromise the system. The key is managing information, classifying information appropriately and allowing accessibility on a need-to-know basis.

The Army is implementing a centralized relational database that will store specific vulnerability assessments� results. Access to this database will be rigidly controlled. Each data element will be controlled on a need-to-know basis, rather than operating at a "system high" level, where all who have authorized access have access to all data.

This database is in its infancy, but the concept is to compile a repository of known, public-domain vulnerability information and the results of Army-system vulnerability assessments, as well as configuration of systems tested. Once the database is fully implemented, the Army plans to reap benefits by:

Testing specific versions of software only once;
Identifying product susceptibilities during the predevelopment stage, allowing informed acquisition decisions regarding systems� survivability; and
Tackling the difficult task of securing systems-of-systems by understanding the weak link in any confederation of systems.

Closing thoughts

Clearly, risk analysis is the critical component in securing a networked environment. "Black boxes" and locked doors still play their roles, but perfect security is no longer a reality. At least, not in a networked environment. Understanding risks, applying risk management and testing the results allow managers to approach an acceptable risk level within operational and resource constraints.

Ultimately we must consider the Nigerian proverb, "Not to know is bad; not to wish to know is worse." We must always wish to know our weaknesses, even when we wish we didn�t.

MAJ French is chief of the Vulnerability and Protection Division, Information Assurance Directorate, Office of the Director of Information Systems for Command, Control, Communications and Computers. She�s responsible for providing technical and policy solutions to protect information within Army tactical forces and for developing the information-assurance infrastructure for warfighter networks. MAJ French also manages the Army�s information-operations vulnerability-assessment program, ensuring all weapons systems exchanging information over tactical networks are secure from computer-network attack.

She�s a single-tracked Signal officer with the added specialty of network management. Before being assignment to Army headquarters, MAJ French served as Signal officer for the Stabilization Forces in Sarajevo, Bosnia, and as S-3/executive officer of 43d Signal Battalion in Heidelberg, Germany.

MAJ French is also enrolled at National Defense University in both the Chief Information Officer Certification Program and the Certified Information System Security Professional Program.

Acronym QuickScan
DoD � Department of Defense
IP � Internet protocol

dividing rule

Back issues on-line | "Most requested" articles | Article search | Subscriptions | Writer's guide

Army Communicator is part of Regimental Division, a division of Office Chief of Signal.

This is an offical U.S. Army Site |


This is an offical U.S. Army Site |