Seminal Papers in Cybersecurity: A Review (Part 2 of 2)
Around the same time that Anderson was thinking about a reference monitor solution, David E. Bell and Len LaPadula were working at the MITRE Corporation and assigned the task of a mathematical formalization that guaranteed secure access control [3]. While the details of the mathematical proof of assured security is well defined in the paper, there is a glaring issue with proving that the subsequent system implemented these formally defined controls. Bell and LaPadula acknowledged the obvious question head on: how can you guarantee that the machine rules, as implemented, are inviolable? This means that the formal mathematical system would need to be coded in a machine without error. Not only would the code for the formalism need to be flawless, but everything involved in how the computer operated would also need to be flawless. They provide no answer to this question, but just acknowledge it as an obvious question.
I found it amusing that David Bell, during an interview in 2012, recalled saying when he first received his assignment at MITRE to study and develop a formal method of computer security: “That sounds pretty boring” [4]. It seems rare that people initially dismiss something as boring and then proceed to become a world-renowned expert and make major contributions in the advancement of that very thing. While most known for being a great and influential figure in computer security, I equally admire him for his intellectual curiosity and open mindedness.
A couple years later in 1975, at MIT, J. Saltzer and M.D. Schroeder published the paper “The Protection of Information in Computer Systems,” which appears to be the first published notion of what is now referred to as the CIA triad [5]. In addition, they proposed design characteristics of secure computer systems that are still largely in use today. This paper may include the first reference to two-factor authentication and makes a vigorous case for the importance of simplicity. In addition, Saltzer and Schroeder assert that the design for computer security should be open and not based on secrets. This statement should not be construed to imply that passwords or keys should not be kept secret, but that the design and implementation of the system should not demand secrecy. In present terminology, such a requirement of security depending on secrecy would be described as security by obscurity. This type of security is widely regarded as lacking, as evidenced by the case that Saltzer and Schroeder present in their paper.
Many of the characteristics of secure cyber systems that were written about some 50 years ago are still in use today. There are two particularly glaring exceptions. Saltzer and Schroeder make mention of “fail-safe defaults.” This can best be described as deny-by-default. The very first switched computer network designs allowed all traffic to flow across networks and to any connected machine, and it was up to the machine to authenticate. Largely this practice is still in place today. Why? Likely because of market forces.
In his article “The Origin and Early History of the Computer Security Software Products Industry,” J.R. Yost writes:
“The goals of greater efficiency and keeping overhead down tended to trump strong security for most firms in the late 1970s, 1980s, and in many cases, beyond.” [6]
The other exception relates to Andersons’ critique of ad hoc security measures. He writes:
“Unless security is designed into a system from its inception, there is little chance that it can be made secure by retrofit.” [2]
The focus on secure by design never really caught on. As a result, it seems that devices and software were designed and placed in service, found to have vulnerabilities and patched, or worse yet, encapsulated by a separate device or software that mediated the first vulnerability, but very well could have introduced more vulnerabilities. The vicious cycle is obvious. The sales cycle is equally obvious.
Contributor
Nicholas Seeley
Senior Vice President of Engineering Services, Engineering Servicesnicholas_seeley@selinc.comView full bio[1] Willis H. Ware, "Security Controls for Computer Systems: Report of Defense Science Board Task Force on Computer Security," Rand Corporation, Santa Monica, 1969.
[2] J. P. Anderson, "Computer Security Technology Planning Study," USAF, Bedford, 1973.
[3] Bell, David E; LaPadula, Len;, "SECURE COMPUTER SYSTEMS: MATHEMATICAL FOUNDATIONS," MITRE, Bedford, 1973.
[4] D. E. Bell, Interviewee, An Interview with David Elliott Bell. [Interview]. 12 September 2012.
[5] J. Saltzer and M. D. Schroeder, "The Protection of Information in Computer Systems," Proceedings of the IEEE, vol. 63, no. 9, pp. 1278-1308, 1975.
[6] J. R. Yost, "The Origin and Early History of the Computer Security Software Products Industry," IEEE Annals of the History of Computing, pp. 46-58, 2015.
Contribute to the conversation
We want to hear from you. Send us your questions, thoughts on ICS and OT cybersecurity, and ideas for what we should discuss next.
Video
Video
Article
Learn why inconsistent definitions for the CIA triad (confidentiality, integrity, and availability) may make it less suitable for use as cybersecurity first principles.