Glossary

ARPAnet A research network developed by the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense (DoD). It initially linked a few research organizations in the United States. The Milnet is a spinoff of the ARPAnet, for DoD use. The ARPAnet is the precursor of the Internet.

Compromise An accidentally or intentionally caused situation that prevents a system from fulfilling its intended requirements. Of particular concern are compromises resulting from malicious human actions such as system penetrations, subversions, and other intentional misuses. The term is also used more generally; a compromise may also result from a system malfunction, spontaneous external event, or other causes.

COMPUSEC Computer security.

COMSEC Communications security.

CTCPEC The Canadian Trusted Computer Product Evaluation Criteria [20].

Criteria Definitions of properties and constraints to be met by system functionality and assurance. See TCSEC, ITSEC, and CTCPEC.

Criticality A characteristic of a requirement whose nonsatisfaction can result in serious consequences, such as damage to national or global security or loss of life. A system is critical if any of its requirements is critical.

Dependability Defined with respect to some set of properties, a measure of how or whether a system can satisfy those properties.

Dependence A subject is said to depend on an object if the subject may not work properly unless the object (possibly another subject) behaves properly. One system may depend on another system.

DES Data Encryption Standard.

Digital signature A string of characters that can be generated only by an agent that knows some secret, and hence provides evidence that such an agent must have generated it.

Digital Signature Standard (DSS) A standard for digital signatures based on a variant of an ElGamal algorithm [40, 41] due to C.P. Schnorr [148].

DOS Disk Operating System (as in MS-DOS for personal computers).

DSS See Digital Signature Standard.

Feature 1. An advantage attributed to a system. 2. A euphemism for a fundamental flaw that cannot or will not be fixed.

Formal Having a punctilious respect for form—that is, having a rigorous mathematical or logical basis.

Gateway A system connected to different computer networks that mediates transfer of information between them.

Guard A component that mediates the flow of information and/or control between different systems or networks.

Human safety A property that a system must satisfy to preserve personal and collective safety.

Identification The association of a name or other identifier with a user, subject, object, or other entity. Note that no authentication is implied in identification by itself.

Implementation A mechanism (in software or hardware, or both) that attempts to correctly realize a specified design.

INFOSEC Information security. Includes both COMPUSEC and COMSEC.

Internet A worldwide network of networks linking computer systems.

ITSEC The Information Technology Security Evaluation Criteria, the Harmonized criteria of France, Germany, the Netherlands, and the United Kingdom [42].

Letter bomb A logic bomb contained in electronic mail, which will trigger when the mail is read.

Logic bomb A Trojan horse set to trigger on the occurrence of a particular logical event.

Mediation The action of an arbiter interposition (literally, being in the middle) that decides whether to permit a subject to perform a given operation on a specified object.

Message digest A checksum, hash-code, compression code, or other generally nonreversible transformation of a message into a generally much shorter form. Message digests can be used in authentication and certification. (MD4 [136] is an example of a message digest algorithm.)

Milnet The U.S. Department of Defense spinoff of the ARPAnet.

MLI See Multilevel integrity.

MLS See Multilevel security.

Model An expression of a policy or a system design in a form that can be used for analysis or other reasoning about the policy or the system.

Monitoring Recording of relevant information about each operation by a subject on an object, maintained in an audit trail for subsequent analysis.

Multics An operating system developed beginning in 1965 by MIT, with participation of Bell Laboratories from 1965 to 1969, and with hardware and software support initially from General Electric and then Honeywell. Multics (Multiplexed Information and Computing Service) was an early pioneer in virtual memory, directory hierarchies, several innovative approaches to security, multiprocessing, abstraction, symbolic naming, and the use of a higher-level programming language for system development.

Multilevel integrity (MLI) An integrity policy based on the relative ordering of multilevel integrity labels.

Multilevel security (MLS) A confidentiality policy based on the relative ordering of multilevel security labels (really multilevel confidentiality).

NCSC The National Computer Security Center.

Noncompromisibility The ability of a system to withstand compromise.

Nonrepudiation An authentication that, with high assurance, can not be refuted subsequently.

Nontamperability The ability of a system to withstand tampering.

Operating system A collection of programs intended to control directly the hardware of a computer, and on which all the other programs running on the computer generally depend.

OPSEC Operations security. Encompasses security concepts that transcend system and network technology per se—that is, COMPUSEC and COMSEC, respectively). Includes such techniques as TEMPEST, communication channel randomization, and covert channel masking.

Orange Book The familiar name for the basic document defining the TCSEC [101], derived from the color of its cover. The Orange Book provides criteria for the evaluation of different classes of trusted systems, and has many documents relating to its extension and interpretation. See Red Book, Yellow Book.

Pest program A collective term for programs with deleterious and generally unanticipated side effects—for example, Trojan horses, logic bombs, viruses, and malicious worms.

Policy An informal, generally natural-language description of intended system behavior. Policies may be defined for particular requirements, such as confidentiality, integrity, availability, and safety.

Private key 1. In a public-key (asymmetric) cryptosystem, the private-key counterpart of a public key—namely, the key that is private to the owner and does not need to be shared at all. In contrast, see Public key. 2. In a shared-key (symmetric) cryptosystem, the key that must be shared by the encrypter and decrypter, but (hopefully) by no one else. See Shared key (also referred to as a secret key). (We avoid the confusion between these two definitions by referring to the key in the latter case as a shared key rather than as a private key.) 3. In a symmetric cryptosystem, a key that is not shared, but rather is held by only one entity—used, for example, for storage encryption. See Secret-key cryptosystem, definition 2.

Process Generally, a sequential locus of control, as in the execution of a virtual processor. It may take place on different processors or on a single processor, but with only a single execution point at any one time.

Public key In a public-key (asymmetric) cryptosystem, the public-key counterpart of a private key—namely, the key that is public and does need not to be protected. In contrast, see Private key.

Public-key cryptosystem An encryption algorithm or its implementation, using a public key and a corresponding private key. Also known as an asymmetric cryptosystem. See RSA as an example of a public-key encryption algorithm.

Red Book Familiar name for the Trusted Network Interpretation of the TCSEC [100].

Reference monitor A system component that mediates usage of all objects by all subjects, enforcing the intended access controls. It might typically include a kernel plus some trusted functionality.

Requirement A statement of the system behavior needed to enforce a given policy. Requirements are used to derive the technical specification of a system.

Risk Intuitively, the adverse effects that can result if a vulnerability is exploited or if a threat is actualized. In some contexts, risk is a measure of the likelihood of adverse effects or the product of the likelihood and the quantified consequences. There is no standard definition.

RSA Acronym for the Rivest-Shamir-Adelman public-key encryption algorithm [137].

Secret key A key that is (supposedly) kept secret. See Shared key; see also Private key, definition 2. It is sometimes actually a key known only to one entity; see Secret-key cryptosystem, definition 2.

Secret-key cryptosystem 1. In common usage, equivalent to shared-key (symmetric) cryptosystem. 2. One-key (symmetric) encryption and decryption (perhaps of a file) in which the single key is known only to one entity (such as a person or computer system).

Security 1. Protection against unwanted behavior. In present usage, computer security includes properties such as confidentiality, integrity, availability, prevention of denial of service, and prevention of generalized misuse. 2. The property that a particular security policy is enforced, with some degree of assurance. 3. Security is sometimes used in the restricted sense of confidentiality, particularly in the case of multilevel security (that is, multilevel confidentiality).

Separation of duties A principle of design that separates functions of differing security or integrity into separate protection domains. Separation of duties is sometimes implemented as an authorization rule in which two or more subjects are required to authorize an operation.

Shared key In a shared-key (symmetric) cryptosystem, the key that must be shared by encrypter and decrypter, and (hopefully) by no one else. See Secret key.

Shared-key cryptosystem A symmetric system for encryption and decryption, using a single key that is shared presumably only by the sender and the recipient. See DES as an example of a shared-key algorithm.

Signature See Digital signature.

Smart card A small computer with the approximate dimensions of a credit card. It is typically used to identify and authenticate its bearer, although it may have other computational functions.

Specification A technical description of the intended behavior of a system. A specification may be used to develop the implementation and provides a basis for testing the resulting system.

Spoofing Taking on the characteristics of another system or user for purposes of deception. In the present contexts, spoofing is generally prankish rather than overtly malicious, although it is often used elsewhere in a malicious context.

State An abstraction of the total history of a system, usually in terms of state variables. The representation could be explicit or implicit.

State machine In the classical model of a state machine (attributable to George Mealy [90]), the outputs and the next state of the machine are functionally dependent on the inputs and the present state. This model is the basis for all computer systems.

Subversion A compromise that undermines integrity.

System 1. A state machine with an associated state, which, when provided with inputs, yields a set of outputs and results in a new machine state. (See State machine.) 2. An interdependent collection of components that can be considered as a unified whole—such as a networked collection of computer systems, a distributed system, a compiler, an editor, or a memory unit.

Tampering An intentionally caused event that results in modification of the system and of its intended behavior.

TCB See Trusted computing base.

TCSEC The Department of Defense Trusted Computer System Evaluation Criteria. See Orange Book [101], Red Book [100], and Yellow Book [102].

TENEX An operating system developed by Bolt, Beranek and Newman for Digital Equipment Corporation mainframe computers.

Threat A potential danger that a vulnerability may be exploited intentionally, triggered accidentally, or otherwise exercised.

Time bomb A Trojan horse set to trigger at a particular time.

Token authenticator A pocket-sized computer that can participate in a challenge-response authentication scheme. The authentication sequences are called tokens.

Trapdoor A hidden flaw in a system mechanism that can be triggered to circumvent the system’s security. A trapdoor is often placed intentionally, but can be created accidentally.

Trojan horse A computer entity that contains a malicious component whose use by an unsuspecting user can result in side effects desired by the creator of the Trojan horse and generally unanticipated by the user. A Trojan horse typically operates with all of the privileges of the unsuspecting user. It may give the appearance of providing normal functionality.

Trust Belief that a system (or person) meets its specifications or otherwise lives up to its expectations.

Trusted computing base (TCB) The portion of a system that enforces a particular policy. The TCB must be nontamperable and noncircumventable. Under the TCSEC, it must also be small enough to be systematically analyzable.

Trusted guard A computer system that acts as a guard and that is trusted to enforce a particular guard policy, such as ensuring the flow of only unclassified data from a classified system or ensuring no reverse flow of pest programs from an untrusted system to a trusted system.

Trusted system A system believed to enforce a given set of attributes to a stated degree of assurance (confidence).

Trustworthiness Assurance that a system or person deserves to be trusted.

Unix A family of extremely popular multiprogramming operating systems, originally developed by Ken Thompson and Dennis Ritchie at Bell Laboratories, beginning in 1969. See [146].

Vaccine A program that attempts to detect and disable viruses.

Virus A program that attaches itself to other programs and has the ability to replicate. In personal computers, viruses are generally Trojan horse programs that are replicated by inadvertent human action. In general computer usage, viruses are self-replicating Trojan horses.

Vulnerability A weakness in a system that can be exploited to violate the system’s intended behavior. There may be security, integrity, availability, and other types of vulnerabilities. The act of exploiting a vulnerability represents a threat, which has an associated risk of exploitation.

Worm A program that distributes itself in multiple copies within a system or across a distributed system. A worm may be beneficial or harmful.

Worm attack The harmful exploitation of a worm that may act beyond normally expected behavior, perhaps exploiting security vulnerabilities or causing denials of service.

Yellow Book Familiar name for a document providing guidance for applying the TCSEC to specific environments [102].

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.109.30