30 Day to Launch Phone
Yep, that’s NSA as in National Security Agency, and you don’t have to burn it after reading! This is a great way to get up to speed on code risks. Zoom over to
GUIDANCE FOR ADDRESSING MALICIOUS CODE RISK (you tell it is serious because it is all caps)
So act now for this publication paid for by you the US Taxpayer! In this article you are given the definition of Malicious Code!
Here is the vocabulary that is part of the document and a good source of definitions, for students prior to interviews you might want to be able to discuss any of these words, just in case.
The property that ensures that the actions of an entity can be traced uniquely to the entity.
Software whose primary function is generating revenue by advertising targeted at the user of the
computer on which the software resides.
Anything observed in the documentation or operation of software that deviates from expectations
based on previously verified software products or reference documents.
Anything that has value (e.g. data, executing process) to a stakeholder (e.g. organization who
Grounds for confidence that an entity meets its security objectives.
Timely, reliable access to data and information services for authorized users.
Property of data or software that ensures that it is operational at or above the minimum level of
performance and accessible by all of its intended users.
Surreptitious mechanism used to circumvent security controls and provide access. Synonymous
with trap door.
An action in which more data is placed into a buffer or data holding area than the capacity that
has been allocated by the system. Synonymous with buffer overrun.
Commercial Off the Shelf (COTS);
Software or hardware products developed and distributed by commercial suppliers. COTS
software products are ready-made and made available for sale, lease, or license (usually for a
fee) to the general public.
Assurance that information is not disclosed to unauthorized individuals, processes, or devices.
(1) The degree to which software is free from faults in its specification, design, and
implementation. (2) The degree to which software, documentation, or other items meet specified
requirements. (3) The degree to which software, documentation, or other items meet user needs
and expectations, whether those needs/expectations are specified or not.
Newly developed software, most often for use in a specific system or application, where the
government has control over its development. Contrast with “pre-existing software”.
Denial of Service;
Prevention of authorized access to a system resource or the delaying of system operations and
Integrating concept that encompasses the following attributes—reliability, safety,
maintainability, integrity, availability. When addressing security, additional attributes have great
prominence—confidentiality and accountability.
The difference between a computed, observed, or measured value or condition and the true,
specified, or theoretically correct value or condition.
An occurrence of some specific data, situation, or activity.
Pertaining to a system or component that automatically places itself in a safe operating mode in
the event of a failure. See also fault secure and fault tolerance.
Pertaining to a system or component that automatically places itself in a secure operating mode
in the event of a failure.
The inability of a system or component to perform its required functions within specified
The adjudged or hypothesized cause of an error.
Government Off the Shelf (GOTS);
Software and hardware products that are developed by the technical staff of the government
agency for which it is created or by an external entity, but with funding and specifications from
the agency. Agencies can directly control all aspects of GOTS products.
High-consequence software systems are those in which a failure could result in serious harm to a
human being in the form of loss of life, physical injury or damage to health, loss of political
freedom, loss of financial well-being, or disastrous damage to the human’s environment.
software systems that support a very large number of software users are also considered
high-consequence because they are not only difficult to recover after a failure, but because it
would be extremely difficult and/or expensive to make reparations to the affected humans for the
damages that would result from such a failure. Examples of high-consequence software systems
include the software elements of national security systems, medical control systems, banking
systems, Supervisory Control And Data Acquisition (SCADA) systems for critical
infrastructures, and electronic voting systems.
Aspects or non-functional requirements. They are so-named because most of them end in "-ility."
A subset of them (Reliability, Availability, Serviceability, Usability, and Installability) are
together referred to as "RASUI".
Protection and defense of information and information systems by ensuring their availability,
integrity, authentication, confidentiality, and non-repudiation. These measures include providing
for restoration of information systems by incorporating protection, detection, and reaction
Property of data or software that assures that it has not been altered or destroyed in an
Quality of an information system reflecting the logical correctness and reliability of the operating
system; the logical completeness of the hardware and software implementing the protection
mechanisms; and the consistency of the data structures and occurrence of the stored data. Note
that, in a formal security mode, integrity is interpreted more narrowly to mean protection against
unauthorized modification or destruction of information.
The actions, arguments, and evidence that collectively provide a basis for justified reduction in
Principle requiring that each subject be granted the most restrictive set of privileges needed for
the performance of that subject’s authorized tasks. Application of this principle limits the
damage that can result from accident, error, or unauthorized use of a component or system.
(1) Malicious software that will adversely affect systems under certain conditions such as at a
certain time or upon receipt of a certain packet. (2) Resident computer program triggering an
unauthorized act when particular states of an IS are realized.
Malicious Code, or Malware;
Software or firmware intended to perform an unauthorized process that will have adverse impact
on the confidentiality, integrity, availability or accountability of an information system. Also
known as malicious software.
Modified Off the Shelf (MOTS);
A MOTS (either modified or modifiable off-the-shelf, depending on the context) whose code has
*OTS (Off the Shelf);
Existing software that is potentially available. Includes COTS, MOTS, and GOTS.
Security testing in which evaluators attempt to violate security properties of a system.
A method of redirecting Internet traffic to a fake web site through domain spoofing.
Tricking individuals into disclosing sensitive personal information through the use of e-mails
that appear to originate from a trusted source.
An existing software component or software product that has been obtained for use rather than
custom-developed (i.e., built “from scratch”) for the system in which it will be used. Pre-existing
software could be used as a “stand alone” application or service or could be integrated into a
larger “pre-existing” or custom-developed system. Pre-existing software may be “off-the-shelf”
(e.g., commercial off-the-shelf (COTS), government off-the-shelf (GOTS), modified off-theshelf
(MOTS), or any other variation of *OTS), “legacy”, freeware, open source, or shareware.
An implementation-independent set of security requirements for a category of IT products or
systems that meet specific consumer needs.
The ability of a system or component to perform its required functions correctly and predictably
under stated conditions for a specified period of time.
The potential that a given threat will exploit vulnerabilities of an asset or group of assets and
thereby cause harm to the organization. It is measured in terms of a combination of the
probability of an event and its consequence. [Source: ISO/IEC 13335-1:2005 Information
technology—Security techniques—Management of information and communications technology
security—Part 1: Concepts and models for information and communications technology security
management] Combination of the probability of an event and its consequence. [Source: ISO/IEC
Guide 73:2002 Risk management. Vocabulary. Guidelines for use in standards]
The degree to which a component or system can function correctly in the presence of invalid
inputs or stressful environmental conditions, including those that are intentionally and
A set of tools designed to conceal an attacker and offer a backdoor after the attacker has
compromised the machine. [Source: Hoglund, Greg, and Gary McGraw. Exploiting Software:
How to break code. Addison-Wesley, 2004]
Security Critical Software;
Software whose failure could have an impact on a system’s security. See high-consequence
Software in which there is a high (though not absolute) level of justifiable confidence in the
presence of a substantial set of explicit security properties and functionality, including all those
required for its intended usage. For software to be secure it must avoid defects in its
implementation that introduce vulnerabilities regardless of whether the majority of development
involves either from-scratch coding or integration/assembly of acquired or reused software
All aspects related to defining, achieving, and maintaining confidentiality, integrity, availability,
non-repudiation, accountability and authenticity.
Security-enforcing software is a portion of software that (based on system architecture) is
responsible for enforcing the system security policy.
Security-relevant software is a portion of software that (based on system architecture) is not itself
responsible for enforcing the system security policy, but is in a position to subvert the
enforcement of it.
To obtain software development services or software products, whether by contract or by other
means (e.g., downloading open source software from the Internet).
The level of confidence that software is free of vulnerabilities, either intentionally or
unintentionally designed or inserted during software’s development and/or the entire software
life cycle. [There are a number of other definitions of Software Assurance. See
Programs that observe and report on users; any technology that aids in gathering information
about a person or organization without their knowledge.
An agreement among any number of organizations that defines certain characteristics,
specification, or parameters related to a particular aspect of computer technology.
Changing (process or) product so as to provide a means to compromise security.
Target of Evaluation;
An IT product or system and its associated guidance documentation that is the subject of an
Testing is an activity performed for evaluating product quality, and for improving it, by
identifying defects and problems. The verification of behavior of a program on a finite set of test
cases, suitably selected from the usually infinite executions domain, against the expected
behavior. The five most prevalent types of software/system testing are: Penetration,
Interoperability, Acceptance, Vulnerability, and Functionality.
A potential cause of an incident that may result in harm to a system or organization.
Any circumstance or event with the potential to adversely impact an Information System through
unauthorized access, destruction, disclosure, modification of data, and/or denial of service.
Malicious program that masquerades as a benign application.
A relationship between two elements, a set of activities and a security policy in which element x
trusts element y if and only if x has confidence that y will behave in a well defined way (with
respect to the activities) that does not violate the given security policy.
An entity is considered trustworthy only when there is sufficient credible evidence leading one to
believe that the entity will satisfy a set of given requirements.
Self-replicating, malicious code that attaches itself to an application program or other executable
system component and leaves no obvious signs of its presence.
A weakness in an asset or group of assets. An asset’s weakness could allow it to be exploited and
harmed by one or more threats. [Source: ISO/IEC 13335-1:2004-11-15 (earlier draft of 13335-
Process to embed information into software in a manner that makes it hard to remove by an
adversary without damaging the software’s functionality. Commonly referred to as “digital
watermarking, or DWM”.
A weakness is an inadequacy in a portion of a computer system’s hardware or software that
makes that system susceptible to subversion, theft of information, or sabotage by an attacker. A
weakness could be the result of an intentional or inadvertent flaw in the design, an error in the
implementation, or an inadequacy in another aspect of the software life cycle process. If the
weakness exists in one of the technological components of the system (e.g., an algorithm, a
sequence of code, a configuration setting), and is exploitable by an attacker, the weakness is
termed a vulnerability. However, not all weaknesses are vulnerabilities. Some weaknesses
originate from inadequacies in non-technical aspects of the system, such as the system’s
requirements specification, security policy, or administrative or operating procedures.
A computer program that can run independently, can propagate a complete working version of
itself onto other hosts on a network, and may consume computer resources destructively.