CHAPTER 13:
THE FEDERAL INFORMATION SECURITY MANAGEMENT ACT (FISMA)

That depends on how an agency goes about doing its work. FISMA has put together a framework, but if [an agency] does it just for compliance, then it’s purely a paperwork exercise.108

Karen Evans, Office of Management and Budget

In this chapter:

The e-Government Act of 2002
FISMA report card
What FISMA is NOT – FISMA misunderstood
FISMA and its achievements
10 questions for FISMA compliance

108 Gauthem Naugesh, “Feds Losing War on Information Security,” Government Executive.com, 13 March 2008.

We can truly say that an “A” on the FISMA scorecard does not always mean you are a more secure agency – but it is a start. When we started in C&A in the civilian federal agencies in 2002, it seemed to be an endless labor of developing security documentation for systems that could never meet the requirements. But that did not seem to matter – the systems were accredited anyway.

By “accepting the risk,” DAAs or authorizing officials (often agency CIOs) were getting closer to a higher grade in security without doing more than producing more documentation about a system. Coming from the security environment in the Intelligence Community (IC) and DOD, we were very surprised to learn just how most federal agency CIO’s looked at security and FISMA (or simply did not). To them, FISMA was only a speed bump and an unnecessary expense to the deployment of their mission systems. To add to the problem, security was often not adequately funded and information security was simply not part of most agencies’ cultures.

Working with many federal agency CIOs, we started to understand why information security was considered a pain and not a positive factor in their agencies. The US Federal Government told them to plan and implement security programs, but did not provide funding or training.

Consequently, the agencies had to respond to new requirements (many of which they did not fully understand) with no increase in the funding to do so, but also still had to execute their day-to-day missions. On top of that, information system attacks, although temporarily embarrassing for lower-level officials and sometimes inconvenient to operations, did not get published or reported anywhere else – so with time, they were simply forgotten.

How the world has changed! Many agencies now have mature and funded (some well funded – some less so) information system security programs. Information systems security is becoming a function of the agencies’ system life cycle (SLC). Like all things in the government, this took some time.

The e-Government Act of 2002 and FISMA

The E-Government Act was signed in 2002 and we are still changing and maturing security and information assurance to this day. United States Title III of the E-Government Act, known as the Federal Information Security Management Act (FISMA), is where we began to show that we were really serious about information systems security.

The E-Government Act of 2002 (P.L. 107-347) recognized the importance of information security to the economic and national security interests of the United States. Title III of the E-Government Act (FISMA), states that effective information security programs include:

Periodic assessments of risk, including the likelihood and magnitude of harm that could result from the unauthorized access, use, disclosure, disruption, modification, or destruction of information and information systems that support the operations/assets of the organization.

Policies and procedures that are based on risk assessments, that cost effectively reduce information security risks to an acceptable level, and address information security throughout the life cycle of information systems.

Plans for providing adequate information security for networks, facilities, information systems, or groups of information systems, as appropriate.

Security awareness training to inform personnel (including contractors and other users of information systems that support the operations and assets of the organization) of the information security risks associated with their activities and their responsibilities in complying with organizational policies and procedures designed to reduce these risks.

Periodic testing and evaluation of the effectiveness of information security policies, procedures, practices, and security controls to be performed with a frequency depending on risk, but no less than annually.

A process for planning, implementing, evaluating, and documenting remedial actions to address any deficiencies in the information security policies, procedures, and practices of the organization.

Procedures for detecting, reporting, and responding to security incidents.

Plans and procedures for continuity of operations for information systems that support the operations and assets of the organization.

FISMA, the Paperwork Reduction Act of 1995, and the Information Technology Management Reform Act of 1996, explicitly emphasize the employment of a risk-based policy for cost-effective security. In support of this legislation, the Office of Management and Budget (OMB) through Circular A-130, Appendix III, Security of Federal Automated Information Resources, requires executive agencies within the federal government to:

plan for security;

ensure that appropriate officials are assigned security responsibility;

review the security controls in their information systems; and

authorize system processing prior to operations and periodically thereafter.

The FISMA report card

OMB has been criticized for creating a paperwork exercise in its implementation of FISMA. One basis for this critique is the use of the FISMA report card to grade federal agencies on certain information systems security elements considered important by OMB.

Many consider that the emphasis on the “good grade” takes away from the real focus on true security and forces agencies to spend their limited IT security budgets on creating this report. The basic criticisms are:

Security is complex and does not lend itself to an elementary grading scale. The grading committee is often roundly criticized for applying the same standards to all agencies. Their grades are based on a possible 100 points awarded based on a compliance percentage in seven categories. However, the points are the same whether the agency has 100 information systems or 10,000. Also, the grading does not consider whether an agency has an internal policy of addressing mission critical systems first. In other words, a point is a point. And if you lose more than 40 of them, you fail.

The report card does not really measure security; it measures compliance with the Federal Information Security Management Act itself. FISMA may be an important tool in determining critical elements in an information system security program. But FISMA compliance does not always equate to good security, and poor compliance reporting does not always mean bad security.

The FISMA report requirements

Annually – usually sometime in the summer of the respective reporting year – OMB issues its annual FISMA reporting guidance. This occurs in the form of a memorandum addressed to the heads of executive departments and federal agencies. You should always follow the most recent guidance, as OMB may change the reporting requirements from year to year. For example, 2008 was the first year federal agencies had to report on their privacy practices for the protection of personal information.

The FISMA report card grades on several elements, organized in sections, some of which are directly related to the information system authorization process. In 2008,109 federal agency CIOs were required to answer the following questions:

109 An example of the 2008 reporting template is included on the CD.

Table 41: FISMA reporting questions

Question

Reporting requirement

1

FISMA systems inventory

2

Certification and accreditation, security control testing, and contingency plan testing

3

Implementation of security controls in NIST SP 800-53

4

Incident detection, monitoring and response capabilities

5

Security awareness training

6

Peer-to-peer file sharing

7

Configuration management

8

Incident reporting

9

New technologies and emerging threats

10

Performance metrics for security policies and procedures

Let’s look at each of these reporting requirements from 2008 in more detail.

FISMA systems inventory

To respond to this question, your agency’s CIO must report on the number of both federal agency and contractor information systems by component/bureau and FIPS 199 security impact categorization (high, moderate, low). Agency systems include information systems used or operated by the agency; contractor information systems include those used or operated by a contractor to the agency or by another organization on behalf of the agency.

One important note: OMB holds the federal agencies responsible for the security of contractor information systems. As a result, self-reporting by a contractor does not meet the requirements of the law.

Certification and accreditation, security controls testing, and contingency plan testing

For each category of information system listed in the inventory, federal agencies must provide the total number of systems that have a current authorization to operate (ATO), whose security controls have been tested and reviewed within the past reporting year, and which have a contingency plan and have actually tested it.

In addition to the total number of information systems that have met the requirements, the federal agency is required to indicate which percentage of all information systems are represented in that number. For example, if an agency has reported a total number of 100 agency and contractor information systems. Of these, 90 have current certification and accreditation (e.g. authorization to operate). The agency would indicate the total number of 90 systems as meeting the requirement and indicate that this number represents 90% of all the information systems reported. The same would apply for annual testing of security controls and contingency plans.

Implementation of NIST SP 800-53 security controls

In response to this question, the CIO has to provide a YES/NO response. In addition, the federal agency must describe its process for annual security controls testing and continuous monitoring.

NOTE: DOD does not use the NIST SP 800-53 security controls baseline, so the FISMA report for DOD would indicate compliance with the DODI 8500.2 IA control baseline as an alternative.

Incident detection, monitoring, and response

Federal agencies are required to implement a process for incident detection, monitoring and response. This requirement can be met with either an internal capability, or for smaller agencies, by an agreement with another federal incident response team.

In addition to the YES/NO response to the primary question, the agency must also indicate which tools are used to conduct incident detection, monitoring, and response. The number of information systems protected by the incident detection, monitoring, and response capabilities must also be provided.

Security awareness training

Each federal agency is responsible for implementing a security awareness training program that meets the requirements of NIST SP 800-50, Building an Information Technology Security Awareness and Training Program, and NIST SP 800-16, Information Technology Security Training Requirements: A Role- and Performance-Based Model. The 2008 FISMA report required the agency to provide the total number of employees trained in accordance with the established program – both government and contractor. The total cost for providing this training is also part of the report.

NOTE: DOD has issued DOD 8570.1-M, Information Assurance Workforce Improvement Program, which prescribes the DOD standard for information assurance training. As an alternative to reporting compliance in accordance with the NIST special publications, DOD reports the status of compliance with the standards in the DOD 8570.1-M.

Peer-to-peer file sharing

Here, the agency is required to answer YES or NO to the question of whether the use of collaborative technologies and peer-to-peer file sharing are addressed in the annual security awareness training, ethics training, or in some other agency-wide training program.

Configuration management

In addition to the simple YES/NO response to the question of whether the agency has implemented configuration management processes, this question requires the agency to also provide an estimate of the percentage of systems that employ a common configuration baseline.

The federal government has also developed a federal desktop core configuration (FDCC) standard, and agencies are required to indicate the percentage of information systems that are compliant with the FDCC.

Incident reporting

Here the federal agency responds YES or NO to several questions about its use of documented policies and procedures to report incidents internally, to the US CERT, or to law enforcement.

NOTE: The DOD has established its own robust computer network defense service provider (CNDSP) with a mandated set of policies and procedures.

New technologies and emerging threats

There is no way to stem the emergence and use of new technologies and capabilities. Since this is a recognized fact, federal agencies must be able to implement processes and procedures for reviewing these new technologies for their security implications and for ensuring the secure integration of new capabilities into the existing information infrastructure.

This question requires each federal agency to indicate YES or NO regarding whether it has addressed the emergence of new technologies and threats. If they respond YES, they are further required to provide a brief synopsis of the procedures they have established.

Security performance metrics

Finally, in 2008, federal agencies were required to provide three metrics for evaluating the performance of their policies and procedures. The metrics had to be different from those in the FISMA reporting requirements, but could be derived from NIST SP 800-55, Performance Measurement Guide for Information Security.

2008 was also the first year that OMB required agencies to formally report on how they meet the privacy requirements for the protection of personally identifiable information (PII). According to the January 2008 memorandum from OMB announcing this additional reporting requirement, federal agencies were required to report on the following:

By agency, the number of each type of privacy review conducted during the last fiscal year.

Information about the advice – formal written policies, procedures, guidance, or interpretations of privacy requirements issued by the agency – provided by the senior agency official for privacy during the last fiscal year.

The number of written complaints for each type of privacy issue allegation received by the senior agency official for privacy during the last fiscal year to include: (1) process and procedural issues (consent, collection, and appropriate notice); (2) redress issues (non-Privacy Act inquiries seeking resolution of difficulties or concerns about privacy matters); or (3) operational issues (inquiries regarding Privacy Act matters not including Privacy Act requests for access and/or corrections).

For each type of privacy violation issue received by the senior agency official for privacy during the last fiscal year, the number of complaints the agency referred to another agency with jurisdiction.

FISMA misunderstood – What FISMA is NOT

Every year, OMB announces the annual FISMA grades for federal agencies with a great deal of hoopla and pronouncements of dire security results. In response, the grim state of affairs at the federal agencies is portrayed by the media, which often interprets these grades as the single benchmark for measuring information systems security in federal agencies. But let’s take a closer look at this and about how FISMA and its intentions are often misunderstood by both the federal agencies and the reporting media:

First, FISMA really does not provide a means to accurately measure the security posture of all of the federal agencies. In fact, if you just look at the final grades issued by OMB, most of the US’s largest and perhaps most critical agencies appear to be doing an awful job of security for their information systems. For example, the DOD, the Department of Veterans Affairs (VA) all received very low grades on the 8th FISMA Report Card on Information Security issued by OMB in May 2008, as indicated in the figure below.

Figure 37: FISMA report card

Source: Adapted from the National Institute for Standards and Technology

However, the truth regarding the security of these agencies is quite complex – and perhaps not as dire as originally thought. For example, the DOD plays a major role in the design and development of many of the policies and processes used worldwide by the DOD, the private sector, and even other countries to secure information, networks, and information systems. So, what is the reason the DOD has consistently scored so poorly on eight straight FISMA audits? Well, it is really because the FISMA reports don’t actually measure security – they really measure how well an organization can demonstrate its compliance with the annual FISMA reporting requirements.

Next, the FISMA reporting process does not align well with measuring systems security progress over time. If you look only at the scores over the past eight years, you would think that the US government hasn’t made much progress in its pursuit of better information security. Taking eight years to raise the overall grade from an F to a C isn’t exactly an indicator of rapid progress! However, as we noted earlier, OMB may change the reporting criteria and/or the scoring methodology every year.

Also, the FISMA grade is a composite of both the agency’s self-reporting and the results of an IG audit. The office of the IG assigned to each agency also reviews the agency data and provides an analysis of the FISMA results. However, the skill and personnel resources of the agency IG offices can vary widely.

For example, a large agency may have in excess of 100,000 systems, but may have an IG staff of less than 10. This means that the IG might only be able to test a percentage of the agency’s systems. On the other hand, a federal agency with a larger IG staff might be able to actually investigate more than 90% of the agency’s systems – perhaps with a more incriminating result. Additionally, the FISMA requirements are not always specific, perhaps leading the IGs to interpret them differently.

Finally, the scores from the IG’s FISMA audit might even be impacted by the CISO’s working relationship with the agency’s IG and the ability to agree on the reporting requirements.

FISMA and its achievements

Despite much of the wailing about FISMA and its burdensome reporting requirements, FISMA has resulted in several tangible information system security benefits. First, it has had a good degree of success in motivating government executives to pay attention to information systems security – if only because it has been linked to their budgets!

There has been an increase in security awareness by both the members of Congress and the members of the individual federal agencies. Prior to FISMA, information systems security was often a neglected requirement – well, certainly not always a top priority.

Lessons learned from the failures in the FISMA reporting progress have led to the recognition that it must and can be made better. Optimal security would really require federal agencies to monitor their information systems, conduct penetration testing and forensic analysis, and mitigate vulnerabilities in a timely manner. FISMA further reinforces the need for federal agencies to develop and implement robust security plans and policies.

In a statement before the House Oversight and Government Reform Committee in February 2008, Tim Bennett, President of the Cybersecurity Industry Alliance made the following recommendations as a result of FISMA lessons learned:

Provide CIOs and chief information security officers (CISOs) with authority over information systems security. This would include both the power to ensure and to enforce.

Give the CIOs and CISOs the staff and funding to accomplish the mission of information systems security.

Hold management accountable for meeting information systems security requirements.

Develop a comprehensive approach to information systems security, to include assessment, continuous monitoring, and speedy remediation.

Mandate more effective and accurate inventories of the information systems supporting federal agencies, to include both government and contractor.

Implement useful performance metrics for information systems security.

Support efforts to institutionalize a culture of security in federal agencies.

Increase information systems security funding at the federal level.

Seek better efficiencies in the federal procurement process.

Improve the protection of privacy information and require accountability for measures to protect personally identifiable information.

10 critical questions for FISMA compliance

If your agency can respond YES to the following list of 10 most critical questions, you have a reasonable indication of a viable information system security program – and a good likelihood of passing your FISMA grade. So, here they are:

1. Do you have a complete and current inventory of your information systems? Can you produce the required evidence?

2. Have you established an ongoing configuration management process? Do you have the associated documentation?

3. Do you have a consistent, executive level process for assessing, analyzing, and managing risk?

4. Have you developed IT contingency plans and have you tested these at least annually? Can you produce the evidence?

5. Do your information systems have current authorizations to operate? Do you have the associated documentation?

6. Are your security controls reviewed and/or tested at least annually? Do you have evidence of compliance?

7. Have you prepared a POA&M for any security control weaknesses and is your POA&M updated as required, or at least quarterly?

8. Does your agency maintain and document a formal security education training and awareness program?

9. Have you conducted a privacy impact assessment and do you have the associated documentation?

10. Do you have an incident response plan? Do you have an internal agency computer incident response capability or have you coordinated with another agency to provide that support?

The 30,000 foot view of FISMA compliance

Compliance with the requirements of FISMA may appear to be impossible initially, but if you take a look at the FISMA mandates from the 30,000 foot view, it may be somewhat simplified. The figure below provides the birds-eye perspective of FISMA compliance.

Figure 38: A FISMA compliance model

Source: Adapted from the National Institute for Standards and Technology

Automated C&A tools can help!

Despite the simplicity of this model, there is still a lot of underlying complexity. Automated tools can help – a lot! So, here is a list of some of those C&A and FISMA compliance tools, all of which can assist in facilitating the process of automation. NOTE: This list is not intended to be all inclusive, nor does it represent any form of endorsement for these tools.

DIACAP toolset: a free automated tool, information available at http://www.i-assure.com/products/diacap/diacap_scorecard.htm .

Enterprise Mission Assurance Support Service (eMASS): developed under the auspices of the US DOD; information available at http://www.disa.mil/peo-ian/.

SecureInfo Risk Management System (RMS): information available at http://www.secureinfo.com/solutions/certification-accreditation/rms.aspx .

Trusted Agent FISMA (TAF): developed under the auspices of the US Department of Justice; information available at http://www.trustedintegration.com/ti/TAFISMA.html .

Xacta IA Manager: information available at http://www.telos.com/solutions/information%20assurance/.

Further reading

Taylor, Laura and Shepherd, Matthew. FISMA Certification & Accreditation Handbook. Syngress Publishing, November 2006.

References

Federal Information Security Management Act 2002, Title III, the full text: http://csrc.nist.gov/drivers/documents/FISMA-final.pdf.

National Institute of Standards and Technology (NIST) Special Publication 800-37, Guide for the Security Certification and Accreditation of Federal Information Systems, May 2004.

Office of Management and Budget Memorandum, FY 2008 Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management, 14 Jul 2008.

Prepared Testimony of Tim Bennett, President, Cybersecurity Industry Alliance to the House Oversight and Reform Committee, 14 February 2008. Available at: http://informationpolicy.oversight.house.gov/documents/20080214132119.pdf .

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.196.175