CHAPTER 1:
AN ABRIDGED HISTORY OF INFORMATION TECHNOLOGY AND INFORMATION SYSTEMS SECURITY

Security can be achieved only through constant change, through discarding old ideas outliving their usefulness, and adapting others to current facts.

William O. Douglas, US Supreme Court Justice (1898-1980)

In this chapter:

An abridged history of information technology
Information systems and information systems security – merging concerns

Information security8 itself is not a new concept – decision makers have taken steps to protect critical information since the emergence of governments and supporting infrastructures. New technologies, however, have forever changed the way information is developed, stored, published, and shared. In order to help you to understand the value of information today and the role of information systems security authorization in its protection, we need to take a walk back in time and engage in a short retrospective of information, information systems, and information systems security.

From physical to virtual – a highly abridged history of information technology

Until the relatively recent emergence of information systems (otherwise known as computers), information9 was held largely in physical form – as documents, manuscripts, and books – starting with etchings in stone and ending with mass-printed materials. So, here’s a very abridged history of the evolution of information management from the physical to the virtual.

At some point around 20,000 years ago, mankind invented the means to store data (e.g. information) in pictures or symbols. The use of these pictures and symbols, later alphabets and words, gradually evolved to create powerful information content, which now required decisions on how to store the information and who should be allowed access to it.

Over time, the storage of information evolved into paper form, which then led to the emergence of mass printing, in turn rendering the information increasingly easy to reproduce and distribute. New

8 Information security is defined by Carnegie Mellon University Software Engineering Institute as “the concepts, techniques, technical measures, and administrative measures used to protect information assets from deliberate or inadvertent unauthorized acquisition, damage, disclosure, manipulation, modification, loss, or use.” [See presentation with the definition by McDaniel 94]

9 There have been many attempts to define information over the generations. One of the primary constructs is that information consists of multiple elements of data, each of which may not be useful on its own. Over time, these individual points of data evolve into more sophisticated content.

methods for storing different forms of information evolved, such as recordings and film.

Since that long ago date, the developers and distributors of information recognized that the information itself possessed a level of value and, as such, deserved protection. Protection of this information was done almost exclusively through physical means, such as fences, guards, secure containers, and access-control to buildings.

In the last few decades, and certainly within the memory of most of us today, information and its production, storage and sharing has undergone radical change brought about by the development of the computer. Many claim that the first computer was the Abacus developed centuries ago by the Chinese, largely as a device to count money.

For over a thousand years after this first computing device, there was little progress made in designing an automated means to count and solve number-related problems. The next advance was brought about by Blaise Pascal and his design of the first mechanical adding machine in 1642. His device proved so successful that history also tells of the first wave of “technophobia”10 among mathematicians who feared the device would render them unnecessary.

Charles Babbage caused the next leap forward in computing technology in 1822, when he first produced the “difference engine.”11 He followed this in 1833 with the “analytic engine,” the first real parallel decimal computer using instructions stored on punched cards. This device reflected virtually every aspect of computing as we know it today.

10 Merriam Webster’s Online Dictionary defines technophobia as “fear or dislike of advanced technology or complex devices and especially computers.” (http://www.merriam-webster.com/dictionary/technophobia)

11 The difference engine was a fully automatic, steam-powered device commanded by a fixed instruction program. (http://www.computerhistory.org/babbage/)

Punch-card computing machines remained the mainstay until the mid-1900s. These progressed through the Harvard Mark I12 to the breakthrough ENIAC13 machine. In the mid-1950s, the ENIAC gave way to the EDVAC, the first computer to use binary rather than decimal units. 1958 saw the breakthrough invention of the computer chip by Texas Instruments, opening the way to replace inefficient vacuum tubes and to begin the process of “miniaturizing” the computer.

The first microprocessor was released in 1971 by Intel. This was the turning point, after which the path to today’s computing environment became irreversible.

Computers had been almost exclusively the legion of the military, universities, and very large corporations simply because they were extremely expensive and maintenance was complex. In 1975, the cover of Popular Electronics featured a story on the world’s first minicomputer kit to rival commercial models – the Altair 8800 – which was produced by a company called Micro Instrumentation and Telementry Systems (MITS). The Altair retailed for $397, finally making computing affordable for the masses, including a small but growing hacker community.

Since the Altair hit the market, there has been a veritable explosion of mass market computing devices. Today, most individuals have their own desktop computers – each of which has more processing power than the entire suite of computers powering the first NASA excursions into space.

12 The Mark I was constructed by Howard Aiken in combination with engineers from IBM and was the first of a series of computers that were fully automatic and could execute long calculations largely without human intervention.

13 ENIAC stands for Electrical Numerical Integrator and Calculator and was a giant computing machine developed at the University of Pittsburgh. ENIAC was kept in use successfully from 1946 to 1955.

Information systems and information systems security – merging concerns

The Internet, personal computers, laptops, and other mobile computing devices are so pervasive in today’s society that it is hard to remember that just 40 years ago they didn’t exist. Considering the sheer volume of security laws and regulations today, it is also difficult to believe that the concepts of information systems security are a relatively recent development.

40 years ago: The Dinosaur Age – the mainframe

The Internet, or the web of individual computing devices, came about in the early 1960s as the consequence of a group of visionary thinkers who saw value in developing a capability to share research and development information.

The Internet Age of the Dinosaur was populated by mainframe computers – large, unwieldy machines occupying entire rooms. Scientists and engineers interacted one-on-one with the systems, continuously rewiring their circuits to perform specific functions and managing the processes through the use of punch cards. Each mainframe was isolated and the only way to communicate between machines was to share punch tape or cards, and later through huge magnetic tapes. Physical protections and access restrictions were the primary means of security during the Dinosaur Age.

30 years ago: The caveman and the wheel – ftp, email, and telnet

In the 1970s, the ARPANET became the DARPANET or Defense Advanced Research Project Network under the US Department of Defense (DOD). Now, a large percentage of universities were connected to the DARPANET and – of course – they were much more interested in the free sharing of information than in access restriction.

Email, ftp, and telnet commands were standardized, making it significantly easier for non-technical individuals to use the network. By today’s standards, it was not simple, but these protocols opened up use of the network to more people, who made use of the net to communicate and more quickly and easily share files and resources.

As the use of the network grew, so did the need for security. More attention was being paid to security, largely as a result of a report authored for the US DOD by Rand, entitled Security Controls for Computer Systems.14 This paper is regarded as a seminal work in the study of computer security. For the first time, there was an explicit call for a shift away from thinking about the protection of computers solely in terms of physical and hardware protection to a concept of security expressed in terms of data, users, and infrastructure. The Rand Report called for the recognition of the data itself as a commodity, with credentials for users needed in order to keep the commodity safe. It also recognized the need for security of specific types of systems, especially those processing critical national security information.

20 years ago: The automobile meets the road – rise of the personal computer

As the 1970s moved into the 1980s, there was a quantum leap in information systems technology. During the space of just a few years, the trend evolved away from mainframe computers and the personal computer literally exploded on to the scene. By 1977, personal computers were crowding the store shelves. All of this occurred despite Ken Olson’s15 prediction in 1972: “There’s no reason anyone would want a computer in their home.”

Individual personal computers (or PCs) were soon integrated into small local area networks, gradually forming into a labyrinth of networks each with varying degrees of security (or insecurity). A simple common network system called the internet protocol suite allowed the network to be separated from its physical implementation forming a global inter-network that would be called the Internet. The Internet began to penetrate into the world as it became the de-facto international standard and global network.

As the network expanded, so did attacks on computer systems. Organizations began to invest in preliminary efforts towards a security infrastructure. DOD and the National Computer Security Center collaborated on the Rainbow Series.16 The Rainbow Series

15 Kenneth Olson was the founder of the Digital Equipment Corporation (DEC). Although there were many voices within DEC seeking to influence Olson to produce a single-user, desktop style of computer, he was dead-set against the idea. This led ultimately to the demise of the company.

16 The Rainbow Series (also called the Rainbow Books) is a comprehensive series of computer security standards published by the US government during the 1980s and 1990s. They were originally published by the US DOD Computer Security Center, and then later by the National Computer Security Center (NCSC). The term Rainbow Series comes from the fact that each book was a different color. They can be found and downloaded from http://csrc.nist.gov/publications/secpubs/rainbow/.

documents are still referenced today. The series consists of approximately 37 volumes, each in a different color and each addressing a specific information systems security need. The primary document of the set was known as the Trusted Computer System Evaluation Criteria (5200.28-STD, known as the Orange Book) published in 1985.

Using the Rainbow Series of regulations, US government entities (as well as private firms) now required formal certification17 of computer technology and its security using these processes as part of their criteria.

10 years ago: The Autobahn – the information super-highway

Throughout the 1990s and early 2000s, the Internet grew beyond all previous imagination into a massive and largely uncontrolled network.

Throughout the 1990s, wave after wave of enthusiasm about new Internet and information technologies deluged the marketplace. But now the innovation was not limited to finding new ways to employ information systems – it now extended into new attacks and new protection technologies: firewalls, encryption, virtual private networks, intrusion detection, and the public key infrastructure.

Today: The sky is the limit – networking without boundaries!

After the unprecedented growth of the Internet during the previous decade – one that continues unabated – the world is continuing to see new trends in information technology.

17 The same document describes certification as “The technical evaluation of a system’s security features, made as part of and in support of the approval/accreditation process, which establishes the extent to which a particular computer system’s design and implementation meet a set of specified security requirements.”

The growth of the Internet has also spawned a massive on-line marketplace and business environment. Consumers can do almost everything through their computers – from purchasing a home, scheduling a vacation, to paying their taxes.

Computing mobility has also untethered us from the office and has allowed us to take advantage of communications from just about anywhere – from the local Starbucks to the waiting rooms at the airport. Traditional concepts of securing the network were based on the ability to protect a boundary through a layered series of assurance devices, such as firewalls, proxy servers, intrusion detection systems, and corporate anti-virus systems. The problem has become how to extend these same protections where there are no defined boundaries.

As a result of these and other trends influenced by the unprecedented level of information gathering, storing, and sharing through the use of information technology, these past 10 years have also seen a dramatic increase in legislation addressing information systems security.

The following chapter will therefore attempt to review the most significant information systems security regulations in today’s certification and accreditation field.

Further reading

De Leeuw, Karl and Bergstra, Jan. The History of Information Security. Elsevier Publishing: August 2007.

Hoyle, Michelle. The History of Computing Science. http://lecture.eingang.org/toc.html .

Khosrow-Pour, Mehdi. Emerging Trends and Challenges in Information Technology Management. IGI Global Publishing, May 2006.

References

Berkus, David. Ten Trends in Technology. A Presentation at the 2005 Harvard Business Conference, Anaheim, California.

Elon University/Pew Internet Project. Imagining the Internet: A History and Forecast. “Imagining the Internet: A Quick Look at the Early History of the Internet.” Elon University/Pew Internet Project. Available at: http://www.elon.edu/predictions.

Hauben, Michael. Behind the Net: The Untold History of the ARPANET. Available at: http://www.dei.isep.ipp.pt/~acc/docs/arpa.html .

Kanellos, Michael. Gordon Moore on 40 Years of His Processor Law. Available on: http://news.cnet.com/Gordon-Moore-on-40-years-of-his-processor-law/2008-1006_3-5657677.html .

Leiner, Barry M; Cerf, Vinton G; Clark, David D; Kahn, Robert E; Kleinrock, Leonard; Lynch, Daniel C; Postel, Jon; Roberts, Larry G; Wolff, Stephen. A Brief History of the Internet. Available at: http://www.isoc.org/internet/history/brief.shtml .

Moor, James H. What is Computer Ethics? Available at: http://www.southernct.edu/organizations/rccs/resources/teaching/teaching_mono/moor/moor_definition.html .

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.17.91