1.3 Computing as a Tool and a Discipline

In the previous section on the history of computer software, we highlighted the ever-changing role of the user. At the end of the first generation, users were split into two groups: systems programmers, who developed tools to make programming easier, and applications programmers, who used those tools. Later, applications programmers built large domain-specific programs such as statistical packages, word processors, spreadsheets, intelligent browsers, virtual environments, and medical diagnosis applications on top of the traditional language tools. These application programs were, in turn, used by practitioners with no computer background.

So who is using the computer as a tool? Everyone, except for those people who are creating the tools for others. For these toolmakers, either computing is a discipline (low-level tools) or the discipline of computing has made their tools possible (applications built upon applications).

A discipline is defined as a field of study. Peter Denning defines the discipline of computer science as “the body of knowledge and practices used by computing professionals in their work. . . . This discipline is also called computer science and engineering, computing, and informatics.”19 He continues, “The body of knowledge of computing is frequently described as the systematic study of algorithmic processes that describe and transform information: their theory, analysis, design, efficiency, implementation, and application. The fundamental question underlying all of computing is, What can be (efficiently) automated?

Denning states that each practitioner must be skilled in four areas:

  • Algorithmic thinking, in which one is able to express problems in terms of step-by-step procedures to solve them

  • Representation, in which one is able to store data in a way that it can be processed efficiently

  • Programming, in which one is able to combine algorithmic thinking and representation into computer software

  • Design, in which the software serves a useful purpose

A debate has long raged about whether computing is a mathematical discipline, a scientific discipline, or an engineering discipline. The reality is that it has aspects of all three. Computing certainly has strong roots in mathematical logic. The theorems of Turing tell us that certain problems cannot be solved, Boolean algebra describes computer circuits, and numerical analysis plays an important role in scientific computing. Scientific disciplines apply a rigorous, analytical process for testing hypotheses. The natural sciences (such as biology, physics, and chemistry) exist to “fill in the instruction book that God forgot to leave us.”20 But the scientific method can be applied to any organized investigation. Computer scientists test hypotheses related to various aspects of computing. When we design and build larger and more complex computing systems, we use techniques from engineering.

The Big Ideas of Computing

The AP Computer Science Principles curriculum lists seven big ideas that are essential to understanding computing.21 They are summarized in TABLE 1.1.

TABLE
1.1 Big Ideas in Computing
Creativity Computing has changed the way humans can be creative, including the creation of video, animation, infographics, and audio.
Abstraction Abstraction is the mechanism used to model the world and facilitate communication among humans and machines.
Data and information Crucial to computing is the management and interpretation of data and information, leading to the creation of knowledge.
Algorithms Algorithms allow us to develop and express solutions to problems.
Programming Of course, computing involves programming, which allows us to implement the solutions to problems.
The Internet Not only does the Internet provide a mechanism for communicating and sharing resources among people and machines, it also has become the means by which computing is accomplished in many situations.
Global impact Computing enables innovation, which has potential benefits and harmful effects in various contexts.

© Artur Debat/Getty Images; © Alan Dyer/Stocktrek Images/Getty Images

When you think about creativity, you might think about artists. You might not think of a computer as a device that promotes creativity, but it certainly is. The artifacts created using a computer are still the result of a human being’s ingenuity. Presentations, images, audio, video, web pages, and programs are all brought about by human acts of creation. We use software tools to do so, just as a sculptor uses a chisel, but the end product is created by the human’s efforts, not the tool.

Even the process of writing a program, often considered a core computing activity, is a creative effort. The purpose of a program depends on the goals of the programmer and the end user (the person who runs the program). A program might be created to satisfy personal curiosity to solve a world-changing problem, or it might be used to develop insight into data. Often, the results of running a program can have an effect on a large number of people, and change organizations and even entire societies.

Despite its old-fashioned roots, the process of programming, and the resulting product, does not have to be a stale, boring thing, made only for crunching numbers. Programs may use various forms of input and output, including visual (text, graphics), audible (sound effects, music, human speech), or tactile (touch screens, vibrating controllers, and virtual reality gloves).

The process of creation using a computer involves a variety of techniques and tools, many of which we discuss throughout this text. Some you’ll be familiar with; others will likely be new to you. Many involve applying old ideas in new ways.

In this text we explore, at an introductory level, all of these big ideas to varying degrees. This text does not aim to make you a better computer user, although it should undoubtedly have that side effect. Instead, we want you to walk away with a thorough knowledge of how computing systems work, where they are now, and where they may go in the future.

SUMMARY

This book is a broad study of computer systems, including the hardware that makes up the devices, the software programs executed by the machine, and the data managed and manipulated by both. Computing systems can be divided into layers, and our organization of this book follows those layers from the inside out.

The history of computing reveals the roots from which modern computing systems grew. This history spans four generations, each characterized by the components used to build the hardware and the software tools developed to allow the programmer to make more productive use of the hardware. These tools have formed layers of software around the hardware.

Throughout the rest of this book, we examine the different layers that make up a computing system, beginning with the information layer and ending with the communication layer. Our goal is to give you an appreciation and understanding of all aspects of computing systems.

You may go on to study computer science in depth and contribute to the future of computing systems. Or you may go on to be an application specialist within other disciplines, using the computer as a tool. Whatever your future holds, given how prevalent computing systems are, a fundamental understanding of how they work is imperative.

KEY TERMS

EXERCISES

For Exercises 1–10, choose from the following list of people.

  1. Leibniz

  2. Pascal

  3. Babbage

  4. Lovelace

  5. Hollerith

  6. Byron

  7. Turing

  8. Jacquard

  1.   1. What French mathematician built and sold the first gear-driven mechanical machine that did addition and subtraction?

  2.   2. Who built the first mechanical machine that did addition, subtraction, multiplication, and division?

  3.   3. Who designed the first mechanical machine that included memory?

  4.   4. Who was considered the first programmer?

  5.   5. Who proposed that a punched card be used for counting the census?

  6.   6. Who edited Babbage’s work?

  7.   7. Who was Ada Lovelace’s father?

  8.   8. Who would have been mentioned in the book the Code Breakers?

  9.   9. Who developed the concept of punched holes used in weaving cloth?

  10. 10. Who is associated with IBM?

For Exercises 11–23, match the hardware listed to the appropriate generation.

  1. First

  2. Second

  3. Third

  4. Fourth

  5. Fifth

  1. 11. Circuit boards

  2. 12. Transistor

  3. 13. Magnetic core memory

  4. 14. Card input/output

  5. 15. Parallel computing

  6. 16. Magnetic drum

  7. 17. Magnetic tape drives

  8. 18. Integrated circuits

  9. 19. Personal computer

  10. 20. Vacuum tube

  11. 21. Large-scale integration

  12. 22. Magnetic disk

  13. 23. Networking

For Exercises 24–38, match the software or software concepts listed to the appropriate generation.

  1. First

  2. Second

  3. Third

  4. Fourth

  5. Fifth

  1. 24. Assemblers

  2. 25. FORTRAN

  3. 26. Operating systems

  4. 27. Structured programming

  5. 28. Time sharing

  6. 29. HTML (for the Web)

  7. 30. Loaders

  8. 31. Spreadsheets

  9. 32. Word processors

  10. 33. Lisp

  11. 34. PC-DOS

  12. 35. Loaders/linkers bundled into an operating system

  13. 36. Java

  14. 37. SPSS

  15. 38. C++

Exercises 39–59 are short-answer questions.

  1. 39. What do we mean by the statement that “the 1980s and 1990s must be characterized by the changing profile of the user”?

  2. 40. Why was Mosaic important?

  3. 41. Discuss the browser wars.

  4. 42. Describe how the Web changed after 2002.

  5. 43. Of the predictions listed in this chapter on page 25 and 26, which do you consider the biggest error in judgment? Explain.

  6. 44. Name the four areas in which the practitioner must be skilled.

  7. 45. Distinguish between computing as a tool and computing as a discipline.

  8. 46. Is computing a mathematical discipline, a scientific discipline, or an engineering discipline? Explain.

  9. 47. Distinguish between systems areas and applications areas in computing as a discipline.

  10. 48. Define the word abstraction and relate it to the drawing in Figure 1.2.

  11. 49. What is cloud computing?

  12. 50. Define the word protocol and explain how it is used in computing.

  13. 51. Distinguish between machine language and assembly language.

  14. 52. Distinguish between assembly language and high-level languages.

  15. 53. FORTRAN and COBOL were two high-level languages defined during the second generation of computer software. Compare and contrast these languages in terms of their history and their purpose.

  16. 54. Distinguish between an assembler and a compiler.

  17. 55. Distinguish between a systems programmer and an applications programmer.

  18. 56. What was the rationale behind the development of operating systems?

  19. 57. What constitutes systems software?

  20. 58. What do the following pieces of software do?

    1. Loader

    2. Linker

    3. Editor

  21. 59. How was the program SPSS different from the programs that came before it?

THOUGHT QUESTIONS

  1.   1. Identify five abstractions in your school environment. Indicate which details are hidden by the abstraction and how the abstraction helps manage complexity.

  2.   2. Discuss the role of abstraction in the history of computer software.

  3.   3. Which big ideas in Table 1.1 do you think are the most important? Why?

  4.   4. Do you know anyone today who does not have Internet access?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.234.83