Preface: The Birth of Videogames

Living in a society so dependent on smartphones, laptops, and the Internet, it can be difficult to remember when videogames and computers were pure science fiction. Yet, just over four decades ago, videogames were as futuristic as holodecks and hoverboards are today. Just imagine how miraculous a Wii U would have seemed back in 1952, when one of the world’s first videogame players—a group of highly trained engineers—hunched around a room-sized Electronic Delay Storage Automatic Computer (EDSAC) to challenge the lumbering beast to a round of tic-tac-toe. Although no one could have predicted it at the time, these humble videogames were the first glimpses of a phenomenon that would change the way the world played.

The origins of today’s digital computing devices can be traced back to World War II. In December 1943, a British engineer named Tommy Flowers and his team secretly unveiled the world’s first electronic, digital, programmable computer, the Colossus Mark 1. Operational by early February of the following year, the Colossus and its successors were used by British codebreakers to read encrypted German messages.1 These codebreakers included the likes of Alan Turing,2 whose seminal 1936 paper on the notion of a “universal machine” capable of performing the tasks of any other machine showed that anything computable could be represented by 1s and 0s, which proved critical when trying to decipher codes with 15 million million possibilities. The improved Colossus Mark 2 went into operation on June 1, in time for the pivotal Normandy landings. In total, ten Colossus computers were in operation by the war’s end. Unfortunately, their strict focus on codebreaking and top secret status—strictly maintained until the 1970s—prevented their diffusion into the marketplace.

Like the armed forces of the other world powers at the time, the US Army was also on a continuous quest to gain an upper hand against its enemies, and several promising—if far-fetched—projects were given funding on the off chance that a few might be successful. One such proposal was to create a high-speed electronic device to calculate ballistics firing tables, a task that was being performed manually by female “computers,” a word that meant “one who computes.”

As a result of that proposal, development of the Electronic Numerical Integrator and Computer—better known as ENIAC—began on June 5, 1943. It was fully operational by 1946, when it became the first reprogrammable, electronic general-purpose computer. Conceived and designed by John Mauchly and John Eckert, the room-sized ENIAC was a modular computer, composed of individual panels that performed different functions. It weighed over 30 tons, and contained more than 18,000 vacuum tubes that burned 200 kilowatts of power.3 More flexible than the Colossus and not constrained by the secrecy of the war effort, the ENIAC was able to more profoundly influence the development of later, increasingly smaller and more powerful computers from a variety of commercial companies. Thus began the transition from centuries-old mechanical and analog paradigms to digital.

The bulky and unreliable vacuum tubes used into the 1950s were phased out by more reliable and less expensive transistors in the 1960s. These transistors were soon incorporated into the Integrated Circuit (IC), where a large number of these semiconductor devices were placed onto small silicon chips. Introduced in the late 1950s by Fairchild Camera and Instrument (later Fairchild Semiconductor), IC was a breakthrough with important implications for both computers and calculators. Compared to the vacuum tube, the IC was faster, smaller, and more energy-efficient. The IC was a huge step forward for the burgeoning computer industry.4

Several decades of innovation in circuitry and refinements in operation and utility followed, including a switch to a stored-program methodology that offered a fully reprogrammable environment. Despite these many advancements, large and expensive mainframe computers remained the norm. Fortunately for us, not all programmers were content to slave away on serious applications, and more and more games found their way onto mainframes. As history has shown, any new computing device capable of running a game will, by hook or by crook, soon have them available.5

The Nimrod, a single-purpose computer designed to demonstrate the principals of digital computing to the general public by playing the game of Nim, was showcased at the Exhibition of Science during the 1951 Festival of Britain. However, we argue that the grid of light bulbs that passed for its display was too abstract to qualify it as the type of videogame we’ll be discussing throughout this book. Nevertheless, it was an important milestone. Interestingly, in that same year and into the next, the UK played host to other innovations, with the Pilot ACE computer simulating draughts (checkers), and the Ferranti Mark 1 computer playing host to the first computer-generated music and one of the earliest attempts at solving chess problems.

The first known instance of an actual recognizable videogame implementation was Alexander Douglas’s 1952 creation of OXO, a simple graphical single player versus the computer tic-tac-toe game on the EDSAC mainframe at the University of Cambridge. Although more proof of a concept than a compelling gameplay experience, OXO nevertheless set the precedent of using a computer to create a virtual representation of a game.

In 1958, for a visitors’ day at the Brookhaven National Laboratory in Upton, New York, William Higinbotham and Robert Dvorak created Tennis for Two, a small analog computer game that used an oscilloscope for its display. Tennis for Two rendered a moving ball in a simplified side view of a tennis court. Each player could rotate a knob to change the angle of the ball, and the press of a button sent the ball toward the opposite side of the court. As with OXO, few people had the chance to play Tennis for Two, but it can be considered the first dedicated videogame system. Without the benefit of hindsight, this historic milestone was even lost on the game’s creators, who, after a second visitors’ day one year later, disassembled the machine’s components for use in other projects.

It wouldn’t be until 1962 that the most famous early computer game, Spacewar!, blasted onto the scene. Initially designed by Steve Russell, Martin Graetz, and Wayne Wiitanen, with later contributions from Alan Kotok, Dan Edwards, and Peter Samson, the game was the result of brilliant engineering and hundreds of hours of hard work. Developed on the DEC PDP-1 mainframe at MIT, Spacewar!’s gameplay was surprisingly sophisticated and ambitious, pitting two spaceships against each other in an armed duel around a star that exhibited gravitational effects on the two craft. Each player controlled a ship via the mainframe’s front-panel test switches or optional external control boxes, adjusting each respective craft’s rotation, thrust, fire, and hyperspace (a random, evasive screen jump that may cause the user’s ship to explode). Over the years, the game was improved many times and inspired countless clones and spiritual successors, including the first arcade videogames in 1971, Galaxy Game and Computer Space.

It was this ability to inspire that was perhaps Spacewar!’s greatest contribution to the future of computing. Even though access to the host hardware severely limited the game’s exposure, a few visionaries realized early on that computers had applications that went way beyond the sober needs of businesses, universities, and the government. They could also delight and entertain. At the time, the idea that one day ordinary people would buy computers just to play games on them was absolutely absurd.

Nevertheless, the computer industry was steadily moving away from the corporate model, in which only highly trained experts actually operated computers. With the needs of the individual in mind, Drs. Thomas Kurtz and John Kemeny took the next major step towards computing for the masses at Dartmouth University with the creation of the Beginner’s All-purpose Symbolic Instruction Code, or BASIC, programming language, in 1964. While BASIC was not as efficient as “low-level” languages such as machine code and assembly language, it did not require nearly as much skill with math or science to use effectively. For the first time, an ordinary person had a fighting chance of programming a computer.

BASIC emphasized natural language syntax and simple logic, something that enthusiasts of all ages and disciplines could appreciate and more easily work with. In another stroke of genius, Kurtz and Kemeny made their BASIC compiler available free of charge, hoping it would help spread the language far and wide. This altruistic strategy worked, with variations of their original BASIC language becoming a staple on several key computing systems of the time. Their language dominated the first few decades of personal computing, serving as the entry point for countless aspiring programmers.

Throughout the late 1960s and 1970s, breakthrough innovations appeared that we take for granted in our modern computing devices. One of the most notable was Douglas Engelbart’s “The Mother of All Demos,” in late 1968, which featured a mouse, hyperlinks, and video conferencing. Alan Kay’s “Dynabook” concept (1968–1972) predicted form factors and use cases that are realized in today’s laptops and tablets. After its founding in 1970, Xerox PARC’s stunning working office environment featured locally networked desktop computers with bitmapped graphics, graphical user interfaces, object-oriented programming, and “what you see is what you get” (WYSIWYG) output to high quality laser printers. Still, despite such advancements, computers remained the occupation of a tiny well-trained professional class.

The major roadblock was the standard operating model of the time, which was to leverage a single large computer that needed to be shared and have its time partitioned among many users. This model was certainly effective and a significant improvement over previous decades when a single user’s activity would tie up a computer for hours, or even days, but it didn’t scale well, and proved costly for the would-be end-user to access. A change was needed. It came from Intel.

By 1971, Intel had developed and released the first mainstream microprocessor, or single-chip Central Processing Unit (CPU). The Intel 4004 became the heart of many small-scale digital computer projects. It offered a clock speed of 740 kHz—less than 3000 times the speed of modern processors. Still, as humble as it might seem today, there would have been no home computer and videogame market without it.

Although microprocessors held great promise for home computer applications, it took another three years before they really caught on with manufacturers and consumers. This meant that “hit” games continued to appear almost exclusively on mainframe systems throughout the 1970s. These included the early dungeon-crawling game dnd (1974), by Gary Whisenhunt and Ray Wood for the versatile PLATO computer instruction mainframe system, and Will Crowther’s PDP-10 computer game Adventure (1975)—the first significant text adventure.

The rise of true home computing began in late 1974 with the release of the MITS Altair 8800 computer kit, based on the Intel 8080 microprocessor released earlier in the year. Advertised in the January 1975 edition of Popular Electronics magazine, the kit was an unexpected success, enthusiastically supported by groups of eager hobbyists who had long waited to get their hands on a computer to call their own. While no one would accuse the Altair 8800 of being user-friendly, it was a computer that a hobbyist could actually afford.

The Altair 8800 had a red LED display and several toggle switches to directly program the system. There were no other display or input options, and little could be done with the default configuration. Still, the system was a step up from prior kits and plans that required hobbyists to track down or fabricate their own parts. MITS’ designers built most of the machine’s intelligence around removable cards, making the motherboard—the heart of the computer that handles system resources—a means to interconnect the components. Since the motherboard accepted 100-pin expansion cards, it eventually became known as the S-100 bus, which became an important industry standard into the early to mid-1980s, often seen in computers paired with the versatile CP/M operating system. In fact, by late 1975, the first of many greatly improved clones appeared in the form of IMS’s IMSAI 8080 kit, which ran a modified version of CP/M. The IMSAI 8080 may also mark the first time a real personal computer was the star of a movie—in this case, the 1983 hit War Games with Matthew Broderick.

Nowadays, when we think of sharing code and expertise, it’s usually in the context of Linux and other open source and free software projects. Computing, however, has rarely been a solo pursuit. In the 1960s, social networks formed between engineers, scientists, and academics, who freely shared their knowledge and code with each other to advance the nascent field. Since the money was assumed to be in the hardware rather than software, programmers seldom took issue with others borrowing and building on their code. This community spirit naturally found its way to the home market after the introduction of the Altair 8800. Enthusiastic groups of hobbyists were allowed to rub elbows with some of the industry’s greatest pioneers, spurring progress across the board.

The most famous of these early groups was the Homebrew Computer Club in Silicon Valley, which first met in March 1975. The club’s early appeal was the enthusiastic and free exchange of ideas, technology, and information amongst its talented members. Club membership consisted not only of hobbyists, but also engineers and other professionals, like future Apple cofounders Steve Jobs and Steve Wozniak, who, along with many others, would go on to shape the path of the computer industry for the next several decades.

Not everyone liked the idea of free software, however. A young and brash Bill Gates, then of Micro-Soft, wrote an open letter for the Homebrew Computer Club’s second newsletter condemning commercial software piracy. His Altair BASIC, cowritten with Paul Allen, was the first language available for the system, and hobbyists were illegally copying the desirable, but expensive, paper tape software in droves. This letter marked the first notable rift between the ideals of free software development and the potential of a retail software market. Gates and the quickly-renamed Microsoft went on to create versions of BASIC, operating systems, and other types of software for nearly every personal computer, creating an important, if infamous, business empire in the process.

With computer user groups and clubs on the rise, specialized retail stores opened to cater to their needs, and publishers churned out in-depth enthusiast magazines like Byte. The first major computer fair, the Trenton Computer Festival, took place in April 1976 and featured speakers, forums, user group meetings, exhibitor areas, and an outdoor flea market, setting the template for future trade shows. As computers made the transition from do-it-yourself kits to pre-built systems in 1977, they became more appealing to a wider segment of the population.

Still, home computers were the domain of nerds and professionals; they were much too expensive and complex for the average kid who just wanted to play videogames. Fortunately for them, an engineer named Ralph Baer was already envisioning the videogame console in the 1950s. His work would eventually culminate in the Magnavox Odyssey, a landmark moment for the videogames industry.

We take videogame consoles for granted today, but Baer’s concept for a television videogame was so novel that he was unable to garner enough support to build working prototypes until the mid-1960s. His first attempt, Chase, was a simple game of tag featuring two squares, which he later expanded into his celebrated “Brown Box” prototype. The prototype included several additional diversions, including paddle and ball as well as target shooting games. After being rejected by several TV manufacturers, Baer finally signed an agreement in 1971 with Magnavox, who released a refined version of the prototype the following year, renaming it the Odyssey Home Entertainment System.

Although relatively limited in its capabilities, requiring considerable manual intervention and imagination from its players, the Odyssey nevertheless boasted several features that became industry standards. These features included detachable controllers, additional controller options (a light rifle/gun), and interchangeable game cartridges. These cartridges appeared to offer players an assortment of different games to play, but were really just plug-in cards that turned the console’s built-in features on or off like a complex selector switch. Twelve games were included with the system, with an additional ten eventually released separately.6 The Odyssey could display only white squares and lines on a black background, so two different sizes of color overlays were provided to enhance gameplay and accommodate different types of televisions. Many games also included external enhancements such as playing cards, maps, dice, and game boards. Much of the system’s playability came from these accessories, since the on-screen interaction was so limited. The system only registered object collisions, and there was no sound or score tracking.

Perhaps the Odyssey’s most enduring legacy was inspiring Nolan Bushnell at a Magnavox product demonstration in 1972. Later that same year, Bushnell founded Atari and, with engineer Al Alcorn, developed Pong for the arcade, the first hit videogame. Pong was clearly derivative of one of the Odyssey’s paddle and ball (tennis) games, a design that was unfortunately quite easy for others to copy. Much to Bushnell’s chagrin, the success of Pong was its own undoing, leading several other companies to copy the game’s concept. It also did not sit well with Baer, who was understandably upset that Atari had ignored his patents. Magnavox eventually filed a successful lawsuit against Atari for infringement, forcing the fledgling company to settle for a lump sum and other manufacturers to pay hefty licensing fees.

Pong’s simple but compelling gameplay was in stark contrast to Bushnell and Ted Dabney’s earlier Computer Space for Nutting Associates. Despite its striking cabinet design, relatively large screen, and four control buttons, Computer Space was too complex for the general public. Bushnell later admitted that the game appealed mostly to his engineering friends who had enjoyed Spacewar!, the even more complicated game that Computer Space was based on. Although technologically less impressive, it was Pong, not Computer Space, that set the foundation for the modern videogame industry.

Although the Odyssey received a small sales boost from the popularity of Pong and the various clones that sprung up in the arcade, it never really overcame the limits of its technology or poor marketing. Magnavox’s marketing strategy was focused on its television dealerships, which reinforced the unfortunate misperception that it would only work on Magnavox televisions. When Atari created a home version of Pong, complete with automatic scoring and sound, dominant retailer Sears agreed in 1975 to distribute it in its sporting goods department7 under their own brand name, Tele-Games. It was a huge success, and showed that Baer had been right all along about the viability of videogames for the home. Atari released its own branded version of the console starting in 1976, just as an explosion of Pong clones saturated the home videogame market. Although these machines were popular and offered increasingly sophisticated feature-sets, there were simply too many systems for the market to bear. They were also soon challenged by fully programmable consoles that used true interchangeable cartridges for more diverse gameplay possibilities, starting with Fairchild’s Video Entertainment System (VES) in 1976. This home videogame breakthrough was followed less than a year later on the home computer side with the release of the preassembled and relatively user-friendly Apple II, Commodore PET, and Tandy TRS-80 systems, each of which featured its own interchangeable software, first on cassette tapes and later on disks. The legendary trinity of Apple, Commodore, and Tandy marked the first time fully assembled, programmable computers were readily available and usable by ordinary folks.

With the two markets in place by the late 1970s, it only took until the early 1980s for nearly all of today’s familiar videogame and computer elements to take shape. These elements ranged from input devices such as multifunction digital and analog controllers to online services, like the proprietary CompuServe Information Service and The Source, each of which featured a selection of relatively sophisticated multiplayer games accessible to anyone willing to pay by the hour.

Comparing Pong and almost any modern game—including those for smartphones—might suggest there are hundreds of years between them, but in fact only a little over 40 years have passed. By contrast, it took Hollywood over 30 years just to introduce sound into their movies! Fortunately for gamers, game developers have enjoyed and continue to enjoy a wide variety of platforms to develop for—each with their own advantages and limitations. The competitive hardware industry has often left software developers struggling to catch up, building increasingly sophisticated games to justify the expensive hardware, much of which is discussed throughout this book.

1  A large number of electromechanical calculating devices known as “bombe” helped decipher encrypted messages specifically from the infamous German Enigma machines.

2  Turing contributed several pivotal mathematical and computing ideas in his tragically short life. For instance, in a famous 1950 paper he posed the thought provoking question, “Can machines think?” which led to the famous “Turing test” concept, which is a measure of a machine’s ability to exhibit indistinguishably human behavior.

3  http://Nobelprize.org, “The History of the Integrated Circuit,” www.nobelprize.org/educational/physics/integrated_circuit/history.

4  And it was becoming a “computer” industry, as the terms “electronic brain” or “mechanical brain” were slowly being phased out in popular usage.

5  The “Loguidice Law.”

6  Like any videogame console worth its salt, the Odyssey has seen additional homebrew games added to its library, starting with Odball in 2009. As you’ll read throughout this book, dedicated hobbyist have created new games for classic systems that often rival or exceed the best of what was available when these platforms were originally commercially available.

7  This type of backdoor entry into the marketplace would predict Nintendo’s own Trojan Horse, or Trojan Robot, ROB, the Robotic Operating Buddy, which allowed the company to initially push its Nintendo Entertainment System as more of a toy than a console to retailers still wary from the Great Videogame Crash.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.224.63.61