Chapter 3
The International Race of Top Supercomputers and Its Implications

Kam Ki Tang and Thanh Quang Le

Introduction

In September 2013, the US National Aeronautics and Space Administration (NASA) achieved a landmark in space exploration. Its Voyager-1 spacecraft launched 36 years ago (September 1977), became the first human object to go beyond the Solar System.1 Voyager-1’s interstellar journey became a possibility when, back in 1961, Michael Minovitch solved the extremely difficult “three-body problem” in celestial mechanics. The problem had been described as one of the hardest in that field, and it had tested and “defeated” some of the finest scientists including Isaac Newton (Riley and Campbell 2012). Minovitch solved the problem with the help of a machine called the IBM 7090 – the fastest computer at that time.

The fact that a calculation machine assisted in solving a mathematical problem and, as a consequence, pushed the boundaries of human exploration beyond planet Earth, is a powerful demonstration of De Solla Price’s (1983) thesis of “instrumentality.” According to this thesis: “advances in instrumentation and experimental techniques have been of major importance in stimulating and enabling both radical theoretical advance in fundamental science, and radical innovations in practice application.” De Solla Price illustrates the idea of instrumentality using another cosmic example – the development of deep-dished concave lenses. Such lenses allowed Galileo to construct ever more powerful telescopes in order to observe the moon in greater detail and clarity than ever before, during the early seventeenth century. This not only opened up a new chapter for astronomy, but also created strong interest in telescopes in the community. The boom of the telescope industry then led to the development of other optical instruments, including the microscope. The development of the microscope itself created a new way to examine biological species such as cells, and this changed the discipline of biology forever. In summary, a scientific instrument can trigger a chain of unintentional and completely different discoveries and innovations.

The most omnipresent instrument across all fields of science and industry nowadays is no doubt the computer (both the hardware and software). What makes a supercomputer “super” amongst its peers is its speed: a supercomputer is simply a very fast computer. As such, supercomputers enable tasks that are beyond the capacity of regular computers, and this assists in accelerating the pace of scientific discovery and product innovation. One of the benefits of high-speed supercomputers is their capability in modeling and simulating large-scale, complex systems. For instance, scientists of Illinois’s Argonne National Laboratory used a supercomputer to simulate how the universe evolved from the Big Bang until today (a time period spanning over 13.7 billion years). They did this by tracing the movement of trillions of “particles” in their computer model, in order to study the mysterious dark energy (Moskowitz 2012). Similarly, the need for accurate and timely weather forecasting also makes supercomputer simulation and modeling an indispensable tool for meteorological scientists. Traditionally theory and experimentation are considered the two legs of science and engineering. Nowadays computational modeling and simulation is considered to be the “third leg” (Middleton 2013), which signifies the crucial instrumentality of supercomputers.2

Supercomputers have also been used widely in various industries for designing and creating products. Cummins uses simulation to help design diesel engines better, faster, and cheaper; Goodyear uses it to design safer tyres much more quickly; Boeing uses it to build more fuel-efficient aircraft; and Johnson & Johnson uses it to improve the absorbency of nappies (Middleton 2013; Nuttall 2013). Even the humble light bulb has had its lifespan increased thanks to the simulation done with a supercomputer (Cray 2011). The importance of supercomputers in molecular modeling results in big pharmaceutical companies owning some of the world’s largest supercomputers in the private sector. Supercomputing capability can also have implications for national security. Simulation allows scientists to conduct virtual nuclear tests and to monitor existing nuclear stockpiles.

Furthermore, the brute power of supercomputers lends them to handing “Big Data.” This can range from investigating large amounts of genetic data (McBride 2013), to processing medical reports and diagrams pertaining to patients with cancer (Lenzner 2013), to decoding and analyzing vast streams of information gathered for intelligence purposes (Tsay 2013).3 IBM’s CEO Ginni Rometty calls data the world’s “next natural resource.” How well a company or an organization “mines” those resources will determine whether it is a winner or a loser in its field (Lenzner 2013).

With faster processing speeds, supercomputers can obviously reduce the time between the inception of an experiment and the final outcome. Aside from the time effect, supercomputers can also make qualitative differences to the outcome. By shortening the time between asking a question and seeing the answer, researchers can quickly ask follow-up questions, find the answer, and keep repeating this process. As Eccles (1989) puts it:

When results come more slowly, we are more conservative in the questions we can ask, and we leave many more stones unturned. When the answers come back while the questions are still fresh, the train of thought can continue, and less time is spent trying to remember where one left off and what line of reasoning was being explored.

In other words, creativity increases through the stream of continuous and bolder thoughts.

To the extent that supercomputer capacity is an indicator of a country’s current inputs in cutting-edge research and development (R&D), it therefore also projects the country’s future scientific and technological prowess. As demonstrated by the success of Voyager-1, the United States’ leading supercomputing capability back in 1961 is still giving the country an unparalleled advantage more than half a century later in the exploration of the “final frontier.” The social and economic benefits of having the highest supercomputing power could be broad and deep. The former US Energy Secretary Steven Chu sums it up:

The nation that leads the world in high-performance computing will have an enormous competitive advantage across a broad range of sectors, including national defense, science and medicine, energy production, transmission and distribution, storm weather and climate prediction, finance, commercial product development, and manufacturing. (NNSA 2012)

Against this background, the objective of this chapter is to examine the distribution of top supercomputers across nations and explore its implications for national technological and economic competitiveness. Given that technological and economic competitiveness is a very complex issue, focusing only on supercomputers may seem to be somewhat narrow. Our limited scope of study can be justified for a number of reasons. First, top supercomputers can be viewed as indicators of the frontier R&D that drives the development of the high-tech sector. Second, rankings of top supercomputers have attracted increasing attention from the general public, policymakers, and stakeholders, and have the potential to evolve into valuable new benchmarking tools. Thirdly, although there are studies providing measures of countries’ technological and innovation capacities,4 they are not explicitly focusing on high-tech and have not made extensive use of top supercomputer data yet.5

Measure of Supercomputer Capacity

According to the Encyclopedia of Computer Science, “the most powerful computers of any time have been called high-speed computers, supercomputers, high-performance computers and, most recently, high-end computers” (Padua and Hoeflinger 2003). The first supercomputer, the Control Data Corporation (CDC) 6600, released in 1964, was up to 10 times faster than the fastest computer at the time.6 As known by any computer user, it does not take too many years for what is considered a very fast computer to become an ordinary computer if not a vintage one. The time-honored example is that a smart phone produced nowadays has more computing power than the Apollo program, which sent three men to the Moon back in 1969. What underlies this exponential growth in computer power is the fact that the number of integrated circuits on a computer chip doubles approximately every two years – known as the Moore’s Law. As such, the term supercomputer is reserved for the fastest computers in the world at the present time.

Since 1993, a group of high-performance computer (HPC) experts have published a list of the 500 most powerful computers in the world, referred to as TOP500. Its purpose is to “provide a reliable basis for tracking and detecting trends in high-performance computing.”7 The TOP500 list is compiled twice a year with input from HPC experts, computational scientists, manufacturers, and the Internet community in general. To date, this is the only public document that details the configuration, purpose, and processing power of HPCs around the world, and thus it is the most authentic source of supercomputer capacities.

The project ranks computers based on their speed in solving a set of numerical problems widely used in the industry for benchmarking. It is known as the LINPACK benchmark. The speed is measured in “flops” (floating-point operations per second), which is basically the number of calculations per second. The more powerful a supercomputer, the larger its flops. The measurement scale of flops, in ascending order, is: kilo (1000), mega (1 million), giga (1 billion), tera (1 trillion), peta (1 quadrillion = 1000 trillion), and exa (1 quintillion = 1 billion trillion). To put this into perspective, the fourth generation iPad’s graphic processing unit has a computing power of 76.8 gigaflops, a human brain’s equivalent power is estimated to be 2.2 petaflops (Fischetti 2011),8 while the fastest supercomputer on Earth in 2013 has a power of 33.85 petaflops.

It is, however, commonly accepted that focusing solely on the number of flops is far from the best way to measure the performance of a computer. For example, software is crucial in determining the speed of a supercomputer for running applications, but this is not accounted for in the LINPACK test. The fact that a human (still) has much stronger cognitive abilities like emotion recognition or concept formation than artificial intelligence, despite achieving a smaller LINPACK score than many supercomputers, also demonstrates clearly the limitation of one-dimensional benchmarks. Despite that, TOP500 is still a very useful benchmark for a number of reasons. First of all, using a single measure based on flops is highly transparent, and provides a focal point of discussion, especially for non-specialists. Furthermore, the performance data for each individual supercomputer in a country can be aggregated to provide a single measure of a country’s supercomputer capacity. Lastly, the constancy in the benchmarking method allows comparison of a supercomputer’s capability, not only across countries at a given time, but also over time for each country.

World Distribution of Supercomputer Capacity

In the following data analysis, we focus on national aggregate data without adjusting for country size. For instance, we will look in absolute terms at the number of top supercomputers that countries have, instead of their per capita figures. Obviously aggregate measures will benefit larger countries as opposed to smaller ones. We do so because our purpose is to use the distributions of top supercomputers as an indicator of the disparity in technology and innovation capacities across countries. It is a country’s total stock of technology and knowledge that determines its international competitiveness, not its total population.

Figure 3.1 shows the system distribution of TOP500 supercomputers by segments in 1993 and 2013 respectively. A segment’s system share is measured based on the total number of TOP500 computers hosted by that segment. The TOP500 list is updated twice a year; all data used in this chapter are from the mid-year June list except for 1993, where the November list is used.9 In 1993, 28.4% or 142 of TOP500 supercomputers were deployed by particular industries. By 2013, that number has been nearly doubled to 53.8% or 269, at the expense of all other segments. This reflects the increasing importance of HPC for industrial innovation.

c3-fig-0001

Figure 3.1 The system distribution of TOP500 by segments.

Source: Data from TOP500.org.

However, there is limited information on what the exact applications of these supercomputers are. In 2013, 2.4% of the TOP500 machines are known to be used for weather and climate research, 1.8% for energy, 1.4% for defense, 1% for benchmarking, and less than 1% (individually) for uses in relation to environment, aerospace, information service, finance, software, life science, semiconductor, web services, Internet providers, and geophysics. All these amount to less than 10% of the total. The remainder is used for unspecified “research” (13.8%), or for applications “not specified” (76.4%). In other words, there is a great deal of secrecy about the applications of, and access to, top supercomputers.

The distribution of HPC power across regions has changed a lot since the launch of TOP500. Figure 3.2 shows the continental distribution of supercomputers as of 1993. A continent’s system share is measured based on the total number of TOP500 computers located in that continent, while its performance share is based on the total flops of those computers. Back then the Americas had 50.4% of the system share and 61.7% of the performance share. That is, North America had 252 of the top 500 supercomputers in the world and their combined speed was 61.7% of the total combined speed of all TOP500 computers. The second biggest share belonged to Asia: 22.8% (system) and 25.2% (performance) respectively. Europe was in third place with 25% (system) and 12.4% (performance). Oceania held a distant fourth (last) place.

c3-fig-0002

Figure 3.2 The distribution of TOP500 in 1993 by continent.

Source: Data from TOP500.org.

Figure 3.3 shows the continental distribution of supercomputers as of 2013. Over the last two decades, although the Americas’ system share has increased slightly to 52.8%, its performance share has dropped substantially to 48.8%. On the contrary, Asia’s system share has increased slightly to 23.8%, while its performance share has surged to 32.8%. Europe’s system share also slid to 22.4%, but its performance share has increased to 17.4%. This means that while Asia nowadays owns roughly the same number of TOP500 machines as it did 21 years ago, those machines are relatively more powerful than before.

c3-fig-0003

Figure 3.3 Continental distribution of TOP500 in 2013.

Source: Data from TOP500.org.

As expected, within each of the four continents there are very diverse performances among the countries themselves. Therefore, in Tables 3.1 and 3.2 we break down the continental-level data into country-level data for not just 2013, but for all the years after 1993. Table 3.1 shows how the world share of TOP500 systems by countries has changed over time, while Table 3.2 shows the corresponding results in terms of performance. To keep the figures intelligible, only the top 10 countries in 2013 (and “others”) are listed.

Table 3.1 Share of TOP500 systems by countries 1993–2013.

Source: TOP500.org.

% US China Japan UK France Germany India Canada Russia Sweden Others
1993 49.0 0.0 21.6 4.2 4.0 10.6 0.0 0.6 0.0 0.2 9.8
1994 52.4 0.0 21.0 4.4 3.8 8.8 0.0 0.6 0.0 0.4 8.6
1995 54.8 0.2 13.2 3.2 5.4 10.4 0.0 1.8 0.0 0.8 10.2
1996 51.0 0.2 18.6 2.4 3.6 9.6 0.0 1.2 0.0 1.2 12.2
1997 53.0 0.0 17.4 4.8 3.8 9.0 0.0 1.4 0.2 1.6 8.8
1998 57.0 0.0 16.6 5.0 2.0 9.2 0.0 0.4 0.2 1.2 8.4
1999 58.4 0.0 11.4 5.8 3.6 9.4 0.0 1.6 0.0 1.4 8.4
2000 51.6 0.4 12.4 5.6 4.0 13.0 0.0 1.8 0.0 1.0 10.2
2001 50.6 0.0 10.8 6.2 4.2 12.8 0.0 1.8 0.0 0.8 12.8
2002 45.2 0.6 10.6 7.4 4.6 12.8 0.0 2.6 0.2 1.0 15.0
2003 49.0 1.2 8.2 7.0 3.8 11.2 0.4 1.8 0.4 0.8 16.2
2004 52.4 2.8 7.0 6.8 3.8 7.4 1.2 1.4 0.4 0.6 16.2
2005 55.4 3.8 4.6 6.4 2.2 8.0 1.6 1.4 0.6 0.6 15.4
2006 59.6 5.6 5.8 7.0 1.6 3.4 0.0 1.8 0.2 0.2 14.8
2007 56.0 2.6 4.6 8.8 2.6 4.6 1.6 2.0 1.0 2.0 14.2
2008 51.6 2.4 4.4 10.4 6.8 9.4 1.2 0.4 1.6 1.8 10.0
2009 58.2 4.2 3.0 8.6 4.6 6.0 1.2 1.6 0.8 2.0 9.8
2010 56.0 5.0 3.6 7.6 5.8 4.8 1.0 1.4 2.2 1.6 11.0
2011 51.0 12.2 5.2 5.4 5.0 6.0 0.4 1.6 2.4 1.0 9.8
2012 50.4 13.6 7.0 5.0 4.4 4.0 1.0 2.0 1.0 0.8 10.8
2013 50.4 13.2 6.0 5.8 4.6 3.8 2.2 1.8 1.6 1.4 9.2

Table 3.2 Share of TOP500 performance by countries 1993–2013.

Source: TOP500.org.

% US China Japan UK France Germany India Canada Russia Sweden Other
1993 59.61 0.00 23.83 3.00 2.34 4.67 0.00 1.54 0.00 0.07 4.94
1994 56.56 0.00 27.01 3.93 2.50 4.62 0.00 0.99 0.00 0.09 4.31
1995 55.49 0.15 24.97 3.49 2.92 5.72 0.00 1.59 1.97 0.51 3.18
1996 47.28 0.10 31.15 2.72 2.58 6.82 0.00 1.22 0.00 0.66 7.47
1997 51.78 0.00 23.30 3.64 3.44 9.69 0.00 0.85 0.13 1.07 6.11
1998 56.82 0.00 16.46 5.99 2.31 10.45 0.00 1.14 0.09 1.07 5.68
1999 60.69 0.09 14.23 6.79 2.42 8.96 0.00 1.23 0.00 0.90 4.68
2000 56.20 0.16 15.33 6.09 3.37 11.08 0.00 1.14 0.00 0.61 6.01
2001 58.91 0.36 14.62 5.15 2.75 8.84 0.00 1.09 0.00 0.63 7.64
2002 44.66 0.29 25.79 5.35 4.19 9.05 0.00 1.04 0.25 0.70 8.68
2003 53.73 0.86 16.92 6.19 3.21 7.30 0.31 1.14 0.29 0.64 9.41
2004 56.76 3.40 11.19 7.13 2.68 5.29 0.70 1.49 0.18 0.47 10.72
2005 61.71 3.16 7.59 5.05 1.17 4.75 0.83 1.39 0.46 0.43 13.48
2006 62.21 3.30 9.10 4.83 2.36 3.29 1.29 1.31 0.23 0.17 11.91
2007 62.13 1.95 5.77 6.83 3.15 4.97 0.92 1.11 0.63 1.03 11.52
2008 61.56 1.15 4.53 6.63 5.86 8.01 1.56 0.19 1.27 2.21 7.02
2009 60.60 3.48 3.86 5.44 4.44 9.75 1.09 1.58 0.74 1.52 7.50
2010 55.25 9.30 3.86 5.50 5.41 6.93 0.87 1.25 2.51 1.28 7.82
2011 42.87 12.11 18.98 3.18 5.40 5.50 0.32 1.09 2.28 0.83 7.45
2012 48.60 9.23 14.59 5.37 5.15 6.59 0.64 1.15 1.05 0.43 7.19
2013 47.77 21.23 9.08 3.61 4.00 5.08 1.20 0.79 0.90 0.52 5.82

The United States dominates throughout the whole period, with a world share of TOP500 systems ranging roughly between 45% (as occurred in 2002) to 60% (2006). As of 2013, it had 252 TOP500 supercomputers, which is just over 50%. This is almost the same as its share two decades ago. Its share of TOP500 performance has a similar story. Its performance share ranged roughly between 43% (2011) and 67% (2008). As of 2013, its share is around 48%.

The surge of China is nothing but spectacular. The country did not even have one machine qualify for the TOP500 list in 2000, now it has 66 (or 13.2%) in 2013, which is more than any other country bar the United States. Its world share of TOP500 performance has risen even more dramatically, from 0%, to over 21% in less than one and a half decades. The rise of China came at the expense, not of the United States, but of the previous runner-up, Japan. In 1993 Japan commanded nearly 22% of the world’s TOP500 systems and 24% for performance. These figures have declined to just 6% and 10% in 2013. Germany also experienced a substantial drop in its world share of TOP500 systems from 10.6% in 1993 to 3.8% in 2013, but it has maintained its performance share of roughly 5% over the two periods.

Due to space limitations, Tables 3.1 and 3.2 do not show the individual share of countries beyond the top 10 countries. However, an examination of the full list reveals another interesting phenomenon. In 1993, there were 23 countries in the TOP500 list, all of which were contemporary or future OECD countries bar Brazil and Taiwan. One decade later in 2003, the list had increased to 34 countries. This list includes newly industrialized economies like Singapore and Hong Kong, emerging economies like Malaysia and Thailand, as well as less prominent ones like Egypt and Oman. However, one more decade later on in 2013, the number of countries in the TOP500 list has been reduced to 27, and the list once against is dominated by OECD countries; the exceptions being China, India, Russia, Saudi Arabia, Brazil, Hong Kong, and Taiwan. This finding suggests that despite the seemingly globalized nature of computer technology, the “supercomputer club” has been, and will continue to be, a very exclusive group.

High-Tech Output and Supercomputing Capacity

An argument in support of supercomputer capacity being used as a competitiveness indicator is that it is a measure of a country’s science and engineering prowess. In this section, we look at countries’ high-tech sector performance and examine if there is any evidence in support of this assertion. All the data regarding high-tech output and exports are drawn from SEI (2012).10

Figure 3.4 shows the world share of high-tech manufacturing exports (HTME) of the top five countries, based on the current value measures. Surprisingly China managed to beat the United States to take top spot. This is probably because of a shift of final reassembling lines for high-tech products after 2001 from other Asian economies and developed economies to China (SEI 2012). This means that the gross HTME value greatly overstates China’s value-added. Therefore, it is necessary to consider high-tech imports as well as exports. Even so, China has been a net high-tech exporter since 1998. In 2010 China’s net HTME was valued at USD 156.7 billion, making it the largest net high-tech exporter. On the contrary, the United States has been a net high-tech importer since 1995. In 2010, the United States had net high-tech manufacturing import value of USD 94.3 billion. However, these figures also need to be interpreted with caution because it is possible that China’s large net HTME value is still attributed to its sheer volume of low to mid-tech processing work.

c3-fig-0004

Figure 3.4 World shares of high-tech exports (%).

Source: Data from SEI (2012).

To shed more light on the issue, we show in Figure 3.5 the world share of high-tech manufacturing value-added (HTMVA) of the top five countries. At first glance, the figure shares three features with Figures 3.3 and 3.4, namely the dominance of the United States over the whole sample period, the rise of China since year 2000, and the decline of Japan.

c3-fig-0005

Figure 3.5 World shares of high-tech manufacturing valued-added (%).

Source: Data from SEI (2012).

To verify if there is any association between TOP500 share and HTMVA share, we provide in Figure 3.6 a scatter plot showing the world share of HTMVA and that of TOP500 system, from 1993 to 2010.11 The figure shows that there is a strong positive correlation (0.82) between the two world shares. However, the United States is a clear outlier. If the United States is excluded, the correlation between the two variables drops to 0.68. If Japan is also considered an outlier and removed (together with the United States), the correlation drops further to 0.46. However, China seems to follow the path of Japan, suggesting that it may be inappropriate to treat Japan as an outlier after all. In fact, for consistency, if China is removed together with Japan and the United States, the correlation actually increases to 0.63 and this is depicted in Figure 3.7. In other words, the high correlation between the two world shares is robust to the exclusion of the top three countries. Yet, even a strong correlation does not imply causation, and therefore the results should be interpreted with caution.

c3-fig-0006

Figure 3.6 World shares of high-tech manufacturing valued-added and TOP500 system (%) 1993–2010.

Source: Authors’ own calculation.

c3-fig-0007

Figure 3.7 World shares of high-tech manufacturing valued-added and TOP500 system (%) 1993–2010, excluding the US, China, and Japan.

Source: Authors’ own calculation.

Figures 3.6 and 3.7 only depict the correlation between HTMVA shares and TOP500 systems shares without controlling for any other convoluting factors. In a related study (Le and Tang 2014), we examine how the growth of a country’s supercomputer capacity affects its growth in HTMVA using a regressional analysis. The data used in the study is very similar to the current one but with some differences. In particular, it takes into account the capacity of the supercomputers that have dropped out of the TOP500 list over time. The reason for this is that a supercomputer’s operational lifespan is much longer than the average time it stays on the TOP500 list and a non-TOP500 supercomputer can still contribute to its host country’s high-tech output.

An advantage of using the regressional approach is that it can control for other observed or unobserved factors, allowing us to isolate the impact of supercomputer capacity on high-tech output more precisely. In terms of observed factors, the study controls for R&D capital stock measured by cumulative R&D expenditure, as well as high-skilled labor measured by the percentage of population aged 25 and over that has completed tertiary education. It also controls for unobserved, time-invariant country heterogeneity such as cultural and institutional environment, using country fixed effects, and for unobserved, time-variant global factors such as the demand for high-tech products or oil prices using time fixed effects.12 The study covers 27 to 28 OECD and emerging economies (depending on model specifications) and the study encompasses years 1991 to 2010.

One concern may arise about the potential reverse causality that a country with a stronger high-tech sector than other countries might have a stronger incentive to invest in supercomputers as well. The concern is alleviated by the methodology of the study. The empirical analysis is based on within-country variations of the data; that is, we look at the growth in supercomputer capacity of a country from one year to another year and see how it connects to the growth of the country’s HTMVA over the same period. Even so, one might still argue that if a country’s high-tech sector grows faster in a given year, it may also invest more in its supercomputer capacity in that year. But one should be aware that investment takes a considerable amount of time to materialize into functioning capital and, thus, it is inconceivable that the growth of a country’s HTMVA can contemporaneously affect the growth of its supercomputer capacity. To conclude, we are quite comfortable to interpret the results as evidence of causation.

The study finds that if the annual growth rate of supercomputer capacity increases by 10 percentage points, then it will increase the annual growth rate of HTMVA by 0.27 percentage points. This finding is statistically significant at the 5% level. While the magnitude of the marginal effect seems to be very small, it should be interpreted against the background that the average annual growth rate of supercomputer capacity in the dataset is over 65%, which, according to the estimation result, is translated into a 1.76 percentage point increase in the annual growth rate of HTMVA.

The study also examines if there are any differential effects of the supercomputer capacities in the academic, government, and industry sectors respectively. It is found that growth in the academic sector’s supercomputer capacity has a statistically larger impact on the growth of HTMVA than growth in the government sector’s capacity. However there are no statistically discernible differences between the academic and industry sectors, or between the industry and government sectors.

Taking all the evidence into account, it seems reasonable to conclude that a country’s supercomputer capacity does have a positive impact on the growth of its high-tech manufacturing sector.

A New Arms Race?

On October 4, 1957, the Soviet Union successfully launched the first Earth-orbiting satellite, Sputnik 1, beating the United States in the first round of their space race. Losing out in that round spurred the United States to set up the Advanced Research Projects Agency (ARPA) within five months in order to regain military technological supremacy.13 Since then US politicians have been using the term “sputnik moment” to stress the need to increase efforts on science and technology development so that the United States will not be outperformed by other countries. Fast-forwarding to 2011, in his State of the Union speech President Barack Obama warned that the United States had faced another Sputnik moment: this time, losing its title of having the fastest supercomputer to China.

In late 2010, Tianhe-1A (or Milky Way-1A), a supercomputer developed by China’s National University of Defense Technology, became the country’s first supercomputer to reach the pinnacle of TOP500, knocking down the US Jaguar to second place. Provoked by China’s technological ascension, President Obama called for higher levels of R&D spending on biomedical research, clean energy, and information technology. The writing on the wall is clear to see: the technology race between China and the United States has intensified.

In June 2012, the US Department of Energy’s 16 petaflops Sequoia regained for the United States the crown of the fastest supercomputer, overtaken in November by the DOE’s Titan. China responded by committing four billion Yuan (about USD 650 million) to building a 50–100 petaflops supercomputer by 2015 as part of its 12th Five-Year Plan for Science and Technology Development (2011–2015) (Zhong 2011). China even managed an impressive step toward its target much earlier. The TOP500 list released in June 2013 confirmed that China’s Tianhe-2 had become the world’s new number one system.14 At a proven speed of 33.86 petaflops, it is almost twice as fast as the now number two and previous champion, Titan.

The success of China in 2010 and 2013, in taking the top prize of building the fastest supercomputer, cannot be taken as exceptional events. China’s investment in HPC technology is for the long haul, and the development of Tianhe-2 signals the amount of resources that the Chinese authority is willing to put in this race. Some experts estimate that Tianhe-2 cost around USD 400 million to build, four times the cost of Titan (Tsay 2013). And the electricity bill for running could be up to USD 24 million per year (Thibodeau 2013). This level of investment is even more impressive given that China’s per capita income, measured in purchasing power parity (PPP) terms, is still only one-sixth of the United States.

Although it is clear that China has become one of the world’s top players in HPC and has the potential to challenge the hegemony of the United States (Service 2013a),15 its rise should be put into perspective. On the one hand, the United States still has the largest share of TOP500 machines, 3.8 times the share of China (and 2.3 times in terms of performance). Also, the United States can arguably claim some credit for China’s success. This is because the “brain” of a supercomputer is its processor chips, and Tianhe-2 was built using chips produced by Intel Corporation. On the other hand, despite Tianhe-2’s brain being “Intel inside,” the rest of its body, namely the operating system, software, and interconnect were all designed and built in China. Industrial experts are of the opinion that China is well on its way to developing its own processor chips so that it will soon have a full package of home-grown supercomputing technology (Thibodeau 2013). The economic implication of this is that China could soon be ready to compete with key chipmakers Intel and AMD for the commercial chip markets (Service 2013b). This scenario is not far-fetched given that China has a good track record in commercializing some of its other high-tech outputs such as satellite and rocket technology.16

The current generation supercomputers are of petascale; the next generation will be of exascale, one thousand times faster. When discussing which countries are committed to putting resources into building the first exascale HPC, Service (2013a) echoes the comment of Steven Chu given earlier:

The answer will determine more than who gets bragging rights for leading high-performance computing technology into the future. Because the effort is expected to revolutionize the design of everything from computer logic and memory to the interconnections and software that make them all run, the race could determine which country’s high-tech firms are likely to dominate computer technology in the decades ahead. And because scientists who take advantage of the world’s top computers tend to be leaders in their own fields, the race to exascale could also affect which nation’s researchers will drive developments in disciplines including materials science, alternative energy production, and climate research.

In that regard, the Chinese government has already approved the next five-year plan (2016–2020) to develop an exascale HPC (Zhong 2011). In the United States, on the contrary, despite the call by the Obama administration to increase R&D spending in 2011, the need to fix the Federal Government’s debt problems has actually reduced the federal R&D expenditure in dollar terms since then. It is now down to its lowest level in 40 years as a share of Gross Domestic Product (Zambon 2013). This raised concerns that US supremacy in the technology realm might be eroded as a result. The EU, India, Japan, and Russia have also geared up their investment for the supercomputer race (Ramachandran 2013; Izvestia 2012), although that probably will only help slow down the rate at which they are falling behind China and the United States. So all the indications point to a two-horse race between China and the United States.

With so much emphasis on computer speed in public commentaries, it is important to stress that, at the end, it is not the raw HPC power that matters but what a country does with that power. In that regard, China’s supercomputers have been highly underutilized compared to its US counterparts (Stone and Xin 2010; Tsay 2013). The main reason is that China’s investment in software, including human capital, has lagged behind that in hardware. That is indeed a common problem in many other Chinese industries.

In their thought-provoking book, Race Against the Machine, Brynjolfsson and McAfee (2011) recount how the development in chess competitions involving humans and computers demonstrates that less powerful computers working together with humans are far more superior than a supercomputer alone. In 1996, chess world champion Gary Kasparov played against IBM’s supercomputer Deep Blue, which was programmed with the help of a team of chess experts. Kasparov won that match. In 1997, the two replayed and this time an improved Deep Blue defeated the champion, raising questions about the superiority of artificial intelligence. Given that computer power grows exponentially (as always), over time the competition between human and computer chess players has become even more one-sided. The situation, however, becomes more fluid in “freestyle” competitions, where any combination of human and computer is allowed. In one of the freestyle competitions discussed in the book, a pair of amateur chess players managed to defeat their opponents of either grandmaster skills or greater computer power, by working with their computers more efficiently than others. Kasparov commented on the result: “Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process” (Brynjolfsson and McAfee 2011: 55).

Supercomputers, like ordinary computers, are essentially tools. Their power eventually lies with the humans that utilize them. Those countries that win the game of speed could still lose every other game if the ultimate users – their people – are not able to use the tools creatively and effectively. Recognizing the limitation of speed-based benchmarking, the US administration warns that:

While it would be imprudent to allow ourselves to fall significantly behind our peers with respect to scientific performance benchmarks that have demonstrable practical significance, a single-minded focus on maintaining clear superiority in terms of FLOPS count is probably not in our national interest. Engaging in such an “arms race” could be very costly, and could divert resources away from basic research aimed at developing the fundamentally new approaches to HPC that could ultimately allow us to “leapfrog” other nations, maintaining the position of unrivaled leadership that America has historically enjoyed in high performance computing. (PCAST 2010: 67)

To sum up, if there is an HPC race between countries, then the race is much more than the TOP500 ranking. The true race is not even in HPC per se, but in the fundamentals that support the development of top HPCs or other frontier technologies, namely research, education, and business. In other words, it is a race of being the most advanced knowledge-based nation.

Concluding Remarks

Computational modeling and simulation is described as the third pillar of scientific inquiry, especially in areas where experiment is dangerous (e.g., testing nuclear weapons), expensive (e.g., investigating aerodynamics), or simply impossible (e.g., recreating the Big Bang). Supercomputers are at the forefront of this kind of scientific inquiry. Some even describe top HPCs as “time machines” because they give people access to a capability that will be broadly available within five to 10 years.17 The countries that have such machines are able to conduct new experiments, design new products, and develop new processes well ahead of the others (NPR 2013). In short, they will have the first mover’s advantage.

This chapter looks at the tally of top supercomputers across countries and what that means in terms of their international competitiveness. Our analysis is based on TOP500, which in turn is developed using publicly available information as well as the acknowledgements of HPC experts and vendors. Certain government agencies, like the military, obviously have huge computing power but will not reveal it for national security reasons. Likewise large private enterprises like Google or Amazon will not reveal it due to issues of trade secrecy. Therefore, a country’s true prowess in HPC cannot be measured just based on TOP500. Misrepresentation may also happen at the lower end of an HPC league table. This is because for nations like China, or companies like IBM, getting to the top rank of the TOP500 list can generate positive publicity for themselves. However, those countries that can only be at the bottom end may not be willing to put in the effort to participate (Jackson 2013). If this trend is persistent and widespread, it could affect the representativeness of the league table.

Also, the results of any of the league tables are specific to the criteria used to rank the subjects. While TOP500 is considered as the prime benchmark for HPC, it is not without limitations and competing alternatives. For instance, another HPC benchmark, Graph500,18 emphasizes data-intensive applications instead of calculations. According to this benchmark, as of June 2013, the US DOE’s Sequoia is ranked the first while China’s Tianhe-2 is ranked only sixth (positions unchanged in November 2014). Because ranking results could change when different criteria are used, one should be cautious not to read too much into any snapshot results from any specific league table. Focusing on their trend results is recommended because systematic biases due to specific benchmarking criteria are likely to remain unchanged over time. On that note, a clear trend in the top supercomputer ranking is the emergence of China as a global player and its ability to challenge the technological supremacy of the United States. This observation is broadly consistent with the conclusions drawn from other indicators of innovation input or performance such as R&D expenditure and patents: the United States is still leading but China is closing in (see, e.g., Naude, Szirmai, and Lavopa 2013).

One reason why supercomputer rankings have drawn so much attention is that they are viewed as a proxy of countries’ capability in broader advanced technology, beyond just computing. Projecting countries’ technological sophistication based on their HPC capacity ranking is not an overstretch because of the wide-ranging applications of supercomputing in science discovery and product innovation. In this chapter we try to place the top supercomputer distributions within a broader picture of international competitiveness, whilst being mindful of their proportion in this picture.

The broad picture of international competitiveness obviously contains far more elements than just technology or knowledge creation. Political and social institutions are equally important, if not more so. Countries like China, India, and Russia, though commanding some of the most advanced technology, are lagging behind in some very basic areas of development. In his Hong Kong art exhibition, dissident Chinese artist Ai Weiwei made a huge map of China out of milk power tins as a reflection of a man-made tragedy in China in 2008, where 300,000 infants were poisoned by a milk formula power containing industrial chemicals, and six of them eventually died. He commented that: “A country like this can put a satellite into space but it can’t put a safe bottle teat into a child’s mouth. I think it’s extremely absurd” (Reuters 2013).

If China’s rise in the TOP500 league tables is a sign of it being a technology superpower, then its uncountable cases of social scandals and corruption are similarly a symptom of a sick nation. These two apparently polar developments are likely to coexist in the medium term. In the long run a country’s advancement in science and technology requires not only the state’s mobilization of resources, but also a stable and functioning institutional environment where laws and regulations are implemented impartially. It also requires investment to be governed through economic principles instead of political interests.

Last but not least, the fast pace of computer evolution is well known, and supercomputers are not an exception. Moore’s Law implies that supercomputers evolve at the pace of one generation every few years. As such, the national and international landscape of HPC in a decade or two could be very different from what it is now. Indeed, in the future, defining the high-tech landscape based on countries’ borders may be insufficient. For instance, the emergence of numerous private spaceflight companies (e.g., SpaceX) competing with each other for a slice of the aerospace travel/transportation market, created partly by the termination of NASA’s space shuttle program, has tested the decade-old idea that the space race must be like the Olympic Games – one nation against another. A similar situation is observed in supercomputing. As shown in Figure 3.1, in the past two decades, the industry sector has increasingly become the dominating force in HPC at the expense of government and academic sectors. To the extent that companies that can afford to invest in HPC tend to be large and multinational corporations, how to take their cross-border activities into account when measuring national technological competitiveness could be a challenging issue.

References

  1. Anthony, Sebastian. 2012. “The History of Supercomputers.” ExtremeTech, April 10. http://www.extremetech.com/extreme/125271-the-history-of-supercomputers (accessed December 19, 2014).
  2. Brynjolfsson, E., and A. McAfee. 2011. Race Against the Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy . Lexington, MA: Digital Frontier Press.
  3. China Daily. 2013. “Chinese Seek Greater Share of Satellite Market.” China Daily, June 20.
  4. Cray. 2011. “LED Lighting Comes Out of the Dark.” Cray.com. http://www.cray.com/Assets/PDF/products/xe/XE-NERSC-LED-0811.pdf (accessed December 19, 2014).
  5. De Solla Price, D.J. 1983. “The Science/Technology Relationship, the Craft of Experimental Science, and Policy for the Improvement of High Technology Innovation.” Research Policy 13(1): 3–20.
  6. Eccles, B. 1989. “Deciding to Acquire a Powerful New Research Tool-Supercomputing.” In Supercomputers: Directions in Technology and Application , 73–80. Washington, DC: National Academic Press.
  7. Fischetti, M. 2011. “Computers Versus Brains.” Scientific American, October 25.
  8. Hawkins, J. 2004. On Intelligence . New York: Times Books.
  9. Izvestia. 2012. “Russia Joins the Supercomputer Race.” Izvestia, August 24.
  10. Jackson, J. 2013. “Growing Inequality in Supercomputing Power.” PCWorld, November 20.
  11. Le, T., and K. Tang. 2014. “Impacts of Academic R&D on High-Tech Manufacturing Products: Tentative Evidence from Supercomputer Data.” Studies in Higher Education, published online April 14.
  12. Lenzner, R. 2013. “IBM CEO Ginni Rometty Crowns Data as the Globe’s Next Natural Resources.” Forbes, March 7.
  13. Markoff, J. 2011. “The iPad in Your Hand: As Fast as a Supercomputer of Yore.” The New York Times, May 9.
  14. McBride, R. 2013. “Big Data Hits Eastern Europe as IBM Fires Up Supercomputer.” FierceBiotechIT.com, April 5. http://www.fiercebiotechit.com/story/big-data-hits-eastern-europe-ibm-fires-supercomputer/2013-04-05 (accessed December 19, 2014).
  15. Middleton, J. 2013. “The World’s Fastest Computers.” Australian Personal Computer 33(5): 64–69.
  16. Moskowitz, C. 2012. “Supercomputer Recreates Universe from Big Bang to Today.” Space.com, September 11. http://www.space.com/17530-universe-dark-energy-supercomputer-simulation.html (accessed December 19, 2014).
  17. Naude, W., A. Szirmai, and A. Lavopa. 2013. “Industrialization Lessons from BRICS: A Comparative Analysis.” IZA Working Paper 7543.
  18. NNSA. 2012. “Titan Supercomputer Named Fastest in the World, NNSA’s Sequoia Supercomputer Ranked Second.” National Nuclear Security Administration, November 13. http://nnsa.energy.gov/blog/titan-supercomputer-named-fastest-world-nnsa%E2%80%99s-sequoia-supercomputer-ranked-second (accessed December 19, 2014).
  19. NPR. 2013. “A Calculating Win for China’s New Supercomputer.” NPR.org, June 21. http://www.npr.org/2013/06/21/194230816/a-calculating-win-for-chinas-new-supercomputer (accessed December 19, 2014).
  20. Nuttall, C. 2013. “Supercomputers: Battle of the Speed Machines.” Financial Times, July 9.
  21. Olds, D. 2012. “Supercomputing vs. Our Computing: iPad Owns Cray-2? Wife’s Desktop Beat All?” Gabriel Consulting Group, March 8.
  22. Padua, D., and J. Hoeflinger. 2003. “Supercomputers.” In Encyclopaedia of Computer Science , 4th edn, ed. A. Ralston, E. Reily, and D. Hemmendinger. Chichester, UK: John Wiley & Sons Ltd.
  23. PCAST. 2010. Designing a Digital Future: Federally Funded Research and Development in Networking and Information Technology: Report to the President and Congress. President’s Council of Advisors on Science and Technology.
  24. Ramachandran, T. 2013. “India Tries to Regain Lost Ground in Supercomputer Race.” The HINDU, July 4.
  25. Reuters. 2013. “China’s Ai Weiwei Takes Inspiration from Milk Scandal.” Reuters, March 17.
  26. Riley, C., and D. Campbell. 2012. “The Maths That Made Voyager Possible.” BBC News, October 23. http://www.bbc.co.uk/news/science-environment-20033940 (accessed December 19, 2014).
  27. Scherer, M. 2012. “Inside the Secret World of the Data Crunchers Who Helped Obama Win.” Time Swampland, November 7.
  28. SEI. 2012. Science and Engineering Indicators 2012. Arlington, VA: NSB.
  29. Service, Robert F. 2013a. “Who Will Step Up to Exascale?” Science 229(6117): 264–266.
  30. Service, Robert F. 2013b. “China’s Supercomputer Regains No. 1 Ranking.” Science Insider, 18 June.
  31. Shadbolt, P. 2013. “China Sets Course for Lunar Landing this Year.” CNN.com, August 30.
  32. Stone, R., and H. Xin. 2010. “Supercomputer Leaves Competition – and Users – in the Dust.” Science 330(6005): 746–747.
  33. Thibodeau, P. 2013. “What China’s Supercomputing Push Means for the U.S.” Computerworld, June 10.
  34. Tsay, B. 2013. “The Tianhe-2 Supercomputer: Less Than Meets the Eye?” SITC Bulletin Analysis, July , University of California Institute on Global Conflict and Cooperation.
  35. Zambon, K. 2013. “AAAS Analysis Shows Uncertain Future for Federal R&D Spending.” American Association for the Advancement of Science, May 2. http://www.aaas.org/news/aaas-analysis-shows-uncertain-future-federal-rd-spending (accessed December 19, 2014).
  36. Zhong, J. 2011. “Recent Advances of Chinese Efforts in HPC.” International Exascale Software Project (IESP) Exploratory Meeting 6, San Francisco, CA, USA, April 6–7. http://www.exascale.org/mediawiki/images/b/b8/Talk25-zjin.pdf (accessed December 19, 2014).

Notes

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.121.101