What happened during the downturn in the 2000s?
Please God, just one more bubble.
—Popular Silicon Valley bumper sticker, 2003

In many ways, the second installment in the boom-and-bust history of computer science is easier to explain than the first. The growth in interest that began in the mid 1990s coincided with the advent of the web and the elimination of commercial restrictions from the Internet. These developments ushered in a period of frenetic growth in the computing industry generally referred to as the dot-com boom or, when it is important to emphasize its ephemeral nature, the dot-com bubble. The excitement generated by both the new technologies and the opportunities provided by the startup culture attracted many students back to the field. In the years of flat enrollments between 1991 and 1996, departments had been able to rebuild their faculties, which meant that there was capacity—at least at the beginning—to accommodate a rise in student numbers. When the tech bubble burst in 2001, students began to move away from computer science, which led in turn to a multiyear decline. Figure 3 shows the rise and fall during this cycle of history.

BS Degrees (1996-2009)

Even though the causes of both phases of this cycle are easy to identify, it is still worth considering this period of history in more detail. From 1997 to 2003, the number of computer science graduates rose by an average of 15 percent per year, with many institutions seeing considerably larger increases. That rapid rate of growth raised echoes of the expansion of the early 1980s. Several committees were formed to study the problem in the hope that academic computer science could avoid the meltdown it had suffered a decade and a half before.

In 2000, academic computer science did avoid a meltdown, but not for any reason one would like to repeat. In a perverse sense, academia got lucky. The industry collapsed first. Had it not done so, it seems likely that capacity limitations would have forced universities to restrict enrollment, with all the negative consequences that the field endured in the late 1980s.

The aspect of the collapse in the early 2000s that seems hardest to explain is why things took so long for student interest to recover. The industry bounced back very quickly and was hiring at the pre-crash rate by 2004. The student numbers, however, did not start to rise substantially until after the subprime mortgage crisis in 2007. Somehow, a meme arose in the public consciousness in the form of a widespread belief that computing jobs were in danger of imminent collapse, either because they would be automated out of existence or because all software development jobs would be shipped offshore. Although there was no evidence to justify those fears—and ample data to refute them—that mythology kept students out of computer science until disaster struck in a different sector of the economy.

The sections that follow look more closely at the effect of the dot-com expansion on academic capacity and the failure of student interest to recover even after the dot-com collapse had passed.

The effect of the dot-com boom on enrollments

The frenzy of excitement around the dot-com explosion in the mid 1990s generated enormous student interest in computer science. As Figure 3 illustrates, the number of bachelor’s degrees in computer science rose steadily from 1997 to 2003. Those numbers therefore reflect the decisions that students made about their major field approximately two years earlier, which aligns with the years of the dot-com boom.

During those years, the situation facing computer science departments corresponded closely to the rapid expansion of the early 1980s and generated a similar set of pressures. Along with the double-digit annual growth in student numbers, departments faced a shortage of available faculty. In the December 1998 issue of the SIGCSE Bulletin, Paul Myers and Henry Walker published a review of academic hiring, which concluded that there was “a very serious shortage of new Ph.D.s in computer science,” to the point that in 1997-98 “only about half of the open tenure-track positions were filled.”8 While that level of undersupply falls far short of the seven-positions-for-every-applicant crisis of the 1980s, it nonetheless generated considerable concern, not only in university departments, but also in the media, industry, and government.

In 1999, the Computing Research Association published a report entitled The Supply of Information Technology Workers in the United States, detailing the shortage of workers in both industry and academia. The report observed that academic institutions faced a special problem, invoking the “seed-corn problem” popularized by Peter Denning a decade earlier.9

Many educators, industrial laboratory leaders, and government science officials are concerned that the high industrial demand for information technology (IT) workers will siphon out of the educational systems many students who would otherwise pursue an advanced degree. This diminishes [the] pool of people who will join the university faculties that perform basic research and teach the next generation of students. This problem is compounded when industry also successfully recruits current faculty members, including junior faculty who would become the academic leaders of the profession in the coming decades. This is known as the “seed-corn” problem—an analogy to those who consume too much of this year’s crop, reserving too little for next year’s planting.

As was true in the mid 1980s, the problems of faculty recruitment were noted by the media. In September 1999, The Chronicle of Higher Education included a news story entitled “Computer scientists flee academe for industry’s greener pastures” that begins with the following evocative paragraphs:10

Just as he prepared to leave Cornell University last spring to help start a new high-technology company, Thorsten von Eicken got word that the computer-science department at Cornell had voted to grant him tenure.
    He left anyway.
    Mr. von Eicken is part of a stampede of bright, young Ph.D.s in computer science who are abandoning academe for the corporate world.
    High-paying, fast-paced jobs in the computer industry are attracting both seasoned academics and newly minted Ph.D.s who, in the past, would have opted for careers in higher education. The upshot: Computer-science and computer-engineering departments are suffering a serious shortage of professors at a time when undergraduate enrollments are booming.
    Many departments are losing professors faster than they can hire them. The University of Illinois at Urbana-Champaign recruited five new professors in electrical and computer engineering to start this fall, but lost five others who were already on its faculty. The University of Washington recruited four scholars to its department of computer science and engineering but lost five. Cornell hired three but lost six.

The difficulty of faculty recruitment was also picked up by The New York Times, which ran an article entitled “Computer science departments are depleted as more professors test entrepreneurial waters” on August 9, 2000.11 The article quotes Ed Lazowska, then chair of the Computer Science and Engineering Department at the University of Washington, as follows:

It is difficult to hold a computer science department together these days. You’d like to keep a lot of that entrepreneurial energy here. Faculty recruiting and retention are difficult. Ten years ago, industrial research labs were the enemy; now it’s the lure of startups.

In 2001, the National Academies released a major study entitled Building a Workforce for the Information Economy that looked broadly at the questions of the computing and information-technology workforce, including several issues concerning education. At the request of the study panel for the National Academies and with the endorsement of the ACM Education Board, I submitted a white paper that focused on how the shortage of faculty candidates was making it impossible for universities and colleges to meet the demand from employers for graduates with the necessary level of expertise.12 That white paper was cited several times in the final report, which issued the following conclusion with respect to higher education:

The academic research enterprise in IT continues to be strong, but industry and academia are competing for the same small pool of highly productive, creative individuals. Ph.D. production and faculty recruitment and retention are both threatened by the lure of the commercial sector. Some faculty and graduate students are leaving academia for better-compensated positions in industry; others leave because only industry (especially start-ups supported by venture capital) offers them the opportunity to pursue their intellectual and research interests. . . . Compared to the benefits to be found in industry and start-ups, academic life—with the attendant burdens of low salaries, teaching, and the need to obtain grant support—is increasingly seen as unattractive to many graduate students. The long-term significance of these perceptions is at present unclear, but they do not bode well for the long-term health of the IT field.

Although many of the discussions that led to the National Academies report took place in 1999 and 2000, the final version was not released until 2001. By that time, the situation in the computing industry had changed entirely. The speculative bubble that had fueled the growth of a vast array of dot-com companies collapsed, and the industry went into a tailspin. The NASDAQ composite index—which had risen from 740.47 at the beginning of 1995 to a high of 5,132.52 on March 10, 2000—collapsed to 1,108.49 by October 10, 2002. With that collapse, investors lost trillions of dollars, the wide-open job market of the late 1990s disappeared (if only for a couple of years), and students who had been lining up to major in computer science, like many faculty members and graduate students before them, started to look for greener pastures.

The slowness of the recovery after the dot-com collapse

On one level, the decline in computer science enrollments after the dot-com collapse is easy to understand. At the time, the news was full of stories of the demise of Internet startups, so recently the darlings of Wall Street, many of which lost their entire value overnight. As startups collapsed and large companies started downsizing, the high-tech industry did not seem like a good bet for stable employment over the long term. Students headed off in other directions.

What is paradoxical about the downturn in student interest is that it persisted for many years after the industry had fully recovered. While there was a small dip in overall employment in the information-technology sector between 2000 and 2002, employment numbers quickly surpassed their pre-crash levels. By 2005, employment prospects for students completing a bachelor’s degree in computer science and other specialties in information technology were considerably better than they were for students in any other discipline. That conclusion is underscored in the following excerpt from a December 2005 publication from the National Science Foundation:13

Continuing a pattern that has been evident for decades, recent bachelor’s and master’s engineering graduates and computer science graduates at the bachelor’s level are more likely than graduates in other fields to be employed full time after graduation, and upon entering the workforce, they are rewarded with higher salaries.

These conclusions from the Bureau of Labor Statistics were echoed in the popular press. In May 2006, Money magazine rated “software engineer” as the best job in America on the basis of a combination of factors including salary, job availability, potential for growth, flexibility, and creativity.14

One of the best analyses about the shortage of software professionals appears in a talk by John Sargent, Senior Policy Analyst at the Department of Commerce, which he presented at the CRA Computing Research Summit in February 2004.15 Although the entire presentation is worth viewing—in part because it is striking how little things have changed over the past decade—the talk is particularly memorable for the slide that appears in Figure 4, which combines data on degree production from the Department of Education with job projections from the Bureau of Labor Statistics.

Degrees and jobs figure

Because of its unusual effectiveness, several people have updated this slide as new releases of the relevant data become available. The most recent version I’m aware of was created by my colleague Phil Levis and is available from http://csl.stanford.edu/~pal/ed/. The take-home message of the graph, however, has remained constant over the past decade: universities are producing far too few graduates in computer science to meet industry demand.

The bar that towers over the other data points in Figure 4 makes it immediately clear that the number of jobs in computing-related disciplines far exceeds the number of students trained in those fields. If students were responding to market forces, the imbalance between degree production and job growth would have sparked a stampede toward computer science. That stampede did not happen. Despite the many economic advantages available to those with computer science degrees, students shied away from the field until after the subprime mortgage crash in 2007.

The reasons behind the lingering unpopularity of computer science during the 2000s are complex. The factors certainly included memories of the pain associated with the collapse of the dot-com bubble and a widespread fear that computing jobs would soon be shipped offshore to low-wage countries. The fear of offshoring was particularly intense, even though a 2006 ACM report entitled Globalization and Offshoring of Software found no evidence that software jobs were disappearing in developed countries. In fact, the report found that “despite a significant increase in offshoring over the past five years, more IT jobs are available today in the U.S. than at the height of the dot-com boom” and, moreover, that “IT jobs are predicted to be among the fastest-growing occupations over the next decade.”16

An interesting illustration of the disconnect between the available economic data and popular perception appears in the online response to a keynote address at the CIO Leadership Conference by Maria Klawe, then Dean of Engineering at Princeton University, with the title “Blue skies ahead for IT jobs.”17 The abstract for Klawe’s talk reads as follows:

Contrary to popular belief, career opportunities in computer science are at an all-time high. We’ve got to spread that message among students from a rainbow of backgrounds, or risk becoming a technological backwater.

The comments that Klawe’s talk elicited—which have, unfortunately, vanished along with the original website at CIO Magazine—ran at least ten-to-one against her assessment of the sunny outlook for the field. Here are a few typical reactions that I downloaded at the time:

The last of these comments, which is quoted in its entirety, seems particularly telling. The reader offers no alternative data, just a deeply seated belief that the optimistic forecasts of the Labor Department must be wrong. Evidence counted for little in this debate.

Ironically, popular fears about the tenuous future of the discipline were in some cases encouraged by comments from within the academic community. In July 2008, Communications of the ACM published a debate about future directions for the technology curriculum.18 Professor Stephen Andriole at Villanova University predicted that the need for programmers would soon diminish:

Of course there will be programming jobs for our students. But the number of those jobs will decline, become more specialized, and distributed across the globe. . . . Today, Fortune 1000 companies have far fewer programmers than they did because of the rise of packaged applications and the labor-rate-driven sourcing options they now have. This trend will accelerate resulting in fewer programming jobs for our students. Should we continue to produce more programmers?

In my response, I argued that Andriole was looking only at one sector of the technology industry and that the number of jobs across the industry as a whole would continue to rise, in line with the predictions from the Bureau of Labor Statistics. In recent years, the sustained increase in the number of software jobs makes it clear that my analysis was closer to the mark.

Although I never published an analysis of the reasons why students continued to stay away from computer science long after the industry recovered, I did prepare a report for the ACM Education Board, which considers this paradox in more detail.19