Jump to ContentJump to Main Navigation
Information Technology PolicyAn International History$

Richard Coopey

Print publication date: 2004

Print ISBN-13: 9780199241057

Published to Oxford Scholarship Online: September 2007

DOI: 10.1093/acprof:oso/9780199241057.001.0001

Show Summary Details
Page of

PRINTED FROM OXFORD SCHOLARSHIP ONLINE (oxford.universitypressscholarship.com). (c) Copyright Oxford University Press, 2020. All Rights Reserved. An individual user may print out a PDF of a single chapter of a monograph in OSO for personal use.  Subscriber: null; date: 27 October 2020

The Supply of Information Technology Workers, Higher Education, and Computing Research: A History of Policy and Practice in the United States

The Supply of Information Technology Workers, Higher Education, and Computing Research: A History of Policy and Practice in the United States

Chapter:
(p.54) 3 The Supply of Information Technology Workers, Higher Education, and Computing Research: A History of Policy and Practice in the United States
Source:
Information Technology Policy
Author(s):

William Aspray

Publisher:
Oxford University Press
DOI:10.1093/acprof:oso/9780199241057.003.0003

Abstract and Keywords

This chapter presents the history of policies and practices in the supply of IT workers, especially as they relate to policy for higher education and computing research, in the United States. There is no direct worker policy that mandates how many workers of various types need to be trained. Instead, policy about the supply of IT workers is vested in other kinds of policy: national research output, education, defence, social welfare, immigration, national economic competitiveness, and taxation. Concerns about the national supply of IT workers is tied directly to scientific research and higher education policy, and these are the topics that are investigated most extensively in this chapter. Both the National Science Foundation (NSF) and the National Academies of Science and Engineering have been key players in this.

Keywords:   IT workers, National Science Foundation, higher education, Sputnik era, computer science, government policy

Introduction

During the late 1990s, the supply of Information Technology (IT) workers has been a major policy issue in the United States. The Department of Commerce, following the lead of a major trade association (the Information Technology Association of America, ITAA), identified a serious shortage of IT workers and argued that this shortage could do serious harm to the economic fortunes of the nation. However, the General Accounting Office questioned the evidence cited by Commerce and expressed skepticism about the existence of a shortage. Debate crystallized around proposed legislation to increase the number of visas awarded annually under the HIB temporary visa program for high-tech workers, with the trade associations in favor of an increase in the number of visas and the labor unions opposed. Compromise legislation was signed into law, but lobbying began within months to increase the number even further.

Several related policy issues have also been under scrutiny at the national level. The number of workers from minority populations (other than Asian Americans) entering the IT workforce is very small. The percentage of women training for jobs in IT has been declining throughout the 1990s, after some promising gains in the 1980s. Many people are worried about a “seed-corn” problem—that industry will siphon off so many college faculty and graduate students from computer science departments that there will not be an adequate supply of faculty to train the next generation of students for the IT workforce. Concerns about the negative features of the current academic research environment—such as inadequate research funding, the focus on short-term results, and onerous teaching and administrative responsibilities—have led to proposals from the White House, now under consideration by Congress, for an unprecedented increase in federal support for computing over a 5-year term. The Department of Education and the National Science Foundation (NSF), meanwhile, are bolstering institutions, such as community colleges, that produce low-end IT workers. Legislation is also under consideration to (p.55) provide tax credits to companies that invest in training for their IT workers. Numerous studies conducted at the state and regional level have resulted in programs to develop a larger and better-trained workforce for the local employers.

The current policy issues concerning IT have been discussed elsewhere and are not the main subject of this chapter.1 Instead, this chapter presents the history of policies and practices in the United States concerning the supply of IT workers, especially as they relate to policy for higher education and computing research. There is no direct worker policy that mandates how many workers of various types will be trained. Instead, policy about the supply of IT workers is vested in other kinds of policy: National research output, education, defense, social welfare, immigration, national economic competitiveness, and taxation. Concerns about the national supply of IT workers is tied directly to scientific research and higher education policy, and these are the topics that are investigated most extensively in this chapter. Both the NSF and the National Academies of Science and Engineering have been key players in this discussion.

Some caveats should be mentioned. In several places, the chapter diverges from worker policy to cover the more general history of policy concerning higher education and research. The chapter is heavily slanted toward the NSF. It perhaps pays too little attention to the Department of Education (and its predecessor organizations) and the National Academies (and their research arm, the National Research Council, NRC), whose computing policies and programs have not been seriously investigated by historians. The chapter does not consider the fellowship programs for graduate students offered by the National Air and Space Administration, the National Institutes of Health (NIH), the Department of Energy (DOE), or the Armed Services. In considering NSF programs, the chapter does not track all the general fellowship programs and infrastructure to community colleges and historically black universities, which have some bearing on computer training. The chapter also ignores general immigration policy and tax credit policy to industry for research and training, which probably has had modest bearing on IT workers. The focus is primarily on supply issues and, in particular, on support for higher education.

Not Enough Mathematicians! (1945–1959)

The supply of IT workers was not a national policy issue in the 1940s and early 1950s. At most, there was limited monitoring of the situation by research-oriented agencies, such as the NSF and the Office of Naval Research (ONR). The reasons are straightforward. First, there were few computers, and hence few people were needed to build, operate, or use them. By 1954, there were at most a few hundred computers, even when one counts automatic, electronic calculating devices that are not stored-program computers, such as card-programmed calculators. Second, the expense, size, power consumption, and high maintenance of computers seemed at that time to limit the future market to large companies, research universities, and government. Third, the computer was originally conceived as a giant computational (p.56) engine, rather than as a data processing machine; and the number of scientific applications would turn out to be much smaller than the number of business applications. Fourth, it was hard to obtain a computer, even if you were willing to suffer the expense and inconveniences; only in the mid-1950s did an industry coalesce to build computing machinery.

The idea of the computer as a machine for business applications began to emerge in the early 1950s. This combined with the anticipated need for more computers and more computer workers led to the first conference on training personnel for the computing machine field, held at Wayne University in 1954. Wayne University had an active, early university computing program, strengthened by its partnerships with the local Detroit industries; thus it was a logical choice to host this training conference. The conference provides a good snapshot of the supply of computing workers at the time.

Remarks made at the conference indicate that some people still regarded the computer as a mathematical instrument, while others were beginning to recognize its role as a data processing machine. Representatives of the federal agencies that support science, perhaps not surprisingly, examined the training issues entirely in terms of the computer as a mathematical instrument. For example, NSF's Leon Cohen wrote: “First, the effective use of the computing machine depends on the development of appropriate mathematical methods. Second, the development of mathematical methods depends on the development of mathematicians.”2 F. J. Weyl, ONR, argued: “Particular scrutiny has been given by this conference to our current mathematical education—and rightly so.”3 However, others at the conference believed that the supply of workers was an issue primarily because of the growing commercial applications. The conference chair, Arvid Jacobson of Wayne University, indicated this in his opening remarks: “With the new applications of electronic techniques in accounting and business management during the past year, it is timely that the problems of training and manpower should be examined.”4 G. T. Hunter of IBM stated: “In the next few years there will be considerably more men required for the commercial field than for the scientific field.”5 Lt. Col. C. R Gregg from the Air Material Command chose to restrict his comments about computing in the federal government to the “non-scientific areas of use” in part because “the real quantity requirements for personnel, with the possible exception of maintenance technicians and electronic engineers, will be in this area.”6

Even for those participants who saw the business applications as the growth area of computing, the emphasis was on training mathematicians to do the work: “We cannot ignore—or even for a moment forget—that sophisticated mathematical techniques have a very definite place in the nonscientific as well as in the scientific applications.”7 Several speakers, including Leon Cohen from NSF and Franz Alt from the National Bureau of Standards (NBS), pointed to the small supply of mathematicians in the United States as a serious labor problem for the burgeoning computer field. A survey conducted in 1951 had shown that there were approximately 2000 PhD mathematicians in the nation.8 The 1950 national census had counted about 5600 professors and instructors of mathematics, as well as 1700 (p.57) other mathematicians. Cohen noted that the average age of a PhD mathematician was 41, indicating that most of these mathematicians had completed their studies before the computer was invented.9

There was great uncertainty about the present and future demand for personnel in the computing field. One of the more outspoken members of the profession, Herbert Grosch of General Electric (GE), estimated the number of current jobs in the computing field at between 2000 and 4000. He argued that the number of positions would double annually for the foreseeable future and would reach 1 million jobs by the end of the 1960s. (In fact, although the computing field grew rapidly, the number of IT workers did not reach 1 million until 1985.)10

At the conference, Milton Mengel of the Burroughs Corporation presented a talk on “Present and Projected Computer Manpower Needs in Business and Industry,” based on a survey he conducted of 500 of the largest manufacturing companies in the United States. Banks, insurance companies, utilities, and retailers were excluded from the survey primarily because Mengel believed them to have special computing needs. Responses were received from 139 companies (28 percent). Of that number, twenty-seven were currently using computers or had them on order, while another twenty-three were considering the possibility and six had studied computers and found them uneconomical. Of these companies that already had computers in use, 65 percent were using them for engineering calculation, 30 percent for scientific and research studies, and 5 percent for business data processing. But it was clear that the trend was toward increased use in business applications. When those computers that were on order were added in, the percentages for engineering, scientific, and business applications shifted to 58, 26, and 16, respectively. The business applications that were most commonly mentioned were those that are computationally most demanding, such as sales and production forecasting, personnel evaluation, shop load, and production scheduling—as opposed to payroll preparation, invoicing, and other arithmetically simple business problems.

The survey asked a number of other questions. The types of workers that would be needed by computer users were reported to be: Analyzers 43 percent (1000), engineers 8 percent (175), programmers 17 percent (400), operators 25 percent (590), technicians 3 percent (70), and others 4 percent (90)—total 100 percent (2325). The numbers in parentheses represent the number of workers of each type that would be needed by all 500 of the companies surveyed, based upon a linear projection from the 28 percent sample. Mengel noted that a greater percentage of the responses came from larger companies, meaning that the linear projection might overstate demand slightly. Asked whether they had difficulty in fulfilling their manpower requirements, fourteen companies replied in the affirmative and nine in the negative. Companies with computers recruited their computer workers from a variety of sources: Colleges and universities 37 percent, company training programs 37 percent, suppliers of equipment 12 percent, government agencies 7 percent, and advertisements 7 percent. Company-operated training programs were regarded as very important. Asked to what extent their company had found it necessary to train its own personnel, 50 percent reported they trained all of their (p.58) workers, 35 percent trained at least half of their workers, and 15 percent trained less than half of their workers. Out of twenty-seven companies, twenty-six stated that they believed colleges and universities should pay more attention to training for the computer field, with one company having no opinion.

Although the response to the survey question about the difficulty in finding computer workers does not suggest a serious problem, a number of conference participants indicated that they believed there was a shortage. G. T. Hunter argued that there is “a general shortage of technically trained people.”11 W. H. Frater (General Motors) opened his presentation speaking about “a universal feeling that there is a definite shortage of technically trained people in the computer field.”12 E. P. Little (Wayne University) stated:

You have heard at this conference the estimates of manpower needs for computer applications. The figure is astounding compared to the facilities for training people for this work. Less than twenty institutions of higher learning have formal courses directly related to large automatic computing machines. Less than half of these have a large machine available for the use of teachers and students, and few of these machines are being used in the study of problems related to the data processing and management decision requirements of business.13

Speaking of government needs in the computing area, Col. Gregg indicated “our biggest difficulty in this area…is the attracting of adequate numbers of qualified people. This is particularly so in localities where we are placed in competition with industry salary-wise.”14 This surprised Gregg somewhat because “the primary importance is their [the computer's] ability to perform…without human inter-vention…However, depending on the types of problems being solved, we may find no decrease in the overall numbers of people required. As a matter of fact, in a pure job-shop' there may be an actual increase.”15 Gregg notes that the need varied by occupation—the government had a great deal of trouble getting enough qualified programmers, but no difficulty in obtaining enough data-preparation people. A few government organizations that were heavy users of computers formed their own training programs.16 Aberdeen Proving Ground, for example, carried out 21,000 hours of machine computation in 1953 and employed seventy people in its Analysis and Computation Branch. Aberdeen preferred to hire people with at least a master's degree in mathematics. Those hired with only a bachelor's degree in mathematics were given training at the BRL Ballistics Institute (organized in 1948) to ensure that they would be effective programmers.17

Mission agencies, including the Army, Navy, Air Force, and Atomic Energy Commission (AEC), sponsored a limited number of university research projects in the first half of the 1950s. This funding provided some support for graduate students and for university infrastructure, thus helping to establish the universities as part of the supply system. The principal purpose of this funding, however, was to meet the needs of the government agency, not to train a national workforce of computing professionals.

(p.59) The federal agency most concerned at the national level about scientific education and research was the NSF. But NSF did not have a program in computing at this time. Computing was then the responsibility of the agency's mathematics program. The program in mathematics included funding for sabbatical leaves for faculty, release time for research, research assistantships, group research grants, predoctoral and postdoctoral fellowships, and summer conferences for mathematics teachers. Although these kinds of funds each could have been applied to computing, there were virtually no grants of these kinds made for computing prior to 1955. Despite Col. Gregg's statement that “often repeated interest is shown in this field by Congress, by the Office of the Secretary of Defense, and by active groups within each of the military departments and in other departments of the government,” there was no indication that the federal government recognized any national computing labor shortage at this time, and there were certainly no federal programs to increase supply at this time.18

By the mid-1950s, the computer had become an important tool of national defense. Had the federal government experienced a shortage of people to staff the construction and operation of its computers, there might have been federal action. However, the number of personnel needed to operate and program the machines in government organizations was still modest. This was due, in part, to the fact that the construction of large computer systems for the government was contracted out to private industry. IBM, for example, was contracted to design and build the computer hardware for the Semi Automatic Ground Environment (SAGE) missile defense system. Thus, much of the labor requirements devolved from the federal government to private industry. Companies, such as the System Development Corporation (SDC), did indeed have difficulty in finding enough computer personnel to staff these projects.

The SDC, formed in 1956 to carry out the programming for the SAGE missile defense system, was perhaps the largest employer of programmers in the 1950s. The company employed 700 programmers in the late 1950s, and the number rose into the thousands in the 1960s. It originally sent its new employees to a short course in programming offered by IBM, but soon it formed its own training school. Between 1956 and 1961 it trained 7000 programmers and systems analysts. Turnover was high, and many of the programmers working in the United States had learned to program at SDC. A company advertisement of the late 1950s indicates that it was looking for mathematicians: “Use your past math background for your future in computer programming.” But, like many other large employers of computer workers, it also used aptitude testing (the Thurstone Primary Mental Abilities test) and personality testing (the Thurstone Temperament Schedule) to screen applicants without mathematics backgrounds for programming jobs.19

By the late 1950s, business applications were growing more rapidly than scientific applications.20 The Bureau of Labor reported that there were several thousand programmers in the United States in 1958, mostly located in metropolitan areas where the main offices of companies and government agencies were located. (p.60) A company would typically employ from ten to thirty programmers if it had a large computer system, but only two or three if the computer was of medium size. Companies were using aptitude tests to measure general intelligence and special tests to measure “ability to think logically and do abstract reasoning.” There were also personality tests to seek out individuals with patience, “logical and systematic approach to the solution of problems,” “extreme accuracy,” close attention to detail, and imagination. Organizations seeking scientific programmers tended to hire new college graduates (e.g. 89 percent of the people trained at SDC between 1956 and 1961 were between 22 and 29 years of age); while companies hiring for business applications were mainly taking people with subject knowledge and giving them programming training. The Bureau of Labor reported:

Men are preferred as programmer trainees in most areas of work. Although many employers recognize the ability of women to do programming, they are reluctant to pay for their training in view of the large proportion of women who stop working when they marry or when they have children. Opportunities for women to be selected as trainees are likely to be better in government agencies than in private industry.21

Training was provided primarily at company expense, mainly through courses offered by the computer manufacturers, but sometimes in courses offered by the companies using large computers once they had several years of experience with automatic computing. For business programmers, the government required all new recruits to have a college degree or equivalent experience. Industry was not as strict as government about educational credentials. For those who did have formal training, they were most often seeking out people with courses in accounting, business administration, and statistics. Mathematics was losing its privileged place in the training of computer professionals: “Many employers no longer stress a strong back-ground in mathematics if candidates can demonstrate an aptitude for the work. However, programmers of scientific and engineering problems are usually college graduates with a major in one of the sciences or in engineering and some course work in mathematics.”22

The Sputnik Era (1957–1963)

The most important issue shaping US science and technology policy in the United States in the late 1950s and early 1960s was the launch of the Soviet artificial satellite Sputnik in 1957. It made American lawmakers rethink their assumptions about US world domination of science and technology. It also led to a major federal investment in academic science and technology—to improve research funding and infrastructure, to enhance science education from the K-12 through the university level, and to increase the number of practicing scientists and engineers.

After the Sputnik crisis in 1957 and the passage of the National Defense Education Act, NSF became more heavily involved in all areas of science education. It increased the direct support of students with fellowships and traineeships, devoted more effort to improving the science curricula, and expanded teacher-training (p.61) institutes. Fellowships for computer science students came originally from the mathematics program, amounting to 10 percent of their fellowships awarded in 1965 and 20 percent by 1974.

The NSF sponsored approximately 140, 1- and 2-month summer institutes for teachers each year as part of the response to Sputnik. The first one on computing was held in 1960, and the number of teachers in institutes on computing grew to five or six each year between 1964 and 1968. Curriculum development grants in computing began in 1959, and typically two to six grants were awarded in this area each year during the 1960s.

As early as 1953, NSF had begun to receive research proposals calling for support for computing time as an aid in carrying out some other kind of scientific investigation. As the interest in using the computer as a research tool increased in the scientific community, the NSF formed an Advisory Panel on University Computing Facilities under the direction of Institute for Advanced Study mathematician John von Neumann. The panel reported back to NSF in 1955 that computers should be treated like other high-capital research equipment, such as radio telescopes and particle accelerators, and that it was appropriate for NSF to supply computing machines to universities for scientific research. For the next 3 years, NSF reprogrammed modest funds to support campus computers and computer training programs, but the agency could not keep up with university demand.

The NSF established a regular computer facilities program for colleges and universities in 1959.23 Funding was fairly strong in the early 1960s, presumably because of the political climate that favored national infrastructure investments to assure the place of the United States in the scientific race. The facilities program continued until 1973, and during most of its years it was NSF's most important activity in the computing area. NSF awarded 414 computer facilities grants during this period, about equally divided between first computer acquisitions and equipment improvements for established computer centers. This was an era when campus computing grew rapidly. The number of academic computer facilities in the United States grew from 100 in 1961, to 300 in 1963, to 700 in 1965, to more than 2000 at 1600 institutions in 1969. NSF's facilities program was constantly challenged by this rapidly growing demand in both the number of academic users and the amount of computing power they required, and by new technologies such as time-sharing. Time-sharing computers were much better suited for instructional purposes than batch-operating computers, but they were expensive. NSF placed computers on campus primarily to support research in the various scientific and engineering disciplines, but these facilities also served—both intentionally and unintentionally—to educate future computer workers.

The other major change in federal computing support in this period was the emergence of a major new funder of academic and industrial computer research. The new entrant was the Defense Advanced Research Projects Agency (DARPA).24 Because of the Sputnik crisis and the inter-service rivalry between the military branches, which was seen as creating inefficiency and waste in the development of advanced technology for the nation's defense, in 1958 President (p.62) Eisenhower formed the Advanced Research Projects Agency (ARPA), which later added “Defense” to its name. DARPA's first task was the military space program, but it soon became involved in advanced research on materials, reconnaissance satellites, nuclear test verification, and other projects.

DARPA began to support research on computing in 1962, when it formed its Information Processing Techniques Office (IPTO) under the guidance of J. C. R. Licklider, a psychologist from Massachusetts Institute of Technology (MIT). DARPA poured massive funding into computing because it believed computing was the solution to the military's need for advanced command-and-control systems. The best-known computing project supported by DARPA in its early years was Project MAC at MIT, which developed time-sharing computers and an environment for using them effectively. Within a year, DARPA was providing more funding for extramural computing research than all other federal agencies combined. While DARPA projects made fundamental contributions to research on time-sharing, networking, graphics, and artificial intelligence, DARPA was a mission agency; its support for research was intended to build up the basic technologies that would be needed by the military in coming years. DARPA provided large grants, sustained over many years, to a few, preferred academic principal investigators. This support enabled many of the universities receiving this support, such as MIT, Carnegie-Mellon, and Stanford, to become leaders in computer science. DARPA did little, however, to strengthen the ability of the larger higher educational system to supply large numbers of computer workers.

Establishing Computer Science as an Academic Discipline (1946–1972)

The first university courses on computing were taught at Harvard, Columbia, University of Pennsylvania, and a few other universities within the first several years after the Second World War. Dozens of universities became interested in computer science and computer engineering research and education in the 1950s. Research and teaching was carried out primarily in computer centers, which first appeared in most research universities in the 1950s. Some universities offered computing within existing departments—most often mathematics or electrical engineering, but occasionally in departments ranging from business schools to agronomy. Of the sixty-eight universities with some activity in digital or analog computing identified in a 1954 survey by the Institute of Radio Engineers Professional Group on Electronic Computers, twenty-nine offered one or more computing courses, and nine offered at least three courses.25 These courses were offered by twenty-two electrical engineering departments, ten mathematics departments, and by other departments at eight universities. The first doctoral degree in computer science (University of Pennsylvania) and the first departments of computer science (Purdue, Stanford) were formed in 1962 and 1963, respectively. The number of schools forming computer science departments grew rapidly, and half of the computer science departments that exist today in the United States were formed in the decade 1962–1972.

(p.63) There had been considerable discussion in the professional community since the 1950s about the kinds of instruction to give in computing. There was no well-formulated body of knowledge known as computer science (or computer engineering), and there were different views about the degree to which it was a discipline about mathematics, engineering, programming, and applications. At first, courses were offered primarily at the graduate level and usually involved training in either machine design or numerical analysis. Later the curricula expanded greatly, and in the second half of the 1960s most of the enrollment growth in computer science was at the bachelor's and master's levels. By 1970 there were more students graduating with a bachelor's degree in computer science than with a doctorate.

Professional societies began to address curricular issues in the 1960s as the need for computer education in the nation's colleges and universities became more acute. The Association for Computing Machinery (ACM) was the first to do so when it established a permanent curriculum committee in 1964. NSF supported this effort by subsidizing the committee throughout the 1960s and 1970s. Between 1967 and 1972, the COSINE committee of the National Academy of Engineering developed a curriculum for computer engineering. In 1977 both the ACM and the IEEE Computer Society proposed curriculum revisions.26 ACM's Curriculum 68 was very influential and widely adopted in US universities and colleges.

Computing Becomes a National Concern (1962–1967)

During the decade of the 1960s, computing emerged as a concern of the federal government. It was growing in importance, not only to scientific and technological research and national defense, but also to education and the general welfare of US citizens. This is manifest not only in the growing budgets for computing research, education, and facilities at DARPA and NSF, but also in the emergence of white papers and data collection projects on computing. No fewer than six studies were conducted in the 1960s by NSF or the National Academies on the nation's computing needs and on the past, present, and future role of the federal government in meeting them, whereas none had been conducted previously.27 From the 1960s, until well into the 1990s, discussions about national needs for computer personnel have been phrased largely in terms of support for academic computing. Because NSF historically has been the agency most closely associated with science education, this is largely a story in the 1960s and 1970s about NSF policy and programs. The National Academies of Science and Engineering also played a role by carrying out studies on computing funded by NSF and others.

The first of the six studies undertaken in the 1960s was initiated by the NRC in 1962. The report was released in 1966 under the title “Digital Computer Needs in Universities and Colleges.” It was known more commonly as the Rosser Report after its chairman, J. Barkley Rosser, a mathematician who moved from Cornell to the University of Wisconsin during the study.28 The report documented the rapid growth of computing, including computer personnel, both nationally and on campus: national investment in computing had grown from $700 million in (p.64) 1958 to $7 billion in 1964; 35,000 computer staff positions were being created each year in the United States; the number of university computing centers had grown from 40 in 1957 to 400 in 1964; and campus expenditures for computing were doubling every 2 years, at twice the rate of growth of overall campus expenditure for research. The report noted the critical role that universities had played in the development of key computing technologies—having conceived and built many of the first computers in the 1940s and 1950s and having developed new modes of operation (time-sharing) to improve access in the 1960s. The report argued that academic computing is a national concern because the growth of computing had already outstripped the financial means of universities and the shortfall was being met by federal support.

Federal investment in academic computing had amounted to $65 million in 1964, and future investment was regarded as critical. The federal government would need to pay the majority of costs for academic computer centers for several reasons: (a) The high cost of computers placed them beyond the means of normal university budgeting; (b) their rapid obsolescence made them poor candidates for philanthropic donations or large appropriations by state legislatures; (c) the rate of product innovation and pricing variation made it difficult to work within the multiple-year budgeting lead time associated with public funding; and (d) the expanding support costs, which frequently exceeded the original purchase price, further deterred state legislatures.

The Rosser Committee's principal recommendation was to double the nation's academic computing capacity within 4 years. Some new funding was to be spent on research computing facilities, but an ever greater amount was to be spent on educational computing, in order to more than double the number of undergraduate students trained annually to use computers in their professional careers. In 1964, only 10 percent of academic computer time was devoted to educational purposes.29 The report recommended as rapid an increase as possible in the number of students being trained as computer specialists. Use of regional computer centers was promoted to both leverage the effect of federal computing dollars and share the computer expertise at the research universities with local 2- and 4-year colleges. The committee called for better coordination of auditing and funding among the eight federal agencies currently supporting academic computing and with the universities.30

The report had some value as a planning tool for staff at the granting agencies—particularly at NSF—but it was politically ineffective. It sounded too self-serving to the Bureau of the Budget for a committee of academics to argue that they needed hundreds of millions of dollars to do their research and teaching.31 Another NSF staff member recalled: “They botched the job…[Rather than providing] the kind of talk that any Congressman would understand and appreciate…instead, Barkley [Rosser]…had I don't know how many people submitting reports, and eventually [these were] distilled into 200 pages of highly technical language—completely mystifying to any legislator. And I don't think anything ever came of it.”32

(p.65) In order to respond to the Rosser Report and a National Science Board request for a survey of total federal computing support (caused by some problems in the way that federal auditors required universities to charge for computer time), NSF organized a Working Group on Computer Needs in Universities and Colleges.33 The Working Group, which first met in May 1965, found that little information was available on federal support for computing, and that the Rosser Report projections and recommendations had been based on very few reliable data. NSF itself could not ascertain with certainty how much it was spending on support for computing, mainly because it did not know how much of the funding in its Institutional Grants Program was being spent on computing. Data on funding of academic computing by other federal agencies were even more difficult to obtain.

If NSF could not appraise national support for academic computing by querying the suppliers (the federal agencies), it decided it would have to query the recipients (the universities). Thus it commissioned what became the first of a series of surveys, prepared by John W. Hamblen of the Southern Regional Education Board. The first survey covered the period 1964–1965 and made projections for 1968–1969. Hamblen found that in 1965 universities expended $103 million on computing, while computer manufacturers contributed an additional $41 million in the form of gifts of equipment and allowances on purchases and rentals. Of the $103 million, $43 million (40 percent) came from federal agency grants and contracts. Of the federal funds, $25 million had been earmarked for computer activities: $13 million for buildings and equipment rental or purchase, $7 million for operating costs, $3 million for computer time for research and graduate teaching, $1.5 million for computer science activities, and less than $0.5 million for under-graduate instruction.

Hamblen's study indicated that the main focus, up until this time, had been on providing computing facilities to researchers in the sciences. In the mid-1960s, however, there was a widespread new interest in the role of computers in under-graduate education—mainly in teaching undergraduates about computers, but also in using the computer as a general instructional tool. One expression of this interest occurred during 22 days of congressional hearings about NSF, held during the summer of 1965.

One of the key witnesses, Jerome Weisner, Dean of Science at MIT and former Science Advisor to President Kennedy, testified that the situation in computing was the “most serious single problem in higher education.”34 He pointed out that most universities needed additional computing equipment, and that even well-positioned institutions such as MIT were involved in “a series of maneuvers involving ad hoc arrangements to keep its computational activities solvent and to provide some time for academic use.” He noted that all schools were searching for a means to pay for instructional use of computers. Under questioning, he estimated that the computing problem “is several times the scale of the [National] Science Foundation's resources to deal with.”35

Others also addressed the educational aspects of computing in their testimony. Richard Sullivan, President of Reed College, reported the growing faculty consensus (p.66) that not only science students needed to understand computers as a research tool, but that all students should be made familiar with computers “as a problem that they are going to have as citizens.”36 Reacting to the trend to save money by sharing academic computing facilities, such as through regional computing centers, he noted that off-site computing facilities were sometimes satisfactory for research but were not very effective for instructional purposes. In the closing round of questions, during a discussion of NSF's support for “big science,” NSF Director Leland Haworth testified that “one field that I wouldn't classify as big science, because of its usefulness all across the board, but it is big money, is the computer field.37” Haworth argued there would be a “terrific growth” in the need for computers for the current scientific applications, the social sciences, and education. He tied this growth not only to the advancement of the natural and social sciences, but also to the training of skilled personnel for industry.

The sentiment expressed in this testimony was consonant with President Lyndon Johnson's ardent desire to improve the nation's educational system. In direct response to President Johnson's educational platform, the President's Science Advisory Committee convened a Panel on Computers in Higher Education, chaired by John R. Pierce, an electrical engineer at Bell Telephone Laboratories. This panel emphasized higher and secondary education rather than scientific research in the universities, which had been the main topic of the Rosser Report. The Pierce Report updated the projections in the Rosser Report and addressed some of the same issues, including the need for graduate education in computer science.

The Pierce Panel found that in 1965 less than 5 percent of the nation's college students had access to adequate computing facilities, and virtually all of those who did were located “at a relatively few favored schools.” They contrasted this figure with the 35 percent of all undergraduates who they believed could benefit from exposure to computers. The report recommended that the government cooperate with the universities to supply adequate computing facilities for educational purposes by 1971. The Panel expected the cost to escalate to $400 million per year by 1971 and proposed that the federal government bear most of the cost. The report also called for extensive training of faculty by sponsoring intensive 2–6-week courses and through other means to meet the anticipated rising demand for computer courses. Finally, the report called upon the federal government to expand its support of computer research and education, and for NSF and the Office of Education to “jointly establish a group which is competent to investigate the use of computers in secondary education.”38

Many years later, Milton Rose, who headed the computing program at NSF in its early years, assessed the impact of the Pierce Report:

The thing which led…to OCA [the Office of Computing Activities]…was the growing realization by a number of people that computing was…not just another thing but really was going to have a significant role in all of…science and education, that actually its influence would extend beyond science to the general culture. And while all of those things were said, they would not have had a political influence…The thing I think that trans-formed the argument very successfully was John Pierce's and John Kemeny's [report], what (p.67) became known as the Pierce Report…Now that was significant because it not only highlighted things we had been trying to push up through the system, but because…of John Pierce's contacts…at the highest levels of the Science Advisor and therefore into OMB [Office of Management and Budget]…And, as anyone who has ever worked in these systems know[s]…trying to push something from the bottom is [like] trying to push spaghetti upward that has been cooked for five hours. But if you can grab it from the top then you can really do something. And so the significance of the Pierce Report and Kemeny's influence in that was enormously important. Well, in fact, that led…to a reevaluation at NSF by [Director] Lee Haworth, and in part also from pressure I think from OMB.39

President Johnson saw the computer as a tool that could be used to advance the aims of his Great Society program, giving the rich and poor alike a chance to get ahead through education. The influence of the Pierce Report was reflected in his 1967 message to Congress on Health and Education, where he directed NSF to “establish an experimental program for developing the potential of computers in education.”40 The Bureau of the Budget, planning for this activity to begin in fiscal year (FY) 1969, proposed the formation of an interagency advisory group led by NSF to coordinate support for computer education. The most visible outcome of this activity was the formation on July 1, 1967 of the OCA at NSF to pursue programs in computer facilities and education.41

The Pierce Committee found that there was a growing disparity in the quality of computing facilities, and therefore in the quality of education, between the have and have not institutions of higher learning. Bringing to all American undergraduates the kind of facilities available to students at the most computer-privileged institutions, such as Dartmouth or California-Berkeley would require a major infusion of money—projected to increase to $400 million by 1971–1972. The Pierce Committee recommended that the federal government share in this cost.

The OCA sponsored a combination of old and new programs. It continued the program to provide computer facilities to universities that had been started in the late 1950s. There was a new priority on the development of regional computing centers that could connect universities, colleges, junior colleges, and high schools. The regional centers were to include not only computing machinery, but also faculty training and curriculum development programs. OCA also gave a new emphasis to the development of computer science departments. The development of computer science research was closely linked (at least rhetorically) to the educational program during the first several years of OCA. OCA estimated in 1967 a 50 percent shortage of faculty to teach computer science.42 Thus the program, as originally established, supported individual research projects, research conferences, and computer science curriculum development.

A Hostile Climate for Science (1968–1976)

The late 1960s and first half of the 1970s were not a favorable time for federal support of computing. In order to pay for the Vietnam War, the administration began to cut spending for domestic programs during the summer of 1968. In FY 1969, (p.68) NSF made its reductions by cutting back grants on capital items such as buildings and equipment, in order to preserve funding levels for individual research grants. This action jeopardized the computer facilities program, which for a decade had been the hallmark of NSF's computer program.

The financial pressure on university computing centers was worse than this cut in the Institutional Computing Services program suggests. Over the previous several years, computer manufacturers had reduced their educational gifts and discounts. By 1969 industry was providing only 15 percent of the support for academic computing, whereas only 4 years earlier industry had provided almost 30 percent of the support. At the same time, the universities were experiencing other across-the-board cuts in funding from government agencies. Faced with smaller research grants from federal agencies, faculty researchers reduced the budget lines in their grant proposals to pay for computing services, figuring that the operating costs of a computer center are fixed costs that the university would have to absorb in some other way. These various factors concentrated the financial pressures of the computer centers, which were not only unable to purchase new equipment, but were also having trouble meeting their operating costs.

The NSF's computer advisory panel called for a new federal policy on academic computing. The panel noted that ARPA was spending $20 million annually on a small number of research projects at universities, but usually was not providing support for general campus facilities. NIH was spending $10.5 million annually on computing facilities for medical schools, but was relying on university computing centers to serve its sponsored academic biomedical research. The grants of these and other federal agencies increased the load on academic computing centers, while NSF alone among the federal agencies was providing general support for these facilities. The advisory committee urged the National Science Board to issue a report to Congress to “illuminate the profound implications of computer developments and clarify the major scientific, technological and educational problems,” as well as to propose a coordinated federal policy for supplying universities with computers for research and education.43 It encouraged NSF to take the lead in setting federal policy for academic computing because the Bureau of the Budget had too limited a view of academic computer needs and the White House Office of Science and Technology had not developed a plan. For whatever reason, the National Science Board issued a report on this matter.

The political climate in Washington in the late 1960s was drifting in a direction that was less sympathetic to basic science. Support for academic science slowed throughout all the federal agencies. After 1966 NSF's allocations for the Mathematical and Physical Science Division increased at a rate slower than inflation. Congress was becoming progressively interested in supporting the application of science rather than basic research, and it looked increasingly to NSF to support scientific research aimed at solving national problems such as environmental pollution. Congress was also dissatisfied with NSF's practice of defining its programs and projects according to the guidance and initiative of the scientific community; Congress felt itself held captive by the very academics it was supporting. There (p.69) were various calls from Congress for greater political control over the Foundation's agenda. It is in this context that one can understand The National Science Foundation Act of 1968 passed by Congress (known commonly as the Daddario Bill, after Congressman Emilio Daddario of Connecticut). It expanded the Foundation's responsibilities to include applied research and the social sciences. Congress believed these subjects had relevance to contemporary national social problems.

Computing fared better than most of the sciences in the support it received from NSF after passage of the Daddario Bill—probably because computing was regarded as having practical applications. The Daddario Bill explicitly directed the Foundation to foster and support the development of computer science and technology, primarily for research and teaching in the sciences. This was the first piece of legislation explicitly authorizing NSF to support the computing field. Nevertheless, the Daddario Bill sliced the NSF funding into more pieces, leaving less for computing.

The Daddario Bill instituted annual authorization hearings before Congress to determine NSF's budget. Milton Rose gave testimony at the first of these hearings (for the FY 1970 budget). Training appeared as a priority for the first time because of the increasing demand for academic computing courses being felt across the nation, and because of projections of rapidly escalating national needs for trained computer personnel of all types.44 Rose called for “grants to enable department units to develop a concerted and integrated revision of entire curricula for major science areas, including specialized computers and programming support…particularly in the development of computer science programs.” As for facilities support, he pushed the economies of scale that could be achieved through regional computing centers. His briefing notes included a call for establishing ten new regional centers each year. A program with this objective was begun several years later.

The election of Rchard Nixon as president in 1968 affected the administration of NSF and its computing programs more than any presidency until that of Ronald Reagan. The Nixon administration appointed William McElroy as NSF director. He pursued the administration's interest in using the Foundation as a tool to apply science to problems of national importance. To that end, the president nominated non-scientists and industrial scientists in record numbers to the National Science Board. The Daddario Bill had increased the policymaking role of the National Science Board, giving greater autonomy to the director and his staff in the operation of the Foundation. Under the Daddario Bill, the president not only appointed the director, but also a deputy director and four assistant directors.

During the Nixon presidency, the OMB wielded unprecedented power over federal programs and agencies. By manipulating the purse strings of various agencies, including NSF, OMB exacted programmatic changes. The assault on the computer facilities program was part of an all-embracing attack by OMB on institutional grants awarded by NSF OMB argued that by replacing general-purpose institutional grants, which subsidized academic resources including computers, (p.70) with individual research grants, which paid directly for all but only the resources used, grantees would be more directly accountable for controlling costs. This may have been true, but it also served to dismantle the substantial federal programs to aid education built up during the Johnson administration. Whatever the motive, this change imperiled academic computing centers, which relied heavily on these institutional grants.

In the early 1970s, the NSF advisory committee assigned highest priority to expanding computer science research and training. The committee reasoned that, despite growing industrial dominance in the computing field, there was still a place for academic research on both hardware and software, and that a strong academic research center would provide a strong environment for training. The OCA had an uphill battle to support the training of computer scientists because computing's need was out of synch with those of the other sciences. By 1970 there was perceived to be a glut of mathematicians and physical scientists, and in response OMB slashed Foundation funding for fellowships and traineeships. However, computer science had a critical shortage of trained personnel. A National Academy of Sciences meeting on Computer Science Education in 1969 projected the need for 1000 new PhDs per year by 1975, at a time in 1970 when the national output was under 100.45 In response, the OCA advisory board recommended expanded sup-port to graduate education in computer science.46

Congressional authorizations did not match this perceived need to grow. The FY 1971 budget continued the pattern of decline for the third year in a row. The total budget for OCA in 1971 was $15 million, approximately $7 million less than the allocation for FY 1968—all while the computing field continued its rapid expansion. There was no new support for facilities; the only facilities grants were the residuals from earlier, multiple-year awards. The computer education budget held about steady at $6 million, while there was a modest increase in the research budget to $4.5 million and a significant rise in support for applications.

John Pasta, who was appointed director of OCA in 1970, set a number of changes in motion. He arrived at the Foundation at a time when the staff and the OCA advisory committee were fighting a rearguard campaign to retain the facilities program, which had been the Foundation's most successful computer venture since the late 1950s. If the Foundation could not afford a full-scale facilities program to place a computer in every university, the advisory committee believed the Foundation should at least aggressively pursue a program to put a computer in every regional consortium of colleges and universities. Pasta decided to let the facilities program go: “The Bureau of the Budget has attacked this program and my personal opinion is that they are right. As a compromise with the Committee position we are proposing a reoriented program which will hopefully answer the objections of the Bureau (and my own private misgivings).”47 As a replacement for the specialized social problem-solving centers advocated by the advisory committee, Pasta advanced his own plan for cross-disciplinary computer centers.48 But the economic and political environment of the early 1970s laid waste to Pasta's 5-year plan.

(p.71) The only real funding increase came in FY 1972, and this was a one-time exception created by the Mansfield Amendment. This amendment to the Military Procurement Authorization Act of 1970, introduced by Senator Mike Mansfield of Montana, narrowed the scope of research pursued by mission agencies to only those scientific research projects that could be justified as directly relevant to their missions. Although the amendment applied specifically to the military, it reflected the mood of the Congress; and all the mission agencies, not only the military ones, narrowed their support of basic research.

The amendment created considerable havoc because NASA, AEC, ARPA, and the three military service research agencies (ONR, Air Force Office of Scientific Research—AFOSR, Army Research Office—ARO) were all providing sizeable support to projects that did not meet the Mansfield criterion of mission relevance. On short notice, each of these agencies divested itself of a considerable number of projects. The Foundation was the recipient of many of their “dropout” projects. Congress provided an additional $41 million to the Foundation research budget in FY 1972 for this purpose. This amount included $3.5 million for computer activities, which the staff used toward supporting some of the thirty-seven computing research projects, costing a total of $5.5 million, dropped by the AEC, NASA, and the DOD. Thus, of the $4 million increase in the OCA budget from 1971 to 1972, $3.5 million is attributable to a one-time adjustment caused by the Mansfield Amendment.49 Because these projects were scattered across OCA's programs, there was little change in the complexion of NSF's computing grant portfolio. If anything, these one-time transfers disrupted Pastas plans.

Foundation Director McElroy needed congressional approval to reprogram these funds to OCA, and his request to do so occasioned an inquiry from the House Committee on Science and Aeronautics about total federal support for computing research. The House findings provide a context for understanding NSF support for computing through 1972. NSF support for physics, chemistry, mathematics, astronomy, and engineering research only sometimes increased as fast as inflation during the period 1968–1972. But in this same period, support for computing research tripled. Whereas the budget for computer research had been one-eighth of that for mathematics research in FY 1968, by FY 1972 the ratio had grown to two-fifths. Probably as would be the case with any rapidly emerging discipline growing from a small base, Congress saw these increases as generous, while the practitioners regarded them as miserly.

In the 1950s AEC, ONR, and AFOSR supplied almost all of the funding for computing research. During the 1960s, support from these three agencies was essentially flat. NSF funding grew from practically nothing in 1960 to the point in 1971 where it almost equaled the total support from AEC, ONR, and AFOSR. The most significant change in the 1960s, however, was ARPA support, which grew from zero in FY 1961, to slightly more than the total support from all other federal agencies combined in FY 1963, to approximately triple the support of the NSF in FY 1971. Historical figures by year were not available for NIH or NASA. However, by interviewing program officers at all the federal agencies sponsoring computer (p.72) research, the OCA staff was able to provide a statistical portrait of computer research support in the federal government by subject area for I year, FY 1971, including support from NIH and NASA. The total government expenditure on computing research and development (R&D) in 1971 was $52 million. ARPA spent $26.5 million and NASA $13.1 million. Each of these agencies spent half its total on system developments. NSF led the remaining six sponsoring agencies with a total of $4.3 million. NSF's share continued to grow. By 1976, NSF and DARPA were each providing approximately 40 percent of the total federal investment in basic computer research. The remainder was coming from the combined efforts of NASA, NIH, Energy Research Development Agency (ERDA), AFOSR, ONR, Naval Research Laboratory (NRL), and ARO.

Under Pasta, NSF tried to build up academic computer science programs, and hence their abilities to train personnel, by building up computing research. Congress, the General Accounting Office, the OMB, and even the National Science Board repeatedly sought justifications for spending public money on computer research. Questions were lodged about the arcane nature, practical relevance, and social impact of computing research.50 Also questioned was why a public investment was needed when industry was already carrying out an extensive research program. It was known that the computer industry expended a greater percentage (7 percent) of its gross revenue on R&D than any other major industry, including aircraft, chemicals, and transportation. The closest follower was the aerospace industry—at 4.5 percent.

In order to address these concerns, NSF surveyed industrial computer research in 1975. The findings supported the continuation of a computer research program at the Foundation: Large R&D expenditure was devoted to the essential and unique business cost of software development. When software was factored out, computer industry R&D expenditure was comparable to that in other industries. The survey found only eight computer companies that were engaged in any basic research, and some of that was done in conjunction with universities.51 The nation's industrial personnel assigned to basic computing research numbered only 150 scientists (and an equal number of support staff). Without exception, the industrial researchers who were surveyed stressed the importance of academic research. Special note was taken of basic research that was not of interest to industry at one time, but which became so after being developed by academic researchers.52

The high rate of inflation exacerbated the problem. Although NSF budgets were growing in these years, they could not keep pace with the double-digit pace of inflation—and the budget declined in real dollars by approximately 15 percent between 1968 and 1976.53 This caused minor disruptions for fields such as mathematics and physics, where the number of practitioners was shrinking slightly. It had much more serious consequences for computer science, which was experiencing explosive growth in undergraduate enrollments and slow growth in the number of faculty. For example, in mathematics the number of bachelor's degrees granted annually fell from 21,000 in 1973 to 11,000 in 1979, and doctoral output declined from 1000 in 1973 to 724 in 1979. By comparison, computer science bachelor's degrees rose from 4700 in 1973 to 11,000 in 1979, while doctoral output rose (p.73) slightly from 198 in 1973 to 239 in 1979.54 In the late 1970s, the level of support for computer time, equipment, and student assistants, as well as overall award amounts, was reduced in individual research grants in order to increase the number of researchers receiving awards. This practice was in contrast to those of other NSF offices, which were awarding larger grants to offset the effects of inflation.

Meeting National Needs for Computer Scientists (1967–1977)

The previous section has described some of the national political context as it shaped computing at NSF. This section focuses more specifically on computer labor issues as they were addressed by NSF during the same era. NSF carried out a number of programs in computer science education in the name of national “manpower” needs: Support to professional societies to design model curricula for computer science and computer engineering; and numerous efforts to help individual computer science departments and programs to establish uniform and effective curricula and, more generally, to build up institutional size and strength. Although national needs to build up the professional community had not been entirely neglected by either the computing community or NSF in the 1950s and early 1960s, this did not become a priority until the mid-1960s, when the tremendous growth in the computing field began to create demands for computer professionals in all sectors of American society.

Alan Perlis, a computer scientist at Carnegie-Mellon and a member of the OCA advisory committee, identified eleven strong graduate computer science departments in 1967 and projected that eighty-one graduate programs of varying quality would exist by 1968.55 In 1967, 200 computer science faculty could be counted. Altogether they produced forty new PhDs in 1967—of which 80 percent accepted faculty positions. Perlis estimated, however, that the nation's colleges and universities would need 400 computer science faculty in 1968 just to keep up with the teaching demand—an impossible doubling of faculty in 1 year. These estimates were based upon Perlis's assumptions that there was a national need to teach one course in computer science to 223,000 undergraduates and 61,000 graduate students, as well as to provide further course work for 5560 undergraduate computer science majors and 4328 computer science graduate students.

In 1971 OMB noted that there had been a significant surplus of American scientists in the late 1960s, compared with the availability of scientific research funds; and that consequently scientists with graduate degrees were left unemployed, underemployed, or leaving their field of training altogether.56 To rectify this situation, the Nixon administration pressured NSF into terminating its student traineeship program in 1971. This may have been appropriate for physics and mathematics, but it only exacerbated an already acute shortage of computer scientists with graduate training.

At the end of the 1960s, training in computer science emerged as a priority for the OCA because of the unremitting increase in demand for computer science course offerings. The shortage of computer personnel at all levels—from machine (p.74) operators, to programmers, to computer scientists—created a demand that the colleges and universities could not meet. In 1964–1965, national enrollments in data processing and computer science programs had totaled 4300 undergraduates and 1300 graduate students. At the time, these enrollment figures had been projected to quadruple by 1969. But the growth was more rapid than anticipated, and quadrupling occurred by 1967.57

In October 1969, NSF Director William McElroy asked the OCA advisory committee to identify new areas for emphasis in planning for the FY 1972.58 The committee gave first priority to supporting computer science departments with research funds, and second priority to specialized centers for both disciplinary and interdisciplinary research. Education was also very much on the minds of the advisory committee. It explained the pressing demand for academic training and new research capabilities, contrasting the “adequate…or very close to it” support for training of mathematicians and scientists with the “completely inadequate” support for training new computer scientists. The committee called for NSF to provide special allocations for student fellowships and traineeships in computer science. When budgets were cut, these fellowships did not materialize.

While there was a general belief in the 1970s that there were grave shortages of computer scientists at all levels, the top-ranked graduate programs, such as Cornell, Purdue, and Stanford, were reporting acceptance rates of less that 25 percent (which the Foundation attributed to shortages of graduate fellowship support and faculty), compared with acceptance rates of about 60 percent in other graduate departments in these same schools. Although many of the computer science departments had been founded only in the preceding 5 years, they had already grown as big as the largest science and engineering departments. The national enrollment at the graduate level increased fivefold between 1964 and 1967, while the undergraduate enrollment also quintupled to more than 21,000. More than half of all students taking a bachelor's degree in mathematics, but not going on to graduate school, had been drawn to employment in the computing field.

The problem was not only a shortage of faculty and lack of support for graduate students. This new field was so unlike the traditional disciplines that it was difficult for university administrations and computer science departmental administrations alike to assess its intellectual merits. As soon as OCA was formed, it began making grants to help universities improve graduate computer science programs and establish undergraduate programs.59 Following 2 trial years of support for departments, the OCA advisory committee recommended a continuing program to expand computer science education. Because of the unrelenting demand for undergraduate course offerings, the most severe problem was the shortage of trained faculty. Thus the committee recommended that OCA begin by supporting the expansion of graduate programs.60

By 1975 there were sixty-two departments in the United States granting doctoral degrees in computer science and an additional fifty-three departments (of mathematics, engineering, information science, statistics, etc.) granting degrees with a computer science emphasis.61 These departments employed 1500 computer (p.75) science faculty and 600 research faculty and research associates. There were also 500–600 computer scientists in other science and engineering programs at these institutions. Approximately 2000 graduate students were enrolled in computer science PhD programs across the nation. Approximately 200 computer science PhDs were produced in 1974. Of these, about 50 percent accepted academic positions, 40 percent took industrial positions, and the remaining 10 percent went to government and other positions. Very few of those entering industry were involved in basic research. Most were involved in design, development, or supervision instead.

A Crisis in Computing (1977–1990)

By 1977, there was a widely held belief in academic and industrial circles that a crisis was at hand in computer science. One manifestation was a personnel problem. Expansion of computer science programs in the universities had reached its peak around 1976 and had begun to fall off. The national production of doctorates in computer science was in stasis. At many schools computer science teaching loads were extremely heavy, and it was not uncommon for computer science departments to have five or more faculty positions they could not fill.62 There was also an insufficient number of computer scientists at the bachelor's and master's levels to fill available government and industry positions. The crisis was also manifest in academic research, which had become dominated by theoretical studies. No longer were academic researchers contributing as significantly to systems design or other experimental research areas as they had from the late 1940s through the early 1970s.

The increase in the national production of computer science doctorates that had been anticipated widely by the professional leadership throughout the 1970s did not materialize.63 By the mid-1970s the expansion of computer science in the universities had peaked. The number of doctorates granted in 1975 totaled 213, only slightly more that the previous year. Doctoral production peaked in 1976 at 244, then declined to 216 in 1977, and 196 in 1978. At the same time, the number of bachelor's degrees granted in computer science continued to increase (from 4757 in 1974 to 7201 in 1978), but at a linear rather than at the exponential rate it had before 1975. Master's degrees were more erratic in their numbers than bachelor's or doctoral programs. The annual number of master's degrees awarded in the mid-1970s was generally in the 3000s.

The rapid increase in computer science enrollments and the shortage of faculty made university teaching less attractive. An NSF survey of people who had left the university for industry found that heavy teaching loads and job insecurity were the major complaints, far above salary considerations. While the entire engineering field was experiencing a loss of faculty to industry in the late 1970s, computer science losses were twice the percentage of any other field. Fewer than half of the new doctorates were entering teaching careers—and this rate was thought to be insufficient to train the growing number of students. For those who chose academic careers in computer science, attaining tenure was complicated by the fact that few university administrators understood the characteristics of hardware and (p.76) software research, which had high cost and time requirements but a low yield of scientific publications.

The focus in the remainder of this section is on the response of the NSF. It should be emphasized that NSF was not the only organization to respond to the crisis. The problems were studied by the computing research community at the 1978 and 1980 Snowbird Conferences.64 As a result, industry as well as NSF began to provide graduate student support. Universities added significant numbers of faculty positions. Companies agreed informally to restrain themselves from hiring away faculty and to provide incentives for graduate students to complete their degree programs.

In order to formulate its response to the crisis, NSF convened a special work-shop in November 1978. The published results are known as the Feldman Report, after the principal editor, computer scientist Jerome A. Feldman of the University of Rochester.65 The workshop organizers asked all PhD-granting computer science departments for comments and suggestions on the problems they faced with experimental computer science. The replies indicated that many of the best faculty, staff, and graduate students were being recruited away from the universities by industry; and while larger industrial salaries were a factor, the greater availability of good equipment in the industrial laboratories was more significant.

The Feldman Report recommended building academic strength in experimental research and the infrastructure to sustain it. The proposal was to build, over a 5-year period, twenty-five “centers of excellence” with multimillion dollar grants that supported coherent, multi-investigator research projects in experimental system design research and that built up a research group with “critical mass.” The report proposed taking funds from other programs, if necessary, to build up this “critical mass.” This recommendation ran counter to the NSF's policy of limiting the size of computer research grants in order to spread the money to as many researchers as possible. The report also called for new government policies that would encourage industry–university interaction. These included new tax laws, patent procedures, and antitrust legislation that would not discourage industry from contributing equipment to the universities.

Not everyone was happy with the recommendations of the Feldman Report. In particular, there was strong disagreement from the Executive Committee of the ACM. Its criticism was especially notable because the ACM's flagship journal, Communications, had been the publication venue for the report. The ACM Executive Committee agreed that there was a crisis, that the solution lay in invigorating the PhD programs, and that government policy changes to encourage industrial investment in the universities were desirable.66 However, the ACM leadership preferred the correctives recommended by the Foundation's Advisory Panel for Computer Science at its May 1979 meeting: Traineeships, expanded equipment grants, and a research computer network.

The nexus of the disagreement between the ACM Executive Committee and the Feldman Report was whether to concentrate or distribute resources. The Feldman Report recommended fellowships, which went to the institutions at which the (p.77) “fellows” enrolled and hence tended, by natural selection, to be concentrated in the elite institutions. ACM favored a traineeship program, in which the support was given directly to the institutions and so could be distributed more evenly across a large number of institutions. Even more in dispute was the report's recommendation that a considerable concentration of capital at a single institution was necessary to support adequate facilities and sufficient numbers of students. ACM believed the Foundation's Research Equipment Program, initiated in 1977, had already begun to mitigate the problem by providing modern equipment to computer science departments.67 ACM proposed an extension of the equipment grants, but it did not support a competition for a few large grants:

we have serious doubts that huge grants to the so-called “centers of excellence” would achieve the desired objective. The Feldman Report envisions up to twenty-five centers being started over a five year period. There are now, however, sixty-two universities in the United States that grant Ph.D.'s in computer science. We believe that the embittered feelings of, and the drain-off of resources from the institutions not favored by this program would severely divide the community just when unity and common programs are most important. We believe that the community is best helped by providing that all available research funds, whether from existing sources or from new ones, be available equally to all qualified computer science research groups.68

The ACM proposed networking as an alternative tool to free-standing centers of concentrated equipment: “Modern minicomputer technology and common carrier data networks can be combined to permit research groups to connect at modest costs that are well within the reach of an equipment grant program.”69 NSF Director Richard Atkinson wrote in July 1979 to twenty-five industrial computer research directors, enclosing a copy of the Feldman Report and asking them to assess possible NSF actions. The research directors favored supporting experimental research facilities and large research projects, regarded additional fellowships as having some value, but showed little enthusiasm for an increase in the number of small research grants of the type the Foundation was currently awarding.

When it came time to act on the Feldman Report in the fall of 1979, the NSF staff had to make hard choices on how to implement its recommendations within the constraints of the budget. NSF decided to fund a scaled-down version, called the Coordinated Experimental Research (CER) Program, on a total budget of approximately $1 million it cobbled together from existing resources. It recognized that this amount would not buy much equipment and that it would create a loss of support in constant dollars to the individual-researcher, small-grant program.

The original idea was to fund large research projects like MIT had…Project MAC and so forth—at places other than MIT, Stanford, and CMU. MIT, Stanford, and CMU, it has always been said, volunteered not to compete in this program. That's the legend. I believe it, but I think they were volunteered by…the computer science advisory panel. But anyway, the idea wasn't just infrastructure, just buying equipment and paying for some support staff… these were to be Project MAC style—some of them. Some of them were going to be straight infrastructure-centered, but some were going to be centered around a particularly large research project or another operating systems research project.70

(p.78) As NSF funding became tighter, much of the support requested in CER proposals for other than infrastructure was scaled back. The CER program continued throughout the 1980s. Four or five new awards were granted each year; however, the level of support to individual centers was not as high as the Feldman report recommended. By 1989 twenty-nine schools had received grants, and eleven of these were awarded a second 5-year grant. The well-coordinated research programs called for when the CER program was established (and present in some of the early CER proposals) were largely absent in later awards. The program shifted away from an emphasis on coordinated research and focused instead on the provision of equipment for experimental research. The total collection of proposed research projects had to warrant a major grant, but no single project had to be large; nor did the many experimental projects have to be closely interrelated. The notion of a “critical mass” became a less significant factor in making the awards.

In 1986, the program title of CER was changed to Institutional Infrastructure and the scope was broadened to cover all facilities needs in computer science and computer engineering. In 1988 the Minority Institutions Program and the Small Scale Program were added. The Minority Institutions Program supported planning, establishment, enhancement, and operation of experimental computing facilities to support research and education in colleges and universities with large minority enrollments. The grants in the program ranged from 1-year, $50,000 grants to improve proposal writing and planning up to 5-year, $1.5 million grants to fund faculty positions, curriculum development, equipment, maintenance, support staff, expert consultants, network membership fees, and other research and educational costs. These grants have had a salutary impact on such institutions as North Carolina A&T and the University of Puerto Rico. The Small Scale Program was intended to provide facilities for small facility research groups within a computer science department.

Most programs beyond [the top] 20 or 25 [schools]…don't have enough breadth…to mount a general infrastructure proposal…they may have a strong group of a half dozen people. They may have 25 faculty, but they don't have 25 good faculty. They would come up against Wisconsin's 33rd renewal and they would get blown out of the water because Wisconsin not only was good when they started, but, two CERs in a row, perhaps three, have been successfully started. It has done them a world of good. I mean, they are a first-class department, and [the second tier institution] just doesn't hold up.71

Thus, a smaller infrastructure grant would be given to support the small core of strength within the second-tier institution.

The main criterion that has been used within the Foundation to gauge the success of the CER program has been the annual production of PhDs in computer science. On this measure, the program has been a success. Beginning in 1980, the number of PhDs granted in computer science began to increase at approximately 10 percent per year, after a decade in which production remained in stasis at an annual rate slightly above 200 graduates. Departments with CER funds were expanding their PhD production twice as fast as departments without these funds. The first nine departments to receive CER grants reported a two-thirds increase in (p.79) the number of students passing qualifying examinations from 1980 to 1985. By 1990, the number of PhDs produced each year was around 600.72

There were other indications of the success of the CER program.73 CER grants provided departments with leverage to obtain more positions, more space, and more support of other kinds. The newly found strength made the departments more competitive for DARPA and ONR funding. Many CER schools developed new interactions with industry, which led to gifts or large discounts on equipment as well as joint research. Faculty teaching loads decreased, and faculty had more time for research and individual interaction with graduate students. The CER schools attracted more and better graduate students.

Despite the efforts of NSF, industry, the universities, and the professional societies, the attempts to train an adequate supply of PhD computer scientsts were not entirely successful. Doctoral graduates produced increased steadily during the 1980s, but the goal of 1000 doctorates per year established in 1967 was not met until 1995. Computer science struggled to achieve adequate financial support for graduate students, and the faculty workload remained much higher than in other related disciplines.74

The CER program was tailored to make the university environment attractive for research. The justification was based on the fact that academic posts are specialized and involve a long preparation process of education and training. Only by keeping an increasing number of the highly trained people in the academic system would there be any hope of increasing the teaching labor pool and of eventually expanding the number of advanced students to meet the industrial demand. Even if this strategy were to be successful, it was acknowledged that it might take a generation to reach any sort of equilibrium even in the face of a serious economic decline that would reduce the demand.75

Scientific Research Tools and National Competitiveness (1980–1996)

Computing had originally been of interest to the federal government as a tool of national defense. While defense and national infrastructure issues are still an important part of federal computing policy today, the range of policy issues that touched on IT has grown over time. Since the 1950s, support for computers has been seen as a way to strengthen the nation's research base. In the 1960s, the Johnson administration regarded the computer as a tool of social welfare—to facilitate education to poor, rural, and minority communities. Federal programs to use the computer for social purposes proved to have limited success in the 1970s, primarily because of shortcomings in the technology at the time. Indeed, one might argue that the computer has driven a further wedge between the “haves” and the “have-nots.” However, the spirit of affirmative action and equal opportunity programs, which was so strongly supported by the Johnson administration, did continue in the late 1970s and early 1980s. As a result, a climate was created to give women an opportunity to participate in the computing field at all levels in strong numbers for the first time (see the next section). In the 1980s, during the administration of (p.80) President Ronald Reagan, the computer became a factor in another kind of policy concern. The computer was seen as spawning an industry that was critically important to the nation's economic well-being, as well as being a tool that drove other industries both directly and through the results that came from research carried out using the computer as a tool.

Both scientific and economic issues drove federal computer policy in the 1980s and early 1990s. In 1980, the NSF Physics Advisory Committee appointed a sub-committee to investigate the availability of advanced computing facilities for physics research. Physicists obtained most of their computer time either because they were conducting directed research for an organization with these kinds of facilities or because they were able to make special arrangements to beg or borrow computer time from government or industrial organizations that owned supercomputers. The advisors worried that these kinds of ad hoc arrangements to gain access to supercomputers were drying up, just as the computation required in physical studies was exploding in scale. They noted that many American physicists were forced to travel to Europe to do their research calculations because they were unable to obtain the needed facilities in the United States.76 They recommended that NSF build a network with a supercomputer at its center and connect it to other centers that owned medium-scale computers.77

To obtain a broad perspective on what might be needed and to attract wide support, a meeting was convened in June 1982 by NSF and the DoD, with additional support from NASA and the DoE. It was chaired by Peter Lax, a mathematician from New York University.78 The Foundation issued the “Report of the Panel on Large Scale Computing in Science and Engineering” (commonly known as the Lax Report) in December 1982.79 It recommended increased access for scientists to the most advanced supercomputers, new research on computational methods and software for supercomputer use, training of personnel to use supercomputers, and development of more advanced supercomputers that would meet the needs of scientists. It proposed that government agencies with an interest in supercomputing should form an interagency coordinating committee to act on these recommendations.

In April 1983, an NSF Working Group on Computers for Research was formed to act on the Lax Report recommendations and more generally to assess the situation of computer facilities for scientific research and graduate education. The Bardon–Curtis Report was the result. Published in 1983, it called for a widely cooperative effort, across government agencies and into the private sector, to provide better local computing facilities, national supercomputer centers, and networks to link research laboratories with one another and with the supercomputer centers.80 The cost was expected to be in the hundreds of millions of dollars.

The political environment was favorable to this kind of investment in 1983 because high technology was beginning to be seen by the Reagan administration as important to national strength and international competitiveness. As soon as some security restrictions on use by Eastern bloc scientists were worked out, super-computer centers were established in 1985 and 1986 at five locations: Princeton, (p.81) Pittsburgh, Cornell, Illinois, and San Diego.81 During the first 3 years of operation, the five centers supported 2000 research projects at a total cost of $92.2 million. Physicists consumed nearly 35 percent of the total resources. The supercomputer centers program received strong support from Congress and NSF. The start-up of these centers, when aggregated with the computer research funding, amounted to an increase of 129 percent for computer science at NSF. Funding continued at high levels in subsequent years.

Beginning in 1987, the federal supercomputing program began to have competition from individual states, which saw supercomputer acquisitions as part of the infra-structure needed for the state and its educational institutions to be competitive—to produce research and train workers. Ohio State University, for example, established a center with state funds and attracted Allison Brown and Kenneth Wilson, the wife and husband team who had previously led the Cornell supercomputer center, to manage their operation.

The Office of Advanced Scientific Computing, which was responsible for the initiation of the supercomputing program at NSF, was also responsible for the establishment of NSFNET Initiated in 1984, it was intended originally to provide access to the national supercomputer facilities. However, its scope was soon expanded to connect the nation's scientists with one another. This amounted to a realization of the “network for science” that NSF staff had envisioned since the mid-1970s, but had been prevented from building by the OMB. OMB had been concerned about NSF's ability to manage an ongoing network and some political leaders did not want NSF competing with private industry in this area.

At first, existing networks were used to interconnect the supercomputer centers and to provide access to remote users.82 A new physical network was built, including a “backbone”—a high-speed data communications link—connecting the five supercomputer centers, the National Center for Atmospheric Research in Boulder, Colorado, and eventually other centers. The total investment approached $50 million.83 Federal interagency cooperation, which was unusually close and effective, was enabled by a Federal Research Internet Coordinating Committee (FRICC), consisting of representatives of NSF, DARPA, DOE, NASA, and the Department of Health and Human Services. By 1990, the Internet connected not only networks—including ARPANET—in the United States, but also included net-works in Canada, Mexico, Japan, Europe, New Zealand, and Australia. It is estimated that well over 100,000 computers were accessible through these interconnected networks.84

As the development of the network became more visible, interest increased in Congress and the White House. In November 1987, the executive branch's Office of Science and Technology Policy (OSTP) proposed a 5-year R&D strategy on high-performance computing and networking. The proposal would build on NSFNET to extend the Internet into a National Research and Education Network (NREN). The program assumed that the government would continue funding the approximately $500 million combined annual support invested by all federal agencies for computer science R&D.

(p.82) This interest resulted in the High Performance Computing and Communications initiative, which agencies began discussing in 1989 and which became legislation in 1991. It resulted in a budget that grew eventually to exceed $1 billion, and involved at least twelve agencies.85 It included support for both infrastructure and research activities in computers and communications. The initiative greatly strengthened understanding of networking, parallel computing, and advanced computer technologies; and it provided the national infrastructure that underlies the leading role of the United States in the Internet today. Albert Gore played a prominent role in promoting this legislation while in Congress. While the emphasis was originally on linking scientific researchers and giving them adequate tools, the network reached out to include many other groups in the educational and library communities. The authorizing legislation ended in 1996, but it has been succeeded by the Next Generation Internet initiative.

Protecting American Business Interests (1990–present)

In the 1990s, the reasons for federal action on IT changed somewhat. IT policy continues to be driven to some degree by a continuing concern about scientific research and education, as well as by national defense needs, but there has been a new emphasis on building an adequate workforce to meet the needs of American business. The continued improvement in price performance of chips and the emergence of the Internet has created a heated demand for IT workers, not just among manufacturers of computer hardware or software, but in all sectors of industry and in the public sector. The Y2K problem has also added a sharp peak in episodic demand, as individual companies spend as much as hundreds of millions of dollars in making their computer systems Y2K-compliant.

In order to meet the concerns of the business community, the federal government would, of course, like to be in a position to regulate the number of IT workers so that supply and demand are equal. If supply is less than demand, wages are driven up (making American companies less competitive) and certain projects cannot be completed or are delayed. If supply is greater than demand, there are unemployed workers to deal with. Unfortunately, in the market economy that the United States practices, the government has only limited ability to act. The government can only provide inducements to individuals to train for a certain occupation, or for companies to hire more workers or accept certain workers who may not have an ideal skill set from the employer's perspective—government cannot regulate any of these issues.

The recent intervention of US government agencies to address the perceived shortages of scientific and technical workers does not provide an encouraging picture. During the late 1980s, senior management at NSF warned of looming “shortfalls” of scientists and engineers. These warnings were based on methodologically weak projection models of supply and demand that were originally misinterpreted as credible forecasts, rather than as simulations dependent upon certain key assumptions. The projections yielded numerical estimates of the shortfalls anticipated, eventually reported to be 675,000 scientists and engineers by the year 2006.

Based in part on these worrisome pronouncements, Congress increased funding for NSF science and engineering education programs. Several years later, in 1990, (p.83) again influenced by the shortfall claims, Congress agreed to greatly expand the number of visas available to foreign scientists and engineers, for both permanent and nonpermanent residents. Many educational institutions moved to increase the number of graduate students in science and engineering. By the time these large cohorts of graduate students emerged with their newly earned doctorates, the labor market had deteriorated badly, and many found their career ambitions seriously frustrated. This experience proved embarrassing, leading to congressional hearings in 1992 and harsh criticisms of NSF management from several prominent congressional supporters of science and engineering. This experience has served as a caution throughout the 1990s as Congress has considered acting on other technical labor shortage claims.86

In response to labor shortages of nurses reported by hospitals and other US employer groups, Congress passed the Immigration Nursing Relief Act of 1989, which provided nonpermanent (HIA) visas for registered nurses for a 5-year period. Responding in part to NSF's concerns about a shortage of scientists and engineers, Congress authorized a new temporary visa category (HIB) for technical workers and a few other specialty occupations (specialty foreign cook and fashion model among them!). The records are not clear, but approximately 25,000 technical workers, including many computer scientists, were coming to the United States under this visa program in the middle 1990s (out of a total of 65,000 visa certificates provided each year).

In 1997, the Information Technology Association of America (ITAA), a large trade association, reported a shortage of 190,000 IT workers in the United States and a supply system that was unable to come close to meeting that demand. A second study conducted the following year by the ITAA, which involved surveying mid-sized as well as large companies, showed an even larger shortage of 346,000 unfilled information technology positions—approximately 10 percent of all IT jobs in the United States. The Department of Commerce's Office of Technology Policy then issued a report that mirrored the ITAA findings. However, the General Accounting Office criticized the methodology used to gather the data put forward by both ITAA and Commerce, and questioned the existence of a shortage. A heated debate ensued, crystallized around industry's desire to increase the number of HIB visas that could be awarded annually. Compromise legislation was passed and signed into law in late 1998, approximately doubling the number of HIB visas that could be awarded for 3 years, before the cap reverted to the original limit of 65,000 visas per year. However, the newly increased cap on HIB visas was reached after only 8 months into the government's FY, and lobbying began in the summer of 1999 to increase the number of visas once again.

Underrepresented Groups and Affirmative Action (1980s and 1990s)

Several groups of Americans are represented in the IT workforce in percentages that are far lower than their percentage representation in the population as a whole.87 These include African Americans, Hispanics, Native Americans, and women generally.

(p.84) Women are heavily underrepresented both in IT occupations and at every educational level in the formal system for educating IT workers. According to the Department of Commerce, only I. I percent of undergraduate women choose IT-related disciplines, as compared to 3.3 percent of male undergraduates. Tables 3.1 and 3.2 provide statistics about the percentage of women being educated in IT

Table 3.1. Number of degrees awarded in computer and information sciences by level and gender

Academic year

PhDs awarded

%Women

MS awarded

%Women

BA/BS awarded

%Women

1984–1985

240

10.0

6942

28.9

38,589

36.8

1986–1987

374

13.9

8481

29.4

39,590

34.7

1988–1989

551

15.4

9414

28.0

30,454

30.8

1989–1990

627

14.8

9677

28.1

27,257

29.9

1990–1991

676

13.6

9324

29.6

25,083

29.3

1991–1992

776

13.8

9534

27.8

24,578

28.7

1992–1993

808

14.7

10,171

27.1

24,241

28.1

1993–1994

810

15.4

10,416

25.8

24,200

28.4

Source: National Center for Education Statistics, Digest of Education Statistics.

Table 3.2. Degrees awarded in computer science by level and gender

Academic year

PhDs awarded

%Women

MS awarded

%Women

BA/BS awarded

%Women

1984–1985

326

11.0

1985–1986

412

12.1

1986–1987

559

9.7

1987–1988

744

9.0

5159

12,687

1988–1989

807

13.3

5457

9681

1989–1990

907

12.6

5116

9681

1990–1991

1074

12.1

4993

9353

1991–1992

1113

11.3

5121

9813

1992–1993

997

13.3

4523

8218

1993–1994

1005

15.6

5179

19.1

8216

17.9

1994–1995

1006

16.2

4425

19.7

7561

18.1

1995–1996

915

11.7

4260

20.0

8411

15.9

1996–1987

894

14.4

4430

22.3

8063

15.7

Source: Computing Research Association, Taulbee Survey. 1984–1986 PhD numbers for Cs&CE departments, all other years CS departments only.

(p.85) fields. Table 3.1 shows the number of women in formal degree programs in computer and information science at all US colleges and universities, whereas Table 3.2 shows the number of women in formal degree programs in computer science and computer engineering at only the PhD-granting institutions.88

One of the obvious patterns in these two exhibits is that the percentage of women entering the computer science pipeline and earning the bachelor's degree in these IT fields has been declining steadily since 1984. While the number of computer and information science degrees awarded decreased every year between 1986 and 1994, the decrease is occurring at a faster rate proportionately for women. This is in contrast to general trends in the graduation figures of US colleges and universities for these same years, during which the percentage of bachelor's degree recipients who were women increased from 50.8 to 54.6 percent. It is also in contrast to the trends in scientific and engineering disciplines generally. The decrease in bachelor's degrees awarded to women has also affected the number of women in the graduate degree pipeline, contributing to the decrease in women completing a master's degree in the computer and information sciences area. The percentages at the doctoral level have stayed somewhat flat, with a reduction in the number of US women apparently offset by an increase in the number of female foreign students entering the system at the graduate level. There are no reliable data on the number of women in the IT workforce.

The decline in women engaging in formal IT training since 1984 is in sharp contrast to the pattern of the late 1970s and early 1980s. In that period, concerted efforts were made to recruit women to the field, and these efforts resulted in a rapid increase in the number of women students. Thus the subsequent decline in the percentage of women entering the field has been especially disheartening.

There has been much speculation about the reasons for the decline in women entering the IT training pipeline. Reasons cited include:

  • (1) lack of opportunity to gain early experience with the technology;

  • (2) lack of K-12 teachers and guidance counselors who are knowledgeable about the wide variety of career paths and opportunities in IT;

  • (3) an image of computing as involving a lifestyle that is not well rounded or conducive to family life;

  • (4) an image of IT work as being carried out in an environment in which one has to deal regularly with more competition than collaboration;

  • (5) courses in mathematics and science that are requirements for degree programs in computer science and computer engineering, which women have not been encouraged to pursue based on outdated stereotypes of aptitude and interest;

  • (6) a lack of women role models; and a large percentage of foreign-born teaching assistants and faculty, some of whom have cultural values that are perceived as not being supportive of women being educated or joining the workforce.89

(p.86) Various programs are now under consideration at the national level to increase the participation of women in science and engineering generally, and in the IT community in particular. However, as the discussion below on minorities indicates, the goals and means of these programs have to be framed in an acceptable way in a political climate that has become hostile to the affirmative action programs that have been in effect since the Johnson administration in the 1960s.

The number of persons from most minority groups training or working in information technology occupations is very low. While African Americans, Hispanics, and Native Americans comprise 23 percent of the US population, they make up only 4.5 percent of those holding science doctorates (considering all scientific fields, not just computer science). One probable reason is the small number of minority students moving through the educational pipeline. Considering only those students who graduate from college, the percentages of Native Americans, African Americans, and Hispanics receiving a degree in computer or information science is actually higher than the percentage among non-Hispanic white males. However, this promising statistic is more than offset by the fact that minorities attend college in much lower percentages than whites do. Table 3.3 shows the low percentages of African Americans, Hispanics, and Native Americans training in IT-related disciplines.

Many of the reasons that discourage women from IT careers also apply to minorities. There are very few minority role models in IT. Minority students are less likely to have computers at home or at school on which to gain early exposure

Table 3.3. PhD degrees awarded in computer science and engineering by minority ethnicity

Academic year

Phd awarded

African American

Hispanic

Native American

Asian or Pacific Islander

Other

#

%

#

%

#

%

#

%

#

%

1984–1985

326

3

1.0

7

2.1

92

28.2

1985–1986

412

6

1.5

6

1.5

151

36.7

1986–1987

559

3

0.5

9

1.6

197

35.2

1987–1988

744

6

0.8

8

1.0

281

37.8

1988–1989

807

0

0.0

12

1.5

299

37.0

1989–1990

907

4

0.4

11

1.2

281

31.0

148

16.3

1990–1991

1074

8

0.7

26

2.4

349

32.5

151

14.0

1991–1992

1113

11

1.0

17

1.5

412

37.0

131

11.8

1992–1993

997

7

0.7

13

1.3

319

32.0

118

11.8

1993–1994

1005

14

1.4

9

0.9

0

0

154

15.3

76

7.6

1994–1995

1006

9

0.9

28

2.8

1

0

149

14.8

92

9.1

1995–1996

915

11

1.2

27

3.0

5

0

143

15.6

59

6.4

1996–1997

894

6

0.6

3

0.3

0

0

107

12.0

60

6.7

(p.87) to IT.90 Students who attend historically black colleges and universities face limited computing facilities, compared with students at the typical US college or university. But there are other reasons as well. For example, minority students who want to devote their lives to helping their communities do not regard IT as a social-conscience field. Students with that goal are much more likely to train for careers in law, medicine, or politics. Since the 1970s, NSF had reserved a portion—typically 15 percent—of its graduate research fellowships for underrepresented minorities. However, this practice was abandoned in 1999 in the face of a lawsuit from a white student who claimed that the separate competition discriminated against the majority population.91 This lawsuit is just one instance of a larger assault on affirmative action, which is causing federal agencies, universities, and private foundations to redesign their programs for underrepresented groups of all types.92 Institutions are turning to other kinds of programs. They are giving financial incentives, without quotas, to “majority” universities to lure and retain faculty and students from underrepresented groups. They are establishing programs targeted at indigent communities, rather than groups defined by ethnicity or gender. They are establishing mentoring programs. They are brokering partner-ships between research universities and nearby schools that have historically had large minority populations (e.g. Johns Hopkins University with the historically black Coppin State and Morgan State Universities in Maryland). It is too soon to know whether these programs will be effective or whether they will pass political muster.

A New Seed-Corn Problem? (1990s)

Many educators, industrial laboratory leaders, and government science officials are concerned that the high industrial demand for IT workers will siphon out of the higher educational system many students who would otherwise pursue an advanced degree. This diminishes the pool of people who will join the university faculties that perform basic research and teach the next generation of students.93 This problem is compounded when industry also successfully recruits current faculty members, including junior faculty who would become the academic leaders of the profession in the coming decades.

There are early signs of another cycle of “eating our seed corn.” The conditions are similar in many ways to the situation in 1980, as described in an earlier section. There is aggressive recruiting by industry that is luring high-quality undergraduates away from considering graduate school. Doctoral-caliber graduate students are leaving graduate programs after completing only a master's degree. Faculty members are shying away from high-pressure teaching positions. Burgeoning undergraduate enrollments are creating large class sizes, an inflated faculty-to-student ratio, and an over-committed faculty.94 Not surprisingly, there has been a downward trend in the number of computer science doctorates awarded annually during the 1990s (1074 awarded in 1990–1991, 894 in 1996–1997).95 The number of new doctoral graduates (p.88) entering academia is slightly more than 40 percent when postdoctoral and academic research positions, as well as faculty positions, are counted. This percentage has not been increasing, so the total number of new doctorates entering the teaching field is lower. Meanwhile, the number of faculty positions being advertised has skyrocketed. Advertisements in Computing Research News, for example, have doubled over recent years.

Other signs of a seed-corn problem are appearing. Universities have already experienced severe faculty shortages in several research areas, including networking, databases, and software engineering. Faculty recruiting is becoming much more difficult. There are fewer qualified applicants, positions are taking longer to fill, and multiple positions are going unfilled—even at strong research universities. The general attitude of the computing research community at the moment is to monitor the situation closely, until the data and qualitative evidence make it more apparent that a serious seed-corn problem does indeed exist. If this is determined, then actions similar to those taken in the 1980s by government, industry, and academia working together may be warranted.

The situation today is, however, different in some respects from 1980.96 Today, computer facilities in universities are more comparable to those in industry than they were in 1980; and a healthy research program in experimental computer science now exists in the universities. However, the focus of university research has become much more short term than it used to be, making it less different from industrial research; this change has removed one incentive for faculty and graduate students to remain in the universities. High-level information technology professionals today are employed across a much larger number of employers, including many outside the IT sector. This makes it more difficult for companies to work together, as they did in the 1980s, to restrain the raiding of faculty and graduate students from universities.

Conclusion

The United States has never had a computer worker policy in the traditional sense of a planned economy in which central authorities dictate supply and demand of workers. Instead, the implicit worker policy has focused on improving the infrastructure for the supply system, providing incentives such as fellowships to individuals to enter the field, and more recently providing incentives such as tax credits to employers to do more in training their workers. The vast majority of federal efforts concerning computer workers fall under its policies for higher education and scientific research. Sometimes the worker issue has been an explicit goal in these policies, but seldom is it the single or defining issue. Often the primary intention of policies or programs that benefit the computer worker situation has been to build strong universities, strengthen the national scientific research effort, or ensure that defense needs are met.

Beginning in the 1960s, national policies that positively affected the computer workforce were tied to social welfare concerns. These included the use of the (p.89) computer to enhance educational opportunities for poor, rural, and minority communities as part of President Johnson's Great Society program, and the affirmative action programs that increased opportunities for women and minorities. The computerized education initiative failed, largely because the technology was not up to the task. The affirmative action programs are today being rapidly dismembered, and it is not clear what kinds of programs will replace them.

In the 1980s and 1990s, economic competitiveness increasingly has been a driving force in IT policy in general and in IT worker policy in particular. In this environment, IT worker policy has been more directly addressed using traditional tools of government such as tax incentives and immigration law. However, there still seems to be a reluctance to be heavy handed in the use of these legislative remedies.

Acknowledgments

Thanks to Eleanor Babco and Catherine Gaddy of the Commission on Professionals in Science and Technology for collecting statistical data for this chapter. Paul Ceruzzi and Nathan Ensmenger kindly helped me to locate sources. Lisa Thompson provided some analysis of recent IT policy issues. Jean Smith provided editorial assistance. Many sections of this chapter rely heavily on an unpublished report for the NSF, written jointly by the author with Bernard Williams and Andrew Goldstein.97

Notes

(p.90) (p.91) (p.93) (p.94) (p.96)

Notes:

(1.) Peter Freeman and William Aspray, The Supply of Information Technology Workers in the United States, Washington, DC, Computing Research Association, 1999, provides basic information and cites most of the relevant literature.

(2.) Arvid W. Jacobson, ed., Proceedings of the First Conference on Training Personnel for the Computing Machine Field held at Wayne University, Detroit, Michigan, June 22 and 23, 1954, Detroit MI, Wayne University Press, 1955. Quotation is from p. 81.

(3.) Wayne University Conference, p. 85.

(4.) Ibid., p. 3.

(5.) Ibid., p. 17.

(6.) Ibid., p. 11.

(7.) Ibid., p. 11.

(8.) Manpower Resources in Mathematics. National Science Foundation and the Department of Labor, Bureau of Labor Statistics.

(9.) Wayne University Conference, p. 82.

(10.) Eleanor Babco from the Commission on Professionals in Science and Technology prepared a data report based on data from the US Department of Labor, Bureau of Labor Statistics, Current Population Studies. See Appendix I.

(11.) Wayne University Conference, p. 16.

(12.) Ibid., p. 21.

(13.) Ibid., p. 79.

(14.) Wayne University Conference, p. 10.

(15.) Ibid., p. 10. In this era, Grace Hopper, Saul Gorn, and others talked of “automatic programming,” which one might think of as making the programming process relatively human-free. What it actually meant was the building of assemblers, compilers, and diagnostic tools. They did not make any given programming task less time-consuming for the human programmer, but they just opened up the flood gates to doing more programming. See, for example, Symposium on Automatic Programming for Digital Computers, US Department of Commerce, Office of Technical Services, May 13–14, 1954, Washington, DC. This was also an era in which there was great concern about automation and the displacement of jobs for workers. See, for example, the testimony of Vannevar Bush, John Diebold, Walter Reuther, and others in “Automation and Technological Change,” Hearings, 84th Congress, October 14–28, 1995; also “Automation and Recent Trends,” Hearings, Subcommittee on Economic Stabilization, Joint Economics Commission, 85th Congress, Vols 14–15, 1957. These automation issues are also a frequent subject of Fortune and Datamation.

(16.) The various speakers from academia and industry discussed their various efforts to provide educational and training programs, but it would take us too far afield to discuss these issues. See, for example, the talk by Harry Huskey (UC Berkeley) on “Status of University Educational Programs Relative to High Speed Computation,” pp. 22–25; Kenneth Iverson (Harvard University) on “Graduate Instruction and Research,” pp. 25–29; and M. P. Chinitz (Remington Rand) on “Contributions of Industrial Training Courses in Computers,” pp. 29–32; but these issues are discussed in many other papers at the conference as well.

(17.) See the talk by Joseph Fishbach of the Ballistic Research Laboratories, pp. 72–74.

(18.) Wayne University Conference, p. 10.

(19.) This material is drawn from Claude Baum, The Systems Builders: The Story of SDC, Santa Monica, CA, SDC, 1981. For more on psychological profiling, see Gerald M. Weinberg, The Psychology of Computer Programming, New York, Van Nostrand Reinhold, 1971, which includes references to various papers and conference talks. The Association for Computing Machinery sponsored an annual computer personnel research conference, beginning in the early 1960.

(20.) The material in this extended paragraph is taken from “Automation and Employment Opportunities for Officeworkers,” US Department of Labor, 85th Congress, Bulletin 1241, October 1958.

(21.) Ibid.

(22.) Wayne University Conference, p. 11.

(23.) See William Aspray and Bernard O. Williams, “Arming American Scientists: NSF and the Provision of Scientific Computing Facilities for Universities, 1950–1973,” Annals of the History of Computing, 16: 60–74, 1994.

(24.) The history of DARPA's contribution to computer science is told best in Arthur L. Norberg, Judy O'Neill, and Kerry Freedman, Tranforming Computer Technology: Information Processing for the Pentagon, 1962–1986, Baltimore, MD, Johns Hopkins University Press, 1996.

(25.) Harry D. Huskey, “Status of University Educational Programs Relative to High Speed Computation,” in Jacobson, op. cit., pp. 22–25.

(26.) Gerald L. Engle, “Comparison of ACM/C3dS and the IEEE/CSE Model Curriculum Subcommittee Recommendations,” Computer 12 (December): 121–123, 1977.

(27.) The six studies were the following:

  • () NAS–VNRC, Committee on Uses of Computers, J. Barkley Rosser, Chairman, Digital Computer Needs in Universities and Colleges, Washington, DC, NAS–VNRC Publication 1233, 1966.

  • () NSF, Working Group on Computer Needs in Universities and Colleges, “Federal Support of Computing Activities, report presented to Advisory Committee for Mathematical and Physical Sciences, see Summary, Minutes of the Advisory Committee, March 31–April 1, 1966, NSF Historians Files, and the Working Group minutes May 13, 1965, June 17, 1965 and Memo to Beoffrey Keller from Milton Rose June 17, 1965 in Documents cited by the Administrative History of NSF, during the Lyndon Baines Johnson Administration,” draft copy in NSF Historian's Files.

  • () President's Science Advisory Committee, Panel on Computers in Higher Education, (p.92) Jolin R. Pierce, Chairman, Computers in Higher Education, Washington, DC, Government Printing Office, February 1967.

  • () John W. Hamblen, “Computers in Higher Education: Expenditures, Sources of Funds, and Utilization for Research and Instruction 1964–1965, with Projections for 1968–1969,” Atlanta, GA, Southern Regional Education Board, 1967.

  • () John W. Hamblen, “Inventory of Computers in U.S. Higher Education 1966–1967, Utilization and Related Degree Programs,” Atlanta, GA, Southern Regional Education Board, August 1, 1970.

  • () John W. Hamblen, “Inventory of Computers in U.S. Higher Education 1969–1970, Utilization and Related Degree Programs,” Atlanta, GA, Southern Regional Education Board, March 1, 1972.

(28.) NAS–VNRC committee on Uses of Computers, J. Barkley Rosser, Chairman, Digital Computer Needs in Universities and Colleges, Washington, DC, NAS–VNRC Publication 1233, 1966.

(29.) It was estimated that this would require a total federal investment starting at $65 million and rising to $180 million per year, including a $200 million investment over the 4-year period. It was anticipated that American universities would have a need in this period for 20 very large (costing $2–8 million apiece), 30 large-to-medium ($500,000 to $2 million apiece for large systems and $300,000 to $1.5. million apiece for medium systems), and 800 small computing systems ($32,000–$180,000 apiece) to supplement those already in place. The report also proposed funding a regional centers program at the level of $10 million per year.

(30.) Those eight agencies were NASA, NSF, NIH, AEC, ARPA, AFOSR, US Army Research Office, and ONR. The Rosser Report (p. 29) indicates the contribution of computing to be greatest and in roughly equal amounts from NSF and NIH, approximately 60% as much support from each of AEC and ARPA, and much smaller levels of support from the other four.

(31.) Thomas Keenan, oral history interview with author, Charles Babbage Institute archives, 1990.

(32.) Arthur Grad, oral history interview with author, Charles Babbage Institute, 1990.

(33.) Geoffrey Keller, division director for Mathematical and Physical Sciences, April 20,1965, letter to Lelard S. Haworth, Director, NSF; subject: National Science Board, Committee I, Meeting, April 14 planning for NSF Support of Computers and associated educational activities at universities; Number 43 in “Documents cited by the Administrative History of NSF during the Lyndon Baines Johnson Administration,” draft copy in NSF Historian's Files.

(34.) US House of Representatives, “Government and Science, Review of the National Science Foundation,” “Hearings before the Subcommittee on Science, Research and Development of the Committee on Science and Astronautics, 89th Congress, 1st Session, June–August, 1965, volume I, Washington,” Government Printing Office, 1965, pp. 651, 660, 667–669.

(35.) Hearings, June–August 1965.

(36.) Ibid., p. 747.

(37.) Ibid., pp. 785–786.

(38.) President's Science Advisory Committee, Panel on Computers in Higher Education, John R. Pierce, Chairman, Computers in Higher Education, Washington, DC, Government Printing Office, February 1967, p. 6.

(39.) Milton Rose, oral history interview with author, Charles Babbage Institute archives, 1990.

(40.) President Johnson's message was a direct response to the Pierce Report, orchestrated by Joseph Califano, Special Assistant to President Johnson (Rose, oral history).

(41.) According to Rose, there was a reluctance in the Bureau of the Budget, and also from Joseph Califano, that there would be “political steam” behind this computers-in-education business and that this would lead to an uncontrolled source of funding for the Department of Education, which would not command the technical aspects well enough to use the funding prudently and effectively. It was therefore decided that the funding should go instead to the NSF to implement the recommendations in the Pierce Report. However, there was concern that the NSF would only address scientific developments in education and that the NSF did not have any well-developed education programs of a general character. (Rose, oral history) Also see Leland Haworth's memo to Joseph Califano, April 17, 1967 in Director's Note Files, NSF Historian's Files.

(42.) NSF, Advisory Committee for Computing Activities, Background Materials and Agenda of the Third Meeting, April 11–12, 1968, Record Accession No. 307-74-038, Box 1, Washington Federal Records Center.

(43.) NSF, Daniel Alpert, Chairman, Advisory Committee for Computing Activities, letter to Leland J. Haworth, December 20, 1968, Records Accession No. 307-70A-3621, Box 22, Washington Federal Records Center.

(44.) Industry forecasts projected an annual-need increase of 50,000 programmers and analysts, rising from 250,000 in 1965 to 750,000 in 1975. In 1965 combined enrollments in data processing and computer science programs were 4300 undergraduates and 1300 graduate students. (Draft of Statement of Dr Milton E. Rose, Head, Office of Computing Activities, before the Subcommittee on Science, Research, and Development of the Committee on Science and Astronautics, US House of Representatives, March, 1969, copy in NSF, Office of the Director, Subject Files, 1969, Record Accession No. 307-75-052, Box 2, Washington Federal Records Center.)

(45.) NSF, letter to W D. McElroy from S. D. Conte, Chairman, Computer Sciences Department, Purdue University and Chairman, Advisory Panel for Computer Science, Supporting Documentation, January 16, 1970, Office of the Director, Subject Files—1970, Records Accession No. 307-75-053, Box 2, Washington Federal Records Center.

(46.) The base for this recommendation was the $1.3 million given in FY 1968 and $1 million in FY 1969 to support graduate (and one undergraduate) computer science programs, with average grant sizes of $150,000–350,000.

(47.) John R. Pasta, Head, Office of Computing Activities, June 10, 1970, memorandum to David E. Ryer, Special Assistant to the Director, Subject: Comments on the OCA Committee Annual Report, NSF, Office of the Director, Subject Files—1970, Records Accession No. 307-75-053, Box 2, Washington Federal Records Center, p. 2.

(48.) Pasta, memorandum to David E. Ryer. See also the Office of Computing Activities Draft of Five Year Plan, April 27, 1970, attached to the Agenda for the Seventh Meeting of Advisory Committee for Computing Activities, June 11–12, 1970, NSF, Office of the Director, Subject Files—1970, Records Accession Number 307-75-053, Box 2, Washington Federal Records Center.

(49.) Office of Computing Activities Draft of Five Year Plan April 27, 1970.

(50.) Pasta answered such questions so often that he compiled a list of the most frequent ones: What is computer science? Why do we study it? Why not let industry do it? Does it have societal impact? Is complexity theory worth studying? What will come out of all this basic research? Is it relevant? (John R. Pasta, “Conclusions,” Director's Program review: Computer Research, May 20, 1975, pp. 45–47.)

(51.) The companies were IBM, Burroughs, Honeywell, Univac, Digital Equipment, Bell Telephone Laboratories, GE, and Xerox.

(52.) The prominent example cited was work on computational complexity. Juris Hartmanis was required to leave his employment at GE for Cornell University in order to pursue his interest in this subject. After he and his academic colleagues had developed it, IBM, GE, and Bell Telephone Laboratories became interested in the subject.

(53.) See Bruce L. R. Smith and Joseph J. Korlesky The State of Academic Science, New York, Change Magazine Press, 1977, pp. 16–19.

(54.) Agenda, Mathematical and Physical Sciences Division Director's Retreat, November 15–16,1982, Appendix, “Degrees Awarded in Computer Science,” Office of the Director Subject Files, File MPS, Records of the NSF, Accession No. 307-87-219, Box 3, Washington Federal Records Center.

(55.) The departments identified by Perlis as “strong” were California-Berkeley Carnegie-Mellon, Harvard, Illinois, MIT, Michigan, New York University, Pennsylvania, Purdue, Stanford, and Wisconsin.

(56.) NSF, Report of the Meeting of the Advisory Committee for Mathematics and Physical Sciences, March 7–8,1969, Office of the Director, Subject Files 1969, Records Accession No. 307-75-052, Box 2, Washington Federal Records Center.

(57.) Statement of Dr Milton E. Rose, Head, Office of Computing Activities, before the Subcommittee on Science, Research, and Development of the Committee on Science and Aeronautics, US House of Representatives, March 1969, attached to Agenda, Fifth Meeting of Advisory Committee for Computing Activities, May 22, 1969, p. H-9.

(58.) National Science Foundation, letter to W D. McElroy from S. D. Conte, Chairman, Computer Sciences Department, Purdue University and Chairman, Advisory Panel for Computer Science, 16 January 1970, Office of the Director, Subject Files—1970, Records Accession No. 307-75-053, Box 2, Washington Federal Records Center.

(59.) In its first year (FY 1968), the OCA made awards to Johns Hopkins, Ohio State, and New York University to improve their graduate programs and to Colgate University to establish an undergraduate program. Grants totalling $1.0 million were made in FY 1969 to graduate programs at California-Berkeley Purdue, the University of Rhode Island, the University of Southern California, SUNY-Stony Brook, and Washington University, St Louis.

(60.) NSF, letter to W D. McElroy from S. D. Conte, Chairman, Computer Sciences Department, Purdue University and Chairman, Advisory panel for Computer Science, 12 January 1970, Office of the Director, Subject Files—12970, Records Accession No. 307-75-053, Box 2, Washington Federal Records Center.

(61.) Curtis Kent, “University and Industry Research,” Computer Science, Directors Program Review, 20 May 1975, NSF

(62.) See Harry Hedges, oral history interview, Charles Babbage Institute archives, 1990. Also the Snowbird Report, Peter Denning, ed., “A Discipline in Crisis,” Communications of the ACM, 24 (June): 370—374, 1981.

(63.) Agenda, Mathematical and Physical Sciences Division Director's Retreat, 15–16 November 1982, Appendix, “Degrees Awarded in Computer Science,” Office of the Director Subject Files, File MPS, Records of the NSF, Accession No. 307-87-219, Box 3, Washington Federal Records Center.

(64.) The 1978 Snowbird Conference is summarized in J. F. Traub, “Quo Vadimus: Computer Science in a Decade,” Communications of the ACM, 24(June): 351–355, 1981. The 1980 Snowbird conference is reported in Denning 1981.

(65.) Jerome A. Feldman and William R. Sutherland, eds., “Rejuvenating Experimental Computer Science: A Report to the National Science Foundation and Others,” Communications of the A CM, 22 (September): 497–502, 1979.

(66.) ACM pointed to other studies that corroborated the findings of the Feldman Report: Daniel P. McCracken, Peter J. Denning, and David H. Brandin, “An ACM Executive Committee Position on the Crisis in Experimental Computer Science,” Communications of the ACM, 22 (September): 503–504, 1979; The President's Federal ADP Reorganization Study (FADPRS) reported that shortages of computer science personnel may impede technological advance. The Council on Wage and Price Stability (COWPS) has ruled that (in certain instances) computer scientists (but not system analysts or programmers) are an “endangered species” and therefore can be excluded from the President's Wage and Price guidelines.

(67.) The Research Equipment Program was continued into the 1980s. By 1986 state-of-the-art machines had been provided to eighty universities and colleges. See “LRP material submitted to OBAC,” April 25,1986, in CISE Administrative Records, NSF, p. 4.

(68.) McCracken, Denning, and Brandin op. cit., p. 504.

(69.) Ibid.

(70.) W Richards Adrion, oral history interview, Charles Babbage Institute archives, 1990.

(71.) Adrion, oral history.

(72.) Hedges, oral history.

(73.) Ibid.

(74.) In 1983, the computer science bachelor-degree—faculty ratio was twice that in electrical engineering and four times that in any other related discipline. The computer science full-time graduate student to faculty ratio was 50% higher than that in electrical engineering and other engineering fields, and three times as high as in other related disciplines.

(75.) Kent Curtis, “Computer Manpower—Is there a Crisis?,” in Robert F. Cotallessa, ed., Identifying Research Areas in the Computer Industry to 1995. Park Ridge, NJ, Noyes, 1984, pp. 12–53.

(76.) Many different people referred to the case of Larry Smarr, an astrophysicist of the University of Illinois who had to travel to Germany to do his research calculations, and who today directs the national supercomputer center at the University of Illinois. When asked if the Bardon—Curtis recommendations should be implemented, Smarr waxed patriotic:

America cannot possibly afford not to make this investment. America won't be the leader in basic research and technology if it pulls back from this…This investment will completely revitalize (p.95) American universities, industries, and American basic research…The only barrier to America surpassing other countries in basic research is a lack of leadership and national will, a lack of vision. We make the computers. We have the scientists. We have the money. We need the vision. (As quoted by Gene Dallaire, “American Universities Need Greater Access to Supercomputers,” Communications of the ACM, 27(4): 292–29S, 19S4, quoted from p. 294.).>

(77.) NSF, “Prospectus for Computational Physics,” Report by the Subcommittee on Computational Facilities for Theoretical Research to the Advisory Committee for Physics, Division of Physics, March 15, 1981.

(78.) An organizing committee was formed of representatives of NSF, DOE, NASA, ONR, NBS, and AFOSR.

(79.) “Report of the Panel on Large Scale Computing in Science and Engineering,” Peter D. Lax, Chairman, December 26, 1982, NSF. (Agenda for Workshop on Large-Scale Computing for Science and Engineering, June 21, 22, 1982, p. 20.)

(80.) NSF, “A National Computing Environment for Academic Research,” prepared under the direction of Marcel Bardon by the NSF Working Group on Computers for Research, Kent K. Curtis, Chairman, July 1983.

(81.) Colin Norman, “Supercomputer Restrictions Pose Problems for NSF, Universities,” Science 229(12): 148, 1985.

(82.) W R. Adrion, D. J. Farber, F E Ko, L. H. Landweber, and J. B. Wyatt, “A Report of the Evolution of a National Supercomputer Access Network: Sciencenet,” NSF, 1984.

(83.) Eliot Marshall, “NSF Opens High-Speed Computer Network,” Science 243(6): 22–23, 1989.

(84.) Tracy La Quey and Jeanne C. Ryer, ‘The Internet Companion: A Beginners Guide to Global Networking’, Reading, Mass, Addison-Wesley, 1993, pp. 194–195.

(85.) See Marjory S. Blumenthal, “Federal Government Initiatives and the Foundations of the Information Technology Revolution: Lessons From History,” Clio and the Economic Organization of Science, AEA Papers and Proceedings, 88(2): 34–39, 1998.

(86.) This paragraph and the previous one are taken almost directly from Freeman and Aspray, op. cit., p. 46, and are based in part on analysis by Michael Teitelbaum of the Alfred P. Sloan Foundation.

(87.) Much of the material in this section is taken from Freeman and Aspray (1991), ch. 7. Thanks to Mary Jane Irwin of Pennsylvania State University for her help with the analysis of the women's issue.

(88.) Table 3.2 works from a smaller sample, but it provides more current information. The percentages of women in bachelor's and master's programs are much lower in Table 3.2, which is attributed not to any methodological problem with either dataset, but rather to the fact that Table 3.1 includes information systems degrees and Table 3.2 does not.

(89.) Some methodologically rigorous research on this issue is under way. See Allan Fisher and Jane Margolis, Computer Science Department, Carnegie-Mellon University, on “Women in Computer Science: Closing the Gender Gap in Higher Education,” www.cs.cmu.edu/~gendergap.

(90.) Minority students who do have computers in their schools are more likely to use them for repetitive math skills, instead of simulations and real-life applications of mathematics concepts. See Educational Research Service, “Does it Compute? The Relationship between Educational Technology and Student Achievement in Mathematics.”

(91.) David Kerstenbaum, ‘DOD Axes Grant Student Program’, Science, 26 June 1998, issue 5372, p. 2037.

(92.) Examples of these assaults, in addition to the NSF lawsuit, are a 1996 California referendum (Proposition 209) that prohibits a race-based criterion in admission and hiring at state institutions; and the Hopwood v. Texas federal appellate court ruling that sets similar rules for Texas, Louisiana, and Mississippi. For a general discussion of this issue of science programs for minorities, see Jeffrey Mervis, “Wanted: A Better Way to Boost Numbers of Minority Ph.D.s,” Science, 28 (August): 1268–1270, 1998.

(93.) This discussion of the seed-corn problem in the 1990s is taken from Freeman and Aspray, op. cit.

(94.) See the anecdotal account of this situation in Bronwyn Fryer, “College Computer Science Enrollment Skyrockets,” Computerworld, October 22, 1998.

(95.) Computing Research Association, Taulbee Surveys.

(96.) The Presidential Information Technology Advisory Committee has addressed some of these issues in its report to the president. See www.ccic.gov/ac/report.

(97.) William Aspray, Bernard O. Williams, and Andrew Goldstein, Computer as Servant and Science: The Impact of the National Science Foundation, Report to the NSF 1991.