Paul Ceruzzi is a historian specializing in the history of technolo...
Ceruzzi is referring to Charles Babbage's designs for his "Analytic...
Vacuum tubes are devices that control the flow of electric current ...
Relays are simple electromechanical switches that control electrica...
Howard Aiken was an American physicist and pioneer in computing who...
A mercury delay line was originally developed during World War II f...
The reason vacuum tubes provided such a dramatic increase in speed ...
The stored-program principle is the idea that a computer's instruct...
The ARC computer was an early British computing machine developed b...
Drum-type computers were early stored-program computers that used a...
In 1883, while working to improve the incandescent light bulb, Thom...
Gottlob Frege, David Hilbert, and Kurt Gödel were mathematicians wh...
IBM's launch of the System/360 in 1964 significantly reinforced the...
Annals H~.st Comput (1989) 10 257-275
G American Federation of Information Processing Socletles
Electronics Technology and
Computer Science,
1940-l 975: A Coevolution
PAUL CERUZZI
This paper explores the relationship between two disciplines: electrical
engineering and computer science, over the past 40 years. The aufhor
argues that it was the technology of electronics-the exploitation of the
properfies of free electrons--that final/y permitted BabbageS concepts of
automatic computing machines to be practically realized. Electrical
Engineering (EE) activities thus “took over” and dominated
the
work of
those involved with computing. Once fhaf had been done (around the mid-
195Os), the reverse takeover happened: the soence of computing then
“took over” the discipline of Electrical Engineering, in
the
sense that its
theory of digital switches and separation of hardware and
software
offered
EE a guide
to
designing and building ever more complex
arcuits.
Categories and Subject Descriptors: K. 2.
[Computing Milieux]:
History
of Computing-hardware, software, systems, theory. A. 1,
[General
Literature]:
Introductions and Survey.
General Terms: Des/gn, Reliability, Theory.
Additional Terms: Computer
Science.
Electrical Engineering.
Introduction
In 1976, a colorful brochure put out by the IBM
Corporation had a startling title: “It Was to Have
Been the Nuclear Age. It Became The Computer
Age: the Evolution of IBM Computers” (Figure
11.’ Leaving aside the question whether it is proper
to identify any period of time by a piece of tech-
nology, the title does call attention to the fact that
the computer seems to have sprung up suddenly
and unexpectedly. to dominate much of the na-
tion’s technology, economy, and culture.
cades, and not sooner? A complete answer to this
question would include a mix of economic and so-
cial as well as technical factors. This essay fo-
cuses on an aspect of the internal development
of computer technology that was as important as
any: namely that after 1940, Babbage’s concep-
tual formulation of an computer was joined to an-
other technology that was well suited to its re-
alization. That technology was electronics.
Annals of the Hlstory of Computing, Volume 10, Number 4, 1989 . 257
But a sophisticated description of a digital
computer appeared in the writings of Charles
Babbage in the 1830s. Why, then, the appear-
ance
of a “computer age” in the past three de-
‘An earlier version of the paper was presented to the an-
nual meeting of the Society for History of Technology (SHOTi,
October 23, 1986. Pittsburgh, Pennsylvania.
Author’s Address:
Dept. of Space Science and Exploration,
National Air and Space Museum. Smithsonian Institution,
Washington, D.C. 20560. (202) 357-2828.
Electronics emerged as the “technology of
choice” for implementing the concept of a com-
puting machine between 1940 and 1955. As it did
so, it enabled persons not trained in Electrical
Engineering to exploit the power and versatility
of computers. This activity led to the sbudy of
“computing” independently of the technology out
of which “computers” were built. In other words,
it led to the creation of a new science: “Computer
Science.”
The term “coevolution” implies that there was
a continuous and reciprocal interaction between
electronics and computing. Such interaction did,
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
in fact occur. As computer science matured, it re-
first automatic calculators, finally, after years of
hope and promise, came into existence. But al-
most from the start they were eclipsed by ma-
chines using the much faster vacuum tube as its
computing element. The story of the invention of
the electronic digital computer has been told
elsewhere, and in that story the issue of the vac-
uum tube’s perceived unreliability, as well as its
heavy power demands, are among the difficulties
cited for the initial skepticism as to its practi-
cality. These were indeed serious issues, but they
were addressed. Once they were, vacuum tube
technology, with its higher operating speeds, was
perceived as an alternative to relays.
paid its debt to electronics by offering that en-
gineering discipline a body of theory which served
to unify it above the level of the physics of the
devices themselves. In short, computer science
provided electrical engineering a paradigm, which
I call the “digital approach,” which came to de-
fine the daily activities of electrical engineers in
circuits and systems design.2
Though continuous, the interaction between
Computer Science and Electrical Engineering was
marked by two distinct phases. In the first phase,
between 1940-1955, electronics took over the
practice of computing. In the second, from 1955
to 1975, computing took over electronics. I shall
look at each in turn.
Between 1940 and 1950, a scattered group of
persons, without knowledge of one another, put
Babbage’s ideas into working machinery. These
inventors were interested in building machines
that could carry out a sequence of elementary
arithmetic operations, store intermediate results,
and recall those results automatically as needed,
and display or print the final results of the cal-
culation. They were not, for the most part, con-
cerned with the engineering details of their im-
plementation, except insofar as they wished to
have a machine that worked reliably (Cohen
19851. As things turned out, the first reliable,
working computers-in other words, the first
machines to implement Babbage’s idea of an au-
tomat.ic computing machine-used relays or sim-
ilar electromechanical elements to carry and ma-
nipulate numbers. Using relays (a technology
borrowed from the telephone industry), George
Stibitz of Bell Laboratories and Konrad Zuse of
the Henschel Aircraft Company in Berlin each
built calculators that could carry out three to five
arithmetic operations a second. And using a com-
bination of relays and toothed wheels borrowed
from punched-card accounting machines, How-
ard Aiken at Harvard built a powerful “Auto-
matic Sequence Controlled Calculator” with a
similar operating speed (Ceruzzi 1981).
Relay computers played the vital role of intro-
ducing the concept of automatic, sequential cal-
culation to an often skeptical community. It was
with electromechanical relay technology that the
Throughout this paper I ~111 be concentrating on that
branch of Electrical Engmeenng that IS more accurately de-
scribed as “electronic” engineering This term ~111 be defined
later in the text, but essentially I will not address that branch
of EE that deals with Power Engineering or the so-called
“Heavy Currents.”
One reason for the rapid ascendancy of elec-
tronic devices for computing elements was that
events during the war, mainly unrelated to
building computers, had transformed electronics
itself, raising it above the level considered (and
rejected) by the computer pioneers like Aiken or
Stibitz. One development-radar-was critical,
and became the bridge across which electronics
entered the realm of computing.
The role of radar is not usually considered as
Paul Ceruzzi was born
in Bridgeport,
Connecticut, and
attended Yale University,
where he received a
B.A. in 1970. He
attended graduate
school at the University
of Kansas, from which
he received his Ph.D. in
American Studies in
1981. His graduate
studies included a year as a Fulbright Scholar at
the Institute for the Hisfory of Science in
Hamburg, West Germany, and he received a
Charles Babbage Institute Research Fellowship in
1979. Before joining the staff of the National Air
and Space Museum, he taught History of
Technology at Clemson University in Clemson,
South Carolina.
Dr. Ceruzzi’s main scholarly work has been in
the history of computing since 1935. His has
written a book on this subject (Reckoners, the
Prehistory of The Digital Computer, 1935-
1945, Greenwood Press, 19831, and he is
presently working on a major new gallery at the
National Air and Space Museum about the
computer’s impact on air and space flight.
258
l
Annals of the History of Computing, Volume 10, Number 4, 1989
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
IT V&S TO
i
t-!AVE BEEN THE
NUCLEAR AGE.
IT BECAME...
I
Figure 1. Brochure from IBM, undated, about 1976
(IBM Corporation).
part of the generational lineage of computer his-
tory, in contrast to, say, the invention and mar-
keting of mechanical adding machines. Yet among
those involved in modern computing’s first de-
cade, there was no question as to its influence.
Radar required vacuum tube circuits that han-
dled discrete pulses of current at high frequen-
cies, in contrast to radio transmitters and receiv-
ers which had been the mainstay of prewar
electronics engineering. Both these requirements
matched the needs of computer engineering. Ra-
dar sets typically contained over a hundred tubes,
again in contrast to the more modest four- or five-
t.ube radio sets of the day.
One link from radar to the computer was the
mercury delay line: an ingenious and tricky de-
vice developed with some difficulty for storing
radar pulses. After the war those who had ex-
perience with it ie.g. Maurice Wilkes at Cam-
bridge University in England and Presper Eckert
in Philadelphia) could adapt it for use as a com-
puter memory device. Those who were less fa-
miliar with it had less success, many (e.g. Aiken)
believing that such a device was so fragile that
it would never work in a computer (Wilkes 1985,
p. 128). Mercury delay lines were indeed difficult
to build and operate; nevertheless they played the
role of being the memory device for four of the
first five stored program computers to be built in
the United States and England:
EDSAC, BINAC,
SEAC,
and Pilot
ACE
(the exception was the
Manchester computer, later called “Mark I,” which
used a Williams-tube memory.3
In 1953, when the
IRE Proceedings
issued a
special “Computer Issue,” Werner Buchholz, the
guest editor, stated that although many com-
puter projects were started during WW II,
. .
Still, the present growth of the computer in-
dustry did not start until the results of the enor-
mous development of electronic technology dur-
ing World War II were brought into the field. It
is interesting to note that many computer proj-
ects started around a nucleus of wartime radar
experts. Electronics not only provided the tech-
nological means for greatly increased speed and
capacity, and thereby enhanced the usefulness of
computers many times, but the availability of
cheap, mass-produced components and of engi-
neers trained to use them made it possible to ex-
periment on a greater scale and at a lower cap-
ital investment than before (IRE 1953, p. 12201.
As a result of that development of electronics
technology between 1939 and 1945, employment
in the American electronics industries had risen
from 110,000
to 560,000 (Electronics
1980, pp. 150-
210).
The vacuum tube ascendancy was not imme-
diate, however. Experience with radar had at-
tacked many of the problems of reliability, but
these problems still remained. Just as serious was
the fact that the much faster operating speeds of
tubes required a rethinking of the overall struc-
ture of a computing machine, especially the way
it received its instructions. High electronic speeds
meant nothing if the computing circuits received
their orders by mechanical devices such as paper
tape or punched card readers. Likewise the high
arithmetic speeds had to be carefully matched to
equally high speeds for storing and retrieving in-
termediate results from memory devices. It was
also recognized that higher arithmetic speeds re-
quired greater memory capacities. Each of the first
electronic calculators (i.e., machines whose com-
puting program was not directed by a stored pro-
“See Table 1 for a full listing of these and other early com-
puters and their memory devices.
Annals of the History of Computing, Volume 10, Number 4, 1989
l
259
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
gram), namely the Atanasoff computer, the
ENIAC,
the British Colossus, and the IBM SSEC,
addressed these problems in different, and in
hindsight inelegant, ways. Electronic computing
was held up by the need for a consensus on what
a digital computer ought to look likea
faults) with relays (Stibitz 1945). What that meant
This last bottleneck was broken in 1945, with
the emergence of the concept of the stored pro-
gram principle as the way to organize the various
units of a computer. The origins of this concept
are a matter of controversy, but the informal dis-
tribution, in 1945 and 1946, of a “First Draft of
a Report on the
EDVAC"
by John von Neumann
was what brought the computing community a
general awareness of the concept (van Neumann
1945).
Von Neumann’s report described the
EDVAC
in
terms of its logical structure, using a notation
borrowed from neurophysiology. The
EDVAC'S
im-
plementation in vacuum tube circuits, though
mentioned, is not the focus of von Neumann’s
energies. Instead he focuses on the main func-
tional units of the computer-its arithmetic unit,
its memory, input and output, and so on. The re-
port also described the idea, and the advantages,
of storing both instructions and data in one, high
speed internal memory.
The “First Draft” had an effect on every aspect
of computing. One effect was to hasten the de-
mise of the relay and assure the place of elec-
tronic circuits as the technology of choice for
building a computer. Once the logical design of
a digital computer was laid out in a way not tied
to a specific technical implementation (as von
Neumann’s was), then it became no more diffi-
cult to construct a computer according to that de-
sign using vacuum tubes than it was to construct
it out of relays or anything else. There were
problems of reliability with tubes, but these were
not overwhelming, nor were they that much
greater than similar problems (e.g. transient
‘It is sometimes argued that those who were skeptical of
vacuum tube technology because of its alleged unreliability
were correct, as evidenced by the fact that vacuum tubes were
eventuallv themselves replaced by the presumably more re-
liable “&d-state” devices such as diodes, transistors, and later
on integrated circuits. Transistors did offer far greater reli-
ability and lower power consumption than tubes, but it was
a full decade after the transistor’s invention that it became
practical to produce transistors in quantity with uniform
characteristics such that they could be used in computers. In
the invervening decade (1948-19581, it was questions first of
logical structure, and then of memory technology, that dom-
inated debates over computer design.
was that for an incremental investment of time
and money to utilize vacuum tube technology, one
got a thousandfold increase in speed. That ad-
vantage was overwhelming, and it meant that the
argument of tubes vs. relays was over before it
had a real chance to begin.
Maurice Wilkes, whose
EDSAC
was among the
first stored program computers to be completed,
in 1949, was among those who saw the Report as
the answer to many of the organizational prob-
lems associated with building computers:
. . .
In [the
EDVAC
Report], clearly laid out, were
the principles on which the development of the
modern digital computer was to be based: the
stored program with the same store for numbers
and instructions, the serial execution of instruc-
tions, and the use of binary switching circuits for
computation and control. I recognized this at once
as the real thing, and from that time on never
had any doubt as to the way computer develop-
ment would go (Wilkes
1985, p. 1091.
As the first stored program electronic com-
puters finally began operating in the early 1950s
their superiority was quickly recognized. The Ab-
erdeen Proving Ground provided a good test en-
vironment-at that facility a variety of mechan-
ical and electronic calculators, relay sequence
calculators, and stored program computers were
installed by 1953. Franz Alt, who was at Aber-
deen at that time, later remarked:
. . I
relay computers were in competition with
them [electronic computers], and they didn’t hold
their own. They were much too slow by compar-
ison . . .
After a few years people lost interest
in
them. They had been built as an insurance
against the possibility that electronic computing
might not work t Alt 1969).
The range and evolution of machine types, and
the emergence of the stored program approach,
is revealed by the listing in Table 1 of digital
computer installations, broken down by their type
of design. (Analog computers and devices are ex-
amined in a separate section). Table 1 summa-
rizes the various types of automatic computing
machines built and installed between 1940 and
1955. For each machine, a date is given for its
completion or first installation, followed by an
260
l
Annals of the History of Computing, Volume 10, Number 4, 1989
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
Name
Table
1, Computer Installations, 1940-l 955.
Year
I. Electromechanical and Electronic Calculators
A. Relay or Mechanical Calculators
No. Installed
Bell Labs Model 1 1940
1
Zuse Z-3
1941 1
Bell Labs Model 2 1942
1
Harvard Mark I
1944 1
Bell Labs Model 3
1944 1
Bell Labs Model 4 1945
1
Zuse Z-4
1945 1
Bell Labs Model 5
1946 2
Harvard Mark II
1947 1
ARC (see text)
1948/52 1
Bell Labs Model 6 1949
1
ONR/ERA relay computer “Abel”
1950 1
BARK
1950
1
AURA
1952 1
NEC Mark I (Tokyo)
1952 1
8. Electronic Calculators, General Purpose but Externally Programmed
IBM PSRC
1944 5
ENIAC
1945 1
IBM 603 Multiplier
1946 100
IBM 604 1948
5600 eventually
IBM SSEC 1948 1
Harvard Mark Ill 1949
1
Northrop Aircraft/IBM CPC 1949
700
Harvard Mark IV i 952
1
ERA Logistics Computer 1953
1
Burroughs E-l 01
1955 100
Monrobot
1955 5 approx.
Elecom 50 1955
2
C. Special Purpose Electronic Calculators
Jaincomp
USAF Fairchild
OMIBAC
Teleregister
SPEEDH
Reservisor
BAEQS
Magnefile
TRADIC
(Bell Labs)
MDP-MSI
Haller Ray & Brown
MIDSAC
1950
1950
1950
1952
1952
1953
1954
1954
1954
1955
D. Digital Differential Analyzers
4
1
1
>4
1
1
>3
1
>1
1
1
Northrop
MADDIDA
1949 15
CRC 101 & 105
1951 >5
QUAC
1952
1
Bendix D-l 2 1954
2
Wedilog
1
Annals of the History of Computtng, Volume
10,
Number 4, 1989
l
261
P Ceruzzr
l
Coevolution of Electronics and Computer Science
Table i. Computer Installations, 1940-l 955. (continued)
Name Year No. Installed
II. Stored Program Electronic Computers
A. Serial,
“EDVAC
type”
-- ---_.___
BiNAC 1949
1
EDSAC 1949 1
SEAC 1950 1
Pilot ACE
1951 1
RAYDAC 1951
1
UNIVAC
1951 46 eventually
C.S.I.R.O. Mark I (Australia)
1952 1
CUBA
(France)
1952
1
EDVAC 1952
1
LEO
1952
1
DYSEAC 1953 1
FLAC 1953 1
MIDAC 1953 1
DEUCE 1954 3
by 1955, 32 eventually
B. Drum Memory, “ERA 1101 type”
ERA 1101
OARAC
Burroughs Laboratory Comp.
CADAC
102
Elecom
100
PTERA
Bendix G-15
CALDIC
CIRCLE
Hughes Airborne
IBM 650
MINIAC
ALWAC
ORDFIAC
WISC
PENNSTAC
LGP 30
READIX
1950
1952
1952
1952
1952
1952
1953
1953
1953
1953
1953
1953
1954
1954
1954
1955
1955
1955
1
1
5 approx.
14
5
approx.
1
>400 eventually
:
5 approx.
>2000 eventually
3
5
1
1
1
>lOO after 1955
1
C. Parallel Memory. “von Neumann type”
Whirlwind
1950
I
SWAC 1950
1
Manchester (Ferranti) Mark I
1951 9
AVIDAC
1951 1
IAS (von Neumann)
1951 1
ILLIAC 1952 1
IBM 701
1952
19
by 1955
MANIAC
1952
1
NAREC
1952
1
ORDVAC 1952
1
ARC (see text)
1948/52 1
ERA 1103
1953 IO
JOHNNIAC 1953
1
IBM 702 1954
14
Ferranti Mark II
1954 19
NOW
1954
1
ORACLE
1954
1
IBM 704 1955 1 in 1955, many later
262 4 Annals of the Htstory of Computing, Volume IO, Number 4, 1989
P. Ceruzzi
l
Coevolutlon of Electronics and Computer Science
estimated number of installations iin many cases
form of magnetic cores) until the end of this era.
Perhaps symbolic of this phase of the history
of the computer was the
ARC
computer, built by
A. D. Booth of London’s Birkbeck College, and
first operational in 1948. Booth had followed the
American developments closely, and was con-
vinced early on of the advantages of the von
Neumann, stored program approach to computer
design. He proceeded to design and build a com-
puter along these lines, however using relays in-
stead of vacuum tubes in the interests of saving
money and time. But almost as soon as the
ARC
was completed, he set out replacing the relay cir-
cuits with their equivalent vacuum tube circuits
to implement the same logical functions (Booth
1949).
this information is approximate).’
The figures verify Alt’s impressions of this era:
relay calculators and computers initiated the dig-
ital era, but they were quickly eclipsed by elec-
tronic devices. Of the electronic machines, gen-
eral purpose calculators having a limited degree
of programmability (e.g. Northrop/IBM CPC, IBM
603/604) were installed in large numbers in the
early part of this period, and served as the work-
horse of digital computation until inexpensive
stored program computers became available, be-
ginning around 1953. Special purpose electronic
machines, including the Digital Differential
Analyzer, likewise fall into this category.
Stored program electronic computers did not
begin to appear in large numbers until around
1953, especially with the introduction of the IBM
650. Those that were installed in large numbers
tended to be the slower, but less expensive, drum
types (such as the 6501. These numbers should be
considered in the context of the greater speed,
memory capacity, and overall computing power
of the large scale machines such as the
UNIVAC
and IBM 701, of which only a few were installed
in the early years.
The “Von-Neumann type” design (stored pro-
gram, parallel memory access), which provided
the fastest performance, and which became the
standard architecture down to the present day,
was slow in being established in the form of in-
stalled machines, even though its advantages were
widely known through a series of reports by von
Neumann and others at the Institute for Ad-
vanced Study after 1946. This was mainly due to
the fact that the viability of the parallel archi-
tecture depended on a reliable, fast, and rela-
tively cheap memory device; something which did
not really become commercially available tin the
‘This table is a summary of a report that has been com-
piled from various sources. primarily the ONR Digital Com-
nuter Newsletter. and the Ballistic Research Laboratorv’s
s~rtqv nf‘Automorzc Conrpulers. three volumes of which were
also published during this interval (Weik 1955) The complete
Itsting. with references for each machine, IS on file at the
Computer Museum. Boston, and the Charles Babbage insti-
tute. Mmneapolis. In lookmg at this table it is important to
recognize that the defmnion of computtng machme was not
constant during that period. to the extent that a machine that
might appear on the table as a “computer” m 1945 would not
qualify as a .‘computer”
by 1955.
However m all cases I base
sought to include machmes that went at least a step beyond
performmg simple arithmetic on a pairs of numbers. but whtch
could with some degree of automatic control evaluate math-
ematical expressions of at least a modest length.
Beginnings of Computer Science,
1955-l 975
Bet.ween 1955 and 1975, a science of computing,
in North America adopting the name “Computer
Science,” emerged. Its focus was on the elec-
tronic, stored program digital computer (with its
magnetic core memory) invented in the previous
decade. As this new science emerged, the elec-
tronics technology from which it sprang changed
in response. The first change was a steady pro-
gression of computer related papers in the Elec-
tronics journals6 Eventually it affected the very
definition of “electronics” itself. Before examin-
ing the emergence of a science of computing after
1955, consider the accepted definition of “elec-
tronics” at that time.
The many changes in the practice of electron-
ics during the Second World War, of which com-
puting was but one, led electrical engineers to
reexamine the nature of the discipline. The orig-
inal definition of electronics was that of the
movement of electrons in a gas or vacuum, and
was intended to distinguish radio and commu-
nications work from the power engineering out
of which Electrical Engineering first emerged
iMcMahon 1984). This definition stemmed from
Edison’s observation in 1883 that an evacuated
tube could carry a current, and from the inven-
There is a steady increase of computer related papers, as
indexed in Science ilhsfracts. B (Electrical Engineering). from
zero in 1945 to 10s; in 1965. In 1960, computer papers were
divided into analog and digital, with analog papers dropping
off to about 25 in 1965. After 1965. as argued later in this
paper, digital computing began to dominate all electronics
papers, so that the frequency of computer papers indexed is
no longer a measure of its dominance in the field.
Annals of the Hlstory of Computtng. Volume 10, Number 4, 1989
l
263
P. Ceruzzl
l
Coevolution of Electronics and Computer Scrence
tion of the diode and triode vacuum tubes in the
first decade of the 20th century.7
With the advent of servomechanisms, radar,
computers, and the transistor (which did not in-
volve movement of electronics though a vacuum),
the definition had to change. In a guest editorial
in a 1952 issue of the IRE Proceedings, William
Everitt proposed a new definition:
Electronics is the Science and Technology which
deals primarily with the supplementing of man’s
senses and his brain power by devices which col-
lect and process information, transmit it to the
point needed, and there either control machinery
or present the processed information to human
beings for their direct use (Everitt 1962, p. 899).
In subsequent issues, several readers objected,
saying that the notion of “information” was too
vague. Many felt that Everitt was correct in
broadening it beyond the movement of electrons
in a vacuum, but they suggested that a better,
yet still precise, definition might be something
along the lines of “the movement of electrons, in
solid, gas,
or vacuum (McMahon 1984, pp. 231-
232).”
The increasing awareness of the computer as
a machine that integrates all aspects of infor-
mation handling, including communication, was
implied in a 1959 address by Simon Ramo, of
Hughes Aircraft, to the Fifth National Commu-
nications Symposium, where he proposed a new
term, “Intellectronics,” defined as “the science of
extending man’s intellect by electronics (Ram0
1959).“8 And Zbigniew Brzezinski coined the term
“Technetronics” to describe essentially the same
transformation of society (Brzezinski 19701.
By 1977 the computer-and-information defi-
nition had become accepted, at least as a general
overall definition of electronics, as indicated by
the lead article by John Pierce for a special issue
of Sczence on “the Electronics Revolution”:
‘Shortly after Edison’s observation. J. J. Thompson ex-
plained the effect by hypothesizing that a stream of nega-
tively charged particles carried the current through the vac-
uum. In 1994 these- particles were given the name “electrons”-
hence “electronics”---by the Irish physicist George Stoney.
Alan Turing, whom I shall describe later in this paper as one
of the founders of Computer Science. was a distant relative
of Stoney. (Hodges 1983).
‘Ramo’s term did not catch on. although a decade later a
group of engineers working in what has since become known
as “Silicon Valley” founded a company called “Intel,” a con-
traction of either Ramo’s term or of the words “Integrated
Electronics;” (Hanson 1982, Chapter 51. Today, what in
America is known as “Computer Science” is called “Infor-
matics” in Continental Europe.
mat is electronics? Once we associated elec-
tronics with vacuum tubes, but vacuum tubes are
almost obsolete. Perhaps electronics is semicon-
ductor devices. But then, what of magnetic cores
and bubbles and liquid crystals? I think that
electronics has really come to mean all electrical
devices for communication, information process-
ing, and control . . . (Pierce 19771.’
Recent publications hint at a new definition
that is in the same spirit as Everitt’s 1952 defi-
nition of electronics as a matter of communica-
tion and control. Mainly as a result of the de-
velopment of so-called very large scale integration
tVLSI&--integrated circuits with hundreds of
thousands of elementary devices on one chip-
there is a perception that it is the job of electri-
cal engineers to “manage complexity.” Although
it is still of concern to design the elementary
transistors, resistors, and so on, what is now the
critical issue is how to interconnect thousands,
even millions of similar and fairly simple devices
to one another.
In the final chapter of Karl Wilde’s and Nilo
Lindgren’s book on the history of Electrical En-
gineering and MIT, for example, several current
faculty and administrators are asked to define
what they see as the essence of their depart-
ment. Most agree that the computer, and espe-
cially its implementation in VLSI circuits, had
come to dominate the practice of the electrical
engineer.” For one of the administrators inter-
viewed (Fernando Corbato), “if there is a single
theme . . . it is the problem of complexity (Wildes
and Lofgren 1985).“l’ This last definition, if it
becomes generally accepted, reveals a strong in-
‘Notice that although the definition of electronics-as-in-
formation resolved the question of whether solid-state devices
like the transistor and integrated circuit properly belonged
to electronics, it introduced the confusion of allowing electro-
magnetic relay devices to be included. While in the general
view there is nothing wrong with this inclusion, when applied
to the context of the early digital computer era. it is inap-
propriate.
‘“They further agreed that the decision made in the late
1970s to keep Computer Science as a part of EE and not let
it break off as did many other universities was a wise one;
the name of the department is now Electrical Engineering
and Computer Science. I shall have more to say on this later.
“One example might serve to illustrate the increase in
complexity of eiectronyc circuits that has taken place over the
oast few decades: When the World Trade Center was built m
Lower Manhattan beginning in the late 196Os, it displaced a
block known as “Radio Row”: a group of shops selling surplus
radio parts. mainly of World War II vintage. Though many
lamented the passing of this area, it should be noted that in
terms of active circuits, the entire contents of all the shops
on Radio Row are today contained on one or two VLSI chips.
264
* Annals of the History of Computing. Volume 10, Number 4, 1989
P. Ceruzzi
l
Coevolutlon of Electronics and Computer Science
fluence from the theory and practice of comput-
ing.
Definition of “Computer Science”
While the definition of electronics had been
changing, a new term appeared, a term whose
definition would also evolve: “Computer Sci-
ence.” Though it is now one of the most popular
subjects taught in universities, “Computer Sci-
ence” has had a variety of definitions.12 Its his-
tory is brief, and any statements as to what it is
become dated quickly. Nevertheless there are
general areas of agreement as to its nature.
One definition, stated in its extreme, is that it
is not a science at all. (In debates found in the
letters column of computer trade and profes-
sional journals, one sometimes reads the aphor-
ism, “Any science that needs the word ‘science’
in its name is by definition not a science.“) A less
extreme form of this statement is that computer
science is driven by electronics technology, and
that computer scientists do little more than ob-
serve and collate into general rules the behavior
of the machines the engineers supply them.
Though useful, such rules are not scientific prin-
ciples such as those on which Electrical Engi-
neering itself was based (e.g. Maxwell’s theory of
electromagnetism, semiconductor physics). As
such, computer science is not a true science but
one of the “engineering sciences,” in Edwin Lay-
ton’s terms, which “took on the qualities of a sci-
ence in their systematic organization, their reli-
ance on experiment, and in the development of
mathematical theory”
(Layton 1971). Computer
science’s rules and laws tend to be about observ-
able and practical things, such as the time or the
memory requirements for a certain program that
sorts a file, and not about the fundamental prop-
erties of computing, whatever they may be.
This perception has its historic roots in the
1945-1955 period, when it took heroic engineer-
ing efforts to get a computer to work at all. The
feeling was that any talk about a theory of com-
puters was premature when it is questionable
whether one could get a stored program com-
puter to run without failure for
more
than a few
minutes at a time. (The one exception to this was
“A recent survey of colleges and umversttres shows that
in 1983,25.000 bachelor’s degrees were awarded m Computer
Science. compared with
18.000 In Electrical Engmeermg,
11,000 m Chemtstry. 12.600 m Mathemattcs and Statistics.
and 3800 In Physics (Gnes et al. 1986).
of course the “theory” of the stored program as
stated in von Neumann’s
EDVAC
Report.) This at-
titude survives among contemporary computer
engineers, who although often too young to know
of the difficulties of electronic computing’s first
decade, are nonetheless paid only to get a ma-
chine “out the door” of the factory. Other than
having the machine pass certain benchmarks to
validate its performance, they are not concerned
with what the customer (including the computer
scientists) do with it (Kidder 1981).
One of the first general textbooks on elec-
tronic computers stated flatly that “The out-
standing problems involved in making machines
are almost all technological rather than mathe-
matical (Bowden 1953). As computer science ma-
tured, others noted that for at least a century there
had been a tradition in a branch of mathematics
of studying the notion of a mechanistic process,
a tradition that began with George Boole’s In-
vestigation into the Lam of Thought
. . . , pub-
lished in 1854, and which included the work of
Frege, Hilbert Godel and many others on the for-
malization of mathematical expressions (van Hei-
jenoort 1967; Aspray 1980). But this tradition”
. . . was smothered after 1940 by a great tech-
nological explosion” Wegener 19701. In the fa-
mous Moore School lectures held in 1947, shortly
after the public unveiling of the
ENIAC,
there was
almost no mention of any of these men and
women-despite the fact that the title of the lec-
tures was “Theory and Techniques for Design of
Electronic Digital Computers” (Moore School
1947). In particular, the tone of the Moore School
sessions was that any discussion of the theory of
computer design had to take a back seat to the
pressing technical problem of designing a fast and
reliable memory system.‘” (The stored program
computers then being built used for their mem-
ory either mercury delay lines or specially built
television tubes. Both methods had extreme lim-
itations, especially regarding reliability, cost per
“‘For example. Claude Shannon’s work. which showed that
the rules of symbolic logic were well suited as a design tool
in the construction of relay circuits that performed arith-
metic, was hardly mentioned at all. For many of the partic-
ipants at these lectures the important thing was to find out
what kind of hardware was inside the many “block diagrams”
the lecturers kept putting on the blackboards. As noted above.
the one exception was the theory of computer design as de-
scribed by von Neumann.
but
even in this instance there was
a feeling at the sessions that theoretical design for the
EDVAC
was being emphasized too much, in the place of a more nar-
rative description of the
ESIX.
a computer based on an ad
hoc theory but one that was at least functioning in 1937.
Annals Of the HIstory of Computtng,
Volume 10. Number 4, 1989
l
265
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
bit of storage, and capacity (Redmond and Smith
1980,.
A few years later, a few companies and uni-
versities had succeeded in building working com-
puters. Among them there was a modest debate
over the value of logical state diagrams as a guide
to computer design, as opposed to straight engi-
neering borrowed from radar circuits that han-
dled electrical pulses. For a while the former was
known as the “West Coast” approach to computer
design, but by the mid 1950s the debate fell by
the wayside as computer design finally estab-
lished itself on a firmer foundation of symbolic
logic (Sprague 1972).
Another indication of how technology drove
perceptions of computing comes from the way the
history of computers was marketed. When IBM
introduced the System/360 series of computers
in 1964, they helped promote the notion of com-
puters belonging to three “generations.” defined
according to the technology by which they were
implemented: vacuum tubes. transistors, and in-
tegrated circuits. This classification, which has
since become accepted and even expanded on (viz.
the Japanese “Fifth Generation” project), had at
least two effects on the perception of the history
of computing: first, it relegated all computer ac-
tivities before the
ENIAC
into a limbo of either
“prehistory” or irrelevant prologue; second, it de-
fined progress in computing strictly in terms of
the technology of its hardware circuits.
Recently this view of computer science being
technology driven has been repeated by C. Gor-
don Bell, for many years chief of engineering for
the Digital Equipment Corporation and one of the
inventors of the minicomputer. Speaking of the
“invention” of the personal computer in the late
1970s he said:
A lot of things are called inventions when, ac-
tually, they were inevitable. I believe technology
is the driving devil. It conspires. and if there’s a
concept
half-there or a computer half-designed.
technology will complete it (Bell 19851.
Elsewhere, he has said of the computer industry:
It is customary when reviewing the history of an
industry
to
ascribe events to either market pull
or technology push. The history of the com-
puter industry . is almost solely one of tech-
nology push (Bell et al. 1978).
And in Tracy Kidder’s chronicle of the team of
young engineers in 1973 who were bidding a new
computer for Data General:
Some engineers likened the chips to an unassem-
bled collection of children’s building blocks. Some
referred to the entire realm of chip design and
manufacture as ‘technology,’ as if to say that
putting the chips together was something else. A
farmer might feel this way: ‘technology’ is the new
hybrid seeds that come to the farm on the rail-
road, but growing those seeds is a different ac-
tivity-it’s just raising food (Kidder 1978, p. 122).
After 1955 there arose compelling arguments
that computer science was a genuine science, al-
beit one that differed in many ways from the
classical sciences. The first argument to appear
was one that emerged in response to the pressure
that computing activities were putting on tradi-
tional disciplinary boundaries, especially in the
universities. With one exception (Wiesner, noted
below), the first published statements about
“computer science” revealed a perception that a
science was
being born, and it needed to be es-
tablished at least on organizational and admin-
istrative grounds; the question of just what it
“was” could be answered later. By the late 1950s
it was recognized that many topics that had much
in common with each other (and all in common
with the computer1 were being taught in various
departments around most universities. The feel-
ing was that those who were concerned with the
computer aspects of their work in these other de-
partments would perhaps not be recognized and
adequately rewarded by their peers for doing good
work. Establishing a separate department of
“computer science” would address that concern.
By the second volume (1959) of the
Commu-
nications
of
the ACM
(the flagship journal for the
Association for Computing Machinery), the term
“computer science and engineering” had begun to
appear. That September, an article entitled “The
Role of the
University in Computers, Data Pro-
cessing. and Related Fields,” by Louis Fein, dis-
cussed the need to consolidate, under a single or-
ganizational entity, the various studies orbiting
around the computer in various academic de-
partments such as business and economics,
mathematics, linguistics, library science, phys-
ics, and electrical engineering. After mentioning
a few names for this entity, including “informa-
tion sciences” (which he attributed to Jerome
Wiesner), “intellectronics” (which he says was sug-
gested by Simon Ramo‘l, and “synnoetics” (which
Fein himself had suggested elsewhere) he sug-
gested “the ‘computer sciences”‘; later in the ar-
ticle shortened to “Computer Science,” (singular,
and in quotations). This I believe is the origin of
266 - Annals of the History of Computing, Volume 10, Number 4, 1989
P.
Ceruzzi
l
Coevolution of Electronics and Computer Science
the term [Fein 1959; Fein 1961; Leech 1986).
Computer Science is concerned with information
processes, with the information structures and
proceduresthatenterinto representations of such
processes and their implementation and mfor-
mation processing systems. . The central role
of the digital computer in the discipline is due to
its near universality as an information process-
ing machine. With enough memory capacity, a
digital computer provides the basis for modeling
any information processing system, provided the
task to be performed by the system can be spec-
ified in some rigorous manner. , the stored
program digital computer . . . provides a meth-
odologically adequate, as well as a realistic, ba-
sis for the exploration and study of a great va-
riety of concepts, schemes, and techniques of
information processing (Amarel 1976).
In describing what this new discipline was, Fein
made the further point about what it was NOT:
“Too much emphasis has been placed on the com-
puter equipment in university programs
that in-
clude fields in the ‘computer sciences’ . . . In-
deed an excellent integrated program in some
selected fields of the computer sciences should be
possible without any computing equipment at all,
just as a first-rate program in certain areas of
physics can exist without a cyclotron” (Fein 1959,
p. 11).
The establishment of Computer Science’s ad-
ministrative and organizational boundaries was
followed five years later by the first attempts to
establish it as a true science based on its internal
nature. These attempts centered on the concept
that, like any other, Computer Science is the sys-
tematic study of certain phenomena, only in this
case the object of study is an artificial, not a nat-
ural one. In other words, Computer Science is
simply the study of computers. It is not to worry
that computers are artificial and not natural
phenomena. This was the argument of Herbert
Simon, author of
The Sciences of the Artificial,
and of his colleagues Allen Newell and Alan Per-
lis, then at Carnegie-Mellon University, whose
letter to the editor of
Science
in 1967 was the first
explicit statement to this effect (Newell et al.
1967).
For Newell, Perlis, and Simon, Computer Sci-
ence is the study of computers, just as Astronomy
is the study of stars. But in Astronomy, stars will
be stars, no matter what the astronomer says
about them. This is not so with computers. If what
the computer scientist says about computers in
theory does not agree with observed behavior, he
or she can always change the computer (more
correctly, get an electrical engineer to change the
computer). For most of the period 1950-1980,
there was sufficient continuity of the von Neu-
mann, stored program model of computer archi-
tecture that this did not present a problem.‘” As
long as the basic architecture remains constant,
so too will the definition emphasize the non-
hardware aspects of computing, such as in the
1976 Encyclopedia of Computer Science:
“Lately. with the development of so-called parallel mul-
tiprocessors, things have changed, and there is a correspond-
ing change in the attitude of computer science that increas-
ingly sees hardware issues as relevant. We may now see
a
return to the pre-1950 era when hardware issues dominated
discussions of the theory of computing. I discuss this issue
further in a later section of this paper.
Since the publication of the Newell et al. letter
to
Science,
a different internal definition has
emerged, and it is this one which dominates the
day-to-day practice of computer science today, es-
pecially in the universities. That is the definition
of computer science as the study of algorithms-
effective procedures -and their implementation
by programming languages on digital computer
hardware. Implied in this definition is the notion
that the algorithm is as fundamental to comput-
ing as Newton’s Laws of iMotion are to physics;
thus Computer Science is a true science because
it is concerned with discovering natural laws about
algorithms which are not engineering rules of
thumb or “maxims,”
in Layton’s terms (Layton
1971. p. 566; Vincenti 1979). Computer Science
thus becomes a science because it has a theoret-
ical foundation on which to build, such as: “The
notion of a mechanical process and of an algo-
rithm (a mechanical process that is guaranteed
to terminatei are as fundamental and general as
the concepts that underlie the empirical and
mathematical sciences (Wegener 1970, pp. 70-
781.” It is no coincidence that this theoretical
foundation is essentially based on the work of
Hilbert, Turing, Church, and others whose pre-
1940 work in mathematics was neglected by those
building the first electronic computers of the
1940s.‘”
The event that, more than any other, gave the
algorithmic basis currency was the publication,
in 1968, of a book entitled
Fundamental Algo-
“A few texts in Computer Science assert that the funda-
mental princiole of Cornouter Science is the so-called Tur-
mg-Church gypolhes~s, khlch, informally stated, says that
all algorithmic procedures are equivalent to a class of math-
ematical functions known as general recursive functions (As-
pray 1980, Ch. 2). At the same time it should be noted that
few if any textbooks in Computer Science devote much space
to an elaboration of this hypothesis.
Annals of the History of Computing, Volume 10, Number 4, 1989
l
267
P. Ceruzzi
l
Coevolution of Electronics and Computer Science
COMPUTER SCIESCE COURSE5
Table of Cour-es for Computer Science Jiajors
T.\BLE 1. PR~LI~IS~RY RECOUXEXD\~-10x3 OF THE CCRRICKLCU COUUITTEE OF hC.\i
Required
FOR \[AJORS 1X &,SIPCTER $ClESCE
0. CO\~BISATORICS
.\SD GRAPH TUEORT
5. .iLGURITHMIC 3. SCliERlC.,L
L.kSGU.\GES .\SD
CALWLUS (or
COMPILERS
Course ;j
3. COSSTRUCTIVE
LOGIC
1. ISrRoDrnxo~
TO ACTO!dATA
THEORT
5. FOR\IAL
-
11. SYSTEUS
SIMULlTlONS
12. NATHE~UTIC+L
OPTI\IIZ.~+IO?I
TECHM~~ES
16. HEURISTIC
LINCUAGES
.4lgebraic Structures
Statistical 1Iethods
DiRerential
Equations
.\dvsnced Calculus
Physics (6 cr.)
Aaslog Computers
Electronics
Probability and Sta-
tlstlcs Theory
IAn~ulstxs
Logx
Pbilosophp and Phi-
losophv of Science
Figure 2. Preliminary curriculum from the ACM’s Committee on Computer Science, 1965. [Note the single course
on “electronics” under the heading of “supporting” and “other electives.” ~from ACM Curriculum Committee on
Computer Science, “An Undergraduate Program in Computer Science: Some Recommendations,
Communications
ACM 1965, p.
5461.1
first published statements about what we now call
“computer science.” In an address at a ceremony
opening the IBM San Jose Laboratory in 1958.
Jerome Wiesner made the following comments:
Information processing systems are but one facet
of an evolving field of intellectual activity called
communication sciences. This is a general term
which is applied to those areas of study in which
interest centers on the properties of a system or
the properties of arrays of symbols which come
from their organization or structure rather than
from their physical properties; that is, the study
of what one MIT colleague calls ‘the problems of
organized complexity’ (Wiesner 1958 1. I9
Currently this definition’s most vocal proponent
has been Edsger W. Dijkstra. Beginning in the
‘“The “MIT colleague” is not identified.
late 196Os, DiJkstra consistently argued that de-
spite an apparent basis on algorithms, Computer
Science departments were teaching only engi-
neering rules of thumb about programming lan-
guages. For Dijkstra it was (and is) imperative
that Computer Science distance itself from not
only hardware issues but also from the mastery
of programming languages as its principal activ-
ity in the schools. His writings, which often take
the form of brief notes. serially numbered and
privately circulated to his colleagues, echo Wies-
ner’s statement as well as those of MIT’s Elec-
trical Engineering Department, for example:
. . .
[Nlow the teaching of programming com-
prises the teaching of facts-facts about sys-
tems. machines, programming languages etc.-
and
it is very easy to be explicit about them, but
the trouble is that these facts represent about 10
Annals of the History of Computing, Volume 10, Number 4, 1989 -
269
P. Ceruzzb
l
Coevolutlon of Electronics and Computer Science
percent of what has to be taught; the remaining
90 percent is problem solving and how to avoid
unmastered complexity. (Dijkstra 1982, p. 1071
. . But programming, when stripped of all its
circumstantial irrelevancies, boils down to no
more and no less than the very effective thinking
so as to avoid unmastered complexity (Dijkstra
1982, p. 163).
For him, the goal of computer science was to con-
cern itself with the attempt “to define program-
ming semantics independently of any underlying
computational model . . . or, in other words, to
“forget that program texts can also be inter-
preted as executable code” (Dijkstra 1982, p. 275).
But although Dijkstra is held in high esteem by
academic computer scientists, Computer Science
as it is taught (especially in the United States)
emphasizes the study and mastery of existing
programming languages (and operating systems)
that can be executed on existing digital com-
puters.
Based on these observations, Computer Sci-
ence, as it was formally recognized and taught
between 1955 and 1975, was an engineering sci-
ence, according to Layton’s term. It also fits Wal-
ter Vincenti’s criteria for engineering science in
that in its first two decades. progress in Com-
puter Science occurred in the absence of any for-
mal or useful theory (Vincenti 1979, pp. 742-746.”
But in the context of its roots in formal mathe-
matics iand in its ever increasing levels of ab-
straction and formality since 1975) it is now a
pure science, albeit one that is still groping for
an agreed upon set of fundamental principles and
one that has a different character from classical
physics or chemistry.
The issue of what is Computer Science ironi-
cally has little to do with its having been shaped
by administrative, government, military, and
university policies-indeed, the same sort of pol-
icy factors are characteristic of nearly all post-
World War II science, including (even especially)
physics. Computer Science concerns the system-
atic study of algorithms, especially in the expres-
sion of those algorithms in the form of computer
programs that can be executed on commercially
‘“In one aspect Computer Science represents a departure
from Vincenti‘s thesis. That is his assertion (Vincenti 1979,
p. 7461. that “. the use of working scale models is
peculiar to technologp. Scientists rarely. if ever, have the pos-
sibility of building a working model of their object of con-
cern.” As noted bv Newell et al. above. the object of study for
Computer Scienchis precisely such a model-a universal model
at that.
sold digital computers. Computer hardware, and
hence electrical engineering, are part of com-
puter science, but at present the university struc-
ture of computer science departments treats com-
puting hardware as a given, and the more one
can ignore purely hardware issues the more
progress can be made on the study of algorithms.
To a lesser extent there is a trend to look at ex-
isting programming languages in the same way.
Analog vs. Digital
Despite the many pieces of common ground be-
tween computing and electronics. the two activ-
ities remained distinct. One reason was due to a
fundamental difference in the ways each disci-
pline approached the handling of signals of elec-
tron currents, a difference that had characterized
the evolution of each discipline from its earliest
days. Electrical Engineering evolved as a study
of using devices (like the vacuum tube) to am-
plify continuous signals (McMahon 1984). Com-
puting Engineering, later on Computer Science,
was concerned with using electrons to count and
switch. The one was analog, the other digital.
Analog computing devices, electronic or other-
wise, belong to the history of computers (the
ENIAC
owed as much a debt to wartime analog comput-
ing projects as it did to radar or to digital me-
chanical computing). But analog computers do not
belong to Computer
Science,
as the discipline es-
tablished itself in the late 1950s. The reason is
simple: Computer Science centers on the pro-
grams that execute algorithmic procedures; but
whatever advantages analog have over digital
machines, their inability to be programmed eas-
ily put them forever at a disadvantage, and pre-
clude their being part of the discipline. Several
examples illustrate this difference.
The first concerns the fate of the various Dif-
ferential Analyzer projects, centered at MIT un-
der the leadership of Vannevar Bush. These ma-
chines were certainly among the world’s first
“computers,” in the sense that they were the first
machines capable of automatically evaluating
fairly complicated mathematical expressions. Yet
they never really fulfilled their promise, and of
all the reasons this was so, it was the difficulties
involved in reprogramming them for different
tasks that was decisive (Owens 19861.”
“Owens argues. for example, that it was the Rockefeller
Analyzer’s inability of to be reprogrammed easily that was
270 - Annals of the HIstory of Computing, Volume 10, Number 4, 1989
0, Ceruzzi
l
Coevolution of Electronics and Computer Science
Another example concerns the introduction of
digital computing to the aviation industry. This
community, consistently one of the largest cus-
tomers for computer equipment, had a long tra-
dition of using analog computing devices for air-
craft stability and control, as well as for ground
based applications such as missile tracking and
guidance. But at the same time this industry was
among the first to adopt the digital approach, as
soon as it felt that the problems of reliability and
size could be managed. They did so, despite nu-
merous engineering difficulties, again because of
the digital computer’s greater flexibility ice-
ruzzi, 1989).
lished the notion that the study of the software
side of computing was a valid activity, and it val-
idated the departments of Computer Science (not
Electrical Engineering) as the places where this
study would take place. Computer Science thus
emerged and was split from Electrical Engineer-
ing between 1940 and 1970 as a result of the res-
olution of tha analog/digital split in favor of dig-
ita1.22
And at MIT in the early 195Os, a strong re-
search program had developed on automatic con-
trol of machinery, especially for automating fac-
tory processes and production. This work had its
roots in the (essentially analog) engineering of
MIT’s Servomechanisms Laboratory. By 1952,
when a special issue of Scientific American on
“Automatic Control” appeared, it was recognized
that digital techniques were preferable in all but
one aspect. That was the ability of analog devices
to operate in “real time.” But it was also noted
that the steady progress of digital computing in-
dicated that these machines would soon have the
speeds necessary for such operations. And when
they did, automatic control would be done by dig-
ital computers (Ridenour 1952).
Throughout the 1950s and 196Os, advances in
digital computer programming permitted its in-
cursion into areas where analog devices had for-
merly held sway: consider the replacement of the
slide rule by the electronic calculator, or the re-
placement of the engineer’s drafting table and
machinist’s jigs by CAD/CAM (Noble 19841. The
final blow, and the event which completed Com-
puter Science’s break with Electrical Engineer-
ing, was the discovery of the Fast-Fourier Trans-
form, an algorithm which permitted digital
computers to tackle signal processing and anal-
ysis, a discovery which “thus penetrated the ma-
jor bastion of analog computation” (Newell 1982).
In short, analog computing faded because no
one was able to build an analog computer that
had the property of being universally program-
mable in the sense of a Turing Machine. And in
triumphing over analog, digital computing estab-
the main cause for Its quick demise. Iromcally, in grappling
with this problem for the Rockefeller Analyzer, Perry Craw-
ford and others at MIT developed their plugboard system of
programming for it. a techmque which was then apphed to
the programming of the
ESIAC.
Computer Science Takes over Electronics
Electronics technology took over computing; af-
ter Computer Science established itself, it repaid
the debt by taking over electronics. That is, the
notion of using electronic components as digital
switches, to perform functions that are specified
not by their circuit wiring but by “software,” came
to dominate the activities of the electronics en-
gineer. This notion had its origins in von Neu-
mann’s
EDVAC
Report. and by the end of this pe-
riod has been formalized by computer scientists.
They in turn furnished a paradigm that became
an organizing force for the practice of electronics
engineering above the physics and engineering
of the device level. In other words, in the early
decades of computing (1940-19601, the theory of
computing drew its conceptual framework from
electronics. In the next two decades (1960-1980)
computing supplied to electronics a paradigm,
namely the digital approach, that has reshaped
that discipline.
Consider the following example: the 25 Octo-
ber, 1973 issue of Electronics was devoted to “The
Great Takeover.” The magazine’s publisher in-
troduced the issue by saying:
The proliferation of electronics’ multifarious
technologies into new products. new applica-
tions. and new markets-indeed, into new ser-
vices never before considered possible-is the
theme of this special issue of
Electronics.
On the
cover, we have called the pervasive movement of
electronics into just about every area of human
endeavor ‘The Great Takeover.” And, in many
ways it was inevitable, as the cost advantages of
“The comblmng of the two m MIT’s admimstratlve struc-
ture IS an isolated case. To the extent 1’~ points to a future
trend. It is because dlg7tal computing has run up against some
fundamental lm-uts. including the basic quantum granularity
of materials. Therefore progress in computing may require a
turnmg away from the basic principles on which Computer
Science has been founded-mcludlng. among others, the su-
perlorlty of digital over analog.
Annals of the History of Computing, Volume 10. Number 4, 1989
l
271
P. Ceruzzl
l
Coevolutlon of Electronw and Computer Science
electronic technologies and cost-effectiveness of
electronic solutions took over more and more jobs
from the venerable mechanical and electrome-
chanical technologies.
(Electronics 1973,
p.
4)
ing techniques, and their expression in the mi-
Inside. a loo-page section described in detail how
electronic devices were rapidly sending older
technologies to the scrap-heap. Examples in-
cluded retail sales, (where point-of-sale terminals
were replacing cash registers), pocket calculators
replacing slide rules, electronic circuits replacing
mechanical clockwork in watches, computers in
banking replacing older posting and accounting
techniques, and many others.
Although nowhere was it explicitly stated in
this issue, the reason electronics was taking over
the world was that digital circuits were taking
over electronics. Every example given described
a digital circuit. Buried at the end of the section
on “technology” was a half-page piece entitled
“Don’t Forget Linear” (p. 84). The implication was
that “linear”-that is, analog-circuits were in-
deed all but forgotten. And of the linear circuits
described, a large percentage were those that
performed a conversion between analog and dig-
ital.
After about 1973, Electronics Engineering be-
came digital computer engineering. Radio, and
communications applications. from which elec-
tronics sprang and which dominated it in its ear-
lier period, were still there, but insofar as they
were, they were treated as a subset of digital
techniques. Even the humble radio, from which
modern electronics engineering grew, has now lost
its tuning dial to a calculator style digital key-
pad. Thanks to the mass produced microproces-
sor, it has become easier to take a computer and
program it to “act like a radio,” than it is to de-
sign and build a radio from scratch.
Analog circuits, now called “linear applica-
tions.” are still found of course, but they occupy
an inferior position. Many do believe however that
progress in computer engineering will hinge on
a redefinition and breakthrough in analog cir-
cuits, as digital circuits approach the physical
limits of the ultimate granularity of matter
(Sutherland and Mead 1977).” Digital comput-
‘“In recent years
computer
science has had to return to a
closer look at technological questions. This phenomenon is
outside the scope of this paper, but briefly it can be sum-
marized as a reaction to the introduction of the microproces-
sor in 1974. which has driven the cost of computing to near
zero. Ivan Sutherland, a founder of computer graphics, and
Carver Mead. one of the founders of a theory of VLSI, summed
croprocessor, offer overwhelming advantages over
any other approaches to, say, building a radio or
an automatic control system or whatever. Thus
it has become not only possible but compelling to
recast as much of electronics practice into a dig-
ital computing mold. Increasingly, digital com-
puting appears as a natural extension of the very
properties of electronics that have always been
part of its appealing characteristics.
Conclusion
Electronics took over computing in the late 1940s
because of its inherent advantages over other
techniques. Digital computing, the theory for
which grew out of Computer Science, took over
electronics because it provided a path for those
inherent advantages to progress. In sum, those
advantages are as follows:
Like electronics in general, digital computing
offered speed. The “instant” communications of-
fered by the Morse telegraph was matched by the
relentless drive by the computer engineer for
faster switching circuits, and by the computer
scientist’s development of algorithms that do not
“blow up”-take up exponentially greater num-
bers of cycles as the complexity of the problem
increases by a small increment.
Like electronics in general, digital computing
offered leverage. One early notion of electronics
(still used in Europe) was that it concerned the
applications of “weak currents,” in contrast to the
“strong currents”
of traditional electrical engi-
neering. Weak currents of electricity, carried on
thin and light wires or as weightless signals
through the ether, do the heavy work of carrying
messages, motion pictures, and signals that con-
trol heavy machinery. With computers is it the
same: a tiny chip and its accompanying ethereal
software do the heavy work of “crunching num-
bers” and moving and processing huge quantities
of data.
up this phenomenon as follows: “Computer science has grown
up in an era of computer technologies in which wires were
cheap and switches were expensive. Integrated circuit tech-
nology reverses the cost situation. making switching ele-
ments essentially free and leaving wires as the only expen-
sive component.” (Sutherland and Mead 1977. pp. 210-228).
As a result, the theory of computing has to be revised. but as
of their writing (1977) this revision had only begun. With the
advent of the microprocessor it is now more practical to stamp
out very complex computers on a chip, and then deliberately
hobble them to do a more mundane task. than it is to design
and build from scratch the simpler circuit to do that mundane
task.
272
l
Annals of the History of Computing, Volume 10. Number 4, 1989
P. Ceruzzl
l
Coevolutlon of Electronics and Computer Science
Finally the digital approach extended the con-
cept in electrical engineering of the separation of
a machine’s structural design from its implemen-
tation. When electric power was introduced into
the factory, one of its major advant.ages was that
it allowed the machinery of production to be ar-
ranged on the factory floor according to the logic
of production, without regard to the logic of the
distribution of power as had been the case with
mechanical transmission. The design tool of the
electronics engineer has been the schematic dia-
gram-unlike, say, the architectural drawing-
a diagram that enjoys the luxury of postponing
the physical and spatial details of implementa-
tion until later. During the Second World War,
aviation electronics gave us the “black box”-ra-
dio, radar, and control systems whose internal
organization were invisible, and irrelevant, to
those using it.
From Computer Science came an explicit di-
vision of a technical process into “hardware” and
“software.” Progress in the latter depends on the
fact that programmers need not be concerned with
the details of the former. It is this separation that
electrical engineering has siezed upon. By defer-
ring the thorny issues of applications to a later
phase of software development, the electrical en-
gineer is better able to cope with the physical
problems of constructing circuits having hundreds
of thousands of individual active devices, each of
which has to mesh with all the others. It is in
this sense that the computer scientist has repaid
the debt they owed to electrical engineering, by
providing a way for the engineer to construct cir-
cuits several orders of magnitude more complex
than were possible otherwise. Digital engineer-
ing has become a paradigm for electrical engi-
neering, in Kuhn’s sense of general organization
of a body of theory, whose acceptance by a com-
munitv facilitates their day-to-day work (Kuhn
197oG”
“Sly discussion of the “digital paradigm” for modern Elec-
trical Engineering is based on Kuhn’s earliest writings on the
subject. The subsequent elaboration and refutation of Kuhn’s
thesis. by numerous writers including Kuhn himself. does not
in my view affect the validity of his original insight as a way
of understanding the practice of what he calls “normal sci-
ence.”
in this case normal
engineering
science. The distinc-
tion between hardware and software is present in all tech-
nology. though usually never so easy to discern as in a
computer. It‘is furthermore a distinction which exists in a
hierarchy of levels-the microprocessor. for example. is such
a versatile circuit because it does not require that its func-
tions be fixed as it leaves the factory. It leaves the electronics
engineer free to concentrate on the job of fabricating the chip
itself, without having to worry at that time about precisely
what it will
be
used for.
But as the paradigm of the stored program
digital computer came to pervade the nature of
electronics, at the same time important distinc-
tions remained between computer science and
electronics technology. The two did not become
synonymous, and with the exception of a few in-
stitutional settings, they are likely to remain
separate academic disciplines (Figure 31. Those
who called themselves computer scientists, though
their livelihood depended on the existence of
computer technology, distanced themselves from
hardware issues insofar as possible. They did so
in part for institutional reasons, but also because
they felt a need to focus their energies on estab-
lishing a foundation, based on scientific princi-
ples, for the construction and analysis of algo-
rithms that a computer implements. This was an
activity they felt would never be considered a
proper branch of electronics engineering.
By the end of the 1950s electronics engineers
agreed with computer scientists that their work
concerned the processing, storage, and transmis-
sion of information, although the definition of
“information” was neither precise nor the same
for each group. Twenty years later, both groups
have begun to see their work as the “manage-
ELECTRICAL ENGINEERING
COMPUTER SCIENCE
1
lDlglta1 c:rcuits
/, \
:osi5;es )
,;lr;ults I
/
1
j fbrlng-Church i
j r;y;c!w;!s
Figure 3.
The Merging of Computer
Science and Elec-
trical Engineering?
Annals of the History of Computing, Volume IO, Number 4, 1989 *
273
P. Ceruzzi
l
Coevoiutlon of Electronics and Computer Sctence
ment of complexity,” a term once again not pre-
cisely defined. Both disciplines continue to evolve
and change rapidly. For both Electronics Engi-
neering and Computer Science, the present def-
initions, institutional settings, and research pro-
grams will evolve, and coevolve, in the future.
Acknowledgments
An earlier draft of this paper was presented at
the Annual Meeting of the Society for History of
Technology (SHOT), October 1986. The author
wishes to thank W. David Lewis, Edwin T. Lay-
ton Jr., William Aspray, Michael Williams, Mi-
chael Dennis, and Paul Forman for their critical
reading and comments on that draft. The Smith-
sonian Institution’s Research Opportunities Fund
contributed financial support toward the prepa-
ration of this paper.
References
ACM Curriculum Committee on Computer Science.
1965. “An Undergraduate Program in Computer
Science: Some Recommendations,”
Communications
ACM 8 tseptember), pp. 543-552.
ACM Curriculum Committee on Computer Science.
1968. “Curriculum 68: Recommendations for Aca-
demic Programs in Computer Science.”
Commune-
cations
ilCM 11 (March), pp. 151-197.
ACM Curriculum Committee on Computer Science.
1977. ACM SIGCSE Bulletin. 9.
pp. l-16.
Alt. F. 1969. Interview. Smithsonian Computer His-
tory Project.
Amarel. S 19T6. “Computer Science.” m -4nthony Ral-
ston fed.\.
Encyclopedia
of
Computer
Sczence. New
York: Van Nostrand, pp. 314-318.
Aspray, W. 1980. “From Mathematical Constructivity
to Computer Science: ,4lan Turing, John von Neu-
mann. and the Origins of Computer Science in
Mathematical Logic,” Dissertation, University of
Wisconsin, 1980,
Bell,
C. G. 1985. Computerrcorld,
Oct. l-1, 1985, p. 19.
Bell. C. G.. J. C. 1/Iudge. and J. McNamara. 1978.
Computer Engineering: .4 DEC Vzeu of Hardware
Systems
Design.
Bedford, Massachusetts: Digital
Press, p, 27.
Booth. A.D. 1949. “The Physical Realization of an
Electronic Digital Computer.”
Electronzc Engzneer-
ing 21
(19491. p. 234: 22 119501, pp. 492-498.
Bowden, B.V. 1953.
Faster Than Thought.
London, Pit-
man, p.
30.
Brzezinski,
Z.
1970.
Between Two Ages: America’s Role
in the Technetronx Era,
New York: Viking.
Ceruzzi, P. E. 1981.
Reckoners: The Prehtstory of the
Digltal Computer, 1935-1945.
Westport, Connecti-
cut: Greenwood Press.
Ceruzzi,
P. E.
1989.
Beyond the Limits: Flight Enters
the Computer
Age, Cambridge, Massachusetts: MIT
Press.
Cohen, I. B. 1985. Foreword to the 1985 reprint of A
Manual of Operatzon for the Automatic Sequence
Controlled Calculator,
Cambridge, Massachusetts:
MIT Press 1985, p. x.
Dijkstra, E. 1982.
Selected Writings on Computing: A
Perspective,
New York: Springer.
Electronics,
1980. “Fifty Years of Achievement: A His-
tory,”
Electronics,
Special Commemorative Issue, 17
April, 1980.
Electronics,
1973. “The Great Takeover,” Special Issue
of
Electronics,
25 October.
Everitt, W. 1952. “Guest Editorial,”
IRE Proceedings
40.
p.
899.
Fein, L. 1959. “The Role of the University in Com-
puters, Data Processing, and Related Fields,”
CACM
2 (Sept.). pp. 7-14;
Fein. L. 1961. ‘The Computer-related Sciences (Syn-
noetics) at a University in the Year
1975.” American
Scientist,
49/2 (June), pp. 149-168.
Gries, D., R. Miller, R. Richie, and P. Young 1986.
“Imbalance between Growth and Funding in Aca-
demic Computing Science; Two Trends Colliding,”
Comm.
ACM, 29 (September), pp. 870-876.
Hanson, D.
1982. The New Alchemists: Silicon Valley
and the Mzcroelectronics Revolution,
Boston: Little,
Brown.
Hodges, A.
1983., Alan Turing, the Emgma. New
York,
Simon & Schuster.
IRE 1953. “Computer Issue.”
IRE Proceedings 41 (Oc-
tober).
Kidder, T. 1981.
The Soul of a Neu: Machine,
Boston:
Little. Brown. 1981, pp. 245 ff.
Knuth. D E. 1968.
The Art of Computer Program-
mung; Vol. 1: Fundamental Algortthms,
Reading,
Massachusetts: Addison-Wesley.
Knuth, D. E. 1973.
The Art of
Computer
Program-
mung; Volume Three: Sorting and Searching,
Read-
ing, Massachusetts: Addison Wesley.
Knuth, D, E 1976. ‘*The State of the ‘Art of Computer
Programming,“’ Stanford University, Computer Sci-
ence Dept., Report STAN-CS-76-551, May.
Knuth. D. E. 1982. “A Conversation with Don Knuth,”
Annals of the HEstory of Computing. 4/3
(July), pp.
257-274.
Kuhn, T.
1970. The Structure of Sczentzfic Revolutions.
2nd Ed., University of Chicago Press.
Layton, E. T. 1971. ‘Mirror-Image Twins: The Com-
munities of Science and Technology in 19th Century
America,”
Technology and Culture,
12 ioctober), pp.
562-580.
274
l
Annals of
the History of
Computing, Volume
10,
Number
4, 1989
P. Ceruzzi *
Coevolution of Electronics and Computer Science
Leech, J. 1986. Letter to the author.
entific American,
Special Issue on “Automatic Con-
McMahon, A. Michal 1984.
The Making
of
a Profes-
trol,” 187, (September), pp. 116-130.
soon: A Century
of
Electrwal Engineering in Amer-
Sprague, R. E. 1972. “A Western View of Computer
ica, New York: IEEE Press.
History,”
Comm. ACM 15
(July), pp. 686-692.
Moore School of Electrical Engineering, 1947. “Theory
Sutherland, I. and C. Mead. 1977. “Microelectronics and
and Techniques for Design of Electronic Digital Computer Science.”
Screntific Amerwan, 237
~Sep-
Computers: Lectures Given at the Moore School, 8 tember), pp. 210-228.
July 1946-31 August, 1946.” Philadelphia: Moore Stibitz, G. 1945. “Relay Computers,” National Defense
School of Electrical Engineering. Univ. of Penn., Research Committee, Applied Mathematics Panel,
1947; reprinted 1985, MIT Press, Cambridge, Mas-
AMP Report 17l.lR. February 1945. pp. 21-22.
sachusetts. van Heijenoort, J. 1967.
From Frege to Gddel: A Source
Newell, A., A. Perlis, and H. Simon. 1967. Letter to
Book in Mathematical Logic, 1879-1931,
Cam-
the Editor of
Science 157 122
Sept.), pp. 1373-1374. bridge, Massachusetts: Harvard Univ. Press.
Newell, A. 1982. “Intellectual Issues in the History of Vincenti, W. 1979. “The Air-Propeller Tests of W. F.
Artificial Intelligence,” Carnegie-Mellon Univer- Durand and E. P. Lesley: A Case Study in Techno-
sity, Department of Computer Science, Report CMU- logical Methodology,”
Technology and Culture, 20
CS-82-142. (October), pp. 712-751.
Noble, D. F.
1984. Forces
of
Production: A Social His-
von Neumann, J. 1945. “First Draft of a Report on the
tory
of
Industrtal Automation
(New York: Knopf).
EDVAC,"
Philadelphia: Moore School of Electrical En-
Owens, L. 1986. “Vannevar Bush and the Differential gineering, 30 June.
Analyzer: The Text and Context of an Early Com- Wegener, P. 1970. “Three Computer Cultures: Com-
puter,”
Technology and Culture,
27 (January), pp. 63- puter Technology, Computer Mathematics, and
95. Computer Science.”
Advances In Computing. 10,
Pierce, J. R. 1977. “Electronics: Past, Present, and Fu- p. 9.
ture.”
Science, 195
(18 March), pp. 1092-1095. Weik,
M.
1955.
A Survey
of
Domestic Electronic Dig-
Pollack, S. 1977. “The Development of Computer Sci-
ltal Computing Systems,
Aberdeen, Maryland: Bal-
ence, in S. Pollack (ed.),
Studtes tn Computer Set-
hstic Research Laboratory, Report No. 971.
ence,
Washington, D.C. 1982, pp. 1-51. Wiesner, J. 1958. Communication Sciences in a Uni-
Ramo, S. 1959. Address to 5th National Communica- versity Environment.
IBM J. Res. Dev. 2/4
(Octo-
tions Symposium. Utica, NY, 7 Oct. 1959. Quoted in her), pp. 268-275.
Computers and Automation,
9 (Jan. 19601, p. 6.
Wildes, K. L. and N. A. Lofg-ren 1985.
A Century
of
Redmond, K. C. and T. M. Smith 1980.
Project Whirl-
Electrical Engtneertng at MIT, 1882-1982.
Cam-
wind: The History
of
a Pioneering Computer.
Bed- bridge, Massachusetts: MIT Press, 1985.
ford. Massachusetts: Digital Press.
Wilkes. M. V. 1985.
Memoirs
of
a Computer Pioneer,
Ridenour, L. 1952. “The Role of the Computer,”
Sci-
Cambridge, Massachusetts: MIT Press.
Annals of the History of Computing. Volume 10, Number 4, 1989
l
275

Discussion

Paul Ceruzzi is a historian specializing in the history of technology and computing. He's best known for exploring how technological developments and societal factors influence each other. As curator emeritus at the Smithsonian National Air and Space Museum, Ceruzzi researched, curated exhibits, and lectured on aerospace electronics and computing, highlighting how these fields transformed air and space flight. ![](https://i.imgur.com/aShOfZY.jpeg) *Paul Ceruzzi* Relays are simple electromechanical switches that control electrical circuits. They work by using an electromagnet: when electric current flows through a coil, it creates a magnetic field that mechanically opens or closes a separate circuit. In early computing machines, relays served as digital switches to represent and manipulate numbers, effectively acting like the binary logic elements (1s and 0s) used in modern computers. The relay technology was initially developed for the telephone industry, where it was used to automate switching and routing of telephone calls, connecting lines as needed. ![](https://i.imgur.com/IhHmupA.gif) *Demonstration of the relay mechanism* Howard Aiken was an American physicist and pioneer in computing who designed and built the Automatic Sequence Controlled Calculator (ASCC), also known as the Harvard Mark I, completed in 1944. This was one of the earliest automatic computers, capable of performing long sequences of mathematical operations without human intervention. Like other early computers, the Mark I used electromechanical components such as relays, switches, and rotating mechanical wheels. While its speed - approximately three to five arithmetic operations per second - was limited compared to later electronic computers, the Mark I was extremely reliable and demonstrated the feasibility of automatic, programmable computation. ![](https://i.imgur.com/BgPWe5P.png) *Harvard Mark I* IBM's launch of the System/360 in 1964 significantly reinforced the idea of dividing computer history into distinct technological "generations." This emphasis served as a marketing strategy, clearly positioning the new integrated-circuit-based System/360 as more advanced and efficient than earlier vacuum-tube or transistor-based systems. Drum-type computers were early stored-program computers that used a rotating metal drum as their main memory. Data and instructions were stored magnetically on the surface of the drum. As the drum spun, read/write heads accessed the information sequentially. Although relatively inexpensive and reliable, these drum-based machines were slower compared to larger systems like UNIVAC or IBM 701, which used faster but more costly memory technologies. Because of their affordability, drum-type computers, such as the popular IBM 650, were installed widely and helped make digital computing accessible to a larger number of users in the 1950s. ![](https://i.imgur.com/PHzz6Qq.jpeg) *Drum Type Memory * Ceruzzi is referring to Charles Babbage's designs for his "Analytical Engine”, which he conceived in the 1830s. Focused on the immense design challenges and the struggle to secure funding, Babbage did not formally publish his work, which was first widely described in an 1842 paper by Luigi Menabrea based on Babbage's lectures. The Analytical Engine was a revolutionary concept that anticipated the core components of modern computers. Babbage's design featured an arithmetic processing unit (which he called the "mill"), a separate memory (the "store"), and a system for input using punched cards. ![](https://i.imgur.com/jY6kQpy.jpeg) *A representation of the Analytical Engine* The ARC computer was an early British computing machine developed by Andrew D. Booth at Birkbeck College, London, in the late 1940s. "ARC" stood for Automatic Relay Calculator, reflecting its original electromechanical relay-based design. Booth initially built the machine using relays to save cost and time, but quickly transitioned it to faster vacuum tubes. Influenced by John von Neumann’s stored-program principle, the ARC was among Britain's earliest stored-program computers. Booth used it primarily for research in crystallography, performing complex mathematical calculations needed to analyze molecular structures. The reason vacuum tubes provided such a dramatic increase in speed compared to relays is that vacuum tubes switch electronically, without physical moving parts. Relays, which rely on mechanical movements, were limited to just a few arithmetic operations per second. Vacuum tubes, by contrast, could perform thousands of operations per second - roughly a thousand times faster. Vacuum tubes are devices that control the flow of electric current within a sealed, airless glass or metal container. Inside, electrons are emitted from a heated cathode and flow toward a positively charged anode. The introduction of a metal grid between the cathode and anode (forming devices such as triodes) allows precise control of this electron flow: changing the grid's voltage controls whether electrons pass through, effectively switching or amplifying electrical signals. Vacuum tubes, first invented by John Ambrose Fleming in 1904, were widely used before WWII in radio, telephone, and radar technologies. Their ability to rapidly switch electronic signals made them ideal components in early digital computers like the ENIAC (1945), enabling dramatically faster calculations than previous electromechanical technologies. ![](https://i.imgur.com/i94eN0j.jpeg) *Vaccum Tube* Gottlob Frege, David Hilbert, and Kurt Gödel were mathematicians whose work significantly influenced the theoretical foundations of computer science by advancing our understanding of formal logic and mathematical reasoning. In the late 19th century, Frege developed a precise logical notation to rigorously represent mathematical statements, laying the groundwork for symbolic logic. In the early 20th century, Hilbert proposed the idea of formalizing all mathematics into a logical system with clearly defined rules, known as Hilbert’s program. Later, in the 1930s, Gödel demonstrated, through his incompleteness theorem, that in any sufficiently powerful mathematical system, some statements can neither be proved nor disproved within that system. Although their work was primarily theoretical, these foundational insights became crucial to the later development of computer science, especially in fields such as computability, algorithm theory, and programming language design. The stored-program principle is the idea that a computer's instructions and data should both be stored together inside the computer’s memory. Before this, computers were "programmed" externally, often using switches, plugs, or separate punched paper tapes, making changes slow and cumbersome. By storing instructions internally, a computer could switch tasks rapidly and flexibly, allowing it to become much more versatile. While John von Neumann popularized this idea in his influential 1945 report, known as the "First Draft of a Report on the EDVAC," the exact origins of the stored-program concept remain debated. Some historians credit pioneers such as J. Presper Eckert, John Mauchly, and even Alan Turing with independently developing or significantly contributing to the same idea, leading to ongoing discussions about who deserves primary recognition. A mercury delay line was originally developed during World War II for radar systems. Radar detects objects by sending out a radio pulse and measuring the time it takes for the pulse to return after reflecting off an object. To accurately determine this time, radar equipment needed to briefly store (or delay) the transmitted pulse. Mercury delay lines did exactly this, temporarily holding the original radar pulse long enough for it to be directly compared with the reflected echo. By precisely measuring the time delay between these two signals, the radar system could calculate how far away the detected object was. After the war, mercury delay lines were adapted as memory devices in early digital computers, such as EDSAC and UNIVAC I. In these computers, digital data, including instructions and intermediate calculation results, were temporarily stored as pulses traveling slowly through tubes of mercury. The data pulses were continuously refreshed, enabling the computer to reliably store and retrieve information before more advanced electronic memory became available. ![](https://i.imgur.com/oW0VTN8.jpeg) *How a mercury delay line works* In 1883, while working to improve the incandescent light bulb, Thomas Edison observed that electricity could flow between a heated filament and a metal plate placed inside a vacuum-sealed bulb, even though they weren’t physically connected. Edison noticed this effect but didn't fully understand its significance at the time. He did, however, patent the discovery, calling it the Edison Effect. Although Edison initially considered it just a curiosity, it became a key insight, leading to the development of vacuum tubes and laying the foundation for the modern discipline of electronics.