-
Period: to
The Rise of Mathematics
-
International Congress of Mathematicians
In 1928, the German mathematician David Hilbert addressed the International Congress of Mathematicians: (1) Is mathematics complete; i.e. can every mathematical statement be either proved or disproved? (2) Is mathematics consistent, that is, is it true that statements such as "0=1" cannot be proved by valid methods? (3) Is mathematics decidable, that is, is there a mechanical method that can be applied to any mathematical assertion and will eventually tell whether that assertion is true or not? -
Kurt Gödel answered two of Hilbert's questions
In 1931, Kurt Gödel answered two of Hilbert's questions. He showed that every sufficiently powerful formal system is either inconsistent or incomplete. Also, if an axiom system is consistent, this consistency cannot be proved within itself. The third question remained open, with 'provable' substituted for 'true'. -
The Turing machine
In 1936, Alan Turing provided a solution to Hilbert's Entscheidungsproblem by constructing a formal model of a computer -- the Turing machine -- and showing that there were problems such a machine could not solve. One such problem is the so-called "halting problem": given a Pascal program, does it halt on all inputs? -
Period: to
1940's: Wartime brings the birth of the electronic digital computer
-
The Z3 - first operational, general-purpose, program-controlled calculator
Meanwhile, in Germany, Konrad Zuse (1910-1995) built the first operational, general-purpose, program-controlled calculator, the Z3, in 1941 -
Mark I electromechanical computer
The calculations required for ballistics during World War II spurred the development of the general-purpose electronic digital computer. At Harvard, Howard H. Aiken built the Mark I electromechanical computer in 1944, with the assistance of IBM. -
The EDVAC, a stored-program electronic computer
In 1944, Mauchly, Eckert, and John von Neumann were already at work designing a stored-program electronic computer, the EDVAC. Von Neumann's report, "First Draft of a Report on the EDVAC", was very influential and contains many of the ideas still used in most modern digital computers, including a mergesort routine. Eckert and Mauchly went on to build UNIVAC. -
The ENIAC, a general-purpose electronic computer originally intended for artillery calculations
Atanasoff discussed his invention with John William Mauchly, who later, with J. Presper Eckert, Jr., designed and built the ENIAC, a general-purpose electronic computer originally intended for artillery calculations. Exactly what ideas Mauchly got from Atanasoff is not complely clear, and whether Atanasoff or Mauchly and Eckert deserve credit as the originators of the electronic digital computer was the subject of legal battles and ongoing historical debate. -
The invention of the transistor
The invention of the transistor in 1947 by John Bardeen (1908-1991), Walter Brattain (1902-1987), and William Shockley (1910-1989) transformed the computer and made possible the microprocessor revolution. For this discovery they won the 1956 Nobel Prize in physics. -
Turing Test
In a paper that appeared in the journal Mind in 1950, A.Turing introduced the Turing Test, one of the first efforts in the field of artificial intelligence. He proposed a definition of "thinking" or "consciousness" using a game: a tester would have to decide, on the basis of written conversation, whether the entity in the next room responding to the tester's queries was a human or a computer. If this distinction could not be made, then it could be fairly said that the computer was "thinking". -
A compiler
Grace Murray Hopper (1906-1992) invented the notion of a compiler, at Remington Rand, in 1951 -
Dijkstra's algorithm
Edsger Dijkstra invented an efficient algorithm for shortest paths in graphs as a demonstration of the ARMAC computer in 1956. He also invented an efficient algorithm for the minimum spanning tree in order to minimize the wiring needed for the X1 computer. -
FORTRAN
Fortran was originally developed by IBM in the 1950s for scientific and engineering applications, and subsequently came to dominate scientific computing. It has been in use for over six decades in computationally intensive areas such as numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography and computational chemistry. -
LISP
LISP, a list-processing language for artificial intelligence programming, was invented by John McCarthy about 1958. -
The integrated circuit
In hardware, Jack Kilby (Texas Instruments) and Robert Noyce (Fairchild Semiconductor) invented the integrated circuit in 1959. -
Period: to
1960's
In the 1960's, computer science came into its own as a discipline. In fact, the term was coined by George Forsythe, a numerical analyst. The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965. -
The first computer science department
The first computer science department was formed at Purdue University in 1962. The first person to receive a Ph. D. from a computer science department was Richard Wexelblat, at the University of Pennsylvania, in December 1965. -
System/360
Operating systems saw major advances. Fred Brooks at IBM designed System/360, a line of different computers with the same architecture and instruction set, from small machine to top-of-the-line. -
BASIC
Dartmouth BASIC is the original version of the BASIC programming language. It was designed by two professors at Dartmouth College, John G. Kemeny and Thomas E. Kurtz. With the underlying Dartmouth Time Sharing System (DTSS), it offered an interactive programming environment to all undergraduates as well as the larger university community. -
Computer mouse
Douglas C. Engelbart invents the computer mouse c. 1968, at SRI -
ARPAnet
At the end of the decade, ARPAnet, a precursor to today's Internet, began to be constructed. -
Period: to
First microprocessor
Ted Hoff (b. 1937) and Federico Faggin at Intel designed the first microprocessor (computer on a chip) in 1969-1971. -
A Relational Model of Data for Large Shared Data Banks
The theory of databases saw major advances with the work of Edgar F. Codd on relational databases. Codd won the Turing award in 1981. -
C
is a general-purpose computer programming language. It was created in the 1970s by Dennis Ritchie, and remains very widely used and influential. By design, C's features cleanly reflect the capabilities of the targeted CPUs. It has found lasting use in operating systems, device drivers, protocol stacks, though decreasingly for application software. C is commonly used on computer architectures that range from the largest supercomputers to the smallest microcontrollers and embedded systems. -
Unix
Unix, a very influential operating system, was developed at Bell Laboratories by Ken Thompson and Dennis Ritchie -
CRAY-1
The 1970's also saw the rise of the supercomputer. Seymour Cray designed the CRAY-1, which was first shipped in March 1976. It could perform 160 million operations in a second. The Cray XMP came out in 1982. Cray Research was taken over by Silicon Graphics. -
Period: to
1980's
This decade also saw the rise of the personal computer, thanks to Steve Wozniak and Steve Jobs, founders of Apple Computer. -
The first computer viruses
The first computer viruses are developed c. 1981. The term was coined by Leonard Adleman, now at the University of Southern California. -
The Macintosh
In 1984, Apple first marketed the Macintosh computer. -
NSFnet
In 1987, the US National Science Foundation started NSFnet, precursor to part of today's Internet. -
Period: to
1990's and Beyond
Parallel computers continue to be developed.
Biological computing, with the recent work of Len Adleman on doing computations via DNA, has great promise. The Human Genome Project is attempting to sequence all the DNA in a single human being. Quantum computing gets a boost with the discovery by Peter Shor that integer factorization can be performed efficiently on a (theoretical) quantum computer. Computers get smaller and smaller; the birth of nano-technology.