ENIAC was typical of early computers. It relied on special vacuum tubes instead of transistors and was used by the US military to calculate trajectories of missiles. It worked on a decimal system, which meant that the hardware needed to hold a single digit had to hold a counter for each possible number from 0 to 9. This made each counter very large and prone to errors. It was superseded by the EDVAC, which instead counted in binary. It was also used for calculating missile trajectories.
A report on the EDVAC by US scientist John von Neumann led to his description of what would be necessary for a stored-program computer: the von Neumann (also called the Princeton) architecture. Such computers have devices to input information (e.g. a keyboard), an output unit (a monitor or printer), a control unit, and an arithmetic unit. Such computers stored data and instructions in the same memory. More modern computers make use of the so-called Harvard architecture (or a ‘modified Harvard architecture’), where instructions and data are stored separately. As a result of von Neumann’s description, universities in the US and UK began to build their own computers for research purposes. In many ways, this was the birth of e-science. Such computers included the ILLIAC, built at the University of Illinois, and the Manchester Mark I, built at the University of Manchester in the UK. The Manchester machine was used for explorations of number theory in mathematics and for calculations in optics. Later iterations of the ILLIAC would be used for early analysis of particle physics data, but the first revision was used by former industrial chemist Lejaren Hiller to make the first computer music, the ILLIAC suite. 60 years later, scientists are still using the audio and graphics power of computers to understand data and make their science accessible.
Seymour Cray’s name was almost synonymous with supercomputing for more than two decades. Cray realised that having a powerful processor was just one aspect of making a fast computer – to make a machine really fast requires careful consideration of all the electronics leading to and from the CPU, and of all the other components involved in building a computer. It was not until the advent of massively parallel architectures in the late 1980s that his computers were overtaken in speed.
Supercomputers continue to get ever faster. One of the benefits to ordinary computers is the ‘filtering down of technology’. It has been said that a cutting-edge smartphone today is comparable with the fastest supecomputers of 15 years ago.