The world's first general-purpose electronic digital computer, ENIAC (Electronic Numerical Integrator And Computer), was built in 1946 at the University of Pennsylvania. Due to the particularly complex input and program switching of ENIAC, John von Neumann, a famous mathematician and consultant for the ENIAC project, proposed the concept of storing program instructions and the data they operate on together in memory. This famous stored-program concept became the fundamental principle of how computers work. This concept was also proposed by Alan Turing around the same time. The world's first stored-program computer, EDSAC (Electronic Delay Storage Automatic Calculator), was built in 1949 at the University of Cambridge in the UK. It used 3,000 vacuum tubes and could perform 700 additions per second. In 1953, IBM produced the first commercial computer with electronic storage programs. Over the past 50 years, the performance of computer systems has improved dramatically while prices have dropped significantly. The development of computers has gone through four generations and is now in the fifth generation. According to the level of semiconductor technology development, these five generations should be divided as follows: First Generation: 1945-1954, Vacuum Tubes and Relays; Second Generation: 1955-1964, Transistors and Magnetic Core Memory; Third Generation: 1965-1974, Medium and Small Scale Integrated Circuits (MSI-SSI); Fourth Generation: 1975-1990, LSI/VLSI and Semiconductor Memory; Fifth Generation: 1990 to present, ULSI/GSI (Giga-Scale Integration) Very Large Scale Integrated Circuits. The main markers of the update and replacement of computer systems are two aspects: one is due to the continuous development of devices, which causes the indicators such as the operating speed, efficiency, integration degree, and reliability of computer systems to improve continuously and the price to drop constantly; the other is due to improvements in the architecture of computer systems. It has been statistically shown that from 1965 to 1975, the performance of computer systems improved nearly 100 times, where the improvement in device performance increased performance tenfold, and another tenfold increase was mainly due to system architecture improvements. In over 50 years of development, the technological improvements of devices have been relatively stable, while the improvements in system architecture have had larger fluctuations. Especially in the last 20 years, RAID data recovery, because the design of computer systems has become more dependent on integrated circuit technology, this has led to differences in the performance growth rates of various different computer systems during this period. The performance growth of supercomputers benefits from improvements in both device technology and system architecture; the performance growth of mainframes mainly relies on improvements in device manufacturing processes, as there are no new breakthroughs in system architecture; the development of minicomputers, on one hand, is due to significant improvements in computer implementation methods, and on the other hand, it is due to the adoption of many effective advanced technologies from mainframes. However, for these three types of computers, during the period from 1970 to 1990, the average annual growth rate of computer performance was only about 18%. In stark contrast, the performance growth of microcomputers was very fast, with an average annual growth rate of about 35%, because microcomputers benefited most directly from advances in integrated circuit technology. Since the 1980s, microprocessor technology has actually become the main technology selected when designing new system architectures or updating old ones. Starting from 1985, a system architecture with a novel design style, namely RISC (Reduced Instruction Set Computing) technology, gained favor in the computer industry. It organically combines advances in integrated circuit technology, improvements in compiler technology, and new system architecture design ideas, thereby allowing the performance of computer systems designed in this style to improve at a high rate of doubling every year. It should be pointed out that this improvement is based on quantitative analysis obtained from simulation test data on how previous computers were used. Some scholars refer to this design style as a quantitatively analyzed computer system architecture design style, which is clearly much more accurate than the traditional qualitative design style. The two pioneers who developed RISC technology, Prof. D. Patterson of the University of California, Berkeley, and Prof. J. Hennessy of Stanford University, are the main advocates of this quantitative analysis design method.
Original source: http://www.ahserver.com/plus/view-70-1.html