Job Recruitment Website - Job information - History of computer development
History of computer development
In June of 5438+00, Christine Nygard, the pioneer of computer programming language, died of a heart attack at the age of 75. Nigel helped lay the foundation of the Internet and made great contributions to the computer industry. According to Norwegian media reports, Nigaid 1 1 died in Oslo, Norway.
Nigade, a professor at Oslo University, is internationally renowned for developing Simula programming language, which laid the foundation for MS-DOS and Internet. Christine Nygard was born in Oslo on 1926. She graduated from the University of Oslo on 1956 with a master's degree in mathematics, and then devoted herself to computer calculation and programming research.
From 196 1 to 1967, Nigade worked in the Norwegian computer center and participated in the development of object-oriented programming languages. Because of their outstanding performance, in 200 1 year, Nygard and his colleague Orr John Dahl won many awards such as the A.M. Turing Award in 200 1 year. At that time, the Computer Association, which awarded Nygard the prize, thought that their work cleared the way for the wide application of Java, c++ and other programming languages in personal computers and home entertainment devices. "Their work has basically changed the design and programming of software systems, and reusable, reliable and scalable software has also come out.
Century Discovery from Turing Machine to Von Neumann Machine
Alan turing, a British scientist, published a famous article on the application of computable numbers in solving problems in 1937. This paper puts forward the concept of thinking principle computer-Turing machine, which promotes the development of computer theory. 1945, Turing went to work in the National Institute of Physics and began to design automatic computers. 1950, turing published a book entitled "can computers think?" The famous Turing test is to test whether the computer has the same intelligence as human beings through question and answer.
Turing proposed an abstract computing model to precisely define computable functions. Turing machine consists of a controller, an infinitely extending belt and a read-write head that moves left and right on the belt. This machine with such a simple concept can theoretically calculate any intuitive and computable function. As a theoretical model of computer, Turing machine has been widely used in the study of computer and computational complexity.
Computer is an information processing tool made by human beings. If other tools made by human beings are extensions of human hands, then computers, as tools to process information instead of human brains, can be said to be extensions of human brains. At first, real computers were used to solve numerical calculation problems. In the late World War II, a series of code decoding and trajectory calculation for military purposes became more and more complicated. A large number of data and complicated calculation formulas will consume considerable manpower and time even if an electric mechanical calculator is used. In this context, people began to develop electronic computers.
The world's first computer "Crosas" was born in Britain, and/kloc-0 developed the "Crosas" computer in March, 943. The main purpose of developing "Crosas" computer at that time was to decipher the password encrypted by the German "Lorenz" encryption machine. It takes six to eight weeks to decipher this password by other methods, while it only takes six to eight hours to use the Crosas computer. 1944 65438+1October 10, Crosas computer started to run. Since it was put into use, a large number of high-level military secrets in Germany have been quickly deciphered, and the allies are even more powerful. "Crosas" came out more than two years before the ENIAC computer in the United States. During World War II, it deciphered a large number of German secrets. It was secretly destroyed after the war, so it was unknown.
Although the first electronic computer was born in Britain, Britain did not seize the opportunity of the technological and industrial revolution triggered by computers. In contrast, the United States seized this historical opportunity to encourage the development of computer technology and industry, which led to the rise of a large number of computer industry giants and greatly promoted the development of America's comprehensive national strength. 1944, the U.S. Department of Defense organized an ENIAC computer research group led by Mochiri and eckert, and the Hungarian mathematician von Neumann, the founder of modern computers working at Princeton University at that time, also participated in the portrait research. The research work of 1946 was successful, and the world's first electronic digital computer ENIAC was made. This computer composed of 18000 electron tubes, despite its huge size, amazing power consumption and limited functions, has really played a role in saving manpower and time and opened up a new era of computer science and technology. This may be unexpected to the scientists who made it.
Although the earliest computer has limited functions and is very different from modern computers, it already has the basic components of modern computers, namely arithmetic unit, controller and memory.
The arithmetic unit is like an abacus, which is used to perform numerical operations and logical operations to obtain calculation results. The controller, like the headquarters of the computer, directs the work of all parts of the computer, and its commands are completed by sending out a series of control signals.
Computer programs, data, as well as intermediate results and final results produced in operations, should have a storage place, which is the third component of the computer-memory.
Computer automatic calculation, the basis of automatic calculation is the program stored in the computer. Modern computers are all stored program computers, also called von Neumann machines, because the concept of stored program was put forward by von Neumann. According to the mathematical description of the problem to be solved, people write programs in a "language" acceptable to the computer, input them and store them in the computer, and the computer can automatically complete the operation according to people's intention and output the results at high speed. The program should provide the computer with the data to be operated, the sequence of operations, what operations to perform, and so on.
The emergence of microelectronics technology has brought new opportunities for the development of computers and made it possible to miniaturize computers. The development of microelectronics technology can be traced back to the emergence of transistors. From 65438 to 0947, three scientists at Bell Laboratories of AT&T Company, Badin, Brayton and shockley, made the first transistor, which opened the era when transistors replaced electron tubes.
The appearance of transistors can be said to be a prelude to the advent of integrated circuits. After the transistor appeared, some scientists found that circuit components and wires can be made on a silicon chip just like transistors to realize the miniaturization of circuits. Then, after the development of 10 years, the first integrated circuit appeared in 1958.
The development of microelectronics technology and the appearance of integrated circuits first caused great changes in computer technology. In modern computers, arithmetic unit and controller are mostly made together, which is called microprocessor. Due to the integration of microprocessors (computer chips), microcomputers came into being and developed rapidly in the 1970s and 1980s, especially the appearance of IBM PC personal computers, which opened the door to the popularization of computers and promoted the application of computers in all walks of life. Computers with high price, huge volume and amazing power consumption can only be used in a few large military or scientific research facilities. Today, due to the adoption of large-scale integrated circuits, computers have entered ordinary offices and families.
One of the indicators to mark the level of integrated circuits is integration, that is, how many transistors can be manufactured on a chip of a certain size. From the appearance of integrated circuits to today, the development speed is amazing, and the chips are getting smaller and smaller, which has far-reaching influence on production and life. ENIAC computer covers an area of 150 square meters, weighs 30 tons and consumes hundreds of watts. The calculation it completes can be done with today's advanced pocket calculator. This is a miracle created by microelectronics and integrated circuits.
Present situation and prospect
American scientists recently pointed out that after more than 30 years of development, the miniaturization of computer chips has approached the limit. The further development of computer technology can only rely on brand-new technologies, such as new materials, new transistor design methods and molecular computing technology.
For more than 30 years, the development of semiconductor industry has basically followed Moore's law, that is, the number of transistors installed on silicon wafers has doubled every 18 months. The chip is getting smaller and smaller, including more and more transistors, and the etching line width is getting smaller and smaller; As a result, the performance of computers is getting higher and higher, and the price is getting lower and lower. However, it has been suggested that this development trend can only last for 10 to 15 years at most.
Paul A. Pakan, a scientist from Intel Corporation, the largest chip manufacturer in the United States, recently wrote in the American magazine Science that Moore's Law (the law that predicted that the semiconductor capacity would increase at a geometric speed in 1965) may encounter an insurmountable obstacle in the next 10 year: the miniaturization of chips is approaching the limit. People haven't found a way to go beyond this limit, and some scientists call it "the biggest challenge facing the semiconductor industry".
At present, the most advanced VLSI chip manufacturing technology can achieve a minimum line width of about 0. 18 micron, which is as wide as 5% of a hair. The insulating layer in the transistor is only 4 to 5 atoms thick. Japan will start mass production with a width of only 0. 13 micron in early 2000. It is expected that this chip will be widely used in the next two years. The next step is to introduce a chip with a line width of 0. 1 micron. Pacan said that at such a small size, the transistor can only be composed of less than 100 atoms.
When the chip line width is small to a certain extent, the lines will easily interfere with each other because they are too close together. However, if the current through the line is very weak, only a few dozen or even a few electrons, the background noise of the signal will be unbearable. If the size is further reduced, the quantum effect will play a role, making the traditional computer theory completely ineffective. In this case, scientists must use brand-new materials, design methods and even operational theories, so that the semiconductor industry and computer industry can break through the limitations of traditional theories and find another way.
What is the mainstream of computer development at present? The consensus at home and abroad is that
Reduced instruction set computing
RISC stands for Reduced Instruction Set Computer. A collection of operational commands that a computer can execute. The program will eventually become a sequence of instructions that the computer can execute. Each computer has its own instruction system, and the computer can recognize and execute the instructions of the local instruction system. Recognition is to decode the binary code representing the operation-turn it into a control signal corresponding to the operation, so as to execute the operation required by the instruction. Generally speaking, computer instruction system is rich and powerful. RISC system simplifies the instruction system, aiming at reducing the execution time of instructions and improving the processing speed of computers. Traditional computers usually fetch one instruction at a time, while RISC systems adopt a multi-issue structure and simultaneously issue multiple instructions. Of course, it is necessary to increase the execution components on the chip.
Parallel processing technology
Parallel processing technology is also an important direction to improve computer processing speed. In traditional computers, there is generally only one central processing unit, and only one program is executed in the central processing unit. The execution of the program is executed one by one in sequence, and the data reflected by the processor is also a string, so it is called serial execution instruction. Parallel processing technology can simultaneously execute multiple related or independent programs in multiple processors. At present, there are two parallel processing systems: one is a parallel processing system with 4, 8 or even 32 processors, or a multi-processor system; The other is to assemble more than 100 processors to form a large-scale processing system. The two systems differ not only in the number of processors, but also in the internal interconnection, memory connection, operating system support and application fields.
There was a time when supercomputers were made of different materials from ordinary computers. The earliest Cray 1 computer was made by hand with grotesque chips installed on copper-plated liquid-cooled circuit boards. Cray 2 computer looks even weirder. It stirs bubbles in a bath containing liquid fluorocarbon-it is cooled by "artificial blood". Parallel computing technology has changed all this. At present, the fastest computer in the world is Asci Red in America. The computing speed of this computer is 2. 1 trillion times per second. It consists of the same components as personal computers and workstations, except that supercomputers use more components and are equipped with 9000 standard Pentium chips. Judging from the current technical trend, the difference between supercomputers and other computers is really beginning to blur.
At least in the near future, this trend will obviously continue. So, which upcoming technologies may subvert the pattern of computing technology and trigger the next supercomputing technology revolution?
There are at least three such technologies: photon computer, biological computer and quantum computer. They are unlikely to be realized, but they are worth studying because they have the potential to trigger a revolution.
photon computer
Photonic computer is probably the most traditional of these three new technologies. For decades, the application of this technology has been restricted, especially in military signal processing.
In photon computing technology, light can transmit information like electricity, or even better. The effect of light beam transmitting information from one place to another is better than electricity, which is why telephone companies use optical cables for long-distance communication. Light is very useful for communication because it does not interact with the surrounding environment, which is different from electricity. Two beams of light can penetrate each other without being noticed. The propagation speed of light in long distance is about 100 times that of electronic signal, and the energy consumption of optical devices is very low. It is estimated that the operation speed of photonic computers may be 1000 to 10000 times faster than today's supercomputers.
Unfortunately, it is this extreme independence that makes it difficult for people to make all-photon computers, because the calculation process needs to take advantage of mutual influence. If we want to make a real photonic computer, we must develop optical transistors so that one beam of light can be switched on and off by another. This kind of device already exists, but to make optical transistors with suitable performance characteristics, we need to rely on a major breakthrough in the field of materials science.
biocomputer
Compared with photon computing technology, large-scale biological computing technology is more difficult to realize, but it has greater potential. Imagine a supercomputer the size of a grapefruit, capable of real-time image processing, speech recognition and logical reasoning. Such computers already exist: they are human brains. Since 1970s, people began to study biological computers (also called molecular computers). With the steady development of biotechnology, we will begin to understand and manipulate the genetic mechanism that makes the brain.
The performance of biological computers will be better than that of electronic computers and optical computers. If technological progress continues at the current rate, it is conceivable that supercomputing opportunities will emerge in large numbers in ten or twenty years. This may sound like science fiction, but in fact there have been experiments in this field. For example, a "biochip" with a special arrangement of neurons on a silicon chip has been made.
In other laboratories, researchers have used relevant data to encode DNA single strands so that these single strands can be manipulated in flasks. These bio-computing experiments are far from being practical, but in 1958, our views on integrated circuits are no more than that.
quantum computer
Quantum mechanics is the third technology that has the potential to create a supercomputing revolution. This concept appeared later than photon computing or biological computing, but it has greater revolutionary potential. Because quantum computer uses the counterintuitive law of quantum mechanics, its potential operation speed will be much faster than that of electronic computer. In fact, their speed increase is almost endless. A quantum computer with about 5000 qubits can solve the prime number problem that traditional supercomputers need 1000 billion years to solve in about 3 0 seconds.
At present, there just happens to be an important use suitable for this seemingly abstruse homework. Protect computer data by encrypting codes representing data. The mathematical "key" of decryption appears in the form of a very large number-usually up to 250 digits-and its prime factor. This kind of encryption is considered to be indecipherable, because no traditional computer can calculate the prime factor of such a huge number in a proper time. However, at least in theory, quantum computers can easily handle these prime encryption schemes. Therefore, quantum computer hackers will be able to easily obtain not only credit card numbers and other personal information that often appear in various computer networks (including the Internet), but also government and military secrets. This is also the reason why some government agencies adhering to the principle of "taking the lead and not wanting to lag behind" have been investing heavily in quantum computer research.
Quantum super network engine
Quantum computers are unlikely to destroy the integrity of the Internet. Not only that, they may eventually bring great benefits to the Internet. Two years ago, Ralph Grover, a researcher at Bell Laboratories, discovered a way to use quantum computers to handle many of our daily affairs-searching for some information hidden in a huge database. Looking for information in a database is like looking for something in a briefcase. If different qubit state combinations search different parts of the database, then one of the state combinations will encounter the information to be searched.
Due to some technical limitations, the speed increase brought by quantum search is not as great as expected. For example, if you want to search for an address among 65.438+billion addresses, traditional computers need about 50 million attempts to find the address; Quantum computer needs about 654.38+00000 attempts, but this is a great progress. If the database is enlarged, the progress will be even greater. In addition, database search is a very basic computer task, and any improvement may have an impact on a large number of applications.
So far, few researchers are willing to predict whether quantum computers will be more widely used. However, the general trend is satisfactory. Although many physicists, if not all, initially thought that the puzzling nature of quantum mechanics would definitely eliminate the elusive and deep-rooted obstacles faced by practical quantum computing technology, profound and extensive theoretical research has not yet been able to create real machines.
So, what does the research boom of quantum computers mean? The history of computing technology shows that there are always breakthroughs in hardware and software before the problems that need them to solve appear. Perhaps, when we need to search the huge database that ordinary computers take several months to complete, quantum computers will really start to run. It is not easy to study the technology that will replace the electronic computer. After all, parallel computers with standard microprocessor technology will make great progress every few years. Therefore, any technology that wants to replace it must be excellent. However, the progress in the field of computing technology is always very rapid and full of unexpected things. Predictions of the future are never reliable. In hindsight, people who assert "infeasibility" are the most stupid.
Besides supercomputers, where will computers develop in the future?
multimedia instruction
Multimedia technology is a new technology to further broaden the application field of computer. It processes information media such as words, data, graphics, images and sounds as a whole by computer, and brings the computer into the application field of integration of sound, text and pictures. Multimedia must have various external devices, such as monitor, keyboard, mouse, joystick, video/CD, camera, input/output, telecommunication transmission, etc. Multimedia system integrates computers, household appliances and communication equipment, and is controlled and managed by computers. Multimedia system will have a great impact on human society.
network
At present, most computer systems are networked computer systems. The so-called network refers to a system composed of several geographically distributed independent computers interconnected by communication lines. According to the size of the networking area, computer networks can be divided into residential networks and remote networks. As small as a factory workshop and office, as large as across continents and oceans, computer networks can be formed. The Internet will develop into an invisible and powerful force in human society-it quietly transmits all kinds of information to people and facilitates human work and life with the fastest and most advanced means. Nowadays, the development of Internet tends to turn the world into a "global village".
Experts believe that PC will not disappear immediately, and terminal devices with single or limited functions (such as handheld computers and smart phones) will challenge PC's position as the driving force of computer innovation. "Set-top box" computers, such as Internet TV, will soon become popular by combining Internet access and email functions with limited computing functions. A single-function terminal will eventually become easier to apply.
Intelligent computer
Our understanding of the brain is still superficial, but the work of making computers intelligent must not wait until people know enough about the brain. Making computers smarter has been the goal that people have been pursuing from the beginning. At present, the development of computer-aided design, translation, retrieval, drawing, writing, chess playing and mechanical work has taken a step towards computer intelligence. With the continuous improvement of computer performance, artificial intelligence technology has finally found an opportunity to show its face after wandering for 50 years. Kasparov, the world's number one chess master, bowed to the "deep blue", making people feel the taste of failure in front of the computer for the first time. Never before have human beings been so deeply worried, nor have they felt so strongly the need to know themselves.
Nowadays, most computers are von Neumann computers, and their functions such as reading words, pictures, obedience and thinking in images are particularly poor. In order to make computers more artificial and intelligent, scientists began to let computers simulate the functions of the human brain. In recent years, developed countries have attached importance to the research of artificial neural networks and taken an important step towards the intelligentization of computers.
The characteristics and advantages of artificial neural network are mainly manifested in three aspects: it has self-learning function. When realizing image recognition, as long as many different image templates and corresponding recognition results are input into the artificial neural network, the network will learn to recognize similar images through self-learning function. Self-learning function is of great significance to prediction. It is predicted that the artificial neural network computer in the future will provide the same economic forecast, market forecast and benefit forecast for human beings, and its prospect is very broad.
With associative memory function. The human brain has the function of a hatchback. If someone tells you that your childhood classmate Zhang Moumou. You will think of many things about Zhang Moumou. This correlation can be realized by using the feedback network of artificial neural network.
Have the ability to find the optimal solution at high speed. Finding the optimal solution of a complex problem often requires a lot of calculation. Using the feedback artificial neural network designed for a problem and giving full play to the high-speed computing ability of the computer, the optimal solution may be found soon.
Artificial neural network is a new watershed in the application of electronic technology in the future. The composition of intelligent computer may be a combination of von Neumann machine as the host and artificial neural network as the intelligent peripheral.
It is generally believed that intelligent computers will inevitably appear like the fulfillment of Moore's law (the law of predicting the geometric growth of semiconductor capacity proposed by 1965). Gordon Moore, honorary chairman of Intel Corporation who proposed this law, agrees with this view. He believes: "Silicon intelligence will develop to the point where it is difficult to distinguish between computers and people." But computer intelligence will not stop there. Many scientists assert that the intelligence of machines will soon surpass that of Albert Einstein and Hawking combined. Hawking believes that just as humans can design computers with superb digital manipulation ability, intelligent machines will also create computers with better performance. By the middle of the next century at the latest (maybe even faster), the intelligence of computers may be beyond human understanding.
Century Discovery from Turing Machine to Von Neumann Machine
Alan turing, a British scientist, published a famous article on the application of computable numbers in solving problems in 1937. This paper puts forward the concept of thinking principle computer-Turing machine, which promotes the development of computer theory. 1945, Turing went to work in the National Institute of Physics and began to design automatic computers. 1950, turing published a book entitled "can computers think?" The famous Turing test is to test whether the computer has the same intelligence as human beings through question and answer.
Turing proposed an abstract computing model to precisely define computable functions. Turing machine consists of a controller, an infinitely extending belt and a read-write head that moves left and right on the belt. This machine with such a simple concept can theoretically calculate any intuitive and computable function. As a theoretical model of computer, Turing machine has been widely used in the study of computer and computational complexity.
Computer is an information processing tool made by human beings. If other tools made by human beings are extensions of human hands, then computers, as tools to process information instead of human brains, can be said to be extensions of human brains. At first, real computers were used to solve numerical calculation problems. In the late World War II, a series of code decoding and trajectory calculation for military purposes became more and more complicated. A large number of data and complicated calculation formulas will consume considerable manpower and time even if an electric mechanical calculator is used. In this context, people began to develop electronic computers.
The world's first computer "Crosas" was born in Britain, and/kloc-0 developed the "Crosas" computer in March, 943. The main purpose of developing "Crosas" computer at that time was to decipher the password encrypted by the German "Lorenz" encryption machine. It takes six to eight weeks to decipher this password by other methods, while it only takes six to eight hours to use the Crosas computer. 1944 65438+1October 10, Crosas computer started to run. Since it was put into use, a large number of high-level military secrets in Germany have been quickly deciphered, and the allies are even more powerful. "Crosas" came out more than two years before the ENIAC computer in the United States. During World War II, it deciphered a large number of German secrets. It was secretly destroyed after the war, so it was unknown.
Although the first electronic computer was born in Britain, Britain did not seize the opportunity of the technological and industrial revolution triggered by computers. In contrast, the United States seized this historical opportunity to encourage the development of computer technology and industry, which led to the rise of a large number of computer industry giants and greatly promoted the development of America's comprehensive national strength. 1944, the U.S. Department of Defense organized an ENIAC computer research group led by Mochiri and eckert, and the Hungarian mathematician von Neumann, the founder of modern computers working at Princeton University at that time, also participated in the portrait research. The research work of 1946 was successful, and the world's first electronic digital computer ENIAC was made. This computer composed of 18000 electron tubes, despite its huge size, amazing power consumption and limited functions, has really played a role in saving manpower and time and opened up a new era of computer science and technology. This may be unexpected to the scientists who made it.
Although the earliest computer has limited functions and is very different from modern computers, it already has the basic components of modern computers, namely arithmetic unit, controller and memory.
The arithmetic unit is like an abacus, which is used to perform numerical operations and logical operations to obtain calculation results. The controller, like the headquarters of the computer, directs the work of all parts of the computer, and its commands are completed by sending out a series of control signals.
Computer programs, data, as well as intermediate results and final results produced in operations, should have a storage place, which is the third component of the computer-memory.
Computer automatic calculation, the basis of automatic calculation is the program stored in the computer. Modern computers are all stored program computers, also called von Neumann machines, because the concept of stored program was put forward by von Neumann. According to the mathematical description of the problem to be solved, people write programs in a "language" acceptable to the computer, input them and store them in the computer, and the computer can automatically complete the operation according to people's intention and output the results at high speed. The program should provide the computer with the data to be operated, the sequence of operations, what operations to perform, and so on.
The emergence of microelectronics technology has brought new opportunities for the development of computers and made it possible to miniaturize computers. The development of microelectronics technology can be traced back to the emergence of transistors. From 65438 to 0947, three scientists at Bell Laboratories of AT&T Company, Badin, Brayton and shockley, made the first transistor, which opened the era when transistors replaced electron tubes.
The appearance of transistors can be said to be a prelude to the advent of integrated circuits. After the transistor appeared, some scientists found that circuit components and wires can be made on a silicon chip just like transistors to realize the miniaturization of circuits. Then, after the development of 10 years, the first integrated circuit appeared in 1958.
The development of microelectronics technology and the appearance of integrated circuits first caused great changes in computer technology. In modern computers, arithmetic unit and controller are mostly made together, which is called microprocessor. Due to the integration of microprocessors (computer chips), microcomputers came into being and developed rapidly in the 1970s and 1980s, especially the appearance of IBM PC personal computers, which opened the door to the popularization of computers and promoted the application of computers in all walks of life. Computers with high price, huge volume and amazing power consumption can only be used in a few large military or scientific research facilities. Today, due to the adoption of large-scale integrated circuits, computers have entered ordinary offices and families.
One of the indicators to mark the level of integrated circuits is integration, that is, how many transistors can be manufactured on a chip of a certain size. From the appearance of integrated circuits to today, the development speed is amazing, and the chips are getting smaller and smaller, which has far-reaching influence on production and life. ENIAC computer covers an area of 150 square meters, weighs 30 tons and consumes hundreds of watts. The calculation it completes can be done with today's advanced pocket calculator. This is a miracle created by microelectronics and integrated circuits.
- Related articles
- How about Renhuai Tianli Zhongcheng Concrete Engineering Co., Ltd.?
- Announcement of recruiting logistics personnel in 2023 by Yunyan Town Finance Office, Lechang City, Shaoguan?
- What positions are there on board? What are these positions for?
- What is the retirement age of female employees in private enterprises in yanggu county?
- What pipe does de stand for?
- What are the village committees under the jurisdiction of Ugqi Sumu, Zhalute Banner, Tongliao City, Inner Mongolia Autonomous Region?
- How about Changsha Nanhu Harmonious Hospital Co., Ltd.?
- How to be an excellent HR
- Welcome to the dinner host's opening remarks.
- Ask for a paper about the development history of modern music in China, about 2000 words. . . Thank you very much ! !