Give an unambiguous answer to the question "Who invented the computer?" actually not so easy. As is the case with many other inventions, many people who worked in different countries contributed to the emergence of the computer, and the question of what kind of device, in fact, is worthy of being called the first computer, you can give different answers. So, in this post - about the inventors of the computer.
What is a computer? On the one hand, a computer is considered a kind of computer technology, but its important feature should be the ability not only to perform calculations, albeit complex ones, but to execute some arbitrarily given program. That is, devices designed to solve only certain tasks do not fit the definition of a computer, a computer is a universal computing device that can be programmed.
The history of computers begins in the 19th century. In 1808, the French weaver Joseph Marie Jacquard (or Jacquard) invents a loom that can not only produce fabric, but make fabric with arbitrary patterns. In fact, it was a programmable machine. The pattern was set using plates with holes drilled in a certain order - punched cards.
Punched cards for the Jacquard machine
In 1832, the Russian inventor Semyon Nikolaevich Korsakov published a project of special machines for processing information using punched cards. In fact, they were database machines. However, the invention did not receive official support, the commission that considered the project expressed the opinion that "Mr. Korsakov spent too much reason on teaching others to do without reason."
Who came up with the project of the first programmable computing device, that is, a computer? This man was an Englishman Charles Babbage. Babbage was an extremely versatile man, but is best known for his computer projects. In 1822, he built a machine for calculating logarithmic tables, this machine later became known as the small difference machine. Then Babbage decided to build a full-scale version of the difference engine, received a subsidy from the government, but did not meet the deadlines or the amount of funding. Instead of the initial three years and £1,500, Babbage spent 11 years and £17,000 without completing the car. Only in 1991, on the occasion of Babbage's bicentennial, was a working version of this difference engine built in London.
Babbage's Difference Engine
The difference engine is a rather complex, but still highly specialized computing device. You can't call it a computer. However, in the process of working on the difference engine, Babbage developed a project for an even more complex and versatile analytical engine, which was, in fact, a mechanical computer. This machine had a block for storing numbers, and it itself could perform calculations according to a program written on punched cards. Alas, the machine was too complex and even today enthusiasts have not dared to reproduce it.
In the 19th and early 20th century, the development of computing technology continued, but it was still intended for highly specialized calculations. In 1936, the English mathematician Alan Turing described an abstract machine suitable for arbitrary computation. The described machine is called the Turing machine. In fact, Turing defined the criteria by which one could determine whether a computing machine was universal.
Alan Turing
By the end of the 1930s, there were two possibilities for building computers. More common were electromechanical machines, combining electrical and mechanical elements. They counted very slowly - one operation could take several seconds. But at that time, another concept appeared - to use vacuum tubes as elements. Vacuum tube machines—electronic ones—could count much faster, but the tubes were expensive and not very reliable, and often burned out.
The first computers appeared between the late thirties and late forties. The only question is, what kind of device is considered the first real computer? Consider candidates.
1) Cars by Konrad Zuse
Konrad Zuse was a German engineer who, on his own initiative, took up the development of computers. In 1938, with his own money, he developed and built the first electromechanical machine, called Z1, implemented the possibility of programming in it, but it worked unreliably. In 1939, the Second World War began and Zuse was called to the front, from where he managed to return and create a second version of his car - Z2, and in early 1941 - Z3. These machines were probably the first really working electromechanical computers. In 1941, Zuse was again called to the front. No matter how he proved to the leadership of the Wehrmacht the importance of his computers, they did not want to listen to him. Only after the intervention of the Henschel company, which produced aircraft, where Zuse had previously worked as an engineer, was he still allowed to return to work on his computers. It was assumed that they would be used to calculate the aerodynamic parameters of aircraft. The leadership of the Wehrmacht, however, was not enthusiastic about the developments and, not seeing any particular value in them, financed them very reluctantly. The next model - Z4 Zuse finished only after the war. In 1950 he sold this model to Switzerland.
Z3 (restored copy) in a German museum
Z3 could read the program from punched tape and perform calculations in accordance with it. However, this machine was electromechanical, so it worked very slowly and could not explicitly execute conditional jump instructions, which are considered an important part of a computer program. Can the Z3 be considered the world's first computer, and Konrad Zuse its inventor? Some think yes, some no.
2) Atanasoff-Berry computer
In 1942 an American mathematician of Bulgarian origin John Atanasoff and engineer Clifford Berry, who helped him, built the first 100% electronic computer without mechanical parts. This machine was not universal and was intended mainly for solving linear equations, however, it was in 1973 that the US Federal District Court recognized it as the “first computer”. Perhaps something more would have come out of this car if Atanasov had not been drafted into the American army.
Atanasoff-Berry computer
3) British "Bombs" and "Colossi"
During World War II, the British were faced with the task of deciphering German messages. It was impossible to break the German ciphers by hand. Then the British resorted to the help of computers.
In 1940 in Great Britain under the project Alan Turing The first electromechanical computer was built to decipher the German Enigma code. She got the name "Bomb". One such machine weighed 2.5 tons, and in order to decipher as many messages as possible, by 1944 the British had built 210 such machines.
"Bomb"
But to transmit important messages, the Germans used a different, even more complex Lorenz code. To decipher it, a powerful electronic computer called "Colossus" was designed and built (in the amount of 10 pieces). It was programmable and quite powerful for its time, but still not a universal, but a highly specialized machine. The Colossi was designed and built by an English engineer Tommy Flowers.
4) ENIAC
Let's move to the USA. In 1943, scientists from the University of Pennsylvania John Mauchly and John Eckert conceived to build a powerful electronic computer. It was supposed to be used mainly for calculating artillery tables - a tedious and painstaking job that was entrusted to the university by the American army. Previously, tables were calculated by people with adding machines, and this took them a lot of time. The device was named ENIAC. ENIAC), short for "Electronic Numerical Integrator and Calculator", and he could calculate 2400 times faster than a man with an adding machine.
ENIAC
ENIAC was built by the autumn of 1945. It contained more than 10,000 electron tubes, weighed about 27 tons, and consumed 150 kW of electricity. By this time, the urgent need for calculations of artillery tables had disappeared, and the computer began to be used for other purposes, for example, for calculating the explosion of a hydrogen bomb, the aerodynamics of supersonic aircraft, and weather forecasting.
ENIAC without special reservations can be considered a real computer. It was an all-electronic mainframe that showed the full potential of computers. In addition, ENIAC became the first widely known computer, information about the Zuse and Atanasoff machines surfaced later, and British decryption computers were classified (and almost all destroyed) by order of Churchill. So the title of the world's first computer ENIAC probably deserved.
Still, working with ENIAC was still not very convenient. Computer programming was carried out by changing the position of cables and switches, and preparation for calculations often took much more time than the calculations themselves. Even before the end of the work, the American mathematician John von Neumann proposed to use an architecture for future computers that assumed the storage of commands and data in memory. This architecture became the basis for the development of subsequent computers.
Let's sum up and answer, finally, who invented the computer. In one way or another, the following were involved in the invention and creation of the first computers:
- Charles Babbage - the author of the first project of a (mechanical) computer;
- Alan Turing - described the scheme of the universal computer, the designer of the British decryption electromechanical computer "Bomb";
- Konrad Zuse - creator of the first electromechanical programmable computer;
- John Atanasoff - creator of the first electronic non-programmable computer;
- Tommy Flowers - designer of the British decryption electronic computer "Colossus";
- John Mauchly and John Eckert - designers of the first universal electronic computer ENIAC;
- John von Neumann - one of the participants in the development of the first American computers, proposed the architecture that underlies the design of all modern computers.
Appeared after the Second World War, when the discoveries of mathematicians and other scientists made it possible to realize a new way of reading information. And although today these machines seem outlandish artifacts, it was they who became the progenitors of modern PCs familiar to the layman.
Manchester "Mark I" and EDSAC
The first computer in the modern sense of the word was the Mark I device, created in 1949. Its uniqueness lay in the fact that it was completely electronic, and the program was stored in its RAM. This achievement of British specialists was a big leap forward in the centuries-old history of the development of computers. The Manchester "Mark I" included Williams pipes and magnetic drums, which served as a repository for information.
Today, many years later, the history of the creation of the first computer is controversial. The question of which machine can be called the first computer remains controversial. The Manchester Mark I remains the most popular version, although there are other contenders. One of them is EDSAC. Without this machine, the history of the emergence of the computer as an invention would be completely different. If "Mark" appeared in Manchester, then EDSAC was created by scientists from the University of Cambridge. This computer was put into operation in May 1949. Then the first program was executed on it, which squared the numbers from 0 to 99.
Z4
The Manchester "Mark I" and EDSAC were for specific programs. The next step in the evolution of computing machines was the Z4. Last but not least, the device was distinguished by a dramatic history of creation. The computer was created by the German engineer Konrad Zuse. Work on the project began at the final stage. This circumstance greatly slowed down this development. Zuse's laboratory was destroyed during an enemy air raid. Together with her, all the equipment and preliminary results of a long work were lost.
Nevertheless, the talented engineer did not give up. Production was continued after the onset of peace. In 1950, the project was finally completed. The history of its creation turned out to be long and thorny. The computer immediately interested the Swiss Higher Technical School. She bought the car. Z4 interested specialists for a reason. The computer had universal programming, that is, it was the first multifunctional device of this type.
In the same 1950, the history of the creation of computers in the USSR was marked by an equally important event. MESM, a small electronic calculating machine, was created at the Kiev Institute of Electrical Engineering. A group of Soviet scientists worked on the project, led by academician Sergei Lebedev.
The device of this machine included six thousand electric lamps. Great power allowed to take on tasks that had previously been unprecedented for Soviet technology. In a second, the device could perform about three thousand operations.
Commercial Models
At the first stage in the development of computers, specialists from universities or other government agencies were involved in their development. In 1951, the LEO I model appeared, created thanks to the investments of the British private company Lyons and Company, which owned restaurants and shops. With the advent of this device, the history of the creation of computers has reached another important milestone. LEO I was first used for commercial data processing. Its design was similar to that of the ideological predecessor EDSAC.
The first American commercial computer was UNIVAC I. It appeared in the same 1951. In total, forty-six of these models were sold, each of which cost a million dollars. One of them was used in the US census. The device consisted of more than five thousand vacuum tubes. Mercury delay lines were used as the information carrier. One of them could store up to a thousand words. When developing UNIVAC I, it was decided to abandon punched cards and switch to a metallized magnetic tape. With its help, the device could connect to commercial storage systems.
"Arrow"
Meanwhile, the Soviet electronic had its own history of creation. The Strela computer, which appeared in 1953, became the first such serial device in the USSR. The novelty was produced on the basis of the Moscow plant of calculating and analytical machines. During the three years of production, eight samples were made. These unique machines were installed at the Academy of Sciences, Moscow State University and design bureaus located in closed cities.
"Arrow" could perform 2-3 thousand operations per second. For domestic technology, these were record numbers. The data was stored on magnetic tape, which could hold up to 200,000 words. The developers of the device were awarded Chief Designer Yuri Bazilevsky also became a Hero of Socialist Labor.
The second generation of computers
As early as 1947, transistors were invented. At the end of the 50s. they replaced energy-consuming and fragile lamps. With the advent of transistors, computers began a new history of creation. Computers that received these new parts were later recognized as second generation models. The main innovation was that printed circuit boards and transistors made it possible to significantly reduce the size of computers, which made them much more practical and convenient.
If earlier computers occupied entire rooms, now they have shrunk to the proportions of office tables. Such, for example, was the IBM 650 model. But even transistors did not solve another important problem. Computers were still extremely expensive, which meant they were only made to order for universities, large corporations, or governments.
Further evolution of computers
Integrated circuits were invented in 1959. They marked the beginning of the third generation of computers. 1960s became a turning point for computers. Their production and sales have increased exponentially. New details made devices cheaper and more accessible, although they were still not personal. Basically, these computers were bought by companies.
In 1971, Intel developers launched the first ever microprocessor on the market. On its basis, fourth-generation computers appeared. Microprocesses solved several important problems that had previously been hidden in the design of any computer. One such part performed all the logical and arithmetic operations that were written using machine code. Prior to this discovery, this function lay on many small elements. The appearance of a single universal part heralded the development of small home computers.
Personal computers
In 1977, Apple, founded by Steve Jobs, introduced the Apple II to the world. Its fundamental difference from any other previous computers was that the device of a young Californian company was intended for sale to ordinary citizens. It was a breakthrough that until recently seemed simply unheard of. Thus began the history of the creation of personal computers of the computer generation. The novelty was in demand until the 90s. During this period, about seven million devices were sold, which was an absolute record of that time.
Subsequent Apple models received a unique graphical interface, a keyboard familiar to modern users, and many other innovations. All the same slightly made the computer mouse popular. In 1984, he introduced his most successful Macintosh model, which marked the beginning of a whole line that still exists today. Many of the discoveries of Apple engineers and developers have become the basis for today's personal computers, including those created by other manufacturers.
Domestic developments
Due to the fact that all revolutionary discoveries related to computers took place in the West, the history of the creation of computers in Russia and the USSR remained in the shadow of foreign successes. This was also due to the fact that the development of such machines was controlled by the state, while in Europe and the USA the initiative gradually passed into the hands of private companies.
In 1964, the first Soviet semiconductor computers "Sneg" and "Spring" appeared. In the 1970s Elbrus computers began to be used in the defense industry. They were used in the missile defense system and nuclear centers.
Have you ever wondered who invented the machine that lets you read those very words while listening to music, maintaining a social profile, and shooting down terrorists in games, all at the push of a button?
This outstanding man was Charles Babbage (December 26, 1791 - October 18, 1871). He invented the first programmable computer in 1833. Thanks to his priceless invention, Babbage is considered the father of the computer.
Did you know?
Until the 19th century, the term "computer" was applied to people assigned to perform "mathematical calculations"!
The image shows Babbage's creation.
Charles Babbage's father, Benjamin Babbage, was a wealthy businessman. Thus, young Charles went to many prestigious schools until he got to Holmwood Academy in Enfield. It was there that his love for mathematics began.
Later, he went to Peterhouse, Cambridge for further research. At Peterhouse he studied analytic philosophy and went on to study mathematics. Even before graduating from college, he was awarded an honorary degree in mathematics without examination.
In addition to being a talented mathematician, Babbage was also a philosopher and an avid cryptographer.
Babbage noticed that human calculations, especially with respect to logarithms, are often wrong. This led him to the idea of a machine capable of doing calculations, in fact, without the right to error. Ada Lovelace, who helped Babbage program his machine, is considered the world's first computer programmer.
Interestingly, the history of programming itself does not begin with Babbage's Analytical Engine. The first programmable device in the world was actually a loom! Invented by Joseph Marie Jacquard, the Jacquard Loom was the first programmable machine in history. The programming, of the Jacquard loom and Babbage's computer, was done by means of punched cards. Babbage invented the mechanical predecessor of the printer as an output device for his machine.
The next leap forward in computer history was made simultaneously by Konrad Zuse and John Atanasoff, but with different designs. Atanasoff built the world's first digital computer using vacuum tubes. The Atanasoff-Berry computer laid the foundation for becoming one of the most useful and widespread devices in the world. However, this computer was not programmable. On the other hand, Konrad Zuse built a programmable computer, known as the Z3, which was electromechanical, i.e. analog.
Despite the respective disadvantages of both designs, Atanasoff and Zuse are both considered among the most important names in computer technology. George Stiebitz is also credited as one of the inventors of the digital computer.
The numerous input, output, and peripherals connected to modern computers were not part of these early designs. They were invented by the following scientists:
- Monitor (CRT): Allen Dumont (1931)
- Mouse: Douglas Engelbart (1963)
- QWERTY keyboard: Christopher Scholes (1867 - on typewriters)
- Scanner: Giovanni Caselli / Edouard Belen (1858 / 1913)
In 1991, a fully functioning model of Charles Babbage's machine was built, showing the true brilliance of the predictive inventor. The model also encourages research into the possible applications of mechanical computing, which can be very useful in situations where digital computers cannot withstand physical conditions. In 2011, British scientists initiated a project to build Babbage's Analytical Engine to the most original designs, to be completed by 2021. It would indeed be a fitting tribute to the man who set the world on a constant path of unimaginable technological advancement.
Human life in the twenty-first century is directly related to artificial intelligence. Knowledge of the main milestones in the creation of computers is an indicator of an educated person. The development of computers is usually divided into 5 stages - it is customary to talk about five generations.
1946-1954 - first generation computers
It is worth saying that the first generation of computers (electronic computers) was a tube. Scientists at the University of Pennsylvania (USA) developed ENIAC - the name of the world's first computer. The day when it was officially put into operation is 02/15/1946. When assembling the device, 18 thousand electron tubes were involved. A computer by today's standards was a colossal area of 135 square meters, and a weight of 30 tons. The demand for electricity was also high - 150 kW.
It is a well-known fact that this electronic machine was created directly to help in solving the most difficult tasks of creating an atomic bomb. The USSR was rapidly catching up with its backlog and in December 1951, under the guidance and with the direct participation of Academician S. A. Lebedev, the world's fastest computer was introduced to the world. She wore the abbreviation MESM (Small Electronic Computing Machine). This device could perform from 8 to 10 thousand operations per second.
1954 - 1964 - computers of the second generation
The next step in development was the development of computers running on transistors. Transistors are devices made from semiconductor materials that allow you to control the current flowing in the circuit. The first known stable working transistor was created in America in 1948 by a team of physicists - researchers Shockley and Bardeen.
In terms of speed, electronic computers differed significantly from their predecessors - the speed reached hundreds of thousands of operations per second. The dimensions have also decreased, and the consumption of electrical energy has become less. The scope of use has also increased significantly. This happened due to the rapid development of software. Our best computer, BESM-6, had a record speed of 1,000,000 operations per second. Developed in 1965 under the leadership of chief designer S. A. Lebedev.
1964 - 1971 - third generation computers
The main difference of this period is the beginning of the use of microcircuits with a low degree of integration. With the help of sophisticated technologies, scientists were able to place complex electronic circuits on a small semiconductor wafer, with an area of \u200b\u200bless than 1 centimeter square. The invention of microcircuits was patented in 1958. Inventor: Jack Kilby. The use of this revolutionary invention made it possible to improve all parameters - the dimensions decreased to about the size of a refrigerator, the speed increased, as well as reliability.
This stage in the development of computers is characterized by the use of a new storage device - a magnetic disk. The PDP-8 minicomputer was first introduced in 1965.
In the USSR, such versions appeared much later - in 1972 and were analogues of the models presented on the American market.
1971 - present - fourth generation computers
An innovation in fourth generation computers is the application and use of microprocessors. Microprocessors are ALUs (Arithmetic Logic Units) placed on a single chip and having a high degree of integration. This means that microcircuits begin to take up even less space. In other words, a microprocessor is a small brain that performs millions of operations per second according to the program embedded in it. Dimensions, weight and power consumption have been drastically reduced, and performance has reached record heights. And that's when Intel got into the game.
The first microprocessor was called the Intel-4004, the name of the first microprocessor assembled in 1971. It had a bit depth of 4 bits, but then it was a giant technological breakthrough. Two years later, Intel introduced the world to the Intel-8008, which has eight bits, in 1975 the Altair-8800 was born - this is the first personal computer based on the Intel-8008.
This was the beginning of a whole era of personal computers. The machine began to be used everywhere for completely different purposes. A year later, Apple entered the game. The project was a great success, and Steve Jobs became one of the most famous and richest people on Earth.
The indisputable standard of the computer is the IBM PC. It was released in 1981 with 1 megabyte RAM.
It is noteworthy that at the moment, IBM-compatible electronic computers occupy about ninety percent of the computers produced! Also, it is impossible not to mention the Pentium. The development of the first processor with an integrated coprocessor was completed successfully in 1989. Now this trademark is an indisputable authority in the development and application of microprocessors in the computer market.
If we talk about the prospects, then this, of course, is the development and implementation of the latest technologies: very large integrated circuits, magneto-optical elements, even elements of artificial intelligence.
Self-learning electronic systems are the foreseeable future, called the fifth generation in the development of computers.
A person seeks to erase the barrier in communicating with a computer. Japan worked on this for a very long time and, unfortunately, unsuccessfully, but this is a topic for a completely different article. At the moment, all projects are only in development, but with the current pace of development, this is not far away. The present is the time when history is being made!
Share.Charles Babbage, while designing the Analytical Engine in the 1940s (1840s), developed the basic ideas of building a machine that could work according to a predetermined program without human intervention.
100 years have passed. The first computers (electronic computers) appeared.
3.
4.
5.
Mark-1 on electromechanical relays
In 1943, the American Howard Aiken, using electromechanical relays based on 20th century technology, was able to build such a machine called the Mark-1 at one of the IBM enterprises.
“If Babbage had lived 75 years later,” Aiken later stated, “I would have been left without a job.”
Even earlier, Babbage's ideas were rediscovered by the German engineer Konrad Zuse, who in 1941 built a similar machine. But this has nothing to do with the famous American, therefore it is not such a common fact.
Eniac on vacuum tubes
Radio technology developed rapidly in the first half of the 20th century. The main element of radio receivers and radio transmitters at that time were vacuum tubes.
Starting in 1943, a group of specialists led by John Mauchly and Presper Eckert in the United States began to design a machine similar to the Mark 1, already on the basis of vacuum tubes, and not relays.
Their car was called ENIAC(abbreviated from Electronical Numerical Integrator and Calculator - Electronic Numerical Integrator and Calculator).
The counting speed of this machine exceeded the speed of the Mark-1 by a thousand times. When ENIAC (pronounced ENIAC) was demonstrated in 1946, the American press immediately dubbed it the "Giant Brain".
The mass of the system was 27 tons. ENIAC was used, in particular, for calculations related to the creation of the hydrogen bomb.
However, to enter the program, according to which ENIAC was supposed to make calculations, it was necessary to connect in the right way for several hours or even several days. There was no keyboard yet, there was no monitor either.
Von Neumann architecture
To simplify the process of programming, Mauchly and Eckert began to design a new machine that could store the program in your memory.
In 1945, the famous mathematician John von Neumann, together with other scientists, was involved in the work.
The journal "Nature" in 1946 published an article by John von Neumann, in collaboration with other lesser-known scientists, "A Preliminary Consideration of the Logical Design of an Electronic Computing Device." This article clearly and simply outlined the general principles of the design and operation of computers. The main one is the principle of storing the program in memory, according to which the data and the program are placed in a common .
The fundamental description of the device and operation of a computer is commonly called the architecture of a computer. The ideas outlined in the article mentioned above are called " principles of John von Neumann" or " von Neumann architecture».
Advac machine
The joint development of Mauchly, Eckert and von Neumann can be considered the next model after ENIAC - this is the Edvak machine (EDVAC, short for Electronic Discrete Automatic Variable Computer - electronic discrete variable computer). Its more capacious internal memory contained not only data, but also a program. Unlike ENIAC, this is a computer based on , not decimal.
Like the ENIAC, the EDVAC was developed at the US Army Ballistic Research Laboratory and is the first computer built on the principles of John von Neumann.
The named machines existed in single copies. And the factory, serial production of computers began in the developed countries of the world in the 50s of the 20th century.
MESM in the USSR
In our country (USSR), the first computer was created in 1951. It was called MESM - a small electronic calculating machine. The MESM designer was Sergey Alekseevich Lebedev. Under his leadership, in the 1950s, serial tube computers BESM-2, M-20 were built.
A number of subsequent machines and developments by S.A. Lebedev contributed to the creation of more advanced machines.
When computers were big
A hard drive (an entire cabinet) that in the early 1960s could only fit one picture taken with a modern digital camera
In conclusion, I would like to bring to your attention a short video report from the Museum of Informatics in Paris. You will see with your own eyes
- vacuum lamp,
- punch cards,
- CPU,
- microprocessor,
- modem,
- learn about the binary number system, the principles of the first Internet: