What principle was the computer invented 5

Updated on technology 2024-03-07
8 answers
  1. Anonymous users2024-02-06

    Computer (computer) full name: electronic computer, commonly known as computer, is a kind of modern intelligent electronic equipment that can run according to the program and process massive data automatically and at high speed. A computer that consists of hardware and software, without any software installed, is called bare metal.

    Common models include desktop computers, notebook computers, and more advanced computers include biological computers, photonic computers, quantum computers, etc.

    Computers were developed from early motorized calculators. In 1946, the world's first electronic digital computer "ENIAC" appeared for calculating ballistics. It is manufactured by the Moore Institute of Electrical Engineering, University of Pennsylvania, USA.

    In 1956, the transistor electronic computer was born, which was the second generation of electronic computers. It can be accommodated in just a few larger cabinets, and the calculation speed is greatly improved. In 1959, the third generation of integrated circuit computers appeared.

    The original computer was invented by Johann von Neumann (at that time the computing power of the computer was equivalent to today's calculator), the size of three warehouses, and gradually developed. "ENIAC", which appeared in 1946, was mainly used to calculate ballistics. Manufactured by the Moore School of Electrical Engineering at the University of Pennsylvania in the United States, it is bulky, covering an area of more than 170 square meters, weighing about 30 tons, and consuming nearly 150 kilowatts of electricity.

    Obviously, such a computer costs a lot and is inconvenient to use. In fact, in 1973, according to the U.S. Supreme Court's ruling, the earliest electronic digital computers should have been John Atanasov, an associate professor of physics at Iowa State University, and his graduate assistant, Clifford EBerry, 1918-1963)."abc"(atanasoff- berry-computer)。

    The reason for this misunderstanding is that a person named Moakley in the research group of "ENIAC" plagiarized the work of John Atanasov in 1941 and applied for a patent in 1946. For various reasons, it was not until 1973 that this mistake was reversed. (Please refer to the encyclopedia ---John Atanasov" entry for details, I hope you will remember ABC and John Atanasov, and hope that future textbooks will correct this error).

    Later, in order to recognize and commemorate John Atanasov's great contributions in the field of computers, in 1990, the former Bush of the United States awarded John Atanasov the highest scientific and technological award in the United States--- the National Science and Technology Award.

  2. Anonymous users2024-02-05

    'or' gates, 'not' gates, etc., made of specific circuits.

  3. Anonymous users2024-02-04

    Summary. Hello dear, the computer was invented as followsThe world's first electronic computer was born in 1946 at the Smithsonian Institution in Washington, USA, and her name is.

    Hello dear, the computer was invented as followsThe world's first electronic computer was born in 1946 at the Smithsonian Institution in Washington, USA, and her name is.

    "The world's first computer was invented during World War II. Because it was used to draw ballistics maps for the U.S. Navy. It's the size of half a football field, contains 500 miles of wires, and uses electromagnetic signals to move mechanical parts.

    It runs very slowly, every 3-5 seconds, and it is also very poorly adaptable, and is only suitable for specialized areas. He's just a general-purpose computer. In just a few decades, computers have miraculously multiplied for four generations.

    The first generation was a computer that ran on a vacuum tube. Eleven years later, the second generation of transistor calculators appeared. Seven years later, the third generation of integrated circuit computers was updated.

    Ten years later, the current large-scale integrated circuit computer was invented. In the space of a few decades, computers have evolved so much, and each time it has become more advanced and more versatile. So much so that it has become the pride of modern society.

  4. Anonymous users2024-02-03

    The first laptop brain adopts CP M operating system, equipped with Wordsstar word processing software, SuperCalc electronic software or spare parts, Microsoft mbasicX programming language, CBASIC language and other technologies.

    With the development of laptops, mobile is no longer the only focus emphasized. Because today's laptops can easily be small enough, performance, appearance, entertainment, and other factors are the top factors in which laptops are measured.

    In contrast, the first laptop in history was different. In an era when there are no mobile processors, no mature LCD technology, no thermal performance, and very bulky external memory units, it is even more difficult to make a portable "notebook".

    Trends:

    With the continuous improvement of technology in the notebook industry, as well as the advent of the second-generation Sandybridge processor, today's 8 12-inch screen laptops have tended to be like a simple notebook head, and are tending to a balance between performance, performance and portability.

    Ultra-thin laptops are not inferior to larger laptops in terms of performance, whether it is the processor, memory, or major accessories such as hard drives.

    Of course, it is not impossible to talk about the differences, the most obvious is that the general ultra-thin laptop does not use high-end configurations due to volume, heat dissipation, memory and other reasons, and the volume of the battery is also stuffy to a certain extent determines the relationship between performance and battery life.

  5. Anonymous users2024-02-02

    John von Neumann (1903-1957) was the inventor of the electronic computer, and he has always been known as the "father of the electronic computer".

    An electronic computer (hereinafter referred to as a computer) is a machine that processes data according to a series of instructions. Commonly known as "computer".

    There are many types of computers. In reality, computers are generally tools for processing information. According to Turing machine theory, a computer with the most basic functions should be able to do what any other computer can do.

    So, as long as time and storage are not taken into account, everything from a personal digital assistant (PDA) to a supercomputer should be able to do the same job. This means that even computers with the exact same design should be able to be used for everything from company payroll management to unmanned spacecraft maneuvering, as long as they are modified accordingly. Due to rapid advances in technology, the next generation of computers will always be able to significantly outperform their predecessors, a phenomenon sometimes referred to as "Moore's Law".

    Computers come in different forms in terms of composition. Early computers were the size of a house, and today some embedded computers may be smaller than a deck of playing cards. Of course, even today, there are a large number of large supercomputers that serve the special scientific computing or transactional processing needs of large organizations.

    Smaller, computers designed for personal use are called microcomputers, or microcomputers for short. This is also what we usually refer to when we use the term "computer" in our daily lives today. However, the most common form of computer application today is embedded.

    Embedded computers are often relatively simple, small, and used to control other devices – whether it's airplanes, industrial robots, or digital cameras.

    The above definition of an electronic computer includes a number of devices that can be used for a specific purpose or have limited functionality. However, when it comes to modern computers, the most important feature is that any computer can emulate the behavior of any other computer (limited only by the storage capacity and execution speed of the computer itself) given the correct instructions. Accordingly, modern electronic computers are also called general-purpose electronic computers compared to early electronic computers.

  6. Anonymous users2024-02-01

    1946 Hungarian-American: von Neumann; The "principle program" was proposed

  7. Anonymous users2024-01-31

    In January 1672, Leibniz produced a wooden model of the machine and demonstrated it to the members of the Royal Society. But this model only illustrates the principle, not the normal operation. After that, in order to speed up the process of developing computers, Leibniz settled in Paris for four years.

    In Paris, he collaborated with a famous watchmaker, Olivia. All he needed was a few brief instructions to Olive, and the watchmaker did the actual work alone. In 1974, the final machine was assembled by Olivia himself.

    Leibniz's multiplication machine is about 1 meter long, 30 centimeters wide and 25 centimeters high. It consists of two parts: an immovable counter and a movable positioning mechanism. The whole machine is driven by a set of gear systems, and its important component is the stepped shaft, which facilitates simple multiplication and division operations.

    The prototype designed by Leibniz was exhibited in Paris and London. He was elected a Fellow of the Royal Society for his outstanding achievements in computing equipment. In 1700, he was elected a member of the Academy of Sciences in Paris.

    Leibniz was also the first to recognize the importance of binary notation and systematically proposed the algorithm of binary numbers. Binary has had a profound impact on the development of computers more than 200 years later.

    When Leibniz settled in France, he had close contact with the missionary Bai Jin in China. Bai Jin, who had taught mathematics to the Kangxi Emperor, was very interested in the Chinese I Ching, and in 1701 sent Leibniz two I Ching diagrams, one of which was the famous "Fuxi Sixty-four Hexagram Circle Chart". Leibniz was surprised to find that the sixty-four hexagrams corresponded to the 64 binary numbers.

    Leibniz considered the Chinese Bagua to be the world's earliest binary notation. To this end, in 1716 he published the essay "On Chinese Philosophy", which was devoted to Bagua and binary, pointing out that binary and Bagua have something in common.

    Leibniz longed for and admired the ancient civilization of China, and he presented a replica of the multiplication machine he had developed to the Chinese Emperor Kangxi to show his respect for China.

  8. Anonymous users2024-01-30

    The English name of computer is computer, also known as computer, which was first developed to facilitate calculation.

    The world's first computer was designed and developed in 1945 by two professors at the University of Pennsylvania - Mochley and Eckert. However, computers have been studied for a long time before this, so strictly speaking, computers should be the result of a joint effort of scientists.

Related questions
10 answers2024-03-07

Disdain the above, you ask who invented drift, and you don't ask about the definition of drift. Drift was invented by former Japanese GT racer Keiichi Tsuchiya.

10 answers2024-03-07

Inventor Johann von Neumann.

Computer (computer), commonly known as computer, is an electronic computer used for high-speed calculation, which can perform numerical calculations, logical calculations, and also has the function of storage and memory. It is a modern intelligent electronic device that can operate according to the program and process massive amounts of data automatically and at high speed. A computer that consists of a hardware system and a software system without any software installed is called bare metal. >>>More

8 answers2024-03-07

It should be leverage in finance, this is not a theory, but specifically refers to "small and large", with a small amount of money can buy and sell bulk commodities or other derivatives. It is figuratively referred to as financial leverage. For example, if you buy a two-lot copper (10 tons) contract, you generally only need to pay 6% of the total margin. >>>More

4 answers2024-03-07

Who invented the computer is strictly difficult to define. The original meaning of computer was "calculator", that is, humans invented computers to help with complex number crunching. The concept of this artificial calculator can be traced back to the great French thinker Pascal in the seventeenth century. >>>More

9 answers2024-03-07

Sinan is the earliest compass invented in the Spring and Autumn Period and the Warring States Period in China, and it is not a compass. >>>More