When was the computer invented? What era was the computer invented

Updated on military 2024-04-11
4 answers
  1. Anonymous users2024-02-07

    The first machine that can really be called a computer was born in the United States in 1946, invented by Mao Qili and Eckert, and the name is ENIAC. The computer uses vacuum tubes to process signals, so it's bulky (it takes up a room) and consumes a lot of power (the whole town knows about it, because the lights in every house are dim!). And the memory capacity is very low (only more than 100 words), but it is already a major advance in human technology.

    And we usually call this kind of computer that uses a vacuum tube the first generation of computers. The first generation of computers was the size of two classrooms, which was much different from the size of the personal computers we use today. The computer parts of the time were vacuum tubes (which can no longer be found) and the archive stuff was a kind of punching.

  2. Anonymous users2024-02-06

    Computer? In fact, computers were first called computers, and they are very different from the human brain. The world's first computer was designed and developed in 1946 by two professors at the University of Pennsylvania in the United States, Mochley and Eckert, under the name of ENIAC - Electronic Numerical Intergrator and Computer (Electronic Numerical Intergrator and Computer).

    It was originally used to calculate formulas for new artillery for the US Ballistics Research Laboratory. It weighs 30 tons and covers an area of 1500 square feet. More than 18,000 vacuum tubes are used, capable of performing 5,000 addition operations per second.

  3. Anonymous users2024-02-05

    I think it is the result of the third scientific and technological revolution in the 60s of the 20th century.

  4. Anonymous users2024-02-04

    The history of computer development can be divided into four stages: 1854, 1890, 1890, early 20th century, mid-20th century, late 20th century to the present.

    The first generation: tube digital machine (1946-1958).

    In terms of hardware, the logic element uses a vacuum tube, and the main memory uses a mercury delay line, a cathode ray oscilloscope electrostatic memory, a magnetic drum, and a magnetic core; The external memory is tape. In terms of software, machine language and assembly language are used. The fields of application are mainly military and scientific computing.

    The disadvantages are large size, high power consumption, and poor reliability. Slow (typically thousands to tens of thousands of operations per second), expensive, but laid the foundation for future computer development.

    The second generation: transistor digital machines (1958-1964).

    In terms of software, the application fields of operating systems, high-level languages and their compilers are mainly scientific computing and transaction processing, and they have begun to enter the field of industrial control. It is characterized by smaller size, reduced energy consumption, improved reliability, increased computing speed (generally 100,000 times per second, up to 3 million times), and a significant improvement in performance compared to the first generation computer.

    The third generation: integrated circuit digital machine (1964-1970).

    In terms of hardware, the logic components are medium- and small-scale integrated circuits (MSI, SSI), and the main memory is still using magnetic cores. On the software side, time-sharing operating systems and structured, scaled programming methods have emerged. It is characterized by faster speed, and the reliability has been significantly improved, ** further declined, and the product has moved towards generalization, serialization and standardization.

    The application field began to enter the field of word processing and graphic image processing.

    4th Generation: LSI Computers (1970-present).

    In terms of hardware, the logic components are large-scale and ultra-large-scale integrated circuits (LSI and VLSI). In terms of software, database management systems, network management systems, and object-oriented languages have emerged. In 1971, the world's first microprocessor was born in Silicon Valley, ushering in a new era of microcomputers.

    The application field has gradually moved from scientific computing, transaction management, and process control to the home.

Related questions
5 answers2024-04-11

First: In 1940, Walter Aiken in the United States built the first new type of computer, named "Mark 1". The computer was huge, made loud noises when operated, and could only handle two additional problems per second, but it was the first computer after all. >>>More

2 answers2024-04-11

Hacker is the transliteration of the English word "hacker", derived from the English verb hack, which originally meant "hack", and was later used to refer to those programmers who love to explore the deep technology of computers. The first hackers appeared on the campus of the Massachusetts Institute of Technology, they were computer enthusiasts, and they liked to explore the mysteries of computers in depth, rather than just learning what others assigned to know about computers, as ordinary people do. They were ingenious, skilled, and willing to study, and in their spare time, they often used the computer to play their own games, joke or play pranks, and the word "hacker" at that time was a word with a positive connotation. "Hackers" have played an indelible role in the development of the Internet, but later with the popularization of computers, the hacker team has become more and more complex, and some people have begun to use hacking technology to spy on other people's privacy, vent their personal anger, destroy network resources, seek personal gain, and even carry out computer crimes, bank theft, credit card theft and other cases occur from time to time, gradually, the image of hackers in people's minds has become complicated......

3 answers2024-04-11

What is "logic"? To put it simply, logic is that people artificially divide some things that are not regular into regular combinations in order to facilitate memory, such as: "banana, 32, pineapple, high jump, running, boy" can meet the condition of "is a Chinese character", according to the logic to divide "banana, pineapple, high jump, running"; If you can meet the condition of "can eat", there will be "banana, pineapple" logically, and the simple point of logical address is to establish some kind of connection or operation, and meet certain conditions as the premise to distinguish some types of data, for example, there are a lot of data in your computer's memory address, and all the data that can meet the requirements of "greater than 2561 and less than 3651 and are odd" is named "xsd", and the address named "xsd" is the logical address. >>>More

26 answers2024-04-11

A computer protocol, also known as a network protocol, is a set of agreements that both parties to a communication computer must follow. >>>More

7 answers2024-04-11

Tell the history of the development of electronic computers with beautiful maps.