-
Here's a look at the most fundamental differences between caches and buffers
Cache is generally translated as "cache" or simply "cache", generally when there is a "transmission or storage efficiency level", cache will appear, its role is to [store the most common data with high efficiency but small capacity storage container], such as CPU cache relative to memory, memory relative to disk, disk relative to network transmission, these are all examples of high-efficiency and relatively inefficient storage transfer. So the data in the cache is temporary. In programming, the cache mechanism is enabled or implemented only when it is necessary to improve the efficiency of the entire system (e.g., using memcache in big data processing).
The buffer is generally translated as "buffer", it does not have the concept of "upper and lower efficiency levels" for data transmission and storage, and buffer will be used between uploading, downstreaming, and leveling, and it usually refers to [the storage space used to obtain or transfer data between modules]. In programming, buffers are much more used than cache concepts, such as calling the function buf[256]; The meaning of FGETS(BUFF, 256, FP) is to pass data from the standard input module of the C runtime to the user program module.
-
Both are data in RAM. In simple terms, the buffer is about to be written to disk, and the cache is read from disk.
-
The main function of the cache is to solve the contradiction that the CPU computing speed does not match the memory read and write speed.
The basic principle of caching is to take advantage of the principle of locality of program access, including spatial locality and temporal locality.
When the computer runs the program, it first reads the program from the disk to the main memory, and then the CPU takes out the instructions and data from the main memory according to the rules and executes the instructions, but it is very slow to read and write directly from the main memory (generally made of dram), so we introduced cache.
Before executing the program, it will first try to move the instructions and data to be used from the main storage to the cache, and then directly access the cache when the program is executed. If the instruction and data are in the cache, then we can read it quickly, which is called a "hit".
If the instructions and data are not in the cache, we still have to take the instructions and data from the main memory, which is called "miss". Hit ratio is important for cache.
In fact, cache is a broad concept, it can be considered that the main memory is the cache of the disk, and the cache in the CPU is the cache of the main memory, the purpose of using cache is to forge a memory with a capacity as large as a low-level memory (such as a disk), and a speed as fast as a register (such as a general-purpose register), simply put, to make the storage unit look big and fast.
-
Cache is commonly referred to as caching SRAM. SRAM is called static memory, and "static" means that when we write a piece of data to SRAM, the written data remains unchanged unless new data is rewritten or the power is turned off.
Since the speed of the CPU is much faster than the speed of the memory and hard disk, it will cause the CPU to wait when accessing data, affecting the speed of the computer. SRAM accesses faster than other memory and hard disks, so it is used as a cache for computers.
With caching, data can be pre-written into it and read directly from it when needed, which reduces CPU wait times. The reason why the cache can improve the speed of the system is based on a statistical law, the control system on the motherboard will automatically count which data in the memory will be used frequently, and store the data in the cache, and when the CPU wants to access the data, it will first go to the cache to find it, so as to improve the overall running speed. Generally speaking, a 256k cache can increase the speed of the whole machine by about 10% on average.
Buffer literally translates from English to mean "buffer", and here we will call it buffer because it is not only a noun, but also a verb.
A buffer is a place where a series of data is stored, and the data obtained by the client can be output directly from the execution result of the program, or from the buffer. But there is a difference in speed between the two approaches: in the web, when an ASP program is not requested many times, there is basically no difference between the two, at least not noticeable.
But when a lot of people are requesting an ASP program, the speed is not the same. If there is no buffer, then the result of each client requesting the ASP program is the result of the ASP program being executed once, while if the ASP program is buffered in advance, then the result of each client is the result of the buffer, not the result of the program being executed once. For example, if 1000 users access an ASP page at the same time, if the ASP program is not buffered, then the program will be executed a thousand times, so that the load on the server will increase again, resulting in the slow opening speed of the client. If the ASP program is buffered, the result will be different, each client will get data directly from the buffer, the server will not increase the number of program executions due to the increase in access, so the client will open the page faster than in the previous case.
That's the benefit of buffers.
-
Cache and buffer look like the same thing, cache is called cache and buffer is called buffer.
In the hardware concept, the purpose of a cache is to connect two devices with different speeds, such as registers and memory, CPU and PCI-bus, IDE bus, and hard disk.
The original meaning of buffer is a kind of buffer similar to a spring, which is used to reduce or absorb the vibration of shocks. A buffer is a way of data pre-access, which is used to temporarily store data and transfer it at a different speed than the receiving speed. The buffer can be updated automatically at intervals, while the cache pays more attention to the "hit ratio", which puts a small amount of data that is frequently used in the current time period into a high-speed device for easy reading and writing. In program development, although there are no high-speed and low-speed devices, data sources can have different read and write efficiency. For a small amount of data, the reading and writing efficiency of a text file is usually better than that of a data inventory, and the reading and writing efficiency of a text file is better in TMPFS than that of a direct disk IO.
Cache is physical storage The previous memory speed was very slow, so it was very slow when exchanging data with the processing area, which caused the machine to run very slowly, so this thing was born, because the speed of memory reading is relatively fast, so he first put the data in the memory into the cache Cache is very fast, so the processing speed will be accelerated for a long time! Wait until you're done and go back to memory! This solves the bottleneck that greatly affects the processing speed due to the memory transfer speed! >>>More
The similarity is that they all use the principle of program locality, divide the program into many information blocks, and automatically schedule the information blocks from slow memory to fast memory during operation, and use a certain replacement strategy to improve the hit rate when continuing to run. They use the same address transformation, address mapping method, and substitution algorithm. >>>More
Squid. The taste of squid is very good, big friends like to eat squid, and the nutritional value of squid is also very high, squid is rich in various nutrients, so that our daily food is very good. The calcium, phosphorus, and iron in squid are very beneficial to bone development and hematopoiesis, and can prevent anemia. >>>More
The meaning of "to": to arrive, the extreme, the most.
To "to" means: to concentrate; Reach realization; Attract; Taste; Filigree. >>>More
The differences between white vinegar and white rice vinegar are as follows:Rice vinegar is made by fermenting rice, wheat, sorghum, bran, bran, persimmon, etc., to produce acetic acid, so it is brewed. White vinegar is actually made by fermenting edible alcohol as a raw material, and then adding edible glacial acetic acid, etc., or acetic acid made by continuing fermentation with barley. >>>More