-
Disk cache is divided into read cache and write cache.
Read caching is when the operating system keeps the read file data in the memory space (this memory space is called the "memory pool") when the memory is free, so that the next time the software or user reads the same file again, it does not have to be read again from the disk, thus increasing the speed.
Write caching is actually the data to be written to disk is stored in the memory space allocated by the system for write cache, and when the data saved to the memory pool reaches a certain level, the data is saved to the hard disk. This reduces the actual disk operations, effectively protects the disk from damage caused by repeated read and write operations, and reduces the time required to write.
Depending on the writing method, there are two types: write-through and write-back. When reading the data from the hard disk, the system first checks the request instruction to see if the data you want is in the cache, and if so, the cache sends the response data, which is called a hit. This eliminates the need for the system to access the data on the hard drive, and since SDRAM is much faster than magnetic media, data transfer is accelerated.
Write-back is to find in the cache when writing hard disk data, if you find it, the cache will write the data to the disk, most of the current hard disks are using write-back cache, which greatly improves performance.
Cache is also a type of memory, and its data exchange speed is fast and the operation frequency is high. A disk cache is a region of memory allocated by the operating system in normal physical memory for disk input and output.
The buffer of the hard disk.
The buffer of the hard disk is where the hard disk exchanges data with the external bus. The process of reading data from the hard disk is to convert the magnetic signal into an electrical signal, fill and empty it again and again through the buffer, refill it, and then empty it, and send it out step by step according to the cycle of the PCI bus. It also works to improve performance, but it differs from caching in the following ways:
First, it is hardware with a fixed capacity, unlike cache, which can be dynamically allocated in memory by the operating system. Second, its impact on performance greatly exceeds the performance impact of disk caching, because without buffers, it would require a disk to be read or written once for every word passed (usually 4 bytes).
-
The larger the cache, the faster the hard drive can read and write.
-
Caches are divided into L1 caches (i.e., L1 caches) and L2 caches (i.e., L2 caches). The CPU first reads data from the L1 cache, then from the L2 cache, and then from memory and virtual memory when it is running, so the capacity and speed of the cache directly affect the working performance of the CPU. L1 caches are built into the CPU and run at the same speed as the CPU, which can effectively improve the efficiency of the CPU.
The larger the L1 cache, the more efficient the CPU will be, but the capacity of the L1 cache is small due to the limitations of the CPU's internal structure. The second-level cache also has a great impact on the CPU operation efficiency, and the current second-level cache is generally integrated in the CPU, but it is divided into two kinds inside the chip, the second-level cache integrated inside the chip is the same frequency as the CPU second-level cache (i.e., full-speed L2 cache), and the operating frequency of the second-level cache integrated outside the chip is half of the operating frequency of the CPU (i.e., half-speed L2 cache), so the operation efficiency is lower.
The CPU can access data very quickly, and can access and process one billion instructions and data per second (term: CPU master. >>>More
Caching is very important for CPUs.
So what exactly is this cache? >>>More
Enterprise-class hard drives are usually SCSI hard drives with high spin speed. >>>More
Cache memory is a memory chip on the hard disk controller that has extremely fast access speeds, and it acts as a buffer between the internal storage of the hard disk and the external interface. Since the internal data transfer speed of the hard disk is different from the external interface transfer speed, the cache acts as a buffer. The size and speed of the cache are important factors directly related to the transfer speed of the hard disk, which can greatly improve the overall performance of the hard disk. >>>More
This also reduces the opportunity to read page files in the system disk, and reduces the pressure on the system disk, and the maximum value cannot exceed the remaining space value of the current hard disk.