cache

We explain what the cache is and what types exist. Also, how it works and what are the advantages of this alternate memory.

The cache stores data temporarily.

What is the cache?

In computing, it is known as cache memory or memory of fast access to one of the resources with which a CPU (CentralProcessing Unit, that is, Central Processing Unit) to temporarily store the data recently processed in a special buffer, that is, in a spare memory.

The cache memory operates in a similar way to the Main Memory of the CPU, but with greater speed in spite of being of much smaller size. Its effectiveness provides the microprocessor of extra time to access the most frequently used data, without having to trace it back to its place of origin every time it is needed.

Thus, this alternate memory is located between the CPU and the RAM (Random AccessMemory, that is, Random Access Memory), and provides an additional boost in time and saving resources to the system. Hence its name, which in English means “hiding place”.

There are several types of cache, such as the following:

  • Disk cache. It is a portion of RAM memory associated with a particular disk, where recently accessed data is stored to speed up loading.
  • Track cache. Similar to RAM, this type of robust cache used by supercomputers is powerful, but expensive.
  • Web cache. It is responsible for storing the data of the websites recently visited, to speed up their successive load and save bandwidth. This type of cache in turn can work for a single user (private), several users at the same time (shared) or together for the entire network managed by a server (gateway).

How does the cache work?

The cache allows access to a copy of data and not the originals.

The operation of this alternate memory is simple: when we access any data in our computerized system, a copy of the most relevant data is immediately created in the cache, so that the following accesses to said information keep it handy and should not trace it back to its place of origin.

Thus, by accessing the copy and not the original, processing time and therefore speed are saved, since the microprocessor does not have to go to the main memory all the time. It is, let's put it like this, a constantly updated working copy of the most frequently used data.

Clearing the cache doesn't delete your files

Clearing the cache does not alter the information on the hard drive.

Like all memories, the cache can become full or have data so disorganized that the process of verifying if any requested data is available in cache is delayed - a procedure that all microprocessors perform routinely. This can slow down the machine, producing an effect totally opposite to the one intended. Or, it can cause cache read or copy errors.

Whatever the case, you can clear the cache manually, asking the system to free up the alternate space and refill it as needed. This operation does not alter at all the content of our information on the hard drive, much less in our accounts email or from social networks. It is a working copy, and deleting it leaves us facing the original, identical but in another location.

Advantages of clearing the cache

It is recommended to clear the cache on a regular basis.

Freeing the cache serves two fundamental purposes, such as:

  • Eliminate old or unnecessary data (since we do not always use the same data in the system), such as old files or processes that we will not need again but that are stored there "just in case" to speed up their execution.
  • Accelerate and streamline the system by giving you new free space to copy the data in current use, shortening processing times.

This maintenance work must be done with a certain periodicity, which however should not be exaggerated, as we would be preventing the cache from fulfilling its mission.

If we continually erase it, the data stored there will have to be found and copied back to its original location, resulting in increased processing time for each program.

!-- GDPR -->