We explain what precision is and why it is important in measurements. Also, examples and differences with the accuracy.
That which is precise constantly gets correct results.What is precision?
In general, when we talk about the precision of something or someone, we mean their capacity in the short term to hit the target, that is, to obtain the expected results or results very close to the expected one. Although in everyday speech it can be synonymous with accuracy, do not confuse these two terms.
The word precision comes from Latin praecisionis, derived from the verb praecidere, which can be translated as "cut well", "cut at two ends" or "separate completely by cutting what is left over." This verb was made up of the voices prae- ("Forward" or "in advance") and caedere ("Cut" or sometimes "kill").
Originally this word was used to refer to what has been cut or severed from the body (eunuchs, for example, were called praecisus, "chopped up"); while its current meaning comes from its application figurative on the rhetoric, that is, with regard to the oratory.
There, praecisus he was referring to what was "well cut", that is, well defined, well focused, and therefore adheres to the subject in question in the best way. In other words, that which is pertinent, that adheres to what is necessary.
Thus, today we refer to precision as the ability to hit the target or close to the target in different attempts. For example, a darts player has three chances to throw them at the bullseye, and once he has done so, he can judge how close to the center his throws were and how accurate he was.
This type of value can be of great importance in the field of scientific disciplines, engineering wave statistics.
Accuracy in measuring instruments
The instruments of measurement are the tools and devices that allow us to express in numerical values a magnitude determined of the nature. These measurements can be more or less precise, that is, contain a certain margin of error attributable to contextual and unpredictable factors. Thus, a set of measurements can vary from one another, despite the fact that it is the same magnitude that is being measured.
Let us imagine, by way of example, that with a thermometer we take our body temperature, and we do it several times to be sure there is no inadvertent error. If we notice that the measurements all tend to the real value of the temperature (or in any case to the estimated value), we will know that it is an accurate thermometer, that is, that it registers its values quite accurately.
In other words, an instrument that always tends to to size correctly is accurate. On the other hand, if the temperature varies immensely between one measurement and the other, we must understand that the thermometer has lost its necessary precision, since some measurements will be closer to the real thing and others will instead be far away from it. And how do you know which is which?
Examples of precision
As an example, we can visualize some cases in which precision is a determining factor:
- Every batter in a professional baseball league has a "averageBatting, or average of your batting performance. This average is a numerical approximation of his accuracy at bat, that is, of how many times he bats of all those that correspond to him in a game.
- A soldier exercises for war and fires a 100-round cartridge from his rifle at a target. Then you go and check the number of hits on the doll, and you can get an estimate of its accuracy, that is, how many shots were or were close to hitting the target, and how many shots were missed.
- During a medieval siege, catapult operators attempt to throw stones at enemy walls. But the catapult is not very well calibrated, and each rock they launch follows a different trajectory: some will hit the walls, others the nearby river, others the battlefield where they crush the allied troops. Logically, it is a very inaccurate catapult, since its shots do not tend to hit where it was aimed.
Precision and accuracy
In the SciencesIn engineering and statistics, it is important to distinguish the notion of precision from that of accuracy, despite the fact that in everyday speech they are often used as synonymous words. This difference is particularly important when it comes to understanding or interpreting the results obtained during a measurement, and depends on the following:
- Accuracy, as we have seen, is determined by the capacity of an instrument or a technique measurement to record similar values in a number of successive measurements, since they can vary from one another depending on the margin of error. The closer the measurements, the greater the precision of the device.
- On the other hand, the accuracy has to do with the closeness of the measurements with respect to the expected value or the real value. In other words, how close a measurement is to reality. The closer to the expected or actual data, the more accurate the instrument will be.
This difference can be easily understood with an example: Suppose a golfer tries to make a hole in one to break a record. Although he is a good golfer, there are variables that influence his shots: the wind, the humidity, the perfection of the golf ball or the force that he puts into the shot; so you will have to try many times until you finally achieve it.
If we judge how close to the hole his balls have landed, we will find the measure of its accuracy, since we know that the reference value is the hole itself. On the other hand, if we look at the number of times his shots came close to the hole, against the total number of attempts made, we can find his accuracy, that is, what margin of error his shots have in general.