**Benford’s law** , also known as the **law of the first digit** or the **law of anomalous numbers** , ensures that, in the numbers that exist in real life, the first digit is 1 much more frequently than the rest of the numbers. Also, as this first digit grows, the more unlikely it is to be in the first position. This law can be applied to events related to the natural world or to social elements.

Benford’s Law is framed within the __Theorems of Information Theory__ , currently being an invaluable tool in different fields of science.

Benford’s Law, logarithmic distributions, and invariance to scale changes could account for the “__Theory of Everything__ ”that physicists so long to discover, and that for many the search for the __Holy Grail__ .

Summary

[ hide ]

- 1 Introduction
- 2 Background
- 3 Stigler’s Law: Frank Benford rediscovers Newcomb
- 4 Use of Benford’s law
- 5 References

Introduction

We are used to living from statistical explanations, they tell us that height is normally distributed, that rare events follow a __Poisson distribution__ , and we accept these facts using statistical inference techniques, and with all this Benford’s Law is very little known although it describes almost exactly natural phenomena.

Just as the __normal distribution was__ born to explain the errors that occurred in astronomical measurements, Benford’s Law was studied to explain the frequency of the first significant number of data studied.

Background

In the 19th century there were neither computers nor calculators. All calculations were done by hand or, at best, with __abacuses__ or __slide rules__ . Books and tables with the values of the functions in common use, such as __sinuses__ , __cosines__ and __logarithms__ , were published to facilitate the work of scientists, engineers, sailors …

During his consecration, dedication and professionalism, a good librarian has always observed in books, magazines, tables, etc., that most of the readers who have started it have not been able to finish it, the first sheets will be more used than the last ones, which will be like new.

This was surely the first thing that the Canadian astronomer __Simon Newcomb__ thought when, in 1881, he realized that the same thing happened with the __logarithm tables__ of the __Office of the Nautical Almanac__ of the United States __Naval Observatory__ , of which he was director .

The logarithm tables were used frequently, and Newcomb noted that the first pages of the logarithm tables were manifestly more used than the final ones, from which he deduced that apparently the leading digits of the numbers (at least those used in his work by those who they had consulted the tables) they are not equiprobables but 1 appears as the most frequent initial digit, followed by 2, etc. until 9 which is the least frequent. A logarithm table is not a book that one reads from start to finish.

In logarithm tables, the numbers are ordered, so the numbers whose first digit is 1 are grouped on the first pages, while those starting with 9 are at the end. Newcomb deduced that the initial digits (not counting zero) of the numbers that had been consulted in those tables did not have the same probability, but that this probability is decreasing from 1 to 9: 1 is the most frequent, and the 9 the least.

Without offering a formal demonstration, Newcomb enunciated a logarithmic law on the occurrence of numbers: “the law of probability of the occurrence of numbers is such that the mantissa of its logarithms (that is, its fractional parts) are equiprobable.”

Newcomb published this hypothesis under the title *Note on frequency of use of the different digits in natural numbers* ( “Note on the Frequency of Use of the Different Digits in Natural Numbers”), The American Journal of Mathematics (1881) 4, 39 -40.).

Stigler’s Law: Frank Benford Rediscovers Newcomb

Benford’s law – in honor of General Electric engineer __Frank Benford__ – enunciated in 1938 as “Law of Abnormal Numbers” or **Law of the First Digit** , also confirms __Stigler’s Law__ : it *claims that a discovery or scientific law is not usually bear the name of the discoverer* , since the first to observe it and leave a written record in 1881, was the American astronomer __Simon Newcomb__ .

But Newcomb’s observation about the probabilities of the initial digits of the numbers was forgotten. Until, in 1938, the American physicist __Frank Benford__, in an article titled: “The Law of Anomalous Numbers”. Proc. Amer. Phil. Soc 78, pp 551-72, made the same observation in the logarithm tables and, after empirically checking more than 20,000 numbers from 20 different samples, including river areas, population of towns, stock prices, physical constants, molecular weights , mathematical constants, death rates, mailing address numbers, and even numbers drawn from a magazine, he postulated the law of anomalous numbers, now known as Benford’s Law.

Benford’s law experimentally found that the probability that the first nonzero digit n in a sample of numbers drawn from the real world appears with a logarithmic probability:

log10 (n + 1) – log10 (n)

“… the frequency of occurrence of a given initial digit -or equivalently, the probability that this digit takes on a given value- decreases regularly as the digits from 1 to 9 increase …”

The logarithmic law is the only probability distribution that satisfies the double condition of scale and base invariance.

Benford’s law manifests invariance against the change of scale revealed in self-similarity, that is, the similarity of the parts and the whole, establishing a parallel with the __Mandelbrot fractals__ . Understanding the origin of scale invariance has been one of the fundamental tasks of modern statistical science. As surprising as it may seem, in certain sets or lists of numbers Benford’s law predicts the above results as long as the data fulfill a series of conditions (for example, independence, invariance of scale and base with respect to the law and that also cover several orders of magnitude) systematized in 1995 by__T. Hill__ although later, 2008, __Nicolas Gauvrit__ and __Jean-Paul Delahaye__ proposed more general and simple conditions.

Counterintuitive Benford’s law is not truly a “law” or mathematical theorem but an observation. In a list of statistical data that meet certain conditions, the most frequent first significant figure is 1 (30.1% of observations) followed by 2 (17.6% of observations) 3 (12.5%) as well as 9 (4 , 6%) in line with a logarithmic distribution.

Using Benford’s Law

Benford’s Law, although little known, is used to find fraud in declarations or in the detection of tax fraud when the transactions made by a company are known.

If the first significant figures follow Benford’s law from the accounting data set recorded in the entry and exit entries, the statement has probably not been manipulated.

In general, it is considered that whoever masks accounting data or other fraud with socioeconomic data tends to inadvertently distribute significant digits relatively uniformly. __Mark J. Nigrini__ has developed effective Benford’s Law tests to detect tax or accounting fraud.

Another of its possible uses is the search for data on a computer, we have verified that the first significant figure of the byte size of the files follows Benford’s law, so if we have the files ordered by their first number (which not by size), we will have a 30% chance of finding it among those files whose first significant figure is 1.

Interestingly, the Newcomb-Benford anomaly also holds for a minimum of any 200 terms of the __Fibonacci Series__ , be it the original (1,2,3,5,8 …) or the one obtained from two seed integers chosen at random (3,7,10,17,27,44 …).

Benford’s law now becomes a tool for testing the quality of nuclear decomposition models when their experimental distributions reproduce the Newcomb-Benford law.

The statistics of __Boltzmann-Gibbs__ , statistics __Fermi-Dirac__ , statistics and __Bose-Einstein__, fluctuate slightly around or exactly conform to the Benford distribution, and their intermediate values and integral resources converge for Benford’s law exactly. It appears that the Benford distribution is a general pattern in statistical physics. It is generally believed that the more chaotic and heterogeneous the probability distributions are, the better the global data set that accommodates Benford’s laws. Thus, through a bold conjecture it can be postulated that Benford’s laws are an act of nature and can be used to indicate the randomness of our world.