Grosh’s Law

Grosch’s Law. Named after Herbert Grosch , an astronomer who worked at IBM’s Watson Laboratory, this law notes that, while the processing power of the chips has been increasing year after year (Moore’s First Law), their Prices have also been gradually reduced.

Summary

[ hide ]

  • 1 Background of the Law
  • 2 Curiosity between Moore’s Law and Grosch’s Law
  • 3 Importance of Grosch’s Law
  • 4 Observations
  • 5 Sources

Background of the Law

Many information “laws”, observations, some of them empirical, on how information technologies evolve, or on how we use information on a personal or organizational level, are frequently cited in conferences, courses and articles. Perhaps the oldest, and also best known, is Moore’s “law” on the evolution of transistor density on a chip, as well as Grosch’s “law”.

In the early 1960s, computer design was facing the limit of diminishing returns. At that time the idea was to equip CPUs with as many instructions as possible, to make programs as small and efficient as possible in memory usage. This, of course, generated very complex computers, especially in an era when CPUs were built by soldering individual transistors by hand.

One solution to a problem, which was first explored at that time, was overlapping, which led to what we currently know as instruction pipelining, which allows a CPU to work on small parts of several instructions at the same time.
The other solution to the problem was parallel computing: building a computer out of a number of general-purpose CPUs. The idea is for the computer to keep all the CPUs busy, asking each one to work on a small part of the problem and then collecting the results in the final answer.

During the last two decades, Moore’s Law has been systematically fulfilled, according to which the number of transistors that of each silicon chip doubles every 18 months, increasing its processing and / or storage capacity. Simultaneously, Grosch’s Law has been fulfilled, according to which the price of those same chips is halved every three or four years. Both laws have been taken into account in the implementation of best practices in software engineering development until today.

Curiosity between Moore’s Law and Grosch’s Law

Moore’s Law was formulated in the same year 1965 as Grosch’s Law.

In 1965. Electronics Magazine turned 35, and Moore was asked for an article predicting what electronics would be like in the near future, in about 10 years. Moore looked at integrated circuits, which were 4 years old at the time, and their evolution until then. He noted that the number of transistors and resistors was doubling each year. So that’s exactly what he predicted:

“The number of components of an integrated circuit will continue to double every year, and in 1975 they will be a thousand times more complex than in 1965”

.

Moore’s Law is often cited a lot when new microprocessors are released to the market and worth knowing in depth. Grosch’s law is often cited:

“The performance of a computer increases with the square of the cost. If computer A costs twice as much as computer B, computer A should be expected to be four times faster than computer B.”

.

The law has been formulated using the expression: C = 1 / s1 / 2 where C is the cost of the processor, and s is its speed. Thus, taking into account that Moore’s Law tells us that the speed of the processors doubles every 18 months, or every 24, depending on the source consulted (that is, it quadruples every 3 or 4 years) the costs Processing times are halved every 3 to 4 years.

Importance of Grosch’s Law

Until not long ago, centralized systems were “carried away”. For years, people have followed the advice of Dr. Herbert Grosch, who argued that the larger the system, the greater the quantity-price ratio. Known as Grosch’s Law, this belief reigned in the industry for decades.

So much so, the adjusted statistical curve is that performance is proportional to the square root of cost, the Grosch Law of 1965.

On the one hand, today Grosch’s Law is considered historical evidence of misperceptions that computer science has had.

On the other hand, over time, microprocessors burst into the “glass house” era, bringing with them performance improvements and a drastic reduction in hardware costs that allowed us to pack a lot of processing power into small servers and distributed. Many thought it would be the end of Grosch’s Law.

Distributed computing became the norm for many, who installed PC LANs and distributed servers spanning the entire enterprise. Although this model is still appropriate for some companies, there are many that are beginning to question the cost and complexity of managing distributed systems.

Grosch’s Law fit well into the mainframe technology of its time and caused many organizations to buy a single machine, the largest they could get. With microprocessor technology, Grosh’s law is no longer valid. For a few hundred dollars, it’s possible to buy a CPU microcircuit that can run more instructions per second than one of the biggest mainframes of 1980 did.

Observations

There are dozens of laws, enunciated like those mentioned here, if not hundreds of such observations, which do not constitute scientific “laws” at all, but conclusions derived from repeated observation of situations, and for which they are not yet available. a reasonable or reasoned explanation.

 

Leave a Comment