Effect Of Moores Law In Technology

Scientific and technological innovation is advancing at an ever-increasing pace. This has never been as clear as in the last few decades, with the introduction of computers, medical advancement (and thus increasing life expectancy), new forms of energy, artificial intelligence,… Groundbreaking discoveries are virtually being made on daily basis. Never before human knowledge and abilities have been evolving at such a tremendous speed.

This is due to many factors – the use of new technologies (from direct tools such as building machines to indirect innovations such as increased collaborative possibilities thanks to the internet, computer software,…), globalization (allowing more brainpower to collaborate on common projects), free markets and economic advances (boosting innovation as a way to create profit, but also giving a stable environment for decreasing risks on long term investments), social sciences (individualism driving people to try to get recognition by their intellectual or innovational inputs),…

Although not every factor is equally contributing, and that not every factor continues to evolve (but on the other hand, new factors also come into existence), one can recognize a more or less stable exponential trend in innovation.

One of the most famous and obvious examples is Moore’s Law, which forecasts that the number of transistors that can be placed for the same price on an integrated circuit is doubling every two years. Moore’s Law is no exception though – similar observations are being made on countless other subjects, such as the amount of nanotech science citations, magnetic data storage, etc.

In general, we could say it took humankind thousands of years to invent the wheel, find out how to make fire, and where to find metal, whereas now scientific evolution is so fast that no single person can come anywhere near to even just have the time to read about every new advancement.

When people make forecasts, they usually use linear trends, instead of these correct exponential ones.

When in mathematics, one wants to calculate the slope of a random function, he can use derivates, which scopes on an infinitely small part of the graph, so that the graph can be considered to be linear. Based on the linear transformation, one can give a good approximation of the real value of the slope.

But if one had to calculate the slope from a relatively big chunk of the graph, and use the linear transformation there, it would give a value that might be far away from the right one.

Up to recently, investment projects of 10 years could be seen as a small amount of time in relation to scientific (and economic) progress, and therefore an easy, intuitive linear trend was an acceptable tool to use.

But now we’re getting into the part of the graph where the rate of evolution is so steep, that in many sectors a linear trend might give a bad to very bad prognosis of future conditions, in a time lapse of 5 years to even just a few months. Therefore it’s important to learn to use exponential trends, if we want to make good business decisions.

This paper analyzes the different trends and points to take into consideration when making future predictions, and will look into some business decisions to see whether companies take these trends into account; if yes, how, if not, what their (lost opportunity) cost might be.

Research on Scientific Theories

Introduction: A History of the Universe

The focus of this paper is on modern day evolution; evolution which is happening at this very moment, and the implication of them. It might be interesting to take some distance first, and look at the universe, and how it evolved since its appearance into existence.

Carl Sagan, a famed astronomer, created a “Cosmic Calendar” to offer a more comprehensive view on how the universe’s progress has happened. He asks the reader to imagine the entire history of the universe as if it had happened into a single year.

In this scenario, on the 1st of January, there would be the Big Bang. Not much would happen until the formation of our solar system, which would take place on September the 9th. The earth itself wouldn’t show signs of life until November, where the first multicellular organisms start showing up. Dinosaurs would start walking around Christmas Eve, and humans would start walking straight at 9:24 PM of New Year’s Eve. The invention of agriculture wouldn’t come before 11:59:20 PM. Jesus Christ would be born at 11:59:56 PM. The Renaissance in Europe, which brings the emergence of the experimental method in science, would take place at 11:59:59 PM. And then there’s the last second. Most of the scientific inventions originate from that very last second; electricity, computers, cars, the printing press, the human genome project, nanotechnology, the internet,…

This demonstrates that evolution with an exponential growth is by no means a new concept. It seems to be a very natural phenomenon at its core. Of course, one could argue that scientific (and technological) advancement is man-made – not a natural phenomenon at all, and therefore this theory cannot be extrapolated to future (man-made) evolution. On the other hand, one could argue that man-made inventions are nothing but a prolongation of natural evolution, and therefore, although it might not seem to be a “natural” process, in fact, it is.

This discussion is at an almost philosophical level. In the next part of this chapter will analyze theories on (accelerating) evolution, based on solid foundations of contemporary trends in technology, economics, psychology and philosophy.

Technological Theories

Moore’s Law

One of the most famous and prominent examples of accelerating returns is “Moore’s Law”, which describes a long-term trend in the history of computing hardware. Since the invention of the integrated circuit in 1958, the number of transistors that can be placed inexpensively on an integrated circuit of a fixed size has increased exponentially, doubling approximately every two years.

Figure . Moore’s Law

Kurzweil’s Law of Accelerating Returns

Kurzweil’s “Law of Accelerating Returns” extends Moore’s law to describe an exponential growth of technological progress. Moore’s law describes an exponential growth pattern in the complexity of integrated semiconductor circuits only. Kurzweil extends this to include technologies from far before the integrated circuit to future forms of technology. Whenever a technology approaches some kind of a barrier, according to Kurzweil, a new technology will emerge to allow us to cross that barrier, which allows evolution to enter a new paradigm. He predicts that such paradigm shifts have and will continue to become increasingly common, leading to “technological change so rapid and profound it represents a rupture in the fabric of human history”.

According to Kurzweil, Moore’s law of Integrated Circuits was not the first, but the fifth paradigm to forecast accelerating price-performance ratios of computing technology. Computing devices have thus been consistently multiplying in power (per unit of time) , not since the invention of integrated circuit, but from the mechanical calculating devices used in the 1890 U.S. Census, to Newman’s relay-based “Heath Robinson” machine that cracked the Nazi Lorenz cipher, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer.

Kurzweil’s point is that Moore’s Law could be applied not just to the integrated circuits, but to all of the five paradigms. Ever since the invention of the mechanical calculating devices of the late 19th century, the rate of acceleration at which calculating power per unit of price is available, is constant.

According to Kurzweil, the next paradigm to follow the integrated circuit is a 3D structured chip; this is essentially the same technology as an integrated circuit, but instead of having silicon layers superposed on each other with an isolated layer between them, each layer will be intertwined, creating many more connections between the transistors individually, allowing it to reach speeds many time faster than what is possible today.

Figure . Kurzweil’s extended Law of Moore

Economic Theories

Schumpeter’s Creative Destruction

In “Capitalism, Socialism and Democracy”, Schumpeter used the term “Creative Destruction” to describe the process of transformation that accompanies radical innovation.

According to him, innovative input by entrepreneurs was the force that sustained long-term economic growth, even as it destroyed the value of established companies and laborers that enjoyed some degree of monopolistic power derived from previous technological, organizational, regulatory, and economic paradigms.

Read also  Social Responsibility in Marketing

This means that if companies decide not to adopt new innovations, they will eventually get filtered out of the economy, giving place for companies who embrace new technologies.

With this theory, Schumpeter was the first prominent economist to describe how economics and scientific progress go hand in hand, and how this progress is to be taken into account when making projections of how trends will evolve in the future.

Wikinomics

Tapscott introduces various new concepts, all revolving around mass collaboration in a business environment. The fundamental ideas that lay on the basis of his work are openness, peering, sharing and acting globally.

These new concepts can only materialize when a company is in touch with its users and that the users can adapt and enrich the contents of the product in such a way that the product changes for everyone – even for those who already bought the product.

The author believes that the mass collaboration in business environments are a logical continuation of the trend in business to outsource, which is to externalize functions to other business entities that were previously an internal function of the company. An important change that occurs when outsourcing is that what would otherwise be an entity specifically designed to fulfill a unique function, is now replaced by a collaboration of free agents that come together to cooperate and solve a problem. This system can, but must not, be incentivized by a reward system. Tapscott refers to this type of outsourcing as “Crowdsourcing”.

The author also introduces new terms, such a “Prosumers” , which is a combination of the words “producer” and “consumer”. It’s a type of business model where users add value to the product by adding their own content. As an example, he describes how the users of Second Life are supposed to create their own avatar, their own house, etc. This is a mandatory process in the game, that enriches the diversity of it. The user-made creation doesn’t stop there; if the users feel like it, they can design furniture, clothing, services, and so on, and even sell them for real money in the digital environment.

Another interesting trend he introduces is what he calls “Marketocracy” , which is a process obtained through a form of peering in a mutual fund. It works through the collective intelligence of the investment community, instead of the classical hierarchical fund management under the lead of a super star stock investor.

Tascott also commented on “Coal’s Law”, which states that; “A firm will tend to expand until the cost of organizing an extra transaction within the firm become equal to the costs of carrying out the same transaction on the open market”. According to Tascott, due to the changing way the internet is being used (mostly thanks to the Web 2.0), which made that the cost of transaction dropped so significantly, the theorem should be inverted to: “A firm will tend to expand until the cost of carrying out an extra transaction on the open market become equal to the costs of organizing the same transaction within the firm”. The author believes the costs of communication have fallen so much, that companies who do not adapt to this new reality, and change their company structure accordingly, will disappear. Thus, only companies who use mass collaboration will survive, dixit Tascott.

Behavioral Economics

Origins

Kahneman and Tversky could be seen as the founders of behavioral economics. They identified numerous common human logical errors using heuristics and biases .

This relatively new science describes thus how classical economic theories were somehow always erroneous to a certain degree, by believing the human to be a rational being.

In their paper “On the Psychology of Prediction”, Kahneman et al. explore the intuitive predictions made by humans for both category prediction and numerical prediction. In their findings, they argue that people, instead of using statistics, use a limited amount of heuristics to come to what are usually reasonable results, but are sometimes serious errors.

People seem to make predictions based on the certitude of the given evidence. Sometimes representation coincides with likelihood, but often it has no correlation with reliability or likelihood of evidence.

People often seem to think in terms of similarity, and not of statistical likelihood. People rely on description, and not on the level of predictability attached to them. In experiments, even when people were told that the qualitative description had either low or high predictability, it had little effect on their statistical considerations.

In numerical prediction people also assume the descriptions are highly accurate, even when they’re told it’s not the case. People don’t succeed to regress or move the mean value when stated that the input data is very unreliable.

Another human flaw is the human’s incapacity to think in exponential terms. Linear trends are easy to handle, but logarithmic ones seems somehow unnatural to the human brain, and therefore people tend to make very serious mistakes on predicting exponential outcomes.

Techniques for correcting judgmental forecasting errors

Goodwin & Wright points out how subjects grossly underestimate exponential growth in time series. This is an important finding, as it means that just knowing and accepting certain trends to be exponential is still no guarantee that that subject would take appropriate measures.

Therefore, they summarize some techniques for improving “judgmental forecasting” beyond the direct application of extrapolations. The researched techniques include:

Making judgmental adjustments to quantitative extrapolations.

Deciding how to decompose the forecasting problem.

Combining judgmental and extrapolation forecasts.

Their work provides advice only for cases that meet these two conditions:

Judgmental forecasting when the subject has only information about the time series of interest.

Only when the subject has time series and domain knowledge.

In other words, their advice is only valid in cases containing time series data, which is relevant to our study of exponential growth and the perception of stakeholder on it.

They have several conclusions to their work. One of them recommends the subject to use graphical time series displays when short term forecasts are required, in case the series are non-seasonal and the data is noise-free. Some of the conclusions are not intuitive, but very interesting; training a person’s forecasting abilities proves inefficient, as the human brain seems not able to handle such a concept of exponential growth over time.

Although Goodwin et al.’s work has contributed to improving our judgmental forecasting errors, they point out that a lot more research is needed still.

Philosophical Theories

Technological Determinism

Technological determination is a reductionist philosophical theory that believes technology influences society, instead of the opposite way, that society influences technology. It believes that the development of technology follows a predictable course, which can only modestly be influenced by cultural, economic, political or other influences. It says that society isn’t actively picking or choosing what technology to develop; instead it organizes itself to continue development of a technology once it has been invented. This implies that technology has its effect on society, but not the opposite way.

Both “Moore’s Law” and Kurzweil’s “Law of Accelerating Returns” are thus being technological deterministic, as they assume evolution will continue to evolve at a steady rate, in a fashion that they could predict the date that certain technologies will be available at a certain price, independently of what else happens in the world.

Conclusion of the Scientific Theories

We’ve been looking at diverse branches of scientific research.

History teaches us that evolution has been exponential since the beginning of the universe, although this was through natural evolution, without human’s input; therefore, we cannot conclude that this trend is also taking place in human’s scientific progress.

Technological determinism believes technology is driving society; thereby implying that once an invention has been made, it’s not human’s will that makes it evolve; society as a whole “wants” to continue research on the matter, to improve its capabilities. Society will adapt to integrate the new technology; although it’s not the society that decides what technology it wants to have.

This explains how “Moore’s Law” and “Kurzweil’s Law of Accelerating Returns” can exist, and how these laws can predict quite accurately tomorrow’s technological power and price tag attached to it, by extrapolating the current exponential evolutionary trend towards the future.

Read also  The Erg Theory Of Motivation Business Essay

The science of behavioral economics pulls our attention to the fact that humans aren’t very good at intuitively understanding exponential trends, as humans use a limited set of simple heuristics handicapped by biases, to come to conclusions. By acknowledging this human shortcoming, Goodwin & Wright have reviewed a few techniques to help people to make better predictions of exponential trends in time series.

When a business makes poor judgment of the importance of incorporating certain new technologies, they risk being cleaned out of the system, by a process called “Creative Destruction” as described by Schumpeter.

Tapscott describes a new evolution in business style. Not only technological innovation is important; businesses should also adapt their way of doing business and creative value by using “Crowdsourcing”, “Prosumers”, etc. If companies don’t conform to this new trend, they risk having their business destructed by not being creative enough; aka being a victim of Schumpeter’s “Creative Destruction”.

Perception of Stakeholders (Industrials / Entrepreneurs / Organisations) on Future Trends

As we have a better overview on current trends and theories now, we can look for entities which have or could have used this knowledge of exponential growth instead of the more intuitive (but wrong) linear growth.

The research on this matter is impossible to execute through quantitative research methods. Therefore, I’m forced to utilize qualitative methods, by looking at some diverse examples of entities which have taken advantage of the exponential trends – and what they gained with it – and at examples of entities which have failed to take advantage of this trend – and what cost or missed opportunity this has caused.

Entities using Exponential Trends Successfully

In this entry I’ll identify examples of companies / organizations / entrepreneurs who are successfully using exponential trends.

Human Genome Project

When the Human Genome Project started to sequence the human DNA in 1990, there was much controversy about the completion date being set to 2005. At the sequencing speed with the technology available in 1990, the project would have taken over 100 years to complete – but if you take the exponential trend (based on the Law of Accelerating Returns) into account, you’d come to very different predictions; the project would indeed be completed by 2005. In fact, the prediction of 2005 was still an under-estimation of the exponential trend – completion date has proved to be in April 2003.

As Kurzweil states: “The amount of genetic data that’s been sequenced has doubled every year since the human genome project began in 1990, and the cost per base pair has come down by half each year, from $10 in 1990 to about a penny today.”

This is an excellent example of a research which based its deadline successfully on the theory of accelerating returns.

Note that the acceleration for the sequencing of the genome is not the same as the one described by Moore’s Law. The doubling of transistors per unit of money happens every 2 years (Moore’s Law), whereas the doubling of the amount of genetic data being sequenced happens every year.

Gmail online storage

When Google launched its Gmail service in April 2004, providing each of its customers 1GB of online storage, it looked to many people as a true miracle, as Microsoft was offering 2MB of storage with its Hotmail service, and Yahoo 100MB with its Yahoo Mail service at the time.

In reality, Google didn’t really have the necessary hardware for providing that 1GB per customer, but they had calculated that the price of storage would become cheap enough to provide it by the time their customers’ account filled up, and that they’d really need that space.

The fact that Google took this evolution of quickly lowering prices of hard disks into account lured obviously a lot of customers to use its mail service. Since its introduction, Gmail quickly became one of the biggest free webmail services available.

As it was a very successful technique, Google kept this tradition alive of giving huge (partly not-yet-existing) storage to its customers. At the time of writing, Google offers 7582 MB of storage.

In a final note, one could summarize this was a huge marketing stunt (as it attracted many new customers) at a very good price for Google. Their smart use of “Moore’s Law” greatly rewarded Google by kicking up its market share in webmail services very quickly.

Schumpeter’s “Creative Destruction” is also of application here, as the other webmail providers utilizing the more conservative way of giving space (by making sure they have the full amount of storage equipment at disposal before increasing the allowed storage per account) have had to give in market share to Gmail.

Encyclopedias Encarta and Wikipedia

Encyclopedias have a long history. The oldest one still in existence is the “Naturalis Historia”, which is a collection of 37 books, was written in 75-77 AD. Since then, encyclopedias have had a long history of little evolution. Little by little new content could be added, as new discoveries were made. In the Renaissance, with the invention of the typing press, the encyclopedia became available for the masses. As the Renaissance also brought a lot of scientific progress, encyclopedias grew in amount of volumes; it became less easy to use, and couldn’t contain an infinite amount of details.

Encarta

At the end of the 20th century encyclopedias found a much better medium: CD-ROMs. It’s small, can contain a lot of information, is easy to organize, and provides a search function for the user.

Microsoft’s Encarta was introduced in 1993, after Microsoft had bought non-exclusive rights to the “Funk & Wagnalls Encyclopedia”. Microsoft had first approached “Encyclopædia Britannica”, which had been the standard of encyclopedias for over a century, but they declined, as they feared that their print media sales would get hurt from it. In 1996, just 3 years after Encarta’s appearance, the Brenton Foundation was forced to sell “Encyclopædia Britannica” at below book value, because it couldn’t compete with the more advanced and much more performant Encarta Encyclopedia.

At the end of the 1990s Microsoft enriched its encyclopedia by buying Collier’s Encyclopedia and New Merit Scholar’s Encyclopedia from Macmillan and incorporating them into Encarta. None of those encyclopedias stayed in print for much time after they had been integrated into Encarta. By now, Encarta consisted of roughly 60.000 articles.

By that time Encarta had become the reference in encyclopedias, containing the articles of three separate well-known encyclopedias, while providing it in a much more practical format than the traditional printed edition.

This is again a nice example of Schumpeter’s “Creative Destruction”; the traditional encyclopedias hadn’t seen the interest or the need to evolve into the digital media. The speed at which this evolution happened had never been seen before; while the Encyclopædia Britannica had been the reference for over a hundred years, it was put out of business in just 3 years. This shows that the speed at which the “destruction” of Schumpeter’s “Creative Destruction” happens at an ever-increasing pace – companies have to make decisions quickly.

Wikipedia

At the moment Microsoft believed to have reached a monopoly in encyclopedias. It relaxed, and believed it had reached the summit of encyclopedic solutions.

Meanwhile, in January 2001 Wikipedia was created as a complementary project to Nupedia, by Larry Sanger and Jimmy Wales. Wikipedia started as an innovative test project; Whales defined the goal to make a publicly editable encyclopedia, while Sanger was responsible for the strategy of implementing a wiki to reach that goal.

A wiki is an ingenious format which allows to create a structured database on the internet, in such a way that any user can create, as well as modify, an entry in the database. The content of the input is then being peer reviewed. According to those reviews, the status of the editor is then increased, in case of good input, or decreased in case of bad input. The better the status, the more validity is given to the editor’s input.

This wiki system in combination with an encyclopedia revolutionized the encyclopedia business. By the end of the first year of existence, Wikipedia boasted already about 20.000 articles in 18 different languages. It continued to grow at an exponential rate. On September 9, 2007 Wikipedia English passed the two million article mark, breaking the previous world record of China’s Yongle Encyclopedia, which had kept the title for 600 years. By now, Wikipedia has 18 million articles, of which 3,6 million articles in English.

Read also  The Six Ps Of The Coca Cola Company

Figure . Wikipedia’s content creation and participants over time

In March 2009, Microsoft announced that it would discontinue its Encarta software releases. The MSN Encarta website was closed by the end of 2009. Microsoft explained its termination by the fact that people have changed their way of looking for information, and due to changes in the traditional encyclopedia and reference material market.

The encyclopedia business is once again a clear and harsh example Schumpeter’s Creative Destruction, as the whole business had been revolutionized twice in a time lapse of less than 20 years, leaving the players who hadn’t adapted on time out of the market. Microsoft, being the monopolist of the digitalized conventional encyclopedia, has been cleaned out of the market in just 10 years time.

Wikipedia has shown that not only technological progress, but also changes in business culture and creation of value, can create big changes in entire markets. It has introduced the concepts of practical crowdsourcing and prosumers, which are now being used in many other markets, such as the gaming business, social networks,…

Entities Failing to Use Exponential Trends

In this entry I’ll identify examples of companies / organizations / entrepreneurs who are mistakenly using linear trends instead of exponential trends, and I’ll try to identify the costs – or missed opportunity – of it.

Belgian subsidies on solar energy

Our economy is at the mercy of power generation. In the near future, there are a few threats we need to overcome in order to keep the world running. Most of our energy supply comes from burning of all kinds of fissile fuel, which is only available in limited amounts. Another threat is the global warming, which might threaten our existence on earth.

As humanity’s hunger for energy keeps growing, prices are driven up to record highs. An efficient candidate for fuelling this hunger, while beautifully solving the aforementioned threats, already exists; nuclear energy. But as has been shown several times in the past, most recently in Japan’s Fukushima nuclear power plant, this technology has great risks attached to it, such as radio-activity leaks and nuclear waste.

This drives society to look for alternatives; most famously renewable energy. There are plenty of choices – wind, solar, tide and biomass are just a few examples – but unfortunately, most of them aren’t efficient enough to use.

Therefore, many countries have come together to look together on how to overcome these problems. Out of this came the Kyoto Protocol, which incentivized the adhered countries to stabilize, and eventually to lower their CO2-output, with goals put at certain deadlines.

These goals pushed many countries to rush onto renewable energies. In Belgium these were especially wind turbines and solar panels. In this paragraph I’ll focus on the solar panels.

Belgium has incentivized the common people to create their own little power source at home, by giving great discounts on solar panels (through the form of subsidies), while guarantying that the network would buy the energy at a minimum price over the next 20 years. This has been a huge success for the buyers of the solar panels, as it was financially very advantageous. But as could have been predicted, the costs are high. The government has rerouted the cost of the subsidy to the final consumers, who will pay the price for the solar panel holders. The average household will see its monthly electricity bill increase by 6€ in 2011, for subsidizing an electricity source that has a remarkably low share of 0,2% of the total energy production in Belgium.

On the technological side of solar energy production, we can see an exponential growth; there’s a doubling of performance every two years per price unit. And this is not since two years ago, but since 20 years ago. Today solar energy covers 0,5% of global energy production, which is very little. But that’s just eight doublings – which means 16 years – away from 100% (this is without taking solar power’s major disadvantage: when there’s no sun, there’s no electricity). Eventually, more people are going to gravitate towards solar power; not because they’re fan of green power, but purely because of its cost efficiency – except if another technology appears that will be even more efficient, and especially, more stable.

In other words, the high cost Belgium, both the government and its people, is paying now and for the 20 years to come, is a pain that in all probability could have been avoided by waiting for just a couple of years. This mistake was made both by the Belgian government as by the guidelines backed by the Kyoto protocol, which expects CO2-output to stabilize “too quickly” (for if we take exponential evolution of technology into account, we should understand that technology needs to be mature enough before adopting it).

One could argue that the subsidies have helped to research into renewable energy; but why not invest it straight into research then?

Conclusion

We’ve been looking at many different theories on accelerating evolution. We’ve looked at it from economics perspective, as well as from historical, technological, psychological and philosophical perspective.

Next, we’ve tried to identify examples in which businesses had to react on such an exponential growth. In some cases, the examined entity needed to face threats or opportunities, which they did successfully or not. In other cases, the examined entities needed to make the right future predictions on the technological power per price unit.

In the case of the Human Genome Project, it was all about anticipating what hardware would be available in the near future, as to be try to put a realistic deadline (and thus business plan) on their DNA research. They took the Law of Accelerating Returns into their model, and this gave very satisfactory results.

The Gmail storage subject was very similar case, in that it needed to predict the future’s price of memory, in a way that it would be able to offer more space than the competition today, although it didn’t really dispose of such a memory yet. This resulted in a gain in market share.

The case of Encarta was a nice example of Schumpeter’s Creative Destruction, as it revolutionized the market of the printed encyclopedias by upgrading the product to a new paradigm (that of digital media) to the point of driving the competition out of the market or absorbing them.

Although Creative Destruction also applies for the case of Wikipedia, the point of attention is that not only technological advances make the difference; also new ways of doing business and creating value can make all the difference (although it should be said that the new type of creating value could only be realized through the development of new technology: wikis).

The Belgian subsidies on solar energy showed us how high costs could be avoided by waiting until a technology ripens before adopting it. Since solar panels are becoming more efficient and prices go down in an exponential matter, it makes little sense to make high investments today, whereas it will be cheap in the future. In a sense, this case is similar to Google’s technique with its Gmail storage case, but then in the opposite way.

As a final conclusion it is very important to realize there is an exponential trend in certain evolutions, and to take it into account. A good understanding of the trend could make a tremendously positive difference in final results. Neglecting these trends could turn a profitable business or even sector, into bankruptcy.

Not only understanding these trends are important, but also implement them quickly. Note that “quickly” is a relative term; what is quick today, will be slow tomorrow, because of the exponential nature of our evolution. In other words, we need to be “really” quick.

Order Now

Order Now

Type of Paper
Subject
Deadline
Number of Pages
(275 words)