Why there may be more to computing than Moores Law

by Alan Woodward, Chief Technology Officer at the business and information technology consultancy Charteris plc.

Technological progress comes from pushing hard at the limits of what is currently possible, not from merely following trends others have set. In computing a good illustration of this principle is the life and work of the nineteenth-century computer pioneer Charles Babbage (1791-1870), who spent most of his adult life trying to build a digital computer. Babbage first invented such a machine in 1834. He called it the Analytical Engine.

Designed to contain tens of thousands of cogwheels, the Analytical Engine was never in fact built, mainly because the precision engineering industry of Babbages day couldnt furnish sufficient numbers of exactly-machined cogwheels at a reasonable cost and within practical time constraints. All the same, by thinking way beyond what was feasible at the time; Babbage certainly designed a machine that was to all intents and purposes a computer. The Analytical Engine had a memory, a processor and even a punched-card data storage and programming system. Computers of today work infinitely faster than even Babbage could have imagined, but ultimately they are all essentially a form of Analytical Engine.

How will computers work in the future - in around 2020, for example - and where is computing going?

According to the famous law formulated in 1965 by Gordon Moore, the founder of Intel, the number of transistors on individual microprocessors will double every eighteen months.

Experience gained over the past twenty years of building computers suggests that Moores Law holds good. Extrapolating it into the future, to 2020, for example (a year that, like all future dates that once seemed so remote, will come with astonishing haste), suggests that by then well have reached a point where microprocessors will have attained an atomic level. This is another way of saying that microprocessors will have become as small and compact as they are ever likely to do.

On the face of it, when microprocessors reach the atomic stage that will be the end of the evolution of computers. The machine that Charles Babbage first imagined in 1834 will have reached a dead end.

Or will it? Babbages dream of an Analytical Engine only became a reality after its inventors death, when the new technology of electronics provided a way to build a machine that did everything, and more, that Babbage envisaged. Similarly, a growing circle of computer scientists is coming to believe that another new technology may provide a way to build a completely new generation of computers once conventional electronic computing has reached the point of diminishing returns.

The new technology is quantum computing. Quantum computing exploits the curious effects described in the science of quantum mechanics, which studies the behaviour of energy and matter at an atomic level. The effects of quantum mechanics are in fact present in our everyday lives, but they are not observable (or at least obvious) at macroscopic scales. However, when we are dealing with processes that take place at atomic and subatomic levels, the consequences of quantum mechanics are very important and can in fact give rise to opportunities that are important even to people who focus on everyday, macroscopic experiences.

In particular, quantum mechanics gives the opportunity to build quantum computers.

Read more:

Why there may be more to computing than Moores Law

Related Posts

Comments are closed.