From its inception over two hundred years ago, the computer has seen truly prodigious increases in its power, sophistication and application with almost no aspect of life left untouched by its impact. It has revolutionised industries such as commerce (Amazon), decimated others such as postal services (email) and even created whole new industries such as panda video compilation (YouTube). At the heart of this increase in power has been the rapid decrease in the size and cost of the constituent elements of the computer which has allowed it to perform ever more complicated calculations showing ever more complicated simulations of violent crime. This has allowed the computer to go from performing simple calculations in spreadsheets or playing simple games like snake to performing unbelievably complicated calculations such weather forecasting or producing incredibly realistic video games in real time – all in roughly fifty years since they went mainstream.
At the most basic level, the computer is comprised of very simple logical circuits made from a few components which can be combined in huge quantities to perform highly complex calculations. These logical circuits are principally comprised of transistors, small devices which can do nothing more than switch an electric current on and off - much like a light switch - but with an electrical signal being used instead of a finger. Transistors can then be combined in simple circuits to perform logical operations such as in the ‘AND’ gate shown. This will output electricity at the point labeled ‘Out’ only if electricity is sent to both of the inputs labeled ‘A’ and ‘B.’ Whilst this is a simple circuit it is possible to arrange transistors in patterns like this to perform all of the computation and flinging of annoyed avian creatures which today's computers are capable of.
The symbol for a transistor - the input or base, labelled B can be used to control the flow of electricity between E and C
An AND gate made fron transistors
Knowing this we can now attempt to explain the massive increase in computing performance and capabilities which has been seen over the last half a century. In short, this has been made possible by the continued shrinking of the transistor such that more can be fitted in a computer chip to allow a chip of the same size and cost to perform more complicated calculations. Most of this development has happened over the last fifty years since the first computer chip was produced by Jack Kilby in 1958 which allowed transistors to be put in a small package where they could be used for computation. The rate of this progress has roughly followed a predication called Moore’s Law issued by Gordon Moore, founder of Intel, in 1965 that the number of transistors on a chip would double approximately every two years. This ‘law’ has held remarkably well since then as it has been taken as a target on which industry has based its development allowing us to see an enormous pace of advance in computing capabilities.
Over this period most of the decrease in transistor size (or increase in number of transistors per chip) has been achieved by simply scaling the transistor down according to a set of laws defined by Robert Dennard in 1974. We are now, however, facing the prospect of the breakdown of these laws and the so called ‘death of Moore’s law’ which would see an abrupt halt to the rapid advance of computing technology.
The first problem we shall discuss is one of thermodynamics. Until recently, the power (and therefore heat) dissipated by a transistor decreased according to the scaling laws meaning that a transistor half the size would release half the heat. This meant that more transistors could be squeezed on a chip, seeming ad infinitum, but the total heat released would remain the same. Unfortunately, as the transistor becomes so small that it is on the same scale as the atom, a larger amount of electricity than defined by the scaling laws is required to switch the transistor effectively. This is because the amount of energy being used to switch the transistor is becoming closer to the energy of the electrons which are being controlled. The result of this is that the heat released by the transistor no longer obeys the rules so adding more transistors would eventually cause the chip to melt as the heat produced could no longer be removed.
Another problem faced when trying to make the transistor smaller is, again, a result of it becoming comparable in size to the atom. At this incredibly small scale, the normal ‘classical’ rules of physics start to break down and we start to see effects of the bizarre and sometimes unpredictable ‘quantum’ physics. One consequence of this is that the transistors starts to suffer from the effects of quantum tunneling. This essentially means that at a smaller scale it is possible for electrons to jump seemingly magically across barriers which ‘classically’ they could not.
One area of the transistor in which this can be problematic is the gate, the part of the transistor which controls the flow of electrons within the transistor. With recent technologies the transistor has reached the size where the gate is only a few atoms thick and at this scale, as a result of quantum tunneling, some electrons are able to pass over it. This is obviously problematic as the transistor is meant to control electricity (stop electrons flowing through it) and consequently inefficiencies are introduced with energy being lost and the effectiveness of the transistor being reduced.
We will now consider solutions to these problems which may allow for the continuance of the hitherto rapid increase in computing performance. One solution is to simply work around these problems to try and get the most from existing technologies, as has been achieved many times over the last fifty years, such as by replacing the materials used in the transistor.
There is, however, a limit to how far this can go so we must look into other solutions. One solution is to simply use the transistors we have today in more intelligent ways instead of improving the basic component itself. Conventional computers are made using chips which are for so-called ‘general purpose computing’ which means that one chip can perform any computation from decoding DNA to rotating a photograph. This is very useful as it allows computers to perform a wide range of tasks but it is not very efficient as the transistors are arranged to perform as many tasks as possible instead of performing one task well. The alternative is to use FPGA’s which are chips which can be reprogrammed to essentially change the way the transistors are arranged, meaning they can be used most efficiently for the task in hand. This will allow us to continue the improvements in performance without changing the transistor.
Whilst this technology is readily available today and does allow for significant speed increases at the moment, it will not allow us to continue the previous rate of advancement for very long in the future. Another solution is to use entirely different systems for computation such as the use of DNA instead of transistors to perform computation. With this technology the input data is encoded into DNA molecules which are then manipulated using a number of biological reactions to process the data producing output data. In this manner chemicals can be used to perform logical operations like those performed by transistors. Whilst this technology could allow for significant speed improvements, potentially on the scales seen over the last fifty years, it is a long way from being ready.
In conclusion, there are a number of challenges to be overcome in order to continue the computing progress which we have become used to and it is unlikely that it will continue very long into the future without some major technological breakthroughs. It is, however, certain that progress is being made towards this and that it would not be foolish to hope to see this trend continue into the future.
Images used under Creative Commons license