Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
iCarnegie.doc
Скачиваний:
10
Добавлен:
25.02.2016
Размер:
577.54 Кб
Скачать

2.5.1 Moore's Law

A transistor is an electronic switch that can alternate between two states, "on" and "off," representing one bit of information. Modern microchips contain millions of transistors, each so small that it cannot be seen with the naked eye. Gordon Moore, one of the founders of Intel, observed that in 1965, microchip capacity (the number of transistors contained within a silicon wafer) had doubled every year. This trend in computing, which has become known as Moore's Law, continues on into the present—although the rate of change has slowed recently so that chip capacity now doubles every 12-18 months, not every year. Moore's Law, an example of exponential growth, refers specifically to the capacity of microchips, and the law might be stated this way: the number of transistors that can be put on a microchip will double every 12-18 months, until physical limitations are reached.

To illustrate the power of exponential growth, consider the parable of the inventor of chess and his emperor. The emperor wanted to reward the inventor with anything he wanted for creating the game of chess. The inventor requested that he be given one grain of rice for the first square of the chessboard and that each additional square would double the previous square's amount of rice. The emperor immediately granted his wish. There are 64 squares on a chessboard. By the 32nd square, 4 billion grains of rice would have been given, that is about one large field's worth of rice. And, the next square would need about 2 million grains of rice, the next square about 4 million, the next square about 8 million, and so on. The 64th square would need 9*1018 grains of rice, more than the amount of rice that could be produced even if the entire earth's surface is used to grow rice.

The number of transistors on a single chip increased at such exponential rate, doubling every 12-18 months. Below is a graph illustrating the exponential increase in the number of transistors on processors introduced over the years.

Figure 1 Illustration of Moore's Law applied to Intel Processors

Below is the log scaled graph to provide you with a different perspective of the exponential growth of transistors on a microchip.

Figure 2 Illustration of Moore's Law applied to Intel Processors in log scale

For more recent data, see the press kit from Intel

With the exponential growth of transistor density on microchips, many inferences can be made that allow analysts to predict other developments in the computer industry. Extending the scope of Moore's Law, the following predictions can be made:

  1. Processing power (speed) doubles every 12-18 months.

  2. Storage capacity of RAM doubles every 12-18 months.

Other observations are that storage capacity of hard disk drives is also increasing exponentially, and the cost for consumers to purchase computer parts is decreasing over time.

The reason Moore's Law continues to hold true is that circuitry is becoming ever smaller. Circuits that used to require hundreds of square microns of silicon (a micron is a millionth of a meter) now fit into just a few square microns. This trend has enabled more and more circuits to be packed into the same area. Processors, memory chips, and special-purpose chips for controlling peripheral devices are all becoming denser. Although Moore's Law only predicts the increase in circuit density, this increase in density reduces the time required for inter-component communications, which also means that chips can process data faster.

Improvements in microchip technology are being matched by improvements in several other technologies found in computer systems. Disk capacity is increasing for a variety of reasons. Improvements in magnetic media (the iron oxide coating on the surface of a disk, flatter platters, etc.) and read/write electronics are increasing the capacity of hard disk drives. Introduction of new optical disk technologies is another source of increased storage capacity for personal computers. Corresponding increases in processor speed and bus bandwidth enable computers to take full advantage of the growth in storage capabilities.

Despite the growth in processing speed and storage capacity, the cost per byte of data processed or stored decreases as lower-capacity memory chips become out-dated.

An interesting counter to improvements in capacity and throughput is known as Parkinson's Law of Data, which says that data expands to fill the space available. In other words, as more memory or disk space becomes available, the demand for more memory or disk space increases accordingly. For example, when computers had only a few kilobytes (KB) of memory, their simple operating systems fit in as little as 4 KB. Today's microcomputers typically have 256MB or more of memory and, as Parkinson's Law would predict, today's operating systems are much more elaborate and require tens of megabytes of memory for their own use. Similarly, as disk drive capacity increases, people begin using them in new ways. Early computers with 360 KB floppy disks mainly stored small text files. Today, when computers routinely come with multi-gigabyte hard drives, people store musical recordings, short video clips (each file several megabytes in length), and even collections of feature-length films on DVD (typically about 5 gigabytes).

Parkinson's Law drives the entire computing industry, through the knowledge that applications will always keep pace with Moore's Law. As capacity increases, users would ask for even more performance in order to accomplish more ambitious tasks. Thanks to Moore's Law, we can expect to see continued technological improvements to meet consumer demand for greater performance at affordable prices. (But, note that Moore's Law doesn't cover all aspects of computer technology. It says nothing about increases in system reliability, or about the quality of the software programs used in computer systems.)

Without fundamental changes in chip technology, the laws of physics suggest that there are limits to how far we will be able to improve computing performance. For example, the circuit pathways have to be wide enough for electrons to pass through. Another limitation is the wavelength of light. Light is used to etch circuits into silicon, and the width of the pathways etched is related directly to the wavelength of the light used to do the etching—the shorter the wavelength, the narrower the pathway. Ultraviolet light has a shorter wavelength than visible light, and X-rays are shorter still. But, there are technical problems with using wavelengths that short. What happens when the limit is reached? We don't know, but experience suggests that progress will continue, possibly in unanticipated directions. At some point, the cost of producing ultra-dense chips may restrict their use to the most expensive supercomputers.

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]