The first functional optical processor was built at AT&T Bell laboratories with the hope that one day light would replace electricity in high speed parallel computers.
Despite the many benefits that classical computers (‘classical’ here means computers in which the signals are carried electrically) have brought to our lives, they have some limitations which prevent any improvement in the speed or volume of signals carried. These limitations are inherent to the way these computers work.
For example, classic electric circuits carry information units serially, one by one, and there are some lower limits beyond which such circuits cannot be built-below that limit they simply cannot process the information reliably. Another handicap is that electrons floating in circuits can interfere with each other-and this interference, incidentally, is one reason why engineers cannot produce smaller circuits. By contrast, photons, light particles, which are the main signal or information carrying agent simply do not interact with each other because they do not carry a charge.
An optical computer could be run faster than one running electrons, theoretically at the speed of light, along optical fibres which are specifically designed guide-wires to transfer light-photons in and out between chips in an optical computer without distortion.
One of the main advantages of optical computers is their capability of processing more than one piece of information at the same moment. That means multi-beams can be processed in one chip. This would allow engineers to use parallel processing which greatly enhances the speed of the computer.
Lasers would, naturally, be the source of light in this new generation of computers. Scientists and engineers all over the world are trying to build appropriately tiny lasers emitting precise frequencies of infrared light. But they face a number of practical hurdles. One has to do with making lasers of appropriate size and efficiency. Current technology does not have the means to build optical chips comparable in size to ‘classical’ ones. The efficiency of the lasers is not high enough for the specifications required. Most of the energy to run these lasers escapes as heat and is not used. Since one or at most two percent of this energy can be transformed into the useful form of light, the rest can generate a lot of heat which is dangerous to the condition of the chips.
Making the right lasers is not the only problem on the way to fully optical computers. Switches are at the heart of optical computers, but as photons do not interact with each other, there are substaintial difficulties in building switches.
One solution to this problem is to build computers which are part electrical, part optical. Many scientists now believe that the most viable use for optical technology is in this type of hybrid system combining optics and electronics. Researchers are now focusing their work on optical interconnections between chips, which could be a reality in as little as one or two years. This type of connection can vastly increase the amount of data moving in and out of chips.
Such a machine would have to contain prisms, mirrors, and lasers to channel the light, as well as gallium arsenide chips that convert pulses of laser light into electrons so as to function as switches. If all this does happen, there will be a need for new computer architectures, that is, new computer structures.
However, there are some scientists following a different route. They are trying to find ways to use current transistor technology so as to detect laser beams in the information processing. NPN type transistors without a metal cover would be appropriate because they are faster. This approach also allows for adaptation of existing designs, with all the advantages in time and savings that brings.
The first optical processor developed at AT&T Bell Labs measured about two feet by two feet. Scientists hope some day to fit it all into three square inches. A fully optical computer is more than five years away.
Scientists have set themselves a target for the year 2000: 1,000 I/O (input and output) channels running at 1 giga-bit/sec. That is a thousand times faster than current modern computers.
It is a pity that we must wait for a decade, while scientists and engineers try to accomplish this difficult task. But what an exciting wait!