What Is a Transistor & What Effect Did Its Invention Have on Computers?

Early transistors were a fraction of the size of the vacuum tubes they replaced.
... Hemera Technologies/PhotoObjects.net/Getty Images

A transistor is a miniature electronic component made from a semiconductor material such as silicon. Before transistors, the only way to control the flow of current in an electronic circuit was using large, energy-hungry vacuum tubes, which limited the size and power of computers that could be built. The invention of transistors revolutionized computer design: A modern microprocessor typically contains hundreds of millions of transistors on a single silicon chip.

1 Invention of the Transistor

The transistor was invented in 1947 by John Bardeen, Walter Brattain and William Shockley at Bell Laboratories in New Jersey. The invention was the culmination of a long-running effort to develop a viable alternative to the vacuum tube using semiconductor technology. It earned Bardeen, Brattain and Shockley the Nobel Prize for Physics in 1956. Compared to vacuum tubes, transistors have numerous advantages: They are smaller and less fragile, they require less power to operate and they can function at lower voltages.

2 How a Transistor Works

If an electric current is passed between two of a transistor's three terminals, the magnitude of this current can be controlled by a voltage applied to the third terminal. This has two basic applications -- either as a switch or as an amplifier. The first widespread application of the transistor was as an amplifier in portable radios. However, its use in computers is as a switch rather than an amplifier. Computers using vacuum tubes as switches were limited by size constraints to a few thousand such devices. With the invention of the transistor, the situation changed dramatically.

3 Effect on Computers

Transistor-based computers were not only smaller than their vacuum-tube predecessors but also more powerful because more components could be packed inside. Transistorization also made computers faster because the more compact dimensions meant that electrical signals didn't have to travel as far. As time went on, so the number of transistors that could be fabricated on a single chip of silicon increased. This effect was quantified in the 1960s by Gordon Moore, in the mathematical relationship known as Moore's Law.

4 Transistors in the Digital Age

Transistors have become one of the basic building blocks of modern technological life. As the essential switching device in all types of microprocessors, they are found in smart phones, digital cameras, electronic games and satellite navigation systems as well as in desktop and laptop computers. Typically a modern microprocessor will contain hundreds of millions of transistors, although the most advanced designs have transistor counts in excess of 2 billion.

Andrew May has more than 25 years of experience in academia, government and the private sector. A full-time author since 2011, he wrote "Bloody British History: Somerset" and "Pocket Giants: Isaac Newton" (to be published in 2015). He is a regular contributor to "Fortean Times" magazine, and also contributed to "30-Second Quantum Theory." May holds a Master of Arts in natural sciences from Cambridge University and a Ph.D. in astrophysics.