New materials promise ultra-low-power computing
UC Berkeley researchers unveil materials that could potentially reduce the amount of voltage required for a CPU to operate, paving the way for ultra-low-power computing.
You might not need a whole 1.21 gigawatts to travel through time, after all.
Computer engineers at the University of California at Berkeley have found a way to reduce the minimum voltage required to store a charge in a capacitor--an electron-storing device that works somewhat like a battery--paving the way for ultra-low-power computing. This is a result of a project started in 2008 and led by Asif Khan, a UC Berkeley electrical engineering graduate student, and Sayeef Salahuddin, a UC Berkeley assistant professor of electrical engineering.
The engineers took advantage of ferroelectrics, a class of materials that can hold both positive and negative electric charges, even when there's no voltage applied. On top of that, the electrical polarization in ferroelectrics can be reversed with an external electric field.
The team was able to demonstrate that when a capacitor made of ferroelectric-based materials was paired with an electric insulator, the charge accumulated for a given voltage could be amplified in a phenomenon called "negative capacitance." This means you can create a charge that would normally require a higher voltage. And this, when applied to transistors--the on-off switch components that generate the zeros and ones that are the core of binary computing used in all personal computers--would translate into lower minimum voltage required to operate a computer processor.
Traditionally, a computer's processor (or chip) is made of transistors. The more transistors a chip has, the more processing power it offers. When first introduced in the 1970s, a processor had about a few thousand transistors. Moore's Law speculates that the number of transistors that could be squeezed onto a computer chip would double every two years. A modern processor now has billions of transistors.
While Moore's law held for many years, the number of transistors that can be put on a chip has slowly plateaued since about 2005. This is not because engineers can't reduce the transistors' size anymore, but because the reduced size doesn't translate into a proportional decrease in the overall power required to operate the chip. Due to the fundamental physics of a transistor's operation, its minimum required power supply has remained at 1 volt for about 10 years. Lower than that, the current is not strong enough to create a charge to change the transistor between its on and off states.
This constant power intake translates into a rather constant amount of heat generated by a transistor, regardless of its size. This means the more of them you put on a chip, the hotter the processor gets and the harder it is for the engineers to find a way to efficiently dissipate the heat fast enough. At some point, they can't keep shrinking the transistors without risking a fire hazard.
Salahuddin and his team proposed a solution to modify current transistors so they incorporate ferroelectric materials in the design. Potentially, similar to the case of the capacitor, this change would help create a sufficient charge from a smaller voltage and therefore would let engineers make transistors that generate less heat, which means they can continue to shrink their size further.
In layman's terms, this is similar to making a car run more efficiently so that its engine produces the same amount of horsepower and torque with less gasoline and won't become overheated at high speeds and during long stretches of operation. Apart from transistors, Salahuddin said ferroelectric material can also be used in other applications, such as system memory, energy storage devices, electric car chargers, and other electronics.
A few months ago, NASA also. If these two were combined, we'd probably have an even better solution to further increase the CPU's clock speed.