The findings mean that dramatic reductions in power consumption are possible as much as one-millionth the amount of energy per operation used by transistors in modern computers.
This is critical for mobile devices, which demand powerful processors that can run for a day or more on small, lightweight batteries.
On a larger industrial scale, as computing increasingly moves into 'the cloud', the electricity demands of the giant cloud data centres are multiplying, collectively taking an increasing share of the country's and world's electrical grid.
"We wanted to know how small we could shrink the amount of energy needed for computing," said senior author Jeffrey Bokor, a UC Berkeley professor of electrical engineering and computer sciences.
"The biggest challenge in designing computers and, in fact, all our electronics today is reducing their energy consumption," he added in a paper appeared in the peer-reviewed journal Science Advances.
Lowering energy use is a relatively recent shift in focus in chip manufacturing after decades of emphasis on packing greater numbers of increasingly tiny and faster transistors onto chips.
"Making transistors go faster was requiring too much energy," said Bokor, who is also the deputy director the Centre for Energy Efficient Electronics Science, a Science and Technology Centre at UC Berkeley funded by the National Science Foundation. "The chips were getting so hot they'd just melt."
Magnetic computing emerged as a promising candidate because the magnetic bits can be differentiated by direction, and it takes just as much energy to get the magnet to point left as it does to point right.
Catch the latest from the Consumer Electronics Show on Gadgets 360, at our CES 2026 hub.
Hubble Data Reveals Previously Invisible ‘Gas Spur’ Spilling From Galaxy NGC 4388’s Core
Dhurandhar Reportedly Set for OTT Release: What You Need to Know About Aditya Dhar’s Spy Thriller
Follow My Voice Now Available on Prime Video: What You Need to Know About Ariana Godoy’s Novel Adaptation