The findings mean that dramatic reductions in power consumption are possible as much as one-millionth the amount of energy per operation used by transistors in modern computers.
This is critical for mobile devices, which demand powerful processors that can run for a day or more on small, lightweight batteries.
On a larger industrial scale, as computing increasingly moves into 'the cloud', the electricity demands of the giant cloud data centres are multiplying, collectively taking an increasing share of the country's and world's electrical grid.
"We wanted to know how small we could shrink the amount of energy needed for computing," said senior author Jeffrey Bokor, a UC Berkeley professor of electrical engineering and computer sciences.
"The biggest challenge in designing computers and, in fact, all our electronics today is reducing their energy consumption," he added in a paper appeared in the peer-reviewed journal Science Advances.
Lowering energy use is a relatively recent shift in focus in chip manufacturing after decades of emphasis on packing greater numbers of increasingly tiny and faster transistors onto chips.
"Making transistors go faster was requiring too much energy," said Bokor, who is also the deputy director the Centre for Energy Efficient Electronics Science, a Science and Technology Centre at UC Berkeley funded by the National Science Foundation. "The chips were getting so hot they'd just melt."
Magnetic computing emerged as a promising candidate because the magnetic bits can be differentiated by direction, and it takes just as much energy to get the magnet to point left as it does to point right.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.