The gold standard of semiconductors looks like it might just be copper. Really, really thin bits of copper. A team of scientists from multiple universities, across South Korea and the US, have collaborated to find a way to create thin films of copper that resist corrosion and comes in at just a single atom thick. But why should we give a pair of silicon ingots about some flimsy copper sheets?
Because there's the very real possibility that with this new method, dropping skinny copper films into the semiconductor industry could drive costs down, reduce the amount of power devices demand, and could potentially increase the lifespan of such devices, too. And it's all about ridding chips of that pesky gold stuff.
That's like a triple-whammy of good things at a time where shrinking down transistors is harder and harder, and not necessarily the magic bullet to keeping Moore's Law ticking along.
«Oxidation-resistant Cu could potentially replace gold in semiconductor devices,» says Professor Se-Young Jeong, project lead at Pusan National University, «which would help bring down their costs. Oxidation-resistant Cu could also reduce electrical consumption, as well as increase the lifespan of devices with nanocircuitry.»
The collaboration between Pusan National University and Sungkyunkwan University, in South Korea, and Mississippi State University, in the USA, has created a method for manufacturing both this atom-thick sheet of copper, as well as a way to stop it from oxidising and therefore corroding.
Copper is one hell of a good conductor, which is why we use it in PC cooling, but it's not just heat it's good at conducting, but also electricity, too. You'll find copper used in all sorts of electronics, but the issue is that,
Read more on pcgamer.com