We all know that quantum computers are still in their infancy; they’re big, bulky, and super chill (literally; they are generally kept at just .02 degrees above absolute zero. No easy task.). But what if we could take the power of qubits – a unit of quantum information that contains the standard 1/0 binary of a bit plus an additional state indicating both 1 and 0, known as superposition – and integrate them into a silicon-based microprocessor?
Researchers from the University of New South Wales (UNSW) have designed a method to do just that. The team essentially proposes to re-purpose silicon transistors to handle qubits, and then use existing silicon manufacturing processes to produce quantum-based microprocessors.
As Menno Veldhorst, one of the lead researchers on the project, explains: “our design incorporates conventional silicon transitor switches to turn on operations between qubits in a vast two dimensional array using a grid-based bit select protocol similar to what’s used to select bits in a conventional computer memory chip. By selecting an electrode above a qubit we can control its spin.” Additionally, “by selecting electrodes between the qubits, we can perform two qubit logic.”
Don’t know what any of that means? That’s okay, neither do we. But it sounds promising. One of the major hurdles in the world of quantum computing right now is scalability. Bringing the qubit into a conventional microchip design could considerably speed up the adoption of this technology. The research team admits there are still plenty of engineering hurdles to overcome, but the pay-off would be enormous. I doubt my next system is going be quantum-fied, but the one after that… who knows.
The original article was published in Nature Communications. For a more complete explanation from the researchers themselves: