Hype surrounding the rise of ChatGPT and the supposed ground Google is losing to Microsoft Corp. and OpenAI in the search wars has overshadowed more important developments in computing, progress which will have far greater implications than which website serves up better tax advice.
Quantum computing is the holy grail of scientists and researchers, but it's still decades away from reality. Google's parent company, Alphabet Inc., however moved the ball down the field last month with news that it found ways to ameliorate one of the biggest problems facing the nascent field: accuracy.
To date, all computing is done on a binary scale. A piece of information is stored as either one or zero, and these binary units (bits) are clumped together for further calculation. We need 4 bits to store the number eight (1000 in binary), for example. It's slow and clunky, but at least it's simple and accurate. Silicon chips have been holding and processing bits for almost seven decades.
Quantum bits — qubits — can store data in more than two forms (it can be both 1 and 0 at the same time). That means larger chunks of information can be processed in a given amount of time. Among the many downsides is that the physical manifestation of a qubit requires super cold temperatures — just above zero degrees Kelvin — and are susceptible to even the minutest amount of interference such as light. They're also error prone, which is a big problem in computing.
In a paper published in Nature last month, Google claims to have made a huge breakthrough in an important sub-field called quantum error correction. Their approach is quite simple. Instead of relying on individual physical qubits, scientists store information across many physical qubits but then view
Read more on tech.hindustantimes.com