As I’ve been reading more about quantum computing, I’ve been wondering what the physical manifestation of a quantum computer would be. I know what modern-day silicon chips look like, but I have a hard time imagining what a quantum computer would look like. So let’s start at the smallest component — a qubit.
As Wikipedia so helpfully describes, there are many representations of qubits. My simple take-away is that all of them involve very small things — i.e. a single atom, an electron, or a photon. But, these very small things require 1) supercooling (to reduce the noisy state of each qubit), and 2) lasers (to make measurements), and so while each individual element is small, the entire aparatus is large and very complex. Kind of like vacuum tubes and mainframes, in my imagination. As the technology advances, I wonder if it’s even possible for quantum technology to shrink like modern day transistors, since the atomic level is just so different, or if quantum computing will have to be made available through the cloud, only?
I’ve also noticed that different teams are experimenting with different types of qubits and quantum computing: Google and UCSB with superconducting qubits, Microsoft with topological qubits, etc. So it’s not yet obvious that there is a dominant technology or approach in the field.
After skimming the paper from Google and UCSB, I’m still unclear how qubits in general translate to computing work. It seems like after you measure the state multiple times, you get out a probability distribution that the qubit is in any given superposition (extend this to multiple qubits). So while a qubit can be in all of its superpositions at the same time (in real life), as soon as you sample it, you’ve digitized it, or effectively equated the qubit to a normal bit. And therefore, similar to how sampling analog music to create a digital representation loses data, I would assume that sampling qubits to figure out their state must also lose some (valuable?) data…given all the hype, I’m probably missing some key understandings about the field, so I definitely plan to read some more papers and articles. And maybe that is why error correction is so critical in quantum computing?