Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can the difference between qubit and bit be explained with a simple code example?

The only places I know that you can play with quantum computing are the google quantum playground and the ibm's quantum experience. While the first one uses qscript and the second qasm languages (which are easy to learn) their usage still do not differ much from regular programming (besides the few specific functions). Here's the wikipedia explanation:

A qubit has a few similarities to a classical bit, but is overall very different. There are two possible outcomes for the measurement of a qubit—usually 0 and 1, like a bit. The difference is that whereas the state of a bit is either 0 or 1, the state of a qubit can also be a superposition of both.It is possible to fully encode one bit in one qubit. However, a qubit can hold even more information, e.g. up to two bits using superdense coding.

For a system of n components, a complete description of its state in classical physics requires only n bits, whereas in quantum physics it requires 2^n − 1 complex numbers.

Which more or less clear.But how this can shown with a code example?

like image 633
npocmaka Avatar asked Jun 21 '17 20:06

npocmaka


People also ask

What is the difference between a qubit and a bit?

Classical Bits vs qubits A classical bit can be either 0 or 1. A quantum bit, or qubit, is a superposition of 0 and 1. A single qubit therefore takes 2 classical values at once. Every operation on the qubit is done on both values at once.

What is qubit explain with example?

A qubit is a quantum bit, the counterpart in quantum computing to the binary digit or bit of classical computing. Just as a bit is the basic unit of information in a classical computer, a qubit is the basic unit of information in a quantum computer.

What is the difference between adding a bit and adding a qubit to a computer?

What is the difference between adding a bit and adding a qubit to a computer? Adding a bit on a normal computer is a simple addition. Adding a qubit to a quantum computer doubles the computational power, which is why there is exponential growth.

How is a qubit in quantum computing different from a regular bit in classical computing Mcq?

How is a qubit in quantum computing different from a regular bit in classical computing? A qubit can represent an indeterminate state. A qubit stores information as graphical images. A qubit takes up less memory space than a regular bit.


1 Answers

Here is some classical code that flips coins and counts how many heads you get:

def coin_count():
    bit = False
    counter = 0
    for _ in range(500):
        bit ^= random() < 0.5  # False → 50% False, 50% True
                               #  True → 50% False, 50% True
        if bit:
            counter += 1
    return counter

If you run this code many times, and make a histogram, the result will be approximately a Binomial distribution:

Classical Binomial distribution

Now here is some pseudo-code that does essentially the same thing, except the coin is replaced by a qubit. We "flip the qubit" by applying the Hadamard operation to it.

def hadamard_coin_count():
    qubit = qalloc()
    counter = 0
    for _ in range(500):
        apply Hadamard to qubit # |0⟩ → √½|0⟩ + √½|1⟩
                                # |1⟩ → √½|0⟩ - √½|1⟩
        if qubit:  # (not a measurement; controls nested operations)
            counter += 1  # (happens only in some parts of the superposition)
    return measure(counter)  # (note: counter was in superposition)

Do this many times, plot out the distribution, and you get something very different:

quantum walk distribution

Clearly these code snippets are doing very different things despite their surface similarity. Quantum walks don't act the same as classical random walks. This difference is useful in some algorithms.

like image 173
Craig Gidney Avatar answered Sep 17 '22 11:09

Craig Gidney