Lasciaci i tuoi riferimenti, saremo felici di contattarti il prima possibile e organizzare una consulenza gratuita.
Introduction to Quantum Computing and Qiskit
Introduction to Quantum Computing and Qiskit
Traditional VS quantum computers
Since the 1960's, the power of our computers has kept growing exponentially, allowing them to get smaller and more powerful.
A computer is made up of very simple components: computer chips contain modules, that contain logic gates, that contain transistors.
A transistor is the simplest form of a data processor in computers, and it basically consists of an electric switch that can either block or open the way for information flow. This information is made up of bits that can be set to either 0 or 1 and that are the smallest unit of information.
These switches are approaching smaller and smaller sizes as technology improves. Today, a typical scale for transistors is 14 nm, about 500 times smaller than a red blood cell, but they can be also smaller.
In the quantum world, physics works quite differently from the predictable ways we are used to, and traditional computers just stop making sense. We are approaching a real physical barrier for our technological progress. Now, scientists are trying to use these unusual quantum properties to their advantage by building quantum computers.
Quantum computers do not use bits but qubits that can be created using electrons, atoms, photons, or even molecules. These qubits can be in any proportions of both 1 and 0 states at once and this property is called superposition.
However, as soon as a qubits value is tested, for example by sending the photon through a filter, it must decide to be either vertically or horizontally polarized. So, as long as it's unobserved, the qubit is in a superposition of probabilities for 0 and 1, and you can't predict which it'll be. But the instant you measure it, it collapses into one of the definite states.
In 1935, Erwin Schrödinger devised a well-known thought experiment, now known as Schrödinger's cat, which highlighted this dissonance between quantum mechanics and classical physics. For more info about quantum superposition visit the following link (https://www.youtube.com/watch?v=lZ3bPUKo5zc).
Another property qubits can have is entanglement, a close connection that makes each of the qubits react to a change in the other's state instantaneously, no matter how far they are apart. This means when measuring just one entangled qubit, you can directly deduce properties of its partners without having to look.
Also, Qubit manipulation is very interesting: a normal logic gate gets a simple set of inputs and produces one definite output. A quantum gate manipulates an input of superpositions, rotates probabilities, and produces another superposition as its output. For further details check this link (https://towardsdatascience.com/demystifying-quantum-gates-one-qubit-at-a-time-54404ed80640).
So, in summary, a quantum computer sets up some qubits, applies quantum gates to entangle them and manipulate probabilities, then finally measures the outcome, collapsing superpositions to an actual sequence of 0s and 1s. What this means is that you get the entire lot of calculations that are possible with your setup, all done at the same time.
So, while quantum computers will not probably replace our home computers, in some areas, they are vastly superior: one of them is database searching.
To find something in a database, a normal computer may have to test every single one of its entries. Quantum computers algorithms need only the square root of that time. Moreover, the magnitude in data storing of quantum computing is incredible, in fact, 4 classical bits can be in one of 16 possible combinations at time. Four qubits in superposition, however, can be in all those 16 combinations at once. This number grows exponentially with each extra qubit, so 20 of them can already store a million values in parallel. This is why a 500-bit quantum computer can store more amplitudes than there are atoms in the universe.
Examples of quantum computers
In this last years companies like Google and IBM have invested highly in quantum computing. In 2019, Google built the first machine that achieved quantum supremacy, that is the first to outperform a supercomputer. Then the Jian-Wei Pan’s team at the University of Science and Technology of China has developed the world's most powerful quantum computer, Jiuzhang, that can perform a task 100 trillion times faster than the world's fastest supercomputer. Jiuzhang is reported to be 10 billion times faster than Google's machine and to be able to perform calculations, that a traditional computer would take 600 million years, in just 200 seconds.
However, China has not built a fully functional quantum computer: there are lots of challenges to build a practical quantum computer. For example, the qubits must be created and stored at a temperature close to absolute zero and the computers must be isolated from atmospheric pressure and magnetic field of Earth.
However, if you are interested in having a first try in quantum programming now you can. Qiskit is an open-source framework that provides tools for implementing and manipulating quantum programs and running them on prototype quantum devices also on simulators on a local computer.
The primary version of Qiskit uses the Python programming language and here (https://qiskit.org/textbook/preface.html) you can find a textbook for learning how it works. Quantum computing is new and writing quantum algorithms can be very tricky, but Qiskit helps making it simpler and more visual.
“The best way to learn is by doing. Qiskit allows users to run experiments on state-of-the-art quantum devices from the comfort of their homes. The textbook teaches not only theoretical quantum computing but the experimental quantum physics that realises it.” Qiskit.org
Now, let’s try to use Qiskit by combining it with a machine learning process with the help of the textbook.
Example of quantum machine learning (QML)
Quantum machine learning (QML) is an area of research that aims to exploit the advantages of quantum computing to enhance machine learning algorithms. In this example, we will see how to create a hybrid quantum-classical neural network on Python using PyTorch and Qiskit. To do this, we have to insert a quantum node inside a classical neural network.
The quantum node is a hidden layer of the network between two classical ones. It fits as a parameterized quantum circuit: a quantum circuit where the rotation angles for each gate are specified by the input vector of the previous classical layer. The measurements made by the quantum circuit are then collected and used as inputs for the following layer.
Let's see how we can implement and test our hybrid network. First of all, we need to describe the quantum circuit, the “quantum layer” and the specific functions for the forward-propagation and back-propagation steps.
The quantum circuit is defined by the class "QuantumCircuit" using QiSkit: it requires to specify how many trainable quantum parameters and shots we want to use. In this case, for simplicity, we will use a 1-qubit circuit with one trainable quantum parameter θ and an RY−rotation by the angle θ to train its output. This last is measured in the z-basis computing the σz expectation.
The functions for forward-propagation and back-propagation are defined by the class “HybridFunction" and the quantum layer of the network is defined by the class "Hybrid".
We want to develop an image classifier using the first two categories of the Fashion-MNIST dataset: T-shirt/top and Trouser. So, now we have to organize our dataset into training and test data and create our hybrid network.
We create a simple Convolutional Neural Network consisting of two convolutional layers, one dropout layer, two fully-connected layers, and finally the quantum layer. Since our quantum circuit contains one parameter, we must ensure the network condenses neurons down to size 1, as it happens in the second fully-connected layer. The value of its last neuron is used as the parameter θ into the quantum circuit to compute the σz measurement and obtain the final prediction.
Now we have all the elements to train and test our hybrid network.
This is a simple example of what we can do using Qiskit, for the complete and editable version of the code click here (https://qiskit.org/textbook/ch-machine-learning/machine-learning-qiskit-pytorch.html).
- https://www.youtube.com/watch?v=-eQ7zgwgTdI, consulted on 12.12.2020
- https://www.youtube.com/watch?v=JhHMJCUmq28, consulted on 23.12.2020
- https://qiskit.org/*, consulted on 28.12.2020
- https://www.theverge.com/circuitbreaker/2016/10/6/13187820/one-nanometer-transistor-berkeley-lab-moores-law, consulted on 02.12.2020
- https://qiskit.org/textbook/ch-machine-learning/machine-learning-qiskit-pytorch.html, consulted on 29.12.2020
- Feynman, R. P., Leighton, R. B., Sands, M. (1965), § 1-1
Article by Carla Melia and Monica Mura, data scientists at Orbyta Srl, 11.01.2021