Quantum AI - Part 2 - The dice have been cast!
Previously, in the Quantum AI column, we have seen that quantum computing would theoretically enable some complex machine learning algorithms to be executed in a “reasonable” time (less than several years…). But what is so different about quantum computers compared to today’s computers? The purpose of this article is not to go into theoretical details but to simply illustrate – as faithfully as possible – the fundamental differences between classical and quantum computers, especially with a quantum algorithm example. Understanding the bit concept with the coin analogy To understand why calculations are potentially much faster in a quantum computer, let’s recall the general functioning of conventional computers. They use bits to code information and are made of electronic circuits so that, when the current flows through the circuit, the bit is worth 1, otherwise the bit is worth 0. You can also consider the electrical voltage rather than the current, the approach will remain th