A digestible introduction to how quantum computers work and why they’re essential in evolving AI and ML systems. Gain an easy understanding of the quantum principles that power these machines.

Quantum computing is a rapidly accelerating field with the ability to revolutionize artificial intelligence (AI) and machine learning (ML). Because the demand for larger, higher, and more accurate AI and ML accelerates, standard computers will probably be pushed to the bounds of their capabilities. Rooted in parallelization and capable of manage much more complex algorithms, quantum computers will probably be the important thing to unlocking the following generation of AI and ML models. This text goals to demystify how quantum computers work by breaking down a few of the key principles that enable quantum computing.

A quantum computer is a machine that may perform many tasks in parallel, giving it incredible power to resolve very complex problems in a short time. Although traditional computers will proceed to serve day-to-day needs of a mean person, the rapid processing capabilities of quantum computers has the potential to revolutionize many industries far beyond what is feasible using traditional computing tools. With the power to run hundreds of thousands of simulations concurrently, quantum computing may very well be applied to,

**Chemical and biological engineering:**complex simulation capabilities may allow scientists to find and test latest drugs and resources without the time, risk, and expense of in-laboratory experiments.**Financial investing:**market fluctuations are incredibly difficult to predict as they’re influenced by an unlimited amount of compounding aspects. The just about infinite possibilities may very well be modeled by a quantum computer, allowing for more complexity and higher accuracy than a normal machine.**Operations and manufacturing:**a given process could have hundreds of interdependent steps, which makes optimization problems in manufacturing cumbersome. With so many permutations of possibilities, it takes immense compute to simulate manufacturing processes and sometimes assumptions are required to reduce the range of possibilities to suit inside computational limits. The inherent parallelism of quantum computers would enable unconstrained simulations and unlock an unprecedented level of optimization in manufacturing.

Quantum computers depend on the concept of superposition. In quantum mechanics, superposition is the thought of existing in multiple states concurrently. A condition of superposition is that it can’t be directly observed because the remark itself forces the system to tackle a singular state. While in superposition, there’s a certain probability of observing any given state.

## Intuitive understanding of superposition

In 1935, in a letter to Albert Einstein, physicist Erwin Schrödinger shared a thought experiment that encapsulates the thought of superposition. On this thought experiment, Schrödinger describes a cat that has been sealed right into a container with a radioactive atom that has a 50% likelihood of decaying and emitting a deadly amount of radiation. Schrödinger explained that until an observer opens the box and appears inside, there’s an equal probability that the cat is alive or dead. Before the box is opened an remark is made, the cat may be considered existing in each the living *and* dead state concurrently. The act of opening the box and viewing the cat is what forces it to tackle a singular state of dead *or* alive.

## Experimental understanding of superposition

A more tangible experiment that shows superposition was performed by Thomas Young in 1801, though the implication of superposition was not understood until much later. On this experiment a beam of sunshine was geared toward a screen with two slits in it. The expectation was that for every slit, a beam of sunshine would seem on a board placed behind the screen. Nonetheless, Young observed several peaks of intensified light and troughs of minimized light as a substitute of just the 2 spots of sunshine. This pattern allowed young to conclude that the photons should be acting as waves once they go through the slits on the screen. He drew this conclusion because he knew that when two waves intercept one another, in the event that they are each peaking, they add together, and the resulting unified wave is intensified (producing the spots of sunshine). In contrast, when two waves are in opposing positions, they cancel out (producing the dark troughs).

While this conclusion of wave-particle duality continued, as technology evolved so did the meaning of this experiment. Scientists discovered that even when a single photon is emitted at a time, the wave pattern appears on the back board. Because of this the one particle is passing through each slits and acting as two waves that intercept. Nonetheless, when the photon hits the board and is measured, it appears as a person photon. The act of measuring the photon’s location has forced it to reunite as a single state reasonably than existing within the multiple states it was in because it passed through the screen. This experiment illustrates superposition.

## Application of superposition to quantum computers

Standard computers work by manipulating binary digits (bits), that are stored in one in all two states, 0 and 1. In contrast, a quantum computer is coded with quantum bits (qubits). Qubits can exist in superposition, so reasonably than being limited to 0 or 1, they’re each a 0 and 1 and plenty of combos of somewhat 1 and somewhat 0 states. This superposition of states allows quantum computers to process hundreds of thousands of algorithms in parallel.

Qubits are frequently constructed of subatomic particles reminiscent of photons and electrons, which the double slit experiment confirmed can exist in superposition. Scientists force these subatomic particles into superposition using lasers or microwave beams.

John Davidson explains the advantage of using qubits reasonably than bits with an easy example. Because all the pieces in a normal computer is made up of 0s and 1s, when a simulation is run on a normal machine, the machine iterates through different sequences of 0s and 1s (i.e. comparing 00000001 to 10000001). Since a qubit exists as each a 0 and 1, there isn’t a must try different combos. As a substitute, a single simulation will consist of all possible combos of 0s and 1s concurrently. This inherent parallelism allows quantum computers to process hundreds of thousands of calculations concurrently.

In quantum mechanics, the concept of entanglement describes the tendency for quantum particles to interact with one another and change into entangled in a way that they will not be described in isolation because the state of 1 particle is influenced by the state of the opposite. When two particles change into entangled, their states are dependent no matter their proximity to one another. If the state of 1 qubit changes, the paired qubit state also instantaneously changes. In awe, Einstein described this distance-independent partnership as “spooky motion at a distance.”

Because observing a quantum particle forces it to tackle a solitary state, scientists have seen that if a particle in an entangled pair has an upward spin, the partnered particle can have an opposite, downward spin. While it remains to be not fully understood how or why this happens, the implications have been powerful for quantum computing.

In quantum computing, scientists benefit from this phenomenon. Spatially designed algorithms work across entangled qubits to hurry up calculations drastically. In a normal computer, adding a bit, adds processing power linearly. So if bits are doubled, processing power is doubled. In a quantum computer, adding qubits increases processing power exponentially. So adding a qubit drastically increases computational power.

While entanglement brings an enormous advantage to quantum computing, the sensible application comes with a severe challenge. As discussed, observing a quantum particle forces it to tackle a selected state reasonably than continuing to exist in superposition. In a quantum system, any outside disturbance (temperature change, vibration, light, etc.) may be considered an ‘remark’ that forces a quantum particle to assume a selected state. As particles change into increasingly entangled and state-dependent, they’re especially liable to outside disturbance impacting the system. It’s because a disturbance needs only to effect one qubit to have a spiraling effect on many more entangled qubits. When a qubit is forced right into a 0 or 1 state, it loses the data contained at superposition, causing an error before the algorithm can complete. This challenge, called decoherence has prevented quantum computers from getting used today. Decoherence is measured as an error rate.

Certain physical error reduction techniques have been used to reduce disturbance from the skin world including keeping quantum computers at freezing temperatures and in vacuum environments but to this point, they’ve not made a meaningful enough difference in quantum error rates. Scientists have also been exploring error-correcting code to repair errors without affecting the data. While Google recently deployed an error-correcting code that resulted in historically low error rates, the loss of data remains to be too high for quantum computers to be utilized in practice. Error reduction is currently the key focus for physicists because it is essentially the most significant barrier in practical quantum computing.

Although more work is required to bring quantum computers to life, it is evident that there are major opportunities to leverage quantum computing to deploy highly complex AI and ML models to boost a wide range of industries.

Glad Learning!

## Sources

Superposition: https://scienceexchange.caltech.edu/topics/quantum-science-explained/quantum-superposition

Entanglement: https://quantum-computing.ibm.com/composer/docs/iqx/guide/entanglement

Quantum computers: https://builtin.com/hardware/quantum-computing