Posted by on
Tags: , , , , , , , , , , , , , , , , , , , , , , ,
Categories: Uncategorized

Despite what most people think, quantum mechanics is not a new kid on the technology block.  It is a concept almost 120 years old.

Max Planck, a German theoretical physicist, first introduced quantum theory in 1900, an innovation that won the Nobel Prize in physics in 1918.  Then in 1959, a radical thinking American theoretical physicist named Richard Feynman planted the seeds of quantum computing. He suggested using quantum mechanics to build a new type of computer.

Since then, despite its complexity, we have made good progress in developing small working quantum computers and a limited menu of quantum algorithms.

Even so, Robin Blume-Kohout and Kevin Young, scientists from Sandia National Laboratory, looks at our quantum computing progress from a different perspective. They believe we are at the same stage that classical computing was in during the late 1930s.

IBM calls this the Quantum Ready stage. It suggests it is time to get off the corporate couch and begin preparing our systems, people, and resources for the era of quantum computing.  By all accounts, full-blown quantum computing won’t be a small wave of change; it will be a technological tsunami.

Most researchers agree that quantum computing is still in the experimental stage.  The truth is, a regular computer can do anything today’s quantum computers can do.

However, stay tuned, we have reason to believe that might change very soon.

Quantum supremacy is a techie buzzword.  Until now, it’s been an impossible benchmark for quantum researchers to meet.  It describes the ability of quantum computers to solve problems that classical computers can’t touch. You can look at quantum supremacy as the Mount Everest of quantum computing, except this mountain hasn’t been climbed yet.

Despite the skeptics, Google has hinted its gate-based 72 qubit quantum processor called Bristlecone, will achieve quantum supremacy sometime this year. Bristlecone is a scaled-up version of its nine qubit older brother.  Scaling up qubits generally increases system noise and errors, but Google has done an excellent job of keeping quantum errors in check with Bristlecone.

Read more here:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.