As an Electronics Engineer, one of my favourite courses whilst at Manchester University (many, many, many years ago) covered the operation of microprocessors, and it is with some pride that, at one point in my academic studies, I could explain the operation of a computer from the screen and keyboard down to electrons in the transistors making up the microprocessors. It might take a little more time to do the same today, but the basic principles have stayed with me. So, it is against that backdrop that I have always remained interested in developments in computing - quantum computing being one of the areas that I follow keenly especially as it relates to its potential impact on crypto, and the risk it poses to the underlying algorithms.
Quantum computing differs fundamentally from classical computing. While classical computers use ‘bits’ as the smallest unit of data, represented as either 0 or 1, quantum computers use quantum bits, or qubits. Qubits can exist simultaneously in an infini…
Keep reading with a 7-day free trial
Subscribe to Digital Bytes to keep reading this post and get 7 days of free access to the full post archives.