Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Quantum Computing

Quantum computing describes the application of quantum phenomena to computing applications to dramatically increase the speed at which computations can be performed. Unlike traditional computing, which relies on a 'binary system' (bits) in which the fundamental unit of information is 'ON or OFF', quantum computing can add additional dimensions to bits (known as qubits), allowing algorithms to be designed and performed in novel ways.

Researchers hypothesize that quantum computers could crack several popular cryptographic algorithms, potentially undermining the security of current blockchains. Post-quantum cryptography investigates whether algorithms can be made secure from quantum computer attacks. There has been little real-world use and impact from quantum computing, however, as these computers are expensive and difficult to maintain (requiring to be kept at cryogenic temperatures, for example) and require special development of new algorithms to be used effectively.