top of page
Search

Quantum Computers

  • Writer: Advay Kadam
    Advay Kadam
  • Sep 25, 2022
  • 2 min read


It's fascinating to see how computers have evolved over the past century. In fact, before the 1920s, computers were humans, oftentimes women, who performed complex mathematical operations, typically for the military. That changed in the Roaring Twenties, which witnessed an economic boom and technological growth. In this period, the term “computing machine” became quite prominent, referring to mechanical machines that could perform the tasks of human computers. However, by the time of the 1940s, when Alan Turing’s Turing Machine helped end World War II by cracking the German Enigma machine, “computers” referred to electronic machinery that could perform astronomical calculations in significantly less time than a human.


Let's come back to the 21st century when we now have quantum computers. Technically the idea of quantum computers was introduced in the 1980s when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. More or less, quantum computers significantly improve calculation speeds through the laws of quantum mechanics, something normal computers are simply incapable of.


Aren’t there other solutions to enhance computing speeds though? Well, yes. Supercomputers perform similarly, using thousands of cores to increase processing speed. However, IBM explains a phenomenal difference between the two ultramodern computers: “A supercomputer might be great at difficult tasks like sorting through a big database of protein sequences. But it will struggle to see the subtle patterns in that data that determine how those proteins behave.”


IBM further describes that “Quantum algorithms take a new approach to these sorts of complex problems -- creating multidimensional spaces where the patterns linking individual data points emerge.”


Unlike classical computers, quantum computers create these computational spaces to uncover patterns of the “factor being computed.” For instance, in the protein sequencing example, quantum computers apply the laws of quantum mechanics to find new protein folding patterns in an efficient manner. On the other hand, traditional computers lack the ability to create such spaces and cannot determine outcomes for such complex problems.


Let's look at some fundamental differences between quantum and traditional computers. While classical computers rely on bits to perform operations, quantum computers utilize something called quantum bits, or qubits. A bit can function as either a 0 or a 1, but here’s the cool thing about a qubit: it can be both at the same time. How’s that possible? Superposition. You may have heard of Schrödinger's cat, and how the cat is simultaneously alive and dead, that same logic applies to a qubit, kind of like a coin landing on both heads and tails at the same time.


Through superposition, a qubit can literally represent all possible combinations of information, creating the computational spaces mentioned earlier. Now imagine the amount of information a group of qubits can store. It’s phenomenal, to say the least.


Wow, I barely covered superposition. There’s so much more about these computers and how their qubits behave. Perhaps I’ll write another post about Entanglement. Quantum computers are the future; the technology is still under development, but I’m certain we will be seeing quantum laptops at some point in the future.





 
 
 

2 Comments


hhkadam
hhkadam
Sep 27, 2022

Very nicely written

Like

bdgk.kariya
Sep 25, 2022

Wonderful information. Nice article

Like
bottom of page