While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
Why Extra Temporal Dimensions?
The search for extra spatial dimensions raises questions of the potential for extra temporal dimensions. If space can have more dimensions, why can’t time? The motivations to explore the potential for extra temporal dimensions arise from a desire to better understand the nature of time and the symmetries between them.
Another reason to study these extra-temporal dimensions is the desire to unify seemingly disconnected parts of time. Many frameworks for extra temporal dimensions have revealed previously unnoticed symmetries and relationships between different temporal systems that would not be discovered while only working in one dimension.
The concept of “complex time” is used to fix some of the problems of quantum mechanics. This idea suggests that time should be represented as a complex value rather than a real number. It would allow more ways to represent wave-particle duality, entanglement, and other fundamental concepts of quantum physics.
2T-Physics
Proposed by physicist Itzhak Bars, 2T-Physics suggests that the one dimension of time we experience is really just a “shadow” of the real two dimensions of time. The core motivation of 2T-Physics is to reveal the deeper temporal connections that we don’t see in our one-dimensional perspective. In 2T-Physics, two seemingly disconnected temporal systems are actually connected and represent different views or ‘shadows’ of the same two-dimensional time.
2T-Physics unifies a wide range of physical systems using “gauge symmetry,” which is the property of a system where a set of transformations, called gauge transformations, can be used on a system without changing any of the physical properties of that system. Bars also illustrated that the Standard Model could be explained by 2T-Physics with four spatial dimensions. Not only can this model predict most of the Standard Model, but it also provides a solution to some quantum issues.
An interesting difference between the Standard Model and the predictions of 2T-Physics is the gravitational constant. While it is currently established that the coefficient in gravitational equations is a constant 6.67⋅10-11, the mathematics of 2T-Physics means that the gravitational constant has different values for different periods of our universe (inflation, grand unification, etc). This allows new possibilities for early expansion of our universe that General Relativity and the Standard Model do not. Through its new perspectives, 2T-Physics allows a more complete framework of gravity, especially at higher dimensions.
While 2T-Physics is well-established, it remains highly theoretical and has little to no practical impact. While there is no evidence directly supporting the theory, 2T-Physics predicts certain connections between different physical systems that could potentially be verified through complex experiments, though none have been conducted so far. Above all, 2T-Physics provides a new perspective on time and the nature of the laws of physics that has opened the eyes of many scientists and will likely inspire future discoveries.
3D Time
One of the most recent papers in the field, Kletetschka, proposes a mathematical framework of spacetime that includes temporal dimensions. Kletetschka provides a new perspective on combining gravity and quantum mechanics. Instead of having two hidden dimensions of time, Kletetschka theorizes that each of these dimensions is used to represent time at different scales: the quantum scale, the interaction scale, and the cosmological scale. He explains that the other two dimensions are not visible in our daily life because they occur at very small (quantum) levels or very large (cosmological) levels.
A massive difference between this theory and conventional physics is that while conventional physics considers space to be something vastly different from time, Kletetschka proposes that space is a byproduct of time in each of these dimensions, rather than an entirely separate entity. What we experience as mass or energy actually arises from the curvature of time in these three dimensions. As Kletetschka explored more into this, he discovered surprising consistency in the mathematics, leading to a deeper exploration into the concept.
The key to not creating causality issues and instability in the theory was the usage of regular geometry and spatial dimensions instead of exotic situations that are hard to prove or test. This theory aimed to address many of the long-standing issues in quantum mechanics, and its success thus far makes it a prominent theory in the field.
The theory is able to add extra temporal dimensions without causing causality issues, something very few theories of its type have been able to grapple with. This is due to its structure. The theory is designed so that the three axes share an ordered flow, preventing an event from happening before its cause. Furthermore, these three axes operate at very different scales, leaving very little overlap between them. The mathematics of the framework does not allow for the alteration of events in the past, something that many other theories allow.
The theory is able to offer physical significance and a connection to our world alongside mathematical consistency. Things such as finite quantum corrections, which other theories were not able to predict, were mechanized by this model without creating extra complexity.
This mathematical framework is able to predict several properties and new phenomena that can be experimentally tested, allowing pathways to prove or disprove it soon. Meanwhile, many scientists have spoken in support of the theory, considering it a promising candidate for a near “Theory of Everything” just a few months after its publication.
Conclusion
While the theoretical motivation for extra dimensions is compelling, the reality of their existence remains unconfirmed. Meanwhile, the scientific community works to experimentally prove or disprove their existence through observational evidence.
The Large Hadron Collider (LHC) at CERN is one of the major players on the experimental side. They engage in many experiments, a few of which I have highlighted below.
Tests for Microscopic Black Holes: Many of the theories that propose extra dimensions lead to increased gravitational power within short distances. This manifests physically as microscopic black holes that would dissipate near instantaneously due to Hawking Radiation. However, the byproduct of this dissipation would be particles detected through the LHC.
The Graviton Disappearance: Another common feature of extra-dimensional theories is the manifestation of gravity as a particle called a graviton. That particle would disappear into these extra dimensions, taking energy with it. This would result in an imbalance in the total energy of the system.
While experiments have managed to provide more limitations for potential values that would work in certain theories, they have yet to prove or disprove them.
Meanwhile, it is important to consider what extra dimensions would mean for us and the way we live. The concept of extra dimensions provides multiple philosophical considerations for us as humans. This concept completely changes our worldview and affects our perception of the universe. Dr. Michio Kaku explains this through the analogy of a fish in a pond, unaware of the world outside its simple reality. Our perception of reality is limited, not only by our understanding of physics, but also by the biology of our brains.
The work towards a “Theory of Everything” is not only a physical goal but a philosophical one as well. We strive to understand our universe and everything within it in the simplest way possible. It embodies human desire for ultimate knowledge and drives centuries of physical progress.
Overall, the concept of extra dimensions represents one of the most arduous and ambitious goals in human history. While they lack proof, these theories motivate people to search more into the nature of our universe and question the very fabric of our reality. The exploration into further discoveries about our universe truly shows who we are as humans and will continue to motivate centuries of physicists to question the very nature of everything.
DUFF, M. J. (1996). M THEORY (THE THEORY FORMERLY KNOWN AS STRINGS). International Journal of Modern Physics A, 11(32), 5623–5641. https://doi.org/10.1142/s0217751x96002583
Gunther Kletetschka. (2025). Three-Dimensional Time: A Mathematical Framework for Fundamental Physics. Reports in Advances of Physical Sciences, 09. https://doi.org/10.1142/s2424942425500045
Kalligas, D., S, W. P., & Everitt,. (1995). The classical tests in Kaluza-Klein gravity. The Astrophysical Journal, Part 1, 439(2). https://ntrs.nasa.gov/citations/19950044695
Lloyd, S., Maccone, L., Garcia-Patron, R., Giovannetti, V., Shikano, Y., Pirandola, S., Rozema, L. A., Darabi, A., Soudagar, Y., Shalm, L. K., & Steinberg, A. M. (2011). Closed Timelike Curves via Postselection: Theory and Experimental Test of Consistency. Physical Review Letters, 106(4). https://doi.org/10.1103/physrevlett.106.040403
While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
What are Extra Spatial Dimensions?
As scientists explored further into spacetime, theories of more dimensions of space, beyond the three we know, were suggested as a way to explain many of the phenomena that we cannot explain with only three dimensions. These ideas gained most of their traction from the pursuit to combine quantum mechanics with General Relativity, especially issues such as quantum gravity. These theories also attempt to address the rapid growth of the universe after the Big Bang.
What were the motivations to search for Extra Dimensions?
The idea of more dimensions began as a way to unify the fundamental forces of our universe. Modern theories regarding these ideas come from a drive to resolve some of the unaddressed issues of the Standard Model of physics. While the Standard Model is able to describe fundamental particles and the strong, weak, and electromagnetic forces, it is unable to describe gravity. In addition, the Standard Model cannot address dark matter and dark energy, which make up the majority of our universe.
One of the most significant problems in physics is the Hierarchy problem. It refers to the massive gap in strength between gravity and the other three fundamental forces. This extreme difference comes from the small scale of the strength of gravity in comparison to the other forces. Extra-Dimensions have attempted to resolve this by suggesting that while gravity may be just as strong as the other forces, its strength is leaked into the other dimensions, thus weakening it.
This search to discover extra dimensions is not only about solving these specific technical issues; it’s about the centuries-long quest to find a Theory of Everything. Physicists constantly strive to find simpler solutions to describe our universe rather than leaning on hyperspecific coefficients/constants.
While there are many theories involving extra-spatial dimensions, part 2 will focus on a few of the biggest and most influential theories so far.
Kaluza-Klein Theory
In 1919, Theodor Kaluza proposed his theory of four-dimensional space as an attempt to combine gravity and electromagnetism. This theory was later built upon by Oscar Klein in 1926.
In Kaluza’s attempt to combine these fundamental forces, he suggested a fourth, unseen spatial dimension. To create this system, he used Einstein’s equations and extended them into a fifth dimension. He found that the five-dimensional version of Einstein’s equations naturally created the four-dimensional version in one part. The equation had fifteen components, ten of which described our four-dimensional General Relativity. Four of the remaining five described the electromagnetic force through Maxwell’s equations, while the last dimension was the scalar field, which had no known use.
A key concept of Kaluza-Klein theory is that, rather than seeing electric charge as simply an event or calculation, it is represented as the motion of the fifth dimension. The attempt to create the simplest mathematical structure that could represent the five dimensions led to the assumption that no part of the five-dimensional Einstein equations relied explicitly on this fifth dimension. Instead, its presence was there to alleviate other issues in the Standard Model without disrupting the basic functions of Einstein’s equations. In order to do this, Kaluza created the cylinder condition, where he described all coordinate values in the fifth dimension to be zero, effectively hiding it at a macroscopic level, preserving the four dimensions that we experience.
Oscar Klein produced a physical explanation for the cylinder condition in 1926. He suggested that the fifth dimension was compactified and curled up into an unobservable circle with an incredibly small radius, explaining that this is why we are unable to witness the fifth dimension.
An interesting way to understand this is to think of a hose. From a distance, the hose looks like a single-dimensional line. However, the hose actually has two dimensions, both a dimension of length as well as a circular dimension.
This theory revolutionized how physicists thought about spacetime. In a letter to Kaluza that same year, Einstein wrote,
“The idea of achieving unification by means of a five-dimensional cylinder world never dawned on me […]. At first glance, I like your idea enormously. The formal unity of your theory is startling.” (Einstein, 1919)
Over time, Kaluza-Klein theory has been disproven due to its several fundamental flaws. Scientists have tested for Kaluza-Klein resonances, particles that would have to exist if the theory were to be true, and have found none. In addition, Kaluza-Klein theory only addresses gravity and electromagnetism but excludes the strong and weak forces. When incorporated with quantum mechanics, Kaluza-Klein theory predicts many incorrect values for otherwise known constants, showing massive discrepancies. Despite these issues, Kaluza-Klein theory has long been considered the first step into the exploration of extra-dimensions, becoming the precursor to many theories in the decades after. Its core idea- that hidden dimensions cause forces in our four dimensions-has been crucial to further exploration into the concept of spacetime.
String Theory is a very common term, but few people actually know what it means. String theory proposed that instead of the universe being made up of zero-dimensional points, it is made up of strings that vibrate. The specific vibration of these strings would determine what they would be (photon, quark, etc.). The theory aimed to unify all of these different particles and properties into one thing: the string.
When physicists first began to work on String Theory, they found many mathematical issues, such as negative probabilities. In four dimensions, these strings don’t have enough space to produce the wide range of vibrations needed to create all the particles in the standard model. Thus, Superstring Theory suggests that these strings are ten-dimensional objects (nine dimensions of space and one of time). A major reason why physicists were happy with string theory at the time was that it naturally predicted a particle called a ‘graviton’. This particle would have the same effect as the force of gravity. Theoretical physicist Edward Witten has commented on this by saying,
“Not only does [string theory] make it possible for gravity and quantum mechanics to work together, but it […] forces them upon you.” (Edward Witten, NOVA, PBS)
M-Theory is an extension of String Theory that adds one more spatial dimension. Prior to its creation, different groups of physicists had created five versions of String Theory.
However, a true “Theory of Everything” should be one theory, not five possibilities.
M-Theory was created as an attempt to unify these five types of string theory. The key to the development of M-Theory was the discovery of mathematical transformations that took you from one version of String Theory to another, showing that these were not truly separate theories. M-theory theorized that these different versions were just different approximations of the same theory that could be unified by adding another dimension. M-Theory’s eleven-dimensional framework allowed for the unification of these five theories alongside the theory of supergravity.
M-Theory, similarly to Kaluza-Klein Theory, also proposes that the extra dimensions are curled up and compacted. M-Theory uses a specific geometric shape, known as a Calabi-Yau manifold, to create the physical effects we observe in our four dimensions from the other hidden seven. Calabi-Yau manifolds are a highly compact and complex type of manifold that are the foundation of M-Theory because they allow complex folding without affecting the overall curvature of our universe through a property called “Ricci-flatness”. The Calabi-Yau manifolds also have “holes” within their shapes that are thought to connect to the number of families of particles we experience in the Standard Model. This introduces the key concept that, instead of the fundamental laws of physics just being rules, they are actually geometric properties of our universe.
The biggest challenge that M-Theory is facing is its lack of experimental evidence. Predictions made by this model are not testable by currently available or foreseeable technology due to the high-dimensional microscopic levels required. Without making testable predictions, the theory remains just a theory for the time being.
Despite this lack of proof, many physicists still see M-Theory as a prominent candidate in our search for a “Theory of Everything”. Its mathematical consistency and its ability to unify both gravitational and quantum effects lead to it being considered highly promising.
However, while the math behind M-Theory is highly developed, it is not yet complete. The theory is still a work in progress as research is being conducted to better understand its structure and significance.
Meanwhile, critics believe that M-Theory is fundamentally flawed. Many of them believe that the “Landscape” problem is a significant reason that M-Theory is untrue. The “Landscape” problem is described as the fact that the theory predicts many different universes, each with its own set of physical laws. Critics believe that this prediction proves the unreliability of M-Theory and that a true “Theory of Everything” would be applicable only to our universe.
Overall, M-Theory has neither been proven nor disproven and remains a crucial area for future exploration.
While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
Newtonian Absolutism
At the dawn of classical mechanics, Newton created the foundation upon which all of modern spacetime theory is built. Space and time were considered to be entirely unrelated and absolute concepts. There was no question in his mind that time moves forward and space exists around us. Space was considered a static body within which we exist, while time was described as flowing in only one direction at a steady rate. Imagine space as a box, where events are contained within, and time as a river whose current pulls us along.
Newton coined the terms ‘absolute space’ and ‘absolute time’ to describe the absolutes from the relativity we measure. For centuries, this theory remained unquestioned, so physicists didn’t consider time and space to be real entities, but rather our human way of interpreting the world around us.
Einstein’s Revolution:
Special Relativity
The first true challenge to the Newtonian perspective of space and time came in the form of Einstein’s Special Relativity. He introduced one key revolutionary concept: everything, including space and time, is relative, depending only upon the observer’s frame of reference.
The motivations for Einstein’s work arose from the desire to eliminate the contradiction between Maxwell’s equations and Newtonian Mechanics. A simple way to visualize this contradiction is by imagining the following scenario:
Two rockets in space are flying towards each other at a speed of 500 miles per hour. This would result in a relative speed of 1000 miles per hour. Now, if you were to throw a rock from one ship to another at a speed of 10 miles per hour, it would reach the other ship with a relative speed of 510 miles per hour. However, the substitution of light into this situation instead of a rock changes this because the speed of light is constant. No matter how fast you travel towards light, it will always come towards you at the same constant speed: 3·108m/s, or the speed of light.
Many tests were done to prove that the wave-particle duality of light was the reason for this phenomenon. Rather than trying to disprove or explain away the theory, Einstein decided to take the constant speed of light as a fundamental property. He didn’t explain the speed of light, but used it to explain other things. Einstein was willing to give up the time-honored fundamentals of Newton’s laws in favor of the constant speed of light.
He began with the basic definition of speed as the distance divided by the time. If the speed of light remains constant as this rocket reduces the distance to be travelled, then the time must also decrease to preserve this equality. When mathematically calculating this, Einstein discovered the concept of time dilation, where objects in motion experience time more slowly than objects at rest. Continuing with similar methods for other properties, such as conservation, he discovered that mass would increase with speed and length would decrease. The true genius in Einstein was his willingness to question his own assumptions and give up some of the most basic qualities of the universe, in favor of the speed of light.
General Relativity
Special Relativity, however, did not incorporate gravity. Before Einstein, physicists believed that gravity was an invisible force that dragged objects towards one another. However, Einstein’s general relativity suggested that the ‘dragging’ was not gravity, but rather an effect of gravity. He theorized that objects in space bent the space around them, inadvertently bringing objects closer to one another.
General Relativity defines spacetime as a 4D entity that has to obey a series of equations known as Einstein’s equations. He used these equations to suggest that gravity isn’t a force but instead a name we use to describe the effects of curved spacetime on the distance between objects. Einstein proved a correlation between the mass and energy of an object and the curvature of the spacetime around it.
His work allowed him to prove that:
“When forced to summarize the general theory of relativity in one sentence: Time and space and gravitation have no separate existence from matter.” -Einstein.
Einstein’s General Relativity predicted many things that were only observationally noticed years later. A famous example of this is gravitational lensing, which is when the path of light curves as it passes a massive object. This effect was noticed by Sir Arthur Eddington in 1919 during a solar eclipse, yet Einstein managed to predict it with no physical proof in 1912.
Closed-Timelike-Curves (CTCs)
Another major prediction made by Einstein’s General Relativity is Closed-Timelike-Curves (CTCs), which arise from mathematical solutions to Einstein’s equations. Some specific solutions to these equations, such as massive, spinning objects, create situations in which time could loop.
In physics, objects are considered to have a specific trajectory through spacetime that will indicate the object’s position in space and time at all times. When these positions in spacetime are connected, they form a story of an object’s past, present, and future. An object that is sitting still will have a worldline that goes straight in the time direction. Meanwhile, an object in motion will also have an element of spatial position. Diagrams of a worldline are drawn as two light cones, one into the future and one into the past, with a spatial dimension on the other axis, as seen in figure 1.
CTCs are created when the worldline of an object is a loop, meaning that the object will go backwards in time at some point to reconnect to its starting point. Closed-Timelike-Curves are, in essence, exactly what they sound like: closed curving loops that travel in a timelike way. Traveling in a timelike way, meaning that their change in time is greater than their change in space, suggests that these objects would have to be static or nearly static. As seen in Figure 2, the worldline of a CTC would be a loop, as there is some point in space and time that connects the end and the beginning.
Two major examples of famous CTC solutions are the Gödel Universe and the Tipler Cylinder:
Gödel Universe: Suggested by mathematician Kurt Gödel in 1949, the Gödel Universe is a rotating universe filled with swirling dust. The rotation must be powerful enough that it can pull the spacetime around it as it spins. The curvature would become the CTC. This was the first solution found that suggested the potential for time-travel to be a legitimate possibility, not just a hypothetical scenario.
Tipler Cylinder: In the 1970s, physicist Frank Tipler suggested an infinitely long, massive cylinder spinning along the vertical axis at an extremely high speed. This spinning would twist the fabric of spacetime around the cylinder, creating a CTC.
Closed timelike curves bring many paradoxes with them, the most famous of which is the grandfather paradox. It states that if a man has a granddaughter who goes back in time to kill her grandfather before her parents are born, then she wouldn’t exist. However, if she doesn’t exist, then there is no one to kill her grandfather, thus meaning that she must exist. Yet if she exists, then her grandfather doesn’t.
Most importantly, CTCs drove further exploration and directed significant attention to the spacetime field for decades. Scientists who didn’t fully believe Einstein’s General Relativity pointed to CTCs as proof of why it couldn’t be true, leaving those who supported Einstein to search extensively for a way to explain them. This further exploration into the field has laid the foundation for many theories throughout the years.
The belief amongst scientists is that CTCs simply don’t exist because, while they are hypothetically possible, the energy requirements to create them are not yet feasible. Many of these setups require objects with negative energy density and other types of ‘exotic matter’ that have not been proven to even exist yet. Furthermore, even if CTCs were to be formed, the specific region of spacetime where they form would be highly unstable, meaning that these CTCs would not sustain themselves. The situations in which CTCs would be feasible require types of fields of energy that would approach infinity and the Cauchy Horizon (the limit at which causality no longer exists, therefore making these situations physically unviable).
“Some experts in the field predict that the first quantum computer capable of breaking current encryption methods could be developed within the next decade. Encryption is used to prevent unauthorized access to sensitive data, from government communications to online transactions, and if encryption can be defeated, the privacy and security of individuals, organizations, and entire nations would be under threat.” – The HIPAA Journal
Introduction
The cybersecurity landscape is facing a drastic shift as the increasing power of quantum computers threatens modern encryption. Experts predict a quantum D-day (Q-day) in the next 5-10 years, when quantum computers will be sufficiently powerful to break through even the strongest of cybersecurity mechanisms. Meanwhile, few companies have begun to prepare against the threat, developing quantum resistant cybersecurity methods. However, to fully combat the threat, we need to act now.
Encryption Today
Modern cryptography is dominated by two major algorithms that transform ordinary text into ciphertext:
1. Rivest-Shamir-Adleman (RSA)
Dating back to 1977, the RSA algorithm relies on the factoring of large numbers. RSA can be separated into two parts, a private and public key. The public key, used for encoding, is a pair of numbers (n, e)where n is the product of 2 large prime numbers (p•q=n). The value of e can be any number that is co-prime to (p-1)(q-1), meaning that the GCF of (p-1)(q-1) and e is 1. The private key (d), used for decoding, is the reciprocal of the least common multiple of (p-1)(q-1) and e and can also be found by solving the equation 1= d • e • (p-1)(q-1) for d.
For decades, RSA has provided security for digital data because large scale of (n, e) numbers in addition to the variability of e means that it is nearly impossible to decipher (p, q) from (n, e). However, quantum computing brings forth the ability to quickly factor large numbers, allowing (p, q) to be determined from just the public key.
2. Elliptic Curve Cryptography (ECC):
Since 1985, ECC algorithms have been favored over RSA’s due to their greater complexity and faster encryption, with ECC’s capabilities proving to be up to ten times faster. ECC algorithms use an elliptical curve of the form y2=x3+ax+b over a finite field of not necessarily real numbers (Fp). A field Fp includes numbers from 0 to p-1, where p is prime.
Figure 1: The elliptic curve
Figure 2: The elliptic curve over F11
For the purpose of illustration, let us take the elliptical equation y2=x3+13 and a field F11. Figure 1 shows the elliptical curve while figure 2 shows the solutions to y2 =x3+13 (mod 11). The order of the curve is the number of points, including the arbitrary one at infinity, that satisfy the equation over a specific field (12 points in figure 2). The private key is some value k between 1 and the order of the curve. The public key can be calculated by taking one of the points, called the generator point (G), and multiplying it by k (kG). This system then encrypts the information using the public key (kG) and can only be decrypted by those who know k.
For example, let us take a value of k=5 and the point (9,4) as the generator point (G). When we multiply 5G, we are given the point (9,7), which would be the public key. However, just given the 2 points, it is extremely difficult to find the value of k.
ECC algorithms have long been considered nearly unbreakable due to the elliptic curve discrete logarithm problem , or the ‘ECDLP’. The ECDLP is a mathematical problem that asks: Given two points (P, Q) on an elliptic curve, what operation or algorithms could be used to find the specific constant k such that k multiplied by P equals Q?
The key issue in solving this lies in point multiplication, where a tangent line is drawn to a point on the elliptical curve (P) as part of the operation. Wherever that line intersects the elliptical curve again is point Q’. When Q’ is reflected across the x-axis of the equation (not necessarily y=0), the result is Q which equates to 2P. This process is continued until KP is reached. While it is straightforward to find Q given P and K, it is nearly impossible to find K given P and Q because there is currently no known inverse operation to undo, or solve for the coefficient in point multiplication.
Ultimately, RSA and ECC algorithms are what encrypt all of digital data and communication. They keep everything secure from classified government data to something as simple as a text message. Encryption allows private information to remain private and large national or international systems to continue functioning. It acts as a barrier against bad actors looking to hack or exploit this private data. Without encryption, there would be no safeguard for any data. Imagine if everything you ever put on a device, whether private photos or bank information, suddenly became public. You would no longer be able to trust digital privacy and safety if these algorithms were to fail.
To understand the momentous advancements in quantum computing, it is important to take a step back and examine the field’s origins as well as how quantum mechanics have evolved over time. Written in 1900 by Max Planck, the ‘Quantum Hypothesis’ explored the idea that rather than the conventionally accepted continuously flowing energy, energy was actually emitted in non-connected packets called quanta. His work laid the foundation for an exploration into what has become the field of quantum mechanics. Both Einstein’s 1905 work on the Photoelectric effect and Niels Bohr’s 1913 work on the atom further supported this claim by suggesting quantum leaps and the particle-like behaviors of a photon.
In 1927, Heisenberg formulated his uncertainty principle, which stated that it is impossible to simultaneously know the position and the speed of a particle with perfect accuracy. Einstein, Podolsky, and Rosen each published various works in 1935, questioning quantum mechanics via entanglement, or the influence of the state of one particle on the state of another simultaneously over great distances. Recent works have shown that entanglement can connect particles even between a satellite and the Earth. John Bell later proved entanglement by conducting experiments in search of violations of the Bell inequalities in 1964.
In 1926 Schrodinger created a system of wave equations that accurately predicted the energy levels of electrons in atoms. Neumann built on this alongside Hilbert’s work to create the mathematical framework for quantum mechanics, formalizing quantum states and creating a method to understand the behavior of quantum systems. In the 1940s Feynman, Schwinger, and Tomonaga developed their theory of Quantum Electrodynamics (QED) which described the interactions of light and matter.
The 1980 conference of physicists, mathematicians, and computer scientists was the turning point from quantum theories into quantum applications, laying the foundation for all of quantum computing. While the first working laser was created in the 1950s, quantum mechanics was not explored much further untilPaul Benioff’s 1980 description of a quantum computer,the first step towards quantum computing.
Quantum Computing: What is it and how does it work?
Superposition: The state of being in multiple states or places at once. Superposition is mostly commonly seen with overlaps of waves, but at a quantum level can be understood as a particle being in both state 1 and state 0 at the same time. However, when measured these particles must settle at either state 1 or state 0. The most commonly known analogy to explain this is the Schrodinger’s cat analogy: If you were to put a cat inside of a box with a substance that has an equal chance of killing or not killing the cat within an hour, then after one hour you could say that the cat is both dead and alive until you measure it, at which point it must be either dead or alive.
Entanglement: A phenomenon by which two particles become connected such that the fate of one affects the other, irrespective of the distance between the two. Prior to any measurement, two particles will always be in a state of superposition, meaning that the particles can be in both state 0 and state 1 at the same time. However, when measured, the state of one particle will directly affect the state of the other. This principle was proven by John Bell via the Bell inequalities.
Quantum computing allows storage of more information and more efficient processes, creating opportunities to infinitely increase the rate at which many modern machines work. While they face setbacks in these developing stages, they make it possible to perform multiple simultaneous operations rather than being limited by the tunnel effect that limits most modern machines to straightforward operations.
Quantum systems use qubits as the fundamental unit of information transfer instead of the traditional bit. Qubits allow for the superposition of ones and zeros making it possible for quantum computers with very few qubits to perform billions of operations per second, over a million times faster than the best computers on the market today. In addition, the entanglement of multiple qubits means that information capacity grows exponentially rather than linearly.
Compare and Contrast: Quantum Computers vs. Traditional Computers
The Quantum Threat to Cryptography
While current computers may not be strong enough to carry out an attack on cryptography, the emerging field of quantum computing poses a risk to all of modern encryption.
Is the threat just theoretical?
Even as an emerging technology, quantum computing poses a very real threat to cryptography. While many people would be more than willing to write it off as a threat of the future, that future may be closer than you believe. Quantum computing has shown its strength through many algorithms which could potentially result in the compromisation of sensitive data.
The most prominent algorithm in regards to cryptography is Shor’s ‘Factoring Algorithm’ from 1994. Specifically, Shor’s Factoring Algorithm (SFA) is a major threat to RSA cryptography systems. As I mentioned earlier, RSA systems rely on the creation of large numbers as the product of two prime numbers, basing security over the inability to efficiently factor those numbers.
According to Thorsten Kleinjung of the University of Bonn, it would take around two years to factor N = 135066410865995223349603216278805969938881475605667027524485 14385152651060485953383394028715057190944179820728216447155137368041970396419174 304649658927425623934102086438320211037295872576235850964311056407350150818751067 6594629205563685529 475213500852879416377328533906109750544334999811150056977236 890927563 with under 2 GB of memory.
Shor’s Algorithm could exponentially speed this up by working as follows:
Start with the large number (N) and a guess (g). If g is a factor of N or shares a factor with N then we have already found the factors.
If g is foreign to N, then we use the property that for any 2 prime numbers (a,b) there exists one power (n) and one multiple (m) such that an= mb+1. Applying this here we get gn= mN + 1. We can further rewrite this as (gn/2-1)(gn/2+1)= mN. We can now change our objective from searching for values of g to searching for values of n.
This is where quantum computing makes a vital difference. By testing many possible values of n, the quantum system starts in a superposition of states. After attempting to solve for n using the above equation (mod N), we begin to take advantage of the fact that if gx mod(N) = r then gx+pmod(N) =r if p is the period of the equation ( gp=1). When we utilize superposition, we test to see what values of x produce the same remainder, as the distance between those x values will be the period.
We can derive from this the frequency (f=1/p)
Here we can apply a Quantum Fourier Transform (similar to a classical Fourier Transform): When we absorb all the constructive and destructive interference of the superposition, 1/p is the remaining frequency.
Now that we have a candidate for p, we calculate our best guess for gp and iterate as necessary to correct quantum error.
Aside from algorithms, many corporations have made recent advancements towards building quantum computers as well. As recently as June 2025, Nord Quantique, a Canadian startup, announced their breakthrough ‘bosonic qubit’ which has built in error correction. This creates the potential to produce successful, encryption breaking 1000-qubit machines by 2031, far more efficient than the previously estimated 1 million-qubits.
The ‘Harvest Now, Decrypt Later’ Tactic
Another major reason why quantum mechanics is a threat to cryptography includes the ‘harvest now, decrypt later’ (HNDL) tactic. As the predicted Q-day nears (2035), threatening actors have begun to collect and store encrypted data, with the goal of decrypting it in the future with sufficiently powerful quantum machines. The attackers may not be able to decrypt the data, but they can intercept communications to steal encrypted data.
While it is easy to dismiss these attacks as something that could only be effective on nation-state levels, this assumption only feeds a false sense of security. For bad actors, corporate information could enable them to threaten economic chaos and large-scale disruptions. In fact, experts believe that these attacks have become increasingly focused on businesses as they hold the people’s data and the power to create mass economic instability.
Matthew Scholl, Chief of the Computer Science at NIST described the threat by saying,
“Imagine I send you a message that’s top secret, and I’ve encrypted it using this type of encryption, and that message is going to need to stay top secret for the next 20 years. We’re betting that an adversary a) hasn’t captured that message somehow as we sent it over the internet, b) hasn’t stored that message, and c) between today and 20 years from now will not have developed a quantum machine that could break it. This is what’s called the store-and-break threat.”
The most concerning aspect of these HNDL attacks is that it is nearly impossible to know when your data has been stolen, until it comes into effect with the rise of quantum computing. By then, the damage will be irreversible. While not all data will be of high value over a decade from now, attackers are threatening specific data that they believe will hold long-term value.
Over the past 10 years, incidents have arisen that resemble HNDL attacks:
In 2016, Canadian internet traffic to South Korea, was being rerouted through China
In 2020, data from many large online platforms was rerouted through Russia
A study by HP’s Wolf Security discovered that one third of the cyber attacks conducted by nation-states between 2017 and 2020 were aimed at businesses
Post Quantum Cryptography ( PQC)
However, companies and nations have already begun to look into ways to protect data from quantum attacks. Post-Quantum encryption algorithms focus on encrypting data in a way that will be equally difficult for quantum machines to break as it is for the classic computer.
The Deputy Secretary of US Commerce, Don Graves said,
“The advancement of quantum computing plays an essential role in reaffirming America’s status as a global technological powerhouse and driving the future of our economic security. Commerce bureaus are doing their part to ensure U.S. competitiveness in quantum, including the National Institute of Standards and Technology, which is at the forefront of this whole-of-government effort. NIST is providing invaluable expertise to develop innovative solutions to our quantum challenges, including security measures like post-quantum cryptography that organizations can start to implement to secure our post-quantum future. As this decade-long endeavor continues, we look forward to continuing Commerce’s legacy of leadership in this vital space.”
One example of a potentially powerful PQC algorithm is CRYSTALS-Kyber, which the NIST declared the best for general encryption in 2022. They added HQC to their list of PQC algorithms in 2024, giving us a grand total of five algorithms that have met the standard.
The NIST has named their standards for PQCs and urges people to work towards incorporating them now, because the full shift to PQCs may take as long as developing those quantum computers will take. Their key goals in this endeavor are to not only find algorithms that are resistant to quantum computing, but to diversify the types of mathematics involved to mitigate the risk of compromised data. They search for algorithms that are both able to be easily implemented and improved so that they maintain a ‘crypto-agility’.
Many companies support PQCs and believe that they will safeguard the future of cryptography. Whitfield Diffie, cryptography expert, explains that
“One of the main reasons for delayed implementation is uncertainty about what exactly needs to be implemented. Now that NIST has announced the exact standards, organizations are motivated to move forward with confidence.”
Companies such as Google, Microsoft, IBM, and AWS are actively working to develop better resistance to quantum threats, helping to build some of the most powerful PQC algorithms. IBM is currently advocating for a Cryptography Bill of Materials (CBOM), a new standard to keep tabs on cryptographic assets and introduce more oversight into the system. Microsoft has become one of the founding members of the PQC Coalition, a group whose mission is to step forward and provide valuable outreach alongside education to support the shift towards PQC as the primary form of encryption.
While PQCs could be a valuable resource against quantum threats, there are still setbacks that make people question the validity of the whole effort. The Supersingular Isogeny Key Exchange (SIKE) algorithm, one of the NIST finalists for the PQC standard, failed due to a successful attack by a classical computer, rendering many of the fundamental mathematical assumptions false. In addition, many of these algorithms suffer due to a lack of extensive testing and uncertainty regarding how much quantum machines will actually be able to accomplish.
Conclusion
While the timeline of PQC development might be uncertain, it is imperative that we work now. Quantum computing is no longer a threat looming in the future, but a present reality with significant impacts.It is imperative that we begin shifting towards these safer systems as a community. We cannot wait until the threat has come, we need to prepare now.
Rob Joyce, the Director of the National Security Administration’s Cybersecurity has stated that,
“The transition to a secured quantum computing era is a long-term intensive community effort that will require extensive collaboration between government and industry. The key is to be on this journey today and not wait until the last minute.”
Above all, it is crucial to recognize the threat and take action. Educating the people is the first step towards group action. Let awareness be our first line of defense.