While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
Newtonian Absolutism
At the dawn of classical mechanics, Newton created the foundation upon which all of modern spacetime theory is built. Space and time were considered to be entirely unrelated and absolute concepts. There was no question in his mind that time moves forward and space exists around us. Space was considered a static body within which we exist, while time was described as flowing in only one direction at a steady rate. Imagine space as a box, where events are contained within, and time as a river whose current pulls us along.
Newton coined the terms ‘absolute space’ and ‘absolute time’ to describe the absolutes from the relativity we measure. For centuries, this theory remained unquestioned, so physicists didn’t consider time and space to be real entities, but rather our human way of interpreting the world around us.
Einstein’s Revolution:
Special Relativity
The first true challenge to the Newtonian perspective of space and time came in the form of Einstein’s Special Relativity. He introduced one key revolutionary concept: everything, including space and time, is relative, depending only upon the observer’s frame of reference.
The motivations for Einstein’s work arose from the desire to eliminate the contradiction between Maxwell’s equations and Newtonian Mechanics. A simple way to visualize this contradiction is by imagining the following scenario:
Two rockets in space are flying towards each other at a speed of 500 miles per hour. This would result in a relative speed of 1000 miles per hour. Now, if you were to throw a rock from one ship to another at a speed of 10 miles per hour, it would reach the other ship with a relative speed of 510 miles per hour. However, the substitution of light into this situation instead of a rock changes this because the speed of light is constant. No matter how fast you travel towards light, it will always come towards you at the same constant speed: 3·108m/s, or the speed of light.
Many tests were done to prove that the wave-particle duality of light was the reason for this phenomenon. Rather than trying to disprove or explain away the theory, Einstein decided to take the constant speed of light as a fundamental property. He didn’t explain the speed of light, but used it to explain other things. Einstein was willing to give up the time-honored fundamentals of Newton’s laws in favor of the constant speed of light.
He began with the basic definition of speed as the distance divided by the time. If the speed of light remains constant as this rocket reduces the distance to be travelled, then the time must also decrease to preserve this equality. When mathematically calculating this, Einstein discovered the concept of time dilation, where objects in motion experience time more slowly than objects at rest. Continuing with similar methods for other properties, such as conservation, he discovered that mass would increase with speed and length would decrease. The true genius in Einstein was his willingness to question his own assumptions and give up some of the most basic qualities of the universe, in favor of the speed of light.
General Relativity
Special Relativity, however, did not incorporate gravity. Before Einstein, physicists believed that gravity was an invisible force that dragged objects towards one another. However, Einstein’s general relativity suggested that the ‘dragging’ was not gravity, but rather an effect of gravity. He theorized that objects in space bent the space around them, inadvertently bringing objects closer to one another.
General Relativity defines spacetime as a 4D entity that has to obey a series of equations known as Einstein’s equations. He used these equations to suggest that gravity isn’t a force but instead a name we use to describe the effects of curved spacetime on the distance between objects. Einstein proved a correlation between the mass and energy of an object and the curvature of the spacetime around it.
His work allowed him to prove that:
“When forced to summarize the general theory of relativity in one sentence: Time and space and gravitation have no separate existence from matter.” -Einstein.
Einstein’s General Relativity predicted many things that were only observationally noticed years later. A famous example of this is gravitational lensing, which is when the path of light curves as it passes a massive object. This effect was noticed by Sir Arthur Eddington in 1919 during a solar eclipse, yet Einstein managed to predict it with no physical proof in 1912.
Closed-Timelike-Curves (CTCs)
Another major prediction made by Einstein’s General Relativity is Closed-Timelike-Curves (CTCs), which arise from mathematical solutions to Einstein’s equations. Some specific solutions to these equations, such as massive, spinning objects, create situations in which time could loop.
In physics, objects are considered to have a specific trajectory through spacetime that will indicate the object’s position in space and time at all times. When these positions in spacetime are connected, they form a story of an object’s past, present, and future. An object that is sitting still will have a worldline that goes straight in the time direction. Meanwhile, an object in motion will also have an element of spatial position. Diagrams of a worldline are drawn as two light cones, one into the future and one into the past, with a spatial dimension on the other axis, as seen in figure 1.
CTCs are created when the worldline of an object is a loop, meaning that the object will go backwards in time at some point to reconnect to its starting point. Closed-Timelike-Curves are, in essence, exactly what they sound like: closed curving loops that travel in a timelike way. Traveling in a timelike way, meaning that their change in time is greater than their change in space, suggests that these objects would have to be static or nearly static. As seen in Figure 2, the worldline of a CTC would be a loop, as there is some point in space and time that connects the end and the beginning.
Two major examples of famous CTC solutions are the Gödel Universe and the Tipler Cylinder:
Gödel Universe: Suggested by mathematician Kurt Gödel in 1949, the Gödel Universe is a rotating universe filled with swirling dust. The rotation must be powerful enough that it can pull the spacetime around it as it spins. The curvature would become the CTC. This was the first solution found that suggested the potential for time-travel to be a legitimate possibility, not just a hypothetical scenario.
Tipler Cylinder: In the 1970s, physicist Frank Tipler suggested an infinitely long, massive cylinder spinning along the vertical axis at an extremely high speed. This spinning would twist the fabric of spacetime around the cylinder, creating a CTC.
Closed timelike curves bring many paradoxes with them, the most famous of which is the grandfather paradox. It states that if a man has a granddaughter who goes back in time to kill her grandfather before her parents are born, then she wouldn’t exist. However, if she doesn’t exist, then there is no one to kill her grandfather, thus meaning that she must exist. Yet if she exists, then her grandfather doesn’t.
Most importantly, CTCs drove further exploration and directed significant attention to the spacetime field for decades. Scientists who didn’t fully believe Einstein’s General Relativity pointed to CTCs as proof of why it couldn’t be true, leaving those who supported Einstein to search extensively for a way to explain them. This further exploration into the field has laid the foundation for many theories throughout the years.
The belief amongst scientists is that CTCs simply don’t exist because, while they are hypothetically possible, the energy requirements to create them are not yet feasible. Many of these setups require objects with negative energy density and other types of ‘exotic matter’ that have not been proven to even exist yet. Furthermore, even if CTCs were to be formed, the specific region of spacetime where they form would be highly unstable, meaning that these CTCs would not sustain themselves. The situations in which CTCs would be feasible require types of fields of energy that would approach infinity and the Cauchy Horizon (the limit at which causality no longer exists, therefore making these situations physically unviable).
“Some experts in the field predict that the first quantum computer capable of breaking current encryption methods could be developed within the next decade. Encryption is used to prevent unauthorized access to sensitive data, from government communications to online transactions, and if encryption can be defeated, the privacy and security of individuals, organizations, and entire nations would be under threat.” – The HIPAA Journal
Introduction
The cybersecurity landscape is facing a drastic shift as the increasing power of quantum computers threatens modern encryption. Experts predict a quantum D-day (Q-day) in the next 5-10 years, when quantum computers will be sufficiently powerful to break through even the strongest of cybersecurity mechanisms. Meanwhile, few companies have begun to prepare against the threat, developing quantum resistant cybersecurity methods. However, to fully combat the threat, we need to act now.
Encryption Today
Modern cryptography is dominated by two major algorithms that transform ordinary text into ciphertext:
1. Rivest-Shamir-Adleman (RSA)
Dating back to 1977, the RSA algorithm relies on the factoring of large numbers. RSA can be separated into two parts, a private and public key. The public key, used for encoding, is a pair of numbers (n, e)where n is the product of 2 large prime numbers (p•q=n). The value of e can be any number that is co-prime to (p-1)(q-1), meaning that the GCF of (p-1)(q-1) and e is 1. The private key (d), used for decoding, is the reciprocal of the least common multiple of (p-1)(q-1) and e and can also be found by solving the equation 1= d • e • (p-1)(q-1) for d.
For decades, RSA has provided security for digital data because large scale of (n, e) numbers in addition to the variability of e means that it is nearly impossible to decipher (p, q) from (n, e). However, quantum computing brings forth the ability to quickly factor large numbers, allowing (p, q) to be determined from just the public key.
2. Elliptic Curve Cryptography (ECC):
Since 1985, ECC algorithms have been favored over RSA’s due to their greater complexity and faster encryption, with ECC’s capabilities proving to be up to ten times faster. ECC algorithms use an elliptical curve of the form y2=x3+ax+b over a finite field of not necessarily real numbers (Fp). A field Fp includes numbers from 0 to p-1, where p is prime.
Figure 1: The elliptic curve
Figure 2: The elliptic curve over F11
For the purpose of illustration, let us take the elliptical equation y2=x3+13 and a field F11. Figure 1 shows the elliptical curve while figure 2 shows the solutions to y2 =x3+13 (mod 11). The order of the curve is the number of points, including the arbitrary one at infinity, that satisfy the equation over a specific field (12 points in figure 2). The private key is some value k between 1 and the order of the curve. The public key can be calculated by taking one of the points, called the generator point (G), and multiplying it by k (kG). This system then encrypts the information using the public key (kG) and can only be decrypted by those who know k.
For example, let us take a value of k=5 and the point (9,4) as the generator point (G). When we multiply 5G, we are given the point (9,7), which would be the public key. However, just given the 2 points, it is extremely difficult to find the value of k.
ECC algorithms have long been considered nearly unbreakable due to the elliptic curve discrete logarithm problem , or the ‘ECDLP’. The ECDLP is a mathematical problem that asks: Given two points (P, Q) on an elliptic curve, what operation or algorithms could be used to find the specific constant k such that k multiplied by P equals Q?
The key issue in solving this lies in point multiplication, where a tangent line is drawn to a point on the elliptical curve (P) as part of the operation. Wherever that line intersects the elliptical curve again is point Q’. When Q’ is reflected across the x-axis of the equation (not necessarily y=0), the result is Q which equates to 2P. This process is continued until KP is reached. While it is straightforward to find Q given P and K, it is nearly impossible to find K given P and Q because there is currently no known inverse operation to undo, or solve for the coefficient in point multiplication.
Ultimately, RSA and ECC algorithms are what encrypt all of digital data and communication. They keep everything secure from classified government data to something as simple as a text message. Encryption allows private information to remain private and large national or international systems to continue functioning. It acts as a barrier against bad actors looking to hack or exploit this private data. Without encryption, there would be no safeguard for any data. Imagine if everything you ever put on a device, whether private photos or bank information, suddenly became public. You would no longer be able to trust digital privacy and safety if these algorithms were to fail.
To understand the momentous advancements in quantum computing, it is important to take a step back and examine the field’s origins as well as how quantum mechanics have evolved over time. Written in 1900 by Max Planck, the ‘Quantum Hypothesis’ explored the idea that rather than the conventionally accepted continuously flowing energy, energy was actually emitted in non-connected packets called quanta. His work laid the foundation for an exploration into what has become the field of quantum mechanics. Both Einstein’s 1905 work on the Photoelectric effect and Niels Bohr’s 1913 work on the atom further supported this claim by suggesting quantum leaps and the particle-like behaviors of a photon.
In 1927, Heisenberg formulated his uncertainty principle, which stated that it is impossible to simultaneously know the position and the speed of a particle with perfect accuracy. Einstein, Podolsky, and Rosen each published various works in 1935, questioning quantum mechanics via entanglement, or the influence of the state of one particle on the state of another simultaneously over great distances. Recent works have shown that entanglement can connect particles even between a satellite and the Earth. John Bell later proved entanglement by conducting experiments in search of violations of the Bell inequalities in 1964.
In 1926 Schrodinger created a system of wave equations that accurately predicted the energy levels of electrons in atoms. Neumann built on this alongside Hilbert’s work to create the mathematical framework for quantum mechanics, formalizing quantum states and creating a method to understand the behavior of quantum systems. In the 1940s Feynman, Schwinger, and Tomonaga developed their theory of Quantum Electrodynamics (QED) which described the interactions of light and matter.
The 1980 conference of physicists, mathematicians, and computer scientists was the turning point from quantum theories into quantum applications, laying the foundation for all of quantum computing. While the first working laser was created in the 1950s, quantum mechanics was not explored much further untilPaul Benioff’s 1980 description of a quantum computer,the first step towards quantum computing.
Quantum Computing: What is it and how does it work?
Superposition: The state of being in multiple states or places at once. Superposition is mostly commonly seen with overlaps of waves, but at a quantum level can be understood as a particle being in both state 1 and state 0 at the same time. However, when measured these particles must settle at either state 1 or state 0. The most commonly known analogy to explain this is the Schrodinger’s cat analogy: If you were to put a cat inside of a box with a substance that has an equal chance of killing or not killing the cat within an hour, then after one hour you could say that the cat is both dead and alive until you measure it, at which point it must be either dead or alive.
Entanglement: A phenomenon by which two particles become connected such that the fate of one affects the other, irrespective of the distance between the two. Prior to any measurement, two particles will always be in a state of superposition, meaning that the particles can be in both state 0 and state 1 at the same time. However, when measured, the state of one particle will directly affect the state of the other. This principle was proven by John Bell via the Bell inequalities.
Quantum computing allows storage of more information and more efficient processes, creating opportunities to infinitely increase the rate at which many modern machines work. While they face setbacks in these developing stages, they make it possible to perform multiple simultaneous operations rather than being limited by the tunnel effect that limits most modern machines to straightforward operations.
Quantum systems use qubits as the fundamental unit of information transfer instead of the traditional bit. Qubits allow for the superposition of ones and zeros making it possible for quantum computers with very few qubits to perform billions of operations per second, over a million times faster than the best computers on the market today. In addition, the entanglement of multiple qubits means that information capacity grows exponentially rather than linearly.
Compare and Contrast: Quantum Computers vs. Traditional Computers
The Quantum Threat to Cryptography
While current computers may not be strong enough to carry out an attack on cryptography, the emerging field of quantum computing poses a risk to all of modern encryption.
Is the threat just theoretical?
Even as an emerging technology, quantum computing poses a very real threat to cryptography. While many people would be more than willing to write it off as a threat of the future, that future may be closer than you believe. Quantum computing has shown its strength through many algorithms which could potentially result in the compromisation of sensitive data.
The most prominent algorithm in regards to cryptography is Shor’s ‘Factoring Algorithm’ from 1994. Specifically, Shor’s Factoring Algorithm (SFA) is a major threat to RSA cryptography systems. As I mentioned earlier, RSA systems rely on the creation of large numbers as the product of two prime numbers, basing security over the inability to efficiently factor those numbers.
According to Thorsten Kleinjung of the University of Bonn, it would take around two years to factor N = 135066410865995223349603216278805969938881475605667027524485 14385152651060485953383394028715057190944179820728216447155137368041970396419174 304649658927425623934102086438320211037295872576235850964311056407350150818751067 6594629205563685529 475213500852879416377328533906109750544334999811150056977236 890927563 with under 2 GB of memory.
Shor’s Algorithm could exponentially speed this up by working as follows:
Start with the large number (N) and a guess (g). If g is a factor of N or shares a factor with N then we have already found the factors.
If g is foreign to N, then we use the property that for any 2 prime numbers (a,b) there exists one power (n) and one multiple (m) such that an= mb+1. Applying this here we get gn= mN + 1. We can further rewrite this as (gn/2-1)(gn/2+1)= mN. We can now change our objective from searching for values of g to searching for values of n.
This is where quantum computing makes a vital difference. By testing many possible values of n, the quantum system starts in a superposition of states. After attempting to solve for n using the above equation (mod N), we begin to take advantage of the fact that if gx mod(N) = r then gx+pmod(N) =r if p is the period of the equation ( gp=1). When we utilize superposition, we test to see what values of x produce the same remainder, as the distance between those x values will be the period.
We can derive from this the frequency (f=1/p)
Here we can apply a Quantum Fourier Transform (similar to a classical Fourier Transform): When we absorb all the constructive and destructive interference of the superposition, 1/p is the remaining frequency.
Now that we have a candidate for p, we calculate our best guess for gp and iterate as necessary to correct quantum error.
Aside from algorithms, many corporations have made recent advancements towards building quantum computers as well. As recently as June 2025, Nord Quantique, a Canadian startup, announced their breakthrough ‘bosonic qubit’ which has built in error correction. This creates the potential to produce successful, encryption breaking 1000-qubit machines by 2031, far more efficient than the previously estimated 1 million-qubits.
The ‘Harvest Now, Decrypt Later’ Tactic
Another major reason why quantum mechanics is a threat to cryptography includes the ‘harvest now, decrypt later’ (HNDL) tactic. As the predicted Q-day nears (2035), threatening actors have begun to collect and store encrypted data, with the goal of decrypting it in the future with sufficiently powerful quantum machines. The attackers may not be able to decrypt the data, but they can intercept communications to steal encrypted data.
While it is easy to dismiss these attacks as something that could only be effective on nation-state levels, this assumption only feeds a false sense of security. For bad actors, corporate information could enable them to threaten economic chaos and large-scale disruptions. In fact, experts believe that these attacks have become increasingly focused on businesses as they hold the people’s data and the power to create mass economic instability.
Matthew Scholl, Chief of the Computer Science at NIST described the threat by saying,
“Imagine I send you a message that’s top secret, and I’ve encrypted it using this type of encryption, and that message is going to need to stay top secret for the next 20 years. We’re betting that an adversary a) hasn’t captured that message somehow as we sent it over the internet, b) hasn’t stored that message, and c) between today and 20 years from now will not have developed a quantum machine that could break it. This is what’s called the store-and-break threat.”
The most concerning aspect of these HNDL attacks is that it is nearly impossible to know when your data has been stolen, until it comes into effect with the rise of quantum computing. By then, the damage will be irreversible. While not all data will be of high value over a decade from now, attackers are threatening specific data that they believe will hold long-term value.
Over the past 10 years, incidents have arisen that resemble HNDL attacks:
In 2016, Canadian internet traffic to South Korea, was being rerouted through China
In 2020, data from many large online platforms was rerouted through Russia
A study by HP’s Wolf Security discovered that one third of the cyber attacks conducted by nation-states between 2017 and 2020 were aimed at businesses
Post Quantum Cryptography ( PQC)
However, companies and nations have already begun to look into ways to protect data from quantum attacks. Post-Quantum encryption algorithms focus on encrypting data in a way that will be equally difficult for quantum machines to break as it is for the classic computer.
The Deputy Secretary of US Commerce, Don Graves said,
“The advancement of quantum computing plays an essential role in reaffirming America’s status as a global technological powerhouse and driving the future of our economic security. Commerce bureaus are doing their part to ensure U.S. competitiveness in quantum, including the National Institute of Standards and Technology, which is at the forefront of this whole-of-government effort. NIST is providing invaluable expertise to develop innovative solutions to our quantum challenges, including security measures like post-quantum cryptography that organizations can start to implement to secure our post-quantum future. As this decade-long endeavor continues, we look forward to continuing Commerce’s legacy of leadership in this vital space.”
One example of a potentially powerful PQC algorithm is CRYSTALS-Kyber, which the NIST declared the best for general encryption in 2022. They added HQC to their list of PQC algorithms in 2024, giving us a grand total of five algorithms that have met the standard.
The NIST has named their standards for PQCs and urges people to work towards incorporating them now, because the full shift to PQCs may take as long as developing those quantum computers will take. Their key goals in this endeavor are to not only find algorithms that are resistant to quantum computing, but to diversify the types of mathematics involved to mitigate the risk of compromised data. They search for algorithms that are both able to be easily implemented and improved so that they maintain a ‘crypto-agility’.
Many companies support PQCs and believe that they will safeguard the future of cryptography. Whitfield Diffie, cryptography expert, explains that
“One of the main reasons for delayed implementation is uncertainty about what exactly needs to be implemented. Now that NIST has announced the exact standards, organizations are motivated to move forward with confidence.”
Companies such as Google, Microsoft, IBM, and AWS are actively working to develop better resistance to quantum threats, helping to build some of the most powerful PQC algorithms. IBM is currently advocating for a Cryptography Bill of Materials (CBOM), a new standard to keep tabs on cryptographic assets and introduce more oversight into the system. Microsoft has become one of the founding members of the PQC Coalition, a group whose mission is to step forward and provide valuable outreach alongside education to support the shift towards PQC as the primary form of encryption.
While PQCs could be a valuable resource against quantum threats, there are still setbacks that make people question the validity of the whole effort. The Supersingular Isogeny Key Exchange (SIKE) algorithm, one of the NIST finalists for the PQC standard, failed due to a successful attack by a classical computer, rendering many of the fundamental mathematical assumptions false. In addition, many of these algorithms suffer due to a lack of extensive testing and uncertainty regarding how much quantum machines will actually be able to accomplish.
Conclusion
While the timeline of PQC development might be uncertain, it is imperative that we work now. Quantum computing is no longer a threat looming in the future, but a present reality with significant impacts.It is imperative that we begin shifting towards these safer systems as a community. We cannot wait until the threat has come, we need to prepare now.
Rob Joyce, the Director of the National Security Administration’s Cybersecurity has stated that,
“The transition to a secured quantum computing era is a long-term intensive community effort that will require extensive collaboration between government and industry. The key is to be on this journey today and not wait until the last minute.”
Above all, it is crucial to recognize the threat and take action. Educating the people is the first step towards group action. Let awareness be our first line of defense.