About

Tag: computer

  • A Deep Dive into Computer Aided Design

    A Deep Dive into Computer Aided Design

    By Grace Liu

    ~10 minutes


    Computer Aided Design, or CAD, is essentially a platform for users to design, modify, and analyze a digital model. Its speed and efficiency rival traditional design methods, and the capabilities of CAD are continuously growing as technology advances. It is a space for unlimited creativity and endless possibility, and it is crucial to have an in-depth understanding of CAD to be able to fully harness its potential.

    How does it work?

    At the center of a CAD software program is its graphics kernel, or the processing core. It is a component of the graphical user interface (GUI) which has extensive uses on electronic devices beyond the capabilities of CAD. The GUI takes input from the user and transfers the data to the graphics kernel, which will then generate the geometries and display them on screen. 

    Types of CAD

    There are two main categories of CAD: 2D and 3D. 2D designing is more similar to digital art with a different set of tools, often seen with digital drawing and sketching. The key difference is the use of measurement and parameters, a tool that sets a variable to a certain value in a design to be referenced later in other constraints. Parameters are extremely beneficial to create an adjustable, flexible design. 2D designing with Computer Aided Design is commonly used for landscaping, floorplans, and blueprints. On the other hand, 3D modeling offers more complex and realistic designs, and will be the focus of this article. It comes in tons of different forms, including direct modeling, surface modeling, 3D wireframe, and freeform CAD.

    Direct modeling is a type of CAD that doesn’t contain parameters and purely relies on the pushing and pulling of surfaces on unconstrained objects. It allows more freedom than parametric modeling, but becomes much more difficult when needing to adjust a design. For example, say you need to make an object twice as large as it currently is. In parametric modeling you would simply need to enlarge the base parameters for certain lengths that you set, and then all constraints using those parameters would automatically adjust. In direct modeling you would need to manually scale each surface to size up the object.

    Another form of CAD is surface modeling, which focuses on manipulating intricate external surfaces, more like a shell instead of a full 3D object. It uses curves and lines defined by mathematical formulas, calculated by the computer using input from the graphical workspace in the CAD program. Surface modeling helps display texture, material, and overall aesthetics for the design.

    A step down from surface modeling is 3D wireframe, which goes further to remove the surfaces on the object and models 3D structures using only its lines and curves. Without any actual surfaces or bodies, the design appears to be the skeleton of the object(s) or a wire framework, hence the name. It acts as the first 3D visualization of concept or design, providing a foundation that can be built into a full model later on. These designs are often the first pitch to an outside source that offers feedback on the base sketch, an efficient and effective method to communicate a design idea without having to fully create it.

    A unique but often overlooked type of CAD is freeform CAD. It acts more like clay, letting the user be more artistic and creative with their design. It utilizes digital brushes or styluses to sculpt the object, with a different set of tools and abilities in the workspace compared to the more common forms of CAD. Freeform CAD often involves the use of haptic devices instead of a mouse and keyboard. These devices will transmit the digital output from the computer to a physical attachment on the device through touch sensation that allows the user to “feel” their design as they sculpt. The physical attachment typically mimics brushes or scrapers, and can sometimes even be equipped with vibration.

    Different CAD platforms:

    The foundation of every CAD platform is similar, but each one has different unique features. Getting an overview of the platforms can help the user determine which one to choose that best suits their needs. Five of the most common ones include: Autodesk Fusion 360, Onshape, Blender, TinkerCAD, and SolidWorks.

    September 4, 2019 Product Update – What’s New / Keqing Song / Autodesk Fusion ©

    Autodesk contains a multitude of CAD programs, but their most popular and versatile one is Fusion 360. It’s an industry level CAD software and combines different tools and abilities all into one place, allowing for (unlimited) creation. Fusion 360 contains a variety of workspaces including: Design and Generative Design, Rendering, Simulation, Animation, Electronics, and 2D Drawing. Just within the Design workspace Fusion 360 has hundreds of techniques to choose from when building like freeform, surface, parametric, and direct modeling along with sheet metal, mesh, plastic, etc. Its software platform allows for smooth collaboration by storing all files directly in the cloud and easy updates across designs, reducing the amount of time it takes to combine multiple designs. Fusion 360 is flexible, perfect for rapid prototyping, with an extensive tool kit that contains multiple shortcuts to make designing and modifying faster. Autodesk also has a free education license for students and educators, making it accessible to a larger audience.

    Onshape, The CAD Of The Future / Nuts and Bolts / Substack ©

    OnShape is another one of the leading CAD platforms in the industry today, a top competitor with Fusion 360. Onshape includes diverse customization tools like FeatureScript, a programming language specific to Onshape that allows users to create custom CAD features or shortcuts usable in their designs. For example, you can code a custom feature that can create a mold on a separate body for any design, reducing the time it takes to manually create a mold each time. FeatureScript lays the groundwork for OnShape’s modeling and standard functions like Extrude, Fillet, and Helix are already written in as FeatureScript functions when you begin to branch out and create your own. Onshape has a built in Product Data Management (PDM) system which allows teams to edit the same design simultaneously, a feature not many CAD platforms can achieve. Alongside increasing efficiency, this also makes it easier to store parts and assemblies by eliminating files. You can long into your account anywhere, and have full access to all your designs in OnShape. Another unique tidbit about OnShape is that it does not require manual updates for the application, all updates run automatically in the background so you don’t have to worry about running the correct version of OnShape when fixing bugs in your design.

    Beginner’s Guide to 3D Character Creation Using Blender / DEZPAD ©

    Blender is a slightly different CAD platform; it focuses on and perfects the aesthetics of 3D modeling. It’s best for rendering and shading, animation, simulation, visual effects, and game development. Blender consists of 2 main rendering engines: Eevee and Cycles. Eevee is a real-time engine, best for quick rendering for fast iterations. In short, a real-time rendering engine computes the lighting, materials, plus other components of the image continuously at about 30-120 frames per second and provides an interactive output which allows the user to adjust the settings. Cycles is a path-tracing engine with high quality and realistic renders, but takes a much longer time. A “path-tracing” rendering engine means that the program simulates the physical behavior of light rays on the object frame by frame to create a realistic image. Cycles would typically be used for the final render, pristine and life-like, whereas Eevee would be used in-between iterations to help make improvements. The extensive simulation workspace in Blender can mimic unique bodies in nature like fluids, smoke, and fire. Another benefit of Blender is that it’s completely free, perfect for hobbyists or students.

    TinkerCAD Basics: A Hands-On Workshop for Beginners! / San Carlos Life ©

    TinkerCAD is a much simpler CAD platform, but that also makes it best for beginners with its clear, straight-forward layout. It consists of a couple tabs with a set collection of 3D shapes along with other tools. It includes basic electronics simulation and serves as a good introduction to circuits and coding a real mechanism instead of just on a computer program. TinkerCAD is very popular in schools as it has built-in lessons and hands-on projects along with its easy format. Since it was designed to teach beginners, TinkerCAD has limited capabilities. It doesn’t have complex curves and restricts freedom on building custom shapes, as well as lower resolution models. TinkerCAD does not have advanced rendering, simulation, or animation so it might not be the best option for realistic modeling. These intentional restrictions keep TinkerCAD kid-friendly and focus on teaching the basics of 3D modeling before transitioning to something more advanced. It’s also compatible with online models in a specific file format,  so you can learn from and transfer designs on the internet to TinkerCAD. It’s good for simple 3D printing and laser cutting, allowing for a full introduction to the basics of engineering for beginners. 

    Solidworks 2025 / DEVELOP3D ©

    SolidWorks is an industry grade CAD platform, and in a lot of aspects similar to Fusion 360. A key difference is that SolidWorks targets large engineering companies like Tesla and Lockheed Martin while Fusion 360 focuses on hobbyists, students, or startups that want a simple, but effective CAD platform to create 3D designs not quite as complex as a plane. SolidWorks makes drawing complicated 2D blueprints with details and labels much easier. It can utilize views, measurements, and calculations from the 3D design, and then transfer them to the 2D drawing. SolidWorks has a powerful simulation workspace for motion, stress, heat, and real-life scenarios that designs like cars, planes, or bridges need to withstand. Comparatively, SolidWorks is one of the more sophisticated 3D designing platforms and requires time to get familiar with, but it also offers lessons, tutorials, and even courses to help shorten the learning curve.

    Real-world applications of CAD

    Building Information Modeling (BIM) Explained / KENNMAR ©

    The most common place you see Computer Aided Design is in engineering, where it has become integrated throughout the design process, from designing and prototyping to manufacturing the product. It’s also present in architecture, so much so that there’s a type of CAD created specifically for 3D models of buildings. Building Information Modeling (BIM) is a CAD platform that creates a 3D model of all the components in a real-world building, and also replicates the entire timeline of the building from construction to long-term maintenance. It’s a digital version of the entire process of the building, and helps to check safety and functionality beforehand. CAD also pops up in unexpected places, like interior and exterior designing, fashion, and game design. Interior and exterior designing involves much of the same processes as industrial design, although with less moving parts in the assembly. Fashion mostly uses 2D CAD to make the drawing and sketching process faster and more efficient. Game design, as mentioned when discussing Blender, uses mostly the design, animation, and rendering workspaces in 3D CAD to make their characters and objects look as realistic as possible. 3D modeling can also be seen in medicine, specifically with imaging and x-rays. Machines in hospitals are being equipped with the ability to reconstruct 3D models of bones and structures within the human body, helping doctors to better treat the patient.


  • The Quantum Encryption Crisis

    The Quantum Encryption Crisis

    By Aashritha Shankar

    ~18 minutes


    “Some experts in the field predict that the first quantum computer capable of breaking current encryption methods could be developed within the next decade. Encryption is used to prevent unauthorized access to sensitive data, from government communications to online transactions, and if encryption can be defeated, the privacy and security of individuals, organizations, and entire nations would be under threat.” – The HIPAA Journal

    Introduction

    The cybersecurity landscape is facing a drastic shift as the increasing power of quantum computers threatens modern encryption. Experts predict a quantum D-day (Q-day) in the next 5-10 years, when quantum computers will be sufficiently powerful to break through even the strongest of cybersecurity mechanisms. Meanwhile, few companies have begun to prepare against the threat, developing quantum resistant cybersecurity methods. However, to fully combat the threat, we need to act now.

    Encryption Today

    Modern cryptography is dominated by two major algorithms that transform ordinary text into ciphertext:

    1. Rivest-Shamir-Adleman (RSA)

     Dating back to 1977, the RSA algorithm relies on the factoring of large numbers. RSA can be separated into two parts, a private and public key. The public key, used for encoding, is a pair of numbers (n, e)where n is the product of 2 large prime numbers (p•q=n). The value of e can be any number that is co-prime to (p-1)(q-1), meaning that the GCF of (p-1)(q-1) and e is 1. The private key (d), used for decoding, is the reciprocal of the least common multiple of (p-1)(q-1) and e and can also be found by solving the equation 1= d • e • (p-1)(q-1) for d. 

    For decades, RSA has provided security for digital data because large scale of (n, e) numbers in addition to the variability of e means that it is nearly impossible to decipher (p, q) from (n, e). However, quantum computing brings forth the ability to quickly factor large numbers, allowing (p, q) to be determined from just the public key. 

    2. Elliptic Curve Cryptography (ECC): 

    Since 1985, ECC algorithms have been favored over RSA’s due to their greater complexity and faster encryption, with ECC’s capabilities proving to be up to ten times faster.  ECC algorithms use an elliptical curve of the form y2=x3+ax+b over a finite field of not necessarily real numbers (Fp). A field Fp includes numbers from 0 to p-1,  where p is prime. 

    Figure 1: The elliptic curve 
    Figure 2: The elliptic curve over F11

    For the purpose of illustration, let us take the elliptical equation y2=x3+13 and a field F11. Figure 1 shows the elliptical curve while figure 2 shows the solutions to y2 =x3+13 (mod 11). The order of the curve is the number of points, including the arbitrary one at infinity, that satisfy the equation over a specific field (12 points in figure 2). The private key is some value k between 1 and the order of the curve. The public key can be calculated by taking one of the points, called the generator point (G), and multiplying it by k (kG). This system then encrypts the information using the public key (kG) and can only be decrypted by those who know k. 

    For example, let us take a value of k=5 and the point (9,4) as the generator point (G). When we multiply 5G, we are given the point (9,7), which would be the public key. However, just given the 2 points, it is extremely difficult to find the value of k

    ECC algorithms have long been considered nearly unbreakable due to the elliptic curve discrete logarithm problem , or the ‘ECDLP’. The ECDLP is a mathematical problem that asks: Given two points (P, Q) on an elliptic curve, what operation or algorithms could be used to find the specific constant k such that k multiplied by P equals Q?

    The key issue in solving this lies in point multiplication, where a tangent line is drawn to a point on the elliptical curve (P) as part of the operation. Wherever that line intersects the elliptical curve again is point Q’. When Q’ is reflected across the x-axis of the equation (not necessarily y=0),  the result is  Q which equates to 2P. This process is continued  until KP is reached.  While it is straightforward to find Q given P and K, it is nearly impossible to find K given P and Q because there is currently no known inverse operation to undo, or solve for the coefficient in point multiplication.

    Ultimately, RSA and ECC algorithms are what encrypt all of digital data and communication. They keep everything secure from classified government data to something as simple as a text message. Encryption allows private information to remain private and large national or international systems to continue functioning. It acts as a barrier against bad actors looking to hack or exploit this private data. Without encryption, there would be no safeguard for any data. Imagine if everything you ever put on  a device, whether private photos or bank information, suddenly became public. You would no longer be able to trust digital privacy and safety if these algorithms were to fail.

    A Brief History of Quantum

    Timeline of quantum / Quantum computing review / Fiveable ©

    To understand the momentous advancements in quantum computing, it is important to take a step back and examine the field’s origins as well as how quantum mechanics have evolved over time. Written in 1900 by Max Planck, the ‘Quantum Hypothesis’ explored the idea that rather than the conventionally accepted continuously flowing energy, energy was actually emitted in non-connected packets called quanta. His work laid the foundation for an exploration into what has become the field of quantum mechanics. Both Einstein’s 1905 work on the Photoelectric effect and Niels Bohr’s 1913 work on the atom further supported this claim by suggesting quantum leaps and the particle-like behaviors of a photon.

    In 1927, Heisenberg formulated his uncertainty principle, which stated that it is impossible to simultaneously know the position and the speed of a particle with perfect accuracy. Einstein, Podolsky, and Rosen each published various works in 1935, questioning quantum mechanics via entanglement, or the influence of the state of one particle on the state of another simultaneously over great distances. Recent works have shown that entanglement can connect particles even between a satellite and the Earth.  John Bell later proved entanglement by conducting experiments in search of violations of the Bell inequalities in 1964.

     In 1926 Schrodinger created a system of wave equations that accurately predicted the energy levels of electrons in atoms. Neumann built on this alongside Hilbert’s work to create the mathematical framework for quantum mechanics, formalizing quantum states and creating a method to understand the behavior of quantum systems. In the 1940s Feynman, Schwinger, and Tomonaga developed their theory of Quantum Electrodynamics (QED) which described the interactions of light and matter. 

    The 1980 conference of physicists, mathematicians, and computer scientists was the turning point from quantum theories into quantum applications, laying the foundation for all of quantum computing. While the first working laser was created in the 1950s, quantum mechanics was not explored much further untilPaul Benioff’s 1980 description of a quantum computer,the first step towards quantum computing.

    Quantum Computing: What is it and how does it work?

    Figure 3: Entanglement of 2 particles / Quantum explained / NIST ©
    Figure 4: Superposition with and without measurement / Quantum explained / NIST ©

    Quantum computing is based on two key principles:

    1. Superposition: The state of being in multiple states or places at once. Superposition is mostly commonly seen with overlaps of waves, but at a quantum level can be understood as a particle being in both state 1 and state 0 at the same time. However, when measured these particles must settle at either state 1 or state 0. The most commonly known analogy to explain this is the Schrodinger’s cat analogy: If you were to put a cat inside of a box with a substance that has an equal chance of killing or not killing the cat within an hour, then after one hour you could say that the cat is both dead and alive until you measure it, at which point it must be either dead or alive.
    2. Entanglement: A phenomenon by which two particles become connected such that the fate of one affects the other, irrespective of the distance between the two. Prior to any measurement, two particles will always be in a state of superposition, meaning that the particles can be in both state 0 and state 1 at the same time. However, when measured, the state of one particle will directly affect the state of the other. This principle was proven by John Bell via the Bell inequalities.

    Quantum computing allows storage of more information and more efficient processes, creating opportunities to infinitely increase the rate at which many modern machines work. While they face setbacks in these developing stages, they make it possible to perform multiple simultaneous operations rather than being limited by the tunnel effect that limits most modern machines to straightforward operations.

    Quantum systems use qubits as the fundamental unit of information transfer instead of the traditional bit. Qubits allow for the superposition of ones and zeros making it possible for quantum computers with very few qubits to perform billions of operations per second, over a million times faster than the best computers on the market today. In addition, the entanglement of multiple qubits means that information capacity grows exponentially rather than linearly.

    Compare and Contrast: Quantum Computers vs. Traditional Computers

    The Quantum Threat to Cryptography

    While current computers may not be strong enough to carry out an attack on cryptography, the emerging field of quantum computing poses a risk to all of modern encryption.

    Is the threat just theoretical?

    Even as an emerging technology, quantum computing poses a very real threat to cryptography. While many people would be more than willing to write it off as a threat of the future, that future may be closer than you believe. Quantum computing has shown its strength through many algorithms which could potentially result in the compromisation of sensitive data.

    The most prominent algorithm in regards to cryptography is Shor’s ‘Factoring Algorithm’ from 1994. Specifically, Shor’s Factoring Algorithm (SFA) is a major threat to RSA cryptography systems. As I mentioned earlier, RSA systems rely on the creation of large numbers as the product of two prime numbers, basing security over the inability to efficiently factor those numbers.

    According to Thorsten Kleinjung of the University of Bonn, it would take around two years to factor  N = 135066410865995223349603216278805969938881475605667027524485 14385152651060485953383394028715057190944179820728216447155137368041970396419174 304649658927425623934102086438320211037295872576235850964311056407350150818751067 6594629205563685529 475213500852879416377328533906109750544334999811150056977236 890927563  with under 2 GB of memory.

    Shor’s Algorithm could exponentially speed this up by working as follows:

    1. Start with the large number (N) and a guess (g). If g is a factor of N or shares a factor with N then we have already found the factors. 
    2. If g is foreign to N, then we use the property that for any 2 prime numbers (a,b) there exists one power (n) and one multiple (m) such that an= mb+1. Applying this here we get gn= mN + 1. We can further rewrite this as (gn/2-1)(gn/2+1)= mN. We can now change our objective from searching for values of g to searching for values of n. 
    3. This is where quantum computing makes a vital difference. By testing many possible values of n, the quantum system starts in a superposition of states. After attempting to solve for n using the above equation (mod N), we begin to take advantage of the fact that if gx mod(N) = r  then gx+pmod(N) =r if p is the period of the equation ( gp=1). When we utilize superposition, we test to see what values of x produce the same remainder, as the distance between those x values will be the period. 
    4. We can derive from this the frequency (f=1/p)
    5. Here we can apply a Quantum Fourier Transform (similar to a classical Fourier Transform): When we absorb all the constructive and destructive interference of the superposition, 1/p is the remaining frequency. 
    6. Now that we have a candidate for p, we calculate our best guess for gp and iterate as necessary to correct quantum error.

    Aside from algorithms, many corporations have made recent advancements towards building quantum computers as well. As recently as June 2025, Nord Quantique, a Canadian startup, announced their breakthrough ‘bosonic qubit’ which has built in error correction. This creates the potential to produce successful, encryption breaking 1000-qubit machines by 2031, far more efficient than the previously estimated 1 million-qubits.

    The ‘Harvest Now, Decrypt Later’ Tactic

    Another major reason why quantum mechanics is a threat to cryptography includes the ‘harvest now, decrypt later’ (HNDL) tactic.  As the predicted Q-day nears (2035), threatening actors have begun to collect and store encrypted data, with the goal of decrypting it in the future with sufficiently powerful quantum machines. The attackers may not be able to decrypt the data, but they can intercept communications to steal encrypted data.

    While it is easy to dismiss these attacks as something that could only be effective on nation-state levels, this assumption only feeds a false sense of security. For bad actors, corporate information could enable them to threaten economic chaos and large-scale disruptions. In fact, experts believe that these attacks have become increasingly focused on businesses as they hold the people’s data and the power to create mass economic instability.

    Matthew Scholl, Chief of the Computer Science at NIST described the threat by saying,

    “Imagine I send you a message that’s top secret, and I’ve encrypted it using this type of encryption, and that message is going to need to stay top secret for the next 20 years. We’re betting that an adversary a) hasn’t captured that message somehow as we sent it over the internet, b) hasn’t stored that message, and c) between today and 20 years from now will not have developed a quantum machine that could break it. This is what’s called the store-and-break threat.”

    The most concerning aspect of these HNDL attacks is that it is nearly impossible to know when your data has been stolen, until it comes into effect with the rise of quantum computing. By then, the damage will be irreversible. While not all data will be of high value over a decade from now, attackers are threatening specific data that they believe will hold long-term value. 

    Over the past 10 years, incidents have arisen that resemble HNDL attacks:

    • In 2016, Canadian internet traffic to South Korea, was being rerouted through China
    • In 2020, data from many large online platforms was rerouted through Russia
    • A study by HP’s Wolf Security discovered that one third of the cyber attacks conducted by nation-states between 2017 and 2020 were aimed at businesses 

    Post Quantum Cryptography ( PQC)

    However, companies and nations have already begun to look into ways to protect data from quantum attacks. Post-Quantum encryption algorithms focus on encrypting data in a way that will be equally difficult for quantum machines to break as it is for the classic computer.

    The Deputy Secretary of US Commerce, Don Graves said,

    “The advancement of quantum computing plays an essential role in reaffirming America’s status as a global technological powerhouse and driving the future of our economic security. Commerce bureaus are doing their part to ensure U.S. competitiveness in quantum, including the National Institute of Standards and Technology, which is at the forefront of this whole-of-government effort. NIST is providing invaluable expertise to develop innovative solutions to our quantum challenges, including security measures like post-quantum cryptography that organizations can start to implement to secure our post-quantum future. As this decade-long endeavor continues, we look forward to continuing Commerce’s legacy of leadership in this vital space.”

    One example of a potentially powerful PQC algorithm is CRYSTALS-Kyber, which the NIST declared the best for general encryption in 2022. They added HQC to their list of PQC algorithms in 2024, giving us a grand total of five algorithms that have met the standard.

    The NIST has named their standards for PQCs and urges people to work towards incorporating them now, because the full shift to PQCs may take as long as developing those quantum computers will take. Their key goals in this endeavor are to not only find algorithms that are resistant to quantum computing, but to diversify the types of mathematics involved to mitigate the risk of compromised data. They search for algorithms that are both able to be easily implemented and improved so that they maintain a ‘crypto-agility’.

    Many companies support PQCs and believe that they will safeguard the future of cryptography. Whitfield Diffie, cryptography expert, explains that

    “One of the main reasons for delayed implementation is uncertainty about what exactly needs to be implemented. Now that NIST has announced the exact standards, organizations are motivated to move forward with confidence.”

    Companies such as Google, Microsoft, IBM, and AWS are actively working to develop better resistance to quantum threats, helping to build some of the most powerful PQC algorithms. IBM is currently advocating for a Cryptography Bill of Materials (CBOM), a new standard to keep tabs on cryptographic assets and introduce more oversight into the system. Microsoft has become one of the founding members of the PQC Coalition, a group whose mission is to step forward and provide valuable outreach alongside education to support the shift towards PQC as the primary form of encryption.

    While PQCs could be a valuable resource against quantum threats, there are still setbacks that make people question the validity of the whole effort. The Supersingular Isogeny Key Exchange (SIKE) algorithm, one of the NIST finalists for the PQC standard, failed due to a successful attack by a classical computer, rendering many of the fundamental mathematical assumptions false. In addition, many of these algorithms suffer due to a lack of extensive testing and uncertainty regarding how much quantum machines will actually be able to accomplish.

    Conclusion

    While the timeline of PQC development might be uncertain, it is imperative that we work now. Quantum computing is no longer a threat looming in the future, but a present reality with significant impacts.It is imperative  that we begin shifting towards these safer systems as a community. We cannot wait until the threat has come, we need to prepare now.

    Rob Joyce, the Director of the National Security Administration’s Cybersecurity has stated that,

    “The transition to a secured quantum computing era is a long-term intensive community effort that will require extensive collaboration between government and industry. The key is to be on this journey today and not wait until the last minute.”

    Above all, it is crucial to recognize the threat and take action. Educating the people is the first step towards group action. Let awareness be our first line of defense.


    References

    Bakhtiari, M., & Mohd Aizaini Maarof. (2012). Serious Security Weakness in RSA Cryptosystem. International Journal of Computer Science Issues, 9(1). https://www.researchgate.net/publication/267941681_Serious_Security_Weakness_in_RSA_Cryptosystem
    Caltech. (2023a). What Is Entanglement and Why Is It Important? Caltech Science Exchange. https://scienceexchange.caltech.edu/topics/quantum-science-explained/entanglement
    Caltech. (2023b). What Is Superposition and Why Is It Important? Caltech Science Exchange. https://scienceexchange.caltech.edu/topics/quantum-science-explained/quantum-superposition
    Elliptic Curve Cryptography (ECC). (2020). Cryptobook.nakov.com. https://cryptobook.nakov.com/asymmetric-key-ciphers/elliptic-curve-cryptography-ecc
    Elliptic curve point multiplication algorithms | Elliptic Curves Class Notes | Fiveable. (2024). Fiveable. https://library.fiveable.me/elliptic-curves/unit-8/elliptic-curve-point-multiplication-algorithms/study-guide/LCuiAxqzJAcbiQYY
    Encyclopedia, Q.-T. Q. (2023, April 2). A Brief History of Quantum Computing. Medium. https://quantumpedia.uk/a-brief-history-of-quantum-computing-e0bbd05893d0
    Harvest. (2024). Harvest Now, Decrypt Later: A New Form of Attack. Keyfactor. https://www.keyfactor.com/blog/harvest-now-decrypt-later-a-new-form-of-attack/
    Historical context and motivation for quantum computing | Quantum Computing Class Notes | Fiveable. (2016). Fiveable. https://library.fiveable.me/quantum-computing/unit-1/historical-context-motivation-quantum-computing/study-guide/YmiqGYK4ig1skLMI
    Hughes, O. (2025, June 25). “A first in applied physics”: Breakthrough quantum computer could consume 2,000 times less power than a supercomputer and solve problems 200 times faster. Live Science. https://www.livescience.com/technology/computing/a-first-in-applied-physics-breakthrough-quantum-computer-could-consume-2-000-times-less-power-than-a-supercomputer-and-solve-problems-200-times-faster
    Iberdrola. (2021, April 22). ALL ABOUT QUANTUM COMPUTING. Iberdrola. http://iberdrola.com/innovation/what-is-quantum-computing
    Nation States, Cyberconflict and the Web of Profit. (2021, April 8). HP Threat Research. https://threatresearch.ext.hp.com/web-of-profit-nation-state-report/
    NIST. (2022). NIST Announces First Four Quantum-Resistant Cryptographic Algorithms. NIST, 1(1). https://www.nist.gov/news-events/news/2022/07/nist-announces-first-four-quantum-resistant-cryptographic-algorithms
    NIST. (2024, August 13). What Is Post-Quantum Cryptography?  | NIST. NIST. https://www.nist.gov/cybersecurity/what-post-quantum-cryptography
    NIST. (2025, March 18). Quantum Computing Explained | NIST. NIST. https://www.nist.gov/quantum-information-science/quantum-computing-explained
    Osborne, M., Moskvitch, K., & Janechek, J. (2024, August 13). NIST’s post-quantum cryptography standards are here. IBM Research; IBM. https://research.ibm.com/blog/nist-pqc-standards
    Patil, K. (2024, October 29). What You Need to Know About “Harvest-Now, Decrypt-Later” Attacks. AppViewX. https://www.appviewx.com/blogs/what-you-need-to-know-about-harvest-now-decrypt-later-attacks/
    Post-Quantum Cryptography Coalition |. (2025). Pqcc.org. https://pqcc.org/
    Quantum Algorithms: Shor’s Algorithm. (n.d.). http://Www.classiq.io. https://www.classiq.io/insights/quantum-algorithms-shors-algorithm
    Quantum Computing vs Classical Computing: Key Differences | SpinQ. (2025). Spinquanta.com. https://www.spinquanta.com/news-detail/quantum-computing-vs-classical-computing-full-breakdown
    Quantum Insider. (2020, May 26). The History of Quantum Computing You Need to Know [2022]. The Quantum Insider. https://thequantuminsider.com/2020/05/26/history-of-quantum-computing/
    SIKE – Supersingular Isogeny Key Encapsulation. (n.d.). SIKE – Supersingular Isogeny Key Encapsulation. https://sike.org/
    Tanguyvans. (2024, December 22). RSA and ECC encryption – Tanguyvans – Medium. Medium. https://medium.com/@tanguyvans/rsa-and-ecc-encryption-b182e67a872f
    The Nobel Prize. (2019). The Nobel Prize in Physics 1933. NobelPrize.org; The Nobel Prize. https://www.nobelprize.org/prizes/physics/1933/schrodinger/facts/