Have you ever wondered what life would be like if it were possible to revive extinct animals? To see a woolly mammoth, or a dodo bird? Thanks to a new modern-day technology, these doors are being opened.
A dire wolf is a species of canine that went extinct about 13,000 years ago, differing from the modern gray wolf in its larger body, more massive skull, and smaller brain. In 2021, a company called Colossus Biosciences was able to extract dire wolf DNA from ancient fossils. Using this DNA to find the specific dire wolf genes, the scientists made 20 edits to a gray wolf gene, the closest living relative, until they produced an animal with the same key features as a dire wolf. After creating embryos from these genes, they implanted them into surrogate canine mothers.
Soon after this, three healthy baby wolves were born, carrying the key traits of dire wolves. These three wolves are now known as the first successful use of de-extinction, sparking much debate over whether this practice should be continued.
The Pros of De-extinction:
De-extinction is a powerful tool for animal conservation and ecosystem restoration. Bringing back extinct keystone species could restore degraded habitats that have withered without them, opening doors to revive grasslands and other ecosystems. Along with ecosystem restoration, keystone species could impact the climate and weather in their habitat by impacting carbon storage and moisture regulation.
This technology could also target endangered species, allowing scientists to save and protect animals at risk. By altering extinct genes to restore genetic diversity in a threatened species, scientists could avoid the extinction of important keystone species, keeping the ecosystem’s equilibrium steady.
Along with these two pros, de-extinction has led to significant scientific breakthroughs, specifically in biology and genetics. If it continues to be explored, it de-extinction could lead to other discoveries and raise awareness around the importance of protecting species and biodiversity.
Cons of De-Extinction:
Yet, this useful new technology also harbors many risks. Dr. Meachen, a vertebrate paleontologist and morphologist, stated that she is wary of this new process, saying,
“I have questions. We have trouble with the wolves we have today.”
The de-extinction process is costly and requires funds that the private sector may not be able to provide, meaning governments may have to assume funding. In this case, resources used in this process would come from the government’s conservation budget, making present conservation efforts lose funding. This would mean that existing endangered species facing immediate threats would be at risk, resulting in biodiversity loss.
Placing extinct animals back into their environments might also have drawbacks, as most extinct animals’ ecosystems have changed since they became extinct, and there is no guarantee that they will be able to adapt back. This could lead to potentially invasive species, as their habitats may lack natural predators to keep the revived population in check. Reintroducing a species might also create conflict within the ecosystem, impacting the stability and equilibrium.
Finally, many ethical questions come with de-extinction. By providing a way to return past life to the planet, there may be consequences of falsely condoning extinction and pardoning harm to species. Many critics also believe it is not our responsibility to “play God” and create new life.
In Conclusion:
De-extinction has provided substantial progress in science and has opened doors to new ways to conserve animals and habitats. However, many disadvantages come with it, posing the question: should de-extinction be further used, and if so, should there be limitations to what scientists can and can’t do with the genetic engineering of extinct animals?
While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
Why Extra Temporal Dimensions?
The search for extra spatial dimensions raises questions of the potential for extra temporal dimensions. If space can have more dimensions, why can’t time? The motivations to explore the potential for extra temporal dimensions arise from a desire to better understand the nature of time and the symmetries between them.
Another reason to study these extra-temporal dimensions is the desire to unify seemingly disconnected parts of time. Many frameworks for extra temporal dimensions have revealed previously unnoticed symmetries and relationships between different temporal systems that would not be discovered while only working in one dimension.
The concept of “complex time” is used to fix some of the problems of quantum mechanics. This idea suggests that time should be represented as a complex value rather than a real number. It would allow more ways to represent wave-particle duality, entanglement, and other fundamental concepts of quantum physics.
2T-Physics
Proposed by physicist Itzhak Bars, 2T-Physics suggests that the one dimension of time we experience is really just a “shadow” of the real two dimensions of time. The core motivation of 2T-Physics is to reveal the deeper temporal connections that we don’t see in our one-dimensional perspective. In 2T-Physics, two seemingly disconnected temporal systems are actually connected and represent different views or ‘shadows’ of the same two-dimensional time.
2T-Physics unifies a wide range of physical systems using “gauge symmetry,” which is the property of a system where a set of transformations, called gauge transformations, can be used on a system without changing any of the physical properties of that system. Bars also illustrated that the Standard Model could be explained by 2T-Physics with four spatial dimensions. Not only can this model predict most of the Standard Model, but it also provides a solution to some quantum issues.
An interesting difference between the Standard Model and the predictions of 2T-Physics is the gravitational constant. While it is currently established that the coefficient in gravitational equations is a constant 6.67⋅10-11, the mathematics of 2T-Physics means that the gravitational constant has different values for different periods of our universe (inflation, grand unification, etc). This allows new possibilities for early expansion of our universe that General Relativity and the Standard Model do not. Through its new perspectives, 2T-Physics allows a more complete framework of gravity, especially at higher dimensions.
While 2T-Physics is well-established, it remains highly theoretical and has little to no practical impact. While there is no evidence directly supporting the theory, 2T-Physics predicts certain connections between different physical systems that could potentially be verified through complex experiments, though none have been conducted so far. Above all, 2T-Physics provides a new perspective on time and the nature of the laws of physics that has opened the eyes of many scientists and will likely inspire future discoveries.
3D Time
One of the most recent papers in the field, Kletetschka, proposes a mathematical framework of spacetime that includes temporal dimensions. Kletetschka provides a new perspective on combining gravity and quantum mechanics. Instead of having two hidden dimensions of time, Kletetschka theorizes that each of these dimensions is used to represent time at different scales: the quantum scale, the interaction scale, and the cosmological scale. He explains that the other two dimensions are not visible in our daily life because they occur at very small (quantum) levels or very large (cosmological) levels.
A massive difference between this theory and conventional physics is that while conventional physics considers space to be something vastly different from time, Kletetschka proposes that space is a byproduct of time in each of these dimensions, rather than an entirely separate entity. What we experience as mass or energy actually arises from the curvature of time in these three dimensions. As Kletetschka explored more into this, he discovered surprising consistency in the mathematics, leading to a deeper exploration into the concept.
The key to not creating causality issues and instability in the theory was the usage of regular geometry and spatial dimensions instead of exotic situations that are hard to prove or test. This theory aimed to address many of the long-standing issues in quantum mechanics, and its success thus far makes it a prominent theory in the field.
The theory is able to add extra temporal dimensions without causing causality issues, something very few theories of its type have been able to grapple with. This is due to its structure. The theory is designed so that the three axes share an ordered flow, preventing an event from happening before its cause. Furthermore, these three axes operate at very different scales, leaving very little overlap between them. The mathematics of the framework does not allow for the alteration of events in the past, something that many other theories allow.
The theory is able to offer physical significance and a connection to our world alongside mathematical consistency. Things such as finite quantum corrections, which other theories were not able to predict, were mechanized by this model without creating extra complexity.
This mathematical framework is able to predict several properties and new phenomena that can be experimentally tested, allowing pathways to prove or disprove it soon. Meanwhile, many scientists have spoken in support of the theory, considering it a promising candidate for a near “Theory of Everything” just a few months after its publication.
Conclusion
While the theoretical motivation for extra dimensions is compelling, the reality of their existence remains unconfirmed. Meanwhile, the scientific community works to experimentally prove or disprove their existence through observational evidence.
The Large Hadron Collider (LHC) at CERN is one of the major players on the experimental side. They engage in many experiments, a few of which I have highlighted below.
Tests for Microscopic Black Holes: Many of the theories that propose extra dimensions lead to increased gravitational power within short distances. This manifests physically as microscopic black holes that would dissipate near instantaneously due to Hawking Radiation. However, the byproduct of this dissipation would be particles detected through the LHC.
The Graviton Disappearance: Another common feature of extra-dimensional theories is the manifestation of gravity as a particle called a graviton. That particle would disappear into these extra dimensions, taking energy with it. This would result in an imbalance in the total energy of the system.
While experiments have managed to provide more limitations for potential values that would work in certain theories, they have yet to prove or disprove them.
Meanwhile, it is important to consider what extra dimensions would mean for us and the way we live. The concept of extra dimensions provides multiple philosophical considerations for us as humans. This concept completely changes our worldview and affects our perception of the universe. Dr. Michio Kaku explains this through the analogy of a fish in a pond, unaware of the world outside its simple reality. Our perception of reality is limited, not only by our understanding of physics, but also by the biology of our brains.
The work towards a “Theory of Everything” is not only a physical goal but a philosophical one as well. We strive to understand our universe and everything within it in the simplest way possible. It embodies human desire for ultimate knowledge and drives centuries of physical progress.
Overall, the concept of extra dimensions represents one of the most arduous and ambitious goals in human history. While they lack proof, these theories motivate people to search more into the nature of our universe and question the very fabric of our reality. The exploration into further discoveries about our universe truly shows who we are as humans and will continue to motivate centuries of physicists to question the very nature of everything.
DUFF, M. J. (1996). M THEORY (THE THEORY FORMERLY KNOWN AS STRINGS). International Journal of Modern Physics A, 11(32), 5623–5641. https://doi.org/10.1142/s0217751x96002583
Gunther Kletetschka. (2025). Three-Dimensional Time: A Mathematical Framework for Fundamental Physics. Reports in Advances of Physical Sciences, 09. https://doi.org/10.1142/s2424942425500045
Kalligas, D., S, W. P., & Everitt,. (1995). The classical tests in Kaluza-Klein gravity. The Astrophysical Journal, Part 1, 439(2). https://ntrs.nasa.gov/citations/19950044695
Lloyd, S., Maccone, L., Garcia-Patron, R., Giovannetti, V., Shikano, Y., Pirandola, S., Rozema, L. A., Darabi, A., Soudagar, Y., Shalm, L. K., & Steinberg, A. M. (2011). Closed Timelike Curves via Postselection: Theory and Experimental Test of Consistency. Physical Review Letters, 106(4). https://doi.org/10.1103/physrevlett.106.040403
While the concepts of space and time were fundamental to the Newtonian world, centuries of digging deeper into the mechanics of our universe have uncovered that it isn’t all as simple as it seems. From Einstein’s Special Relativity to theories of multi-dimensional time, the science behind space and time has evolved into a complex field.
What are Extra Spatial Dimensions?
As scientists explored further into spacetime, theories of more dimensions of space, beyond the three we know, were suggested as a way to explain many of the phenomena that we cannot explain with only three dimensions. These ideas gained most of their traction from the pursuit to combine quantum mechanics with General Relativity, especially issues such as quantum gravity. These theories also attempt to address the rapid growth of the universe after the Big Bang.
What were the motivations to search for Extra Dimensions?
The idea of more dimensions began as a way to unify the fundamental forces of our universe. Modern theories regarding these ideas come from a drive to resolve some of the unaddressed issues of the Standard Model of physics. While the Standard Model is able to describe fundamental particles and the strong, weak, and electromagnetic forces, it is unable to describe gravity. In addition, the Standard Model cannot address dark matter and dark energy, which make up the majority of our universe.
One of the most significant problems in physics is the Hierarchy problem. It refers to the massive gap in strength between gravity and the other three fundamental forces. This extreme difference comes from the small scale of the strength of gravity in comparison to the other forces. Extra-Dimensions have attempted to resolve this by suggesting that while gravity may be just as strong as the other forces, its strength is leaked into the other dimensions, thus weakening it.
This search to discover extra dimensions is not only about solving these specific technical issues; it’s about the centuries-long quest to find a Theory of Everything. Physicists constantly strive to find simpler solutions to describe our universe rather than leaning on hyperspecific coefficients/constants.
While there are many theories involving extra-spatial dimensions, part 2 will focus on a few of the biggest and most influential theories so far.
Kaluza-Klein Theory
In 1919, Theodor Kaluza proposed his theory of four-dimensional space as an attempt to combine gravity and electromagnetism. This theory was later built upon by Oscar Klein in 1926.
In Kaluza’s attempt to combine these fundamental forces, he suggested a fourth, unseen spatial dimension. To create this system, he used Einstein’s equations and extended them into a fifth dimension. He found that the five-dimensional version of Einstein’s equations naturally created the four-dimensional version in one part. The equation had fifteen components, ten of which described our four-dimensional General Relativity. Four of the remaining five described the electromagnetic force through Maxwell’s equations, while the last dimension was the scalar field, which had no known use.
A key concept of Kaluza-Klein theory is that, rather than seeing electric charge as simply an event or calculation, it is represented as the motion of the fifth dimension. The attempt to create the simplest mathematical structure that could represent the five dimensions led to the assumption that no part of the five-dimensional Einstein equations relied explicitly on this fifth dimension. Instead, its presence was there to alleviate other issues in the Standard Model without disrupting the basic functions of Einstein’s equations. In order to do this, Kaluza created the cylinder condition, where he described all coordinate values in the fifth dimension to be zero, effectively hiding it at a macroscopic level, preserving the four dimensions that we experience.
Oscar Klein produced a physical explanation for the cylinder condition in 1926. He suggested that the fifth dimension was compactified and curled up into an unobservable circle with an incredibly small radius, explaining that this is why we are unable to witness the fifth dimension.
An interesting way to understand this is to think of a hose. From a distance, the hose looks like a single-dimensional line. However, the hose actually has two dimensions, both a dimension of length as well as a circular dimension.
This theory revolutionized how physicists thought about spacetime. In a letter to Kaluza that same year, Einstein wrote,
“The idea of achieving unification by means of a five-dimensional cylinder world never dawned on me […]. At first glance, I like your idea enormously. The formal unity of your theory is startling.” (Einstein, 1919)
Over time, Kaluza-Klein theory has been disproven due to its several fundamental flaws. Scientists have tested for Kaluza-Klein resonances, particles that would have to exist if the theory were to be true, and have found none. In addition, Kaluza-Klein theory only addresses gravity and electromagnetism but excludes the strong and weak forces. When incorporated with quantum mechanics, Kaluza-Klein theory predicts many incorrect values for otherwise known constants, showing massive discrepancies. Despite these issues, Kaluza-Klein theory has long been considered the first step into the exploration of extra-dimensions, becoming the precursor to many theories in the decades after. Its core idea- that hidden dimensions cause forces in our four dimensions-has been crucial to further exploration into the concept of spacetime.
String Theory is a very common term, but few people actually know what it means. String theory proposed that instead of the universe being made up of zero-dimensional points, it is made up of strings that vibrate. The specific vibration of these strings would determine what they would be (photon, quark, etc.). The theory aimed to unify all of these different particles and properties into one thing: the string.
When physicists first began to work on String Theory, they found many mathematical issues, such as negative probabilities. In four dimensions, these strings don’t have enough space to produce the wide range of vibrations needed to create all the particles in the standard model. Thus, Superstring Theory suggests that these strings are ten-dimensional objects (nine dimensions of space and one of time). A major reason why physicists were happy with string theory at the time was that it naturally predicted a particle called a ‘graviton’. This particle would have the same effect as the force of gravity. Theoretical physicist Edward Witten has commented on this by saying,
“Not only does [string theory] make it possible for gravity and quantum mechanics to work together, but it […] forces them upon you.” (Edward Witten, NOVA, PBS)
M-Theory is an extension of String Theory that adds one more spatial dimension. Prior to its creation, different groups of physicists had created five versions of String Theory.
However, a true “Theory of Everything” should be one theory, not five possibilities.
M-Theory was created as an attempt to unify these five types of string theory. The key to the development of M-Theory was the discovery of mathematical transformations that took you from one version of String Theory to another, showing that these were not truly separate theories. M-theory theorized that these different versions were just different approximations of the same theory that could be unified by adding another dimension. M-Theory’s eleven-dimensional framework allowed for the unification of these five theories alongside the theory of supergravity.
M-Theory, similarly to Kaluza-Klein Theory, also proposes that the extra dimensions are curled up and compacted. M-Theory uses a specific geometric shape, known as a Calabi-Yau manifold, to create the physical effects we observe in our four dimensions from the other hidden seven. Calabi-Yau manifolds are a highly compact and complex type of manifold that are the foundation of M-Theory because they allow complex folding without affecting the overall curvature of our universe through a property called “Ricci-flatness”. The Calabi-Yau manifolds also have “holes” within their shapes that are thought to connect to the number of families of particles we experience in the Standard Model. This introduces the key concept that, instead of the fundamental laws of physics just being rules, they are actually geometric properties of our universe.
The biggest challenge that M-Theory is facing is its lack of experimental evidence. Predictions made by this model are not testable by currently available or foreseeable technology due to the high-dimensional microscopic levels required. Without making testable predictions, the theory remains just a theory for the time being.
Despite this lack of proof, many physicists still see M-Theory as a prominent candidate in our search for a “Theory of Everything”. Its mathematical consistency and its ability to unify both gravitational and quantum effects lead to it being considered highly promising.
However, while the math behind M-Theory is highly developed, it is not yet complete. The theory is still a work in progress as research is being conducted to better understand its structure and significance.
Meanwhile, critics believe that M-Theory is fundamentally flawed. Many of them believe that the “Landscape” problem is a significant reason that M-Theory is untrue. The “Landscape” problem is described as the fact that the theory predicts many different universes, each with its own set of physical laws. Critics believe that this prediction proves the unreliability of M-Theory and that a true “Theory of Everything” would be applicable only to our universe.
Overall, M-Theory has neither been proven nor disproven and remains a crucial area for future exploration.
Mist is comprised of tiny droplets of water hanging in the air. They are often white or grey and look like they are floating over land. It is formed when warmer air over water meets cooler air, which rapidly cools the warmer air. Because when the air is rapidly cooled, it turns air (invisible gas) into tiny water droplets. It can also be formed when warm air on land meets cooler air from the ocean. The tiny droplets are particles suspended in the air due to condensation near the surface of the Earth and scatter light, allowing us to see them. Fun Fact: While fog and mist are similar, they are not the same thing. Mist tends to be less dense than fog and does not last as long.
Crepuscular rays look like sunbeams raining down from a point and have alternating dark and light areas. They are often colored orange and red and are formed when sunlight shines through gaps in the clouds, often during sunrise or sunset, giving them their color. These rays are visible because the sunlight hits vapor, dust, and other particles as it passes through the clouds and has a high enough contrast between shadows and light. The particles then cause the sunlight to scatter and create distinct beams. Fun Fact: The rays are actually parallel, but an optical illusion makes them appear angled.
Mammatus clouds are rounded pouches of cloud that hang from the underside of a larger cloud. They often form during the warmer months when cool air sinks into warmer air. Mammatus clouds get their unique look when cooler air containing ice crystals and water droplets sinks into warmer, drier air. As it descends, the moisture condenses, forming pouch-like shapes. These clouds are often associated with storms because the cooler air typically comes from cumulonimbus clouds that are connected to thunderstorms. This creates these pouches. There is an association with storms because the cooler air often comes from cumulonimbus clouds that are connected to thunderstorms. Fun Fact: The way that they are formed is the opposite of how most clouds are formed (air rising and cooling), and aircraft stay away from them because they can indicate storm activity and severe thunderstorms.
These rays look like a horizontal crepuscular ray. This phenomenon appears when rays of light and shadows converge at a point opposite the sun, making the rays appear like they are diverging horizontally, even though they are parallel.
Streaks of precipitation that are falling from a cloud, but evaporate before they hit the ground. They look like wispy trails and are often found in deserts or places with higher temperatures. Although the precipitation does not reach the ground, it is often picked up by the radar as rain.
References
“Crepuscular Rays and Light Scattering.” Nasa.gov, NASA Earth Observatory, 17 July 2022, earthobservatory.nasa.gov/images/150090/crepuscular-rays-and-light-scattering.
“Mammatus Clouds | Center for Science Education.” Scied.ucar.edu, scied.ucar.edu/image/mammatus-clouds.
Office, Met. “Virga Clouds.” Met Office, 21 June 2018, weather.metoffice.gov.uk/learn-about/weather/types-of-weather/clouds/other-clouds/virga
SpatialNasir. “What’s the Difference between Cloud, Fog, Haze and Mist?” Medium, 7 Sept. 2019, geoafrikana.medium.com/whats-the-difference-between-cloud-fog-haze-and-mist-a06c7cf0cbf3. Accessed 2 Aug. 2025.
Advancements in genetic engineering have brought revolutionary tools to the forefront of biotechnology, with CRISPR leading as one of the most precise and cost-effective methods of gene editing. CRISPR, which stands for Clustered Regularly Interspaced Short Palindromic Repeats, allows scientists to alter DNA sequences by targeting specific sections of the genome. Originally discovered as part of a bacterial immune system, CRISPR systems have now been adapted to serve as programmable gene-editing platforms. This paper explores how CRISPR works, its current uses, its future potential, and the ethical considerations surrounding its application in both human and non-human systems.
How CRISPR System Works
The CRISPR-Cas system operates by combining a specially designed RNA molecule with a CRISPR-associated protein, such as Cas9 or Cas12a. The RNA guides the protein to a specific sequence in the genome, where the protein then cuts the DNA. Once the strand is cut, natural repair mechanisms within the cell are activated. Researchers can either allow the cell to disable the gene or insert a new gene into the gap. As described by researchers at Stanford University,
“The system is remarkably versatile, allowing scientists to silence genes, replace defective segments, or even insert entirely new sequences.” (CRISPR Gene Editing and Beyond)
This mechanism has been compared to a pair of molecular scissors that can cut with precision. For example, the Cas9 protein is programmed with a guide RNA to recognize a DNA sequence of about 20 nucleotides. Once it finds the target, it makes a double-stranded cut. The repair process that follows enables gene knockouts, insertions, or corrections. This technology has dramatically reduced the time and cost associated with gene editing, making previously complex tasks achievable in weeks rather than months. According to a 2020 review,
“CRISPR/Cas9 offers researchers a user-friendly, relatively inexpensive, and highly efficient method for editing the genome.” (Computational Tools and Resources Supporting CRISPR-Cas Experiments)
CRISPR’s influence extends across many fields, but its role in medicine has attracted the most attention. Scientists are using CRISPR to treat genetic diseases such as sickle cell anemia by editing patients’ own stem cells outside the body and then reinserting them. In 2023, researchers published results showing that a single treatment could permanently alleviate symptoms for some patients with these genetic diseases (Zhang 4.) Another area of exploration includes its potential for treating cancers by modifying immune cells to better recognize and destroy cancerous tissue. According to Molecular Cancer,
“Gene editing technologies have successfully demonstrated the correction of mutations in hematopoietic stem cells, offering hope for long-term cures.” (Zhang 3)
Beyond human health, CRISPR has transformed agricultural practices. Scientists are using it to develop crops that resist pests, drought, or disease without the need for traditional genetic modification methods that insert foreign DNA. One of the longer processes of traditional modifications in DNA could include conjugation. This is moving genetic material through bacterial cells in a direct contact. Conjugation is just one example of many of the traditional genetic modification methods.
CRISPR has been used to produce tomatoes with longer shelf lives and rice varieties that can survive in low-water environments. According to the World Economic Forum,
“CRISPR can help build food security by making crops more resilient and nutritious.” (CRISPR Gene Editing for a Better World)
Such developments are increasingly critical in addressing global food demands and climate challenges.
Research is also underway to apply CRISPR in animal breeding and disease control. In mosquitoes, scientists are testing ways to spread genes that reduce malaria transmission. In livestock, researchers are working to produce animals that are more resistant to disease. These experiments, while promising, require cautious monitoring to ensure ecosystem stability and safety.
Future Potential
Looking ahead, new techniques are refining CRISPR’s capabilities. Base editing allows researchers to change a single letter of DNA without cutting the strand entirely, reducing the off-targeting effect such as prime editing, a newer method that uses an engineered protein to insert new genetic material without causing double-stranded breaks. These tools provide even more control. According to the Stanford report,
“Prime editing may become the preferred approach for correcting single-point mutations, which are responsible for many inherited diseases.” (CRISPR Gene Editing and Beyond)
Possible Concerns
Despite its potential, CRISPR also raises important ethical concerns. One of the most debated topics is germline editing, or the modification of genes in human embryos or reproductive cells. Changes made at this level can be passed down to future generations, leading to unknown consequences. In 2018, the birth of twin girls in China following germline editing sparked international outrage and led to widespread calls for stricter regulation. The scientific community responded swiftly, with many organizations calling for a global prohibition on clinical germline editing. As CRISPR & Ethics – Innovative Genomics Institute (IGI) states,
“Without clear guidelines, genome editing can rapidly veer into ethically gray areas, particularly in germline applications.”
Another concern is the potential for unintended consequences, known as off-target effects. These are accidental changes to parts of the genome that were not intended to be edited, which could lead to harmful mutations or unforeseen health problems. I will expand on this later in the article. Researchers are actively developing tools to better predict and detect such errors, but long-term safety remains a topic of study. The possibility of using CRISPR for non-therapeutic purposes, such as enhancing physical or cognitive traits.
Cost and accessibility are also significant factors. Although the CRISPR tools themselves are affordable for research institutions, the cost of CRISPR-based therapies remains high. According to Integrated DNA Technologies,
“Therapies based on CRISPR currently cost hundreds of thousands of dollars per patient, limiting their availability.” (CRISPR-Cas9: Pros and Cons)
Bridging this gap requires investments in infrastructure, policy development, and global partnerships to ensure that developing countries are not left behind.
In conclusion, CRISPR is reshaping the landscape of genetics and biotechnology. It has already brought major advances to medicine, agriculture, and environmental science. While the technology is still evolving, its precision offers a glimpse into the future of human health. CRISPR the potential to unlock solutions to some of humanity’s most pressing challenges.
Lino, Cathryn A., et al. “Delivering CRISPR: A Review of Methods and Applications.” Drug Delivery and Translational Research, vol. 8, no. 1, 2020, pp. 1–14. PubMed Central, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7427626/. Accessed 31 July 2025.