Understanding Biotech: A Primer on Small-Molecule Drugs
Since the earliest human records on medicine in the Mesopotamian civilizations of Sumer and Babylon we have sought to manipulate the course and progression of human ailments with external substances found in nature. These civilizations and most since have lacked the comprehensive scientific understanding of biology, chemistry, and physics we have today and thus where progress was made, it was by happenstance. For example, in records we have from Egypt, willow bark was used to treat pain, inflammation and fever. In the 19th century, scientists were able to extract salicin from the substance and create acetylsalicylic acid, or aspirin. While impressive that ancient civilizations stumbled upon effective remedies, these were often far and few between. Taking centuries or millennia for new effective remedies to be found and to weed out the cruel, unusual, and ineffective remedies that were even more common. For example, just two centuries ago, it is believed that the practice of bloodletting was a significant contributor to George Washington’s death from a throat infection.
The evolution of the field of biology led by the invention of microscopes, discovery of cells, and the explanation of the role of DNA has led to an explosion of progress within the medical sciences. Scientific theories are explanations as to why certain phenomena arise in the world. They assert the existence of real things and processes that can be tested by the result of an experiment. Thus, they form a scaffold for future research and provide a mechanism for their own overthrow by future competing theories. When Darwin formulated his theory of evolution, implicit in his theory was some way for information to be passed down from generation to generation even though he acknowledged in his Origin of Species that the exact mechanism was unknown. However, armed with the knowledge that something must exist to fulfill this role in the theory, the theory of information storage in organisms was developed. With the unit of selection found in the genes encoded by DNA, we had for the first time a foundational understanding of how observed phenomena in the body arise.
Since the 20th century, with a better understanding of cells and biological processes, we’ve dreamed of powerful ways to counteract abnormal states in the body. Pharmaceuticals and biologics began to appear and accelerate rapidly. When this coincided with the information and computer age post-WWII, we dreamed of powerful microscopic machines that could live in our body and counteract damage and decay in real time. In popular culture and media, these appeared as scaled down machines, like little ships that could navigate the body.
In this post and two follow-ups I will provide a primer on the three major categories of biotechnology that I alluded to above: drugs, biologics, and nanotechnology. For each, I will outline what they are and how they are different from each other, how they have evolved over the years with a case study on a flagship example, and to touch on some of the modern innovations around their development. My goal is that these primers serve as a scaffold for future posts within the biotechnology domain.
Definitions:
Small Molecule Drugs:
Chemically synthesized molecules that are designed to interact with specific biological pathways in the body.
Some otherwise defined biologics like antibiotics are considered small molecule drugs because of their small size, relative simplicity to create, and major differences in their development processes from other biologics
Examples include aspirin, antibiotics, and statins
Biologics:
Substances that are derived from living organisms or their cells
In some cases biologics can be whole cells like in the case of engineered T cells
Other examples include hormones like insulin, antibodies, and vaccines
Nanotechnology:
The creation of materials or substances like nanoparticles that can perform functions on the 1–100 nanometer level in cells — like delivering particles into cells or repairing damage. For reference a DNA double helix is around 2nm in diameter and a water molecule is .275 nm
Examples include the nanoparticles that carry mRNA vaccines and other therapeutics into cells.
Small Molecule Drugs:
The concept of drug use, that is external chemical compounds being administered to the body, spans as far back as written history. Initially, these were based on rules of thumb that certain natural substances appeared to have a given effect. The next big breakthroughs consisted of isolation of the active ingredients within those natural substances. Beginning in the 1800’s, the scientific field of chemistry led to the first isolations and synthesization of these compounds. In 1805, the chemical morphine was able to be isolated from opium and by the end of the same century in 1897, aspirin was artificially synthesized by Bayer. This synthesization marked the beginning of modern pharmacology. Throughout the 20th century, advances in scientific understanding of biology and chemistry led to the mass production of antibiotics like penicillin, statins, chemotherapeutics, and many more.
The defining characteristic of drugs, also known as small-molecule drugs, is they are able to be isolated, synthesized, and modified. Like we’ll discuss with the biologics, the first methods of creating these compounds were highly inefficient and revolved around isolation of substances from naturally occurring processes.
The History of Aspirin
The naturally occurring precursor to aspirin, salicin, was originally found and extracted from willow bark. Salicin is eventually converted by the body to salicylic acid which blocks the production of prostaglandins and thereby inhibits pain, fever, and inflammation responses. Extracting salicin for pharmaceutical use involved a multi-step process of grinding up the bark, dissolving the compounds in water or ethanol, filtering out unwanted insoluble matter like cellulose, and several rounds of crystalizing/precipitating out the salicin from the rest of the mixture to purify it. In 1829, the French chemist Henri Leroux isolated 30 grams of purified salicin from 1.5 kilograms of willow bark.
This process, while a major milestone for pharmaceutical science, was far too inefficient for commercial use. Additionally, the pure salicin had side effects causing gastric distress which led to a desire to modify its structure. Chemical experiments were done in the 1850’s by another French chemist Charles Gerhardt to add an acetyl group to salicylic acid to form acetylsalicylic acid (ASA). His process, more of an experiment, did not produce a stable or pure substance and was never commercialized. At the same time, chemical synthesization of salicylic acid via the Kolbe synthesis using sodium phenolate finally liberated the process from natural extraction from willow bark. Building on this in 1897, German chemist Felix Hoffman at Bayer developed a scalable solution for creating ASA using acetic anhydride to add the acetyl group to synthesized salicylic acid, mass producing what became marketed ever since as Aspirin.
Small Molecule Drug Manufacture: A Case Study on Aspirin
Taking aspirin as an example, let’s look at the evolution of the manufacture of small molecule drugs. Since at the end of the day chemistry is chemistry, generally the overall requirements of the process in terms of reactants and reactions will remain the same. The primary differences we’ll look at are the technology that enables this process. In the historical Bayer process, most of it was manual. The reactants (salicylic acid and acetic anhydride) were added in measured quantities to large glass or iron vessels along with a catalyst of either sulfuric or phosphoric acid and heated to 90 degrees celsius. The ASA was then dissolved in water or ethanol and repeatedly manually cooled, crystalized, and filtered to get as pure as possible. To dry the crystals, they used either ovens or let it air-dry. Aspirin was originally sold as a powder but when it switched to a tablet form, this powder was mixed with binders like starch and pressed into a mold manually or through steam operated machines. Quality control was limited to visual inspection or testing of a small subset of batches. In the end, batches were quite small by today’s standards given the equipment they had and could take hours or days to complete given the highly manual processes involved.
Contrasting with this process, are our highly automated and technologically advanced drug factories today. Recently, the concept of continuous manufacturing is taking off to drastically increase yield and efficiency. To enable continuous manufacturing, the process has become highly automated. The automation helps to control for things like quantity, temperature, reaction speed, and other sources of human error. With aspirin for example, reactants might be continuously added into a machine called a plug flow reactor where as they progress along the tube, the products of the reaction are formed. The plug flow reactor allows for continuous production because it is constantly being fed reactants and the progression through the system creates a precise time interval for the reaction in order to guarantee high conversion. (Conversion is the measure of the reactants successfully coming into contact to form the product). The addition of reactants and catalyst is controlled and precisely metered by automated systems to avoid any human error in chemical concentrations.
After the initial reaction the crystals make their way to a number of additional machines that would make the early Bayer chemists envious. Instead of manual filtration and purification, large scale systems filter out the aspirin from the impure slurry produced previously, send it to be evenly cooled and crystalized, and then to a centrifuge to further separate the crystalized aspirin from the rest of the mixture. From there, the crystals are dried in trays under circulating air or by suspending the crystals in hot air within systems that continuously monitor heat and moisture content. In modern factories, these machines are typically connected to allow for continuous manufacture and can be automatically tested for quality control with integrated tools like inline spectrometers. These inline spectrometers shine light onto the sample, collect data on how it is being absorbed, refracted, or scattered and run the data through predefined models to automatically test and monitor for deviations in purity, concentration, or other metrics. Finally, automated tablet pressers make thousands of tablets per minute and coat them with a fine spray in a tumbling drum that will assist with swallowing the drug.
Conclusion
As we can see, a lot has changed in the history of drugs and drug manufacture. In stages we learned to:
Consume willow bark for it’s salicin content that helped with pain and inflammation
Extract the salicin from the willow bark
Chemically synthesize the salicylic acid that your body converts salicin into
Efficiently modify a molecule (salicylic acid) with an acetyl group to to ease side effects
Automate nearly the entire process to supply the world with cheap aspirin
We also saw in this case study the emergence of continuous manufacturing principles in small drug manufacture which emphasizes automation and continual flow through the process. This makes the process much more efficient and has a big impact on increasing yield. In a healthcare system dominated by high prescription drug prices it becomes more and more important to focus on increasing efficiency and supply to help drive costs down for everyone. In the next primer, I will turn my attention to biologics and dive into another case study to demonstrate key principles around their development.