What Is Biological Computing?

April 28, 2023 - Lou Farrell

Revolutionized is reader-supported. When you buy through links on our site, we may earn an affiliate commision. Learn more here.

How many computers or computer-adjacent products do you use daily? Most people use a desktop computer, laptop, cellphone or Kindle. These are just a few possible answers, especially if you include smart appliances and the Internet of Things.

When we hear the word computer, most people think of an amalgamation of copper and silicon, ranging in size from desktop towers to smartphones in their pockets. Modern computing is inherently synthetic, but researchers are working to change this stereotype. Here’s more on biological computing and how it could change the future of technology as we know it.

What Is Biological Computing?

For decades, neuroscience has fed into the metaphor that the human brain is nothing more than a biological computer. Modern research shows that isn’t the case, but the descriptor has stuck. What is biological computing if we’re not talking about the human brain?

Biological computing is an interdisciplinary science that strives to use organisms to complete computational tasks. Researchers are working on using DNA or proteins within cells to carry out basic calculations. For example, biological computer chips could eventually be implanted to monitor everything from blood sugar to mental health, adjusting the body’s chemical releases and activity from a simple input. It has a significant probability to impact the health sector the most because of its practicality.

Biological computing, or biocomputing, can occur in a lab setting or digitally. Depending on the applications, the microscopic scale of these computers could also make them a branch of nanotechnology or nanobiotechnology. 

How Biocomputing Works

Instead of using wires to transmit information — such as the ones and zeros you see in binary — biological computing relies on chemical inputs. The computer must become familiar with its surroundings and objective to create the output. Humans can change these inputs depending on the biocomputer’s application. During the process of determining the solution to a problem, it will make or get rid of a molecule.

Biotechnology would be responsible for changing the directives of proteins. Imagine if a biological computer chip told amino acids to craft the exact protein people with particular ailments needed. Other scientists are thinking of using these organic functions in differently.

In 2017, scientists “rebranded” E. coli bacteria to make it store movies and images. CRISPR technology is the driving force behind these advancements and a curious technology in the biocomputing world. Other researchers have performed similar experiments, concluding that there’s a lot of space in a microscopic bacteria’s DNA. Some question storing videos or books in bacteria because nobody has posed a practical use for it yet.

These biological computers are rudimentary at best, offering capabilities similar to the early computers of the 1920s. It’s nothing compared to the computing power of even a mobile phone. The fact that it’s entirely biological makes it such an exciting advancement. 

Computational Biology vs. Biological Computing

Alan Turing is considered one of the fathers of modern computing. In addition to his work during World War II creating a computer that could crack the Axis codes to help the Allies achieve victory in many engagements, Turing was also one of the first to use computers to paint a clearer picture of the biological world. 

Turing developed a mathematical model to study morphogenesis, or the biological process that causes cells to take a specific shape during embryonic development. In 1952, he published a paper exploring the Chemical Basis of Morphogenesis that is still cited today. Turing also laid the foundations for modern artificial intelligence by pushing for computers designed to look and function like the human brain. He didn’t see it then but created the foundation for biological computing and computational biology. 

They sound similar, but the two fields are distinctly different. Turing’s study of morphogenesis is an example of one of the earliest applications of computational biology — using computers to understand complex biological events. While distantly related to biological computing, it relies on traditional computer hardware rather than biological processors. 

Benefits of Biological Computers

If they lack the processing power of current computing systems, why are scientists so fixated on creating functional biological computers?

Proliferation is one of the most significant benefits of biological computing over traditional. There’s no need to source materials for printed circuit boards or processors. Instead, once a cell is programmed, it’s much more cost-effective to grow billions of clones carrying out the same task. In a world on the cusp of Web3 and maximizing automation, this could push AI and machine learning algorithms to the next level. Industrial adoption of these technologies would increase if the costs were manageable because of their biological nature.

Organic cells also have markers that encourage them to create functional systems, reducing the work biological programmers must put into their projects. It could help talent acquisition and gaps in related sectors. Training is simplified with a project’s inherent biological advantage, and less formal requirements are necessary for professionals to advance in the field.

Biological computers may also be easier to maintain. Think of the human body. More than 1 million cells die every second, even in healthy bodies. These deaths aren’t a problem because your body always makes new cells. Imagine a computer that never needs new parts because it constantly regrows the cells necessary to keep it running. This could be a game-changer on multiple fronts for complex structures like data centers. Their intricacies require constant, expensive maintenance, and they damage the environment with their intense energy consumption. Consider how biological computing could amplify data center usefulness while acting as a panacea for the most negative side effects.

Challenges of Biological Computing

One of the biggest challenges of biological computing is getting the two fields — computer science and biotechnology — to mesh. Instead of reverse-engineering what Mother Nature has already built for us, this field aims to move things forward, which is never easy. 

Applying these technologies in anything other than a laboratory setting is challenging. Biology can be fickle, and anything from environmental conditions to the nutrition in the system could cause a cascade failure that could shut down the entire computer network. 

Plus, the amount of DNA required to execute complex problems with potentially infinite solutions addresses the issue of scaling. The most famous biocomputing example happened in 1994. Leonard Adleman attempted to use biological computing to execute the Hamiltonian Path Problem with seven cities for the DNA to traverse — and it worked with astounding energy efficiency compared to a supercomputer. However, if the number of cities exponentially increased, the amount of DNA necessary to complete the experiment would be almost outside the realm of possibility.

Biological computer systems also don’t have anything near the processing power of modern computers. It’d be like trying to run Twitter or TikTok on the machine Alan Turning used to crack the Axis code in World War II. It’s a neat idea but lacks practical applications in its current form. 

Applications for Biological Computing

Who needs nanobots or other nanotechnology when you can program a cell to do precisely what you need it to? This biocomputing example will be vital in medicine for creating targeted treatments. Taking a sample of a malignant mass would enable doctors to program cells to target and treat just that mass without damaging any of the surrounding tissues. They could also be programmed to identify specific biomarkers that could indicate the presence of a disease or genetic condition. It could prove useful for detecting the genetic markers of recessive disorders that haven’t yet or may never manifest, giving potential new parents a heads up. 

For the foreseeable future, most of the applications for biological computing will be in medicine and research. Station B is Microsoft’s subsidiary focused on biological computing. It works with Princeton University and two biotechnology companies — Oxford BioMedica and Synthace — in hopes of using biological computing to reduce the cost of gene therapy and products. This application could make these treatments more affordable and accessible for those that need them most. 

In the above study taking advantage of CRISPR and E. coli to store data, everyone from content creators to cybersecurity analysts could come up with inventive ways to maximize its dense storage space. Could companies hide secure data in a petri dish, away from the hands of cybercriminals? Could environmentally concerned content creators store countless terabytes of video footage within one bacteria, instead of stacking up external hard drives?

Shaping the Future of Computing

We know that modern computers are getting smaller, but we don’t think anyone anticipated them going microscopic — at least not this soon. Biological computing is still in its infancy. It will be a long time before you can opt for a biological computer instead of one made of copper and silicon. 

The potential applications for biological computing are nearly limitless. For the moment, we’ll likely see them focused in laboratory settings and medicine. Still, as this technology continues to evolve, we could potentially see it shape the future of computing as we know it. 

Editor’s Note: This article was originally published on February 7, 2023 and was updated April 28, 2023 to provide readers with more updated information.

Revolutionized is reader-supported. When you buy through links on our site, we may earn an affiliate commision. Learn more here.


Lou Farrell

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.