Information Theory and Biology

April 1953 Claude Shannon

I explored how information theory could be applied to understand biological systems and genetics, laying groundwork for modern bioinformatics.

Genetic Information as a Channel

DNA can be viewed as an information channel. The storage and transmission of genetic information follows principles similar to communication systems:

Key Parallels

  1. Redundancy: Biological systems use error-correcting codes. Redundant genetic sequences provide protection against mutations.

  2. Efficiency: Genetic information is highly compressed. The genome uses efficient encoding schemes to pack vast amounts of information.

  3. Channel Capacity: There are limits on mutation rates and genetic diversity, analogous to channel capacity limits.

The Mathematical Framework

For a genetic channel with:

  • Input: Parent DNA sequences
  • Noise: Mutations, copying errors
  • Output: Offspring DNA

We can analyze capacity and error rates using information theory.

Neural Systems

The nervous system can be analyzed as an information processing network:

  • Information rate: How many bits per second the brain can process
  • Channel capacity: Limits on sensory input (eye, ear, etc.)
  • Memory as storage: Information storage in neural networks

This work influenced early research into neural networks and computational neuroscience.

Legacy

This paper helped establish the field of bioinformatics and showed that information theory provides a universal language for understanding both living and non-living systems.


The unity of information across all biological systems continues to reveal new insights into the nature of life itself.

Share: