How to apply Shannon's information theory to biology. Cells, from bacteria to human cells, constantly take up, store, retrieve, communicate and make decisions based on information. How they realise all this computation using very unreliable components is still largely an open question. Instead of transistors they have to employ proteins, but proteins constantly degenerate and are re-built making their numbers fluctuate. If cellular signalling is impaired severe diseases can be the result, for instance cancer or epilepsy. As cellular communication is so pervasive and essential, researchers start to look into this information flow in biological systems in more detail. My research group at the BioQuant centre, Heidelberg University, is also active in this area, an area which I would call Information Biology — the study of how biological systems deal with information. I will show you how you can apply Shannon's information theory to biological systems. For this we need three ingredients, namely dynamic models of biological pathways, stochastic simulation algorithms (that take into account intrinsic fluctuations in molecular numbers), and, of course, Shannon's theory of information. I will give brief and user-friendly introductions to these three ingredients. After that I am going to talk about a number of use cases, such as: How much memory does a bacterium have? And how long can it remember things? How many bits per second can a liver cell process via its calcium signalling pathway? How must signalling pathways be constructed, structurally and dynamically, for certain stimuli to be decoded? and others… I will also give links to (open source) software that is being developed in my group, which you can use to simulate and play around with biochemical pathways, and also to estimate information flows and do information biology. FYI: The research I am talking about here is part of a research area which is called Computational Systems Bi
Name | Type | Role | |
---|---|---|---|
Jürgen Pahle | Director |