The Historical Evolution of Modern Systems Biology
The Historical Evolution of Modern Systems Biology
1 Introduction
Biological phenomena are among the most complex phenomena known to exist. Even the smallest cell or the smallest related set of biochemical reactions consists of many diverse elements that engage in numerous, complicated, and often incompletely understood interactions with one another. Because of this complexity, it is vital that biology adopt an approach that integrates these diverse elements and complex interactions into a unified framework. However, attempts at such integration have achieved limited success in the past, due to the complexity of the natural world and computational limitations.
Fortunately, some of the problems faced by modern systems biologists are not without precedent. The idea of viewing a collection of elements and their interactions with one another as an integrated whole, or a system, is not a new idea. From Newton’s dynamical systems to modern systems theory, where elements are expressed by nonlinear differential equations and stochastic methods, systems ideas have been applied to many phenomena and have evolved with each application [Figure 1]. An examination of the past applications of systems ideas, the specific issues that spurred each development, and the conditions that led to either the success or failure of the application will provide a roadmap of sorts to modern systems biologists and will hopefully provide some insight on the unique problems faced by systems biology.
[pic]
Figure 1: Knowledge Flow During the Evolution of Systems Biology
2 The Foundations of Systems Ideas
The fundamental components of modern systems theory are four millennia old. Chinese medicine applied systems ideas to physiology as early as 2500 B.C. [0.5], and the idea of treating a body as a unified whole was also popular with Western physiologists of the ancient world. The earliest recorded attempts at formulating a theory of systems in the Western world were those of Aristotle and other Greek philosophers, which occurred around 300 B.C. Aristotle proposed that nature was made up of primary, indivisible constituents and that these constituents possessed both intrinsic properties and extrinsic properties, or interactions [1]. Aristotle's idea of teleological causation, which postulated that organisms have natural goals and that evolution can only be understood in terms of these goals, is still one of the fundamental assumptions of biology [2].
Mathematical rigor was not introduced into systems thinking until Newton's work on dynamical systems, which placed all system behavior within a set of cause-and-effect rules [3]. Originally developed for astronomy, the idea of dynamical systems migrated to engineering, thermodynamics, ecology, and genetics, and evolved with each application to form some of the fundamental components of modern systems theory. Computational difficulties stemming from both system size and system complexity first appeared in these applications, and this prompted the development of mathematical techniques that are still in use in the present day.
[pic]
Figure 2: A Timeline of Key Events During the Classical Era
2.1 Newtonian Physics and Engineering
Newtonian physics and engineering provided the most fertile ground for systems thinking. The advantageous conditions provided by early Newtonian physics and engineering stemmed from the popular scientific and philosophical paradigm of the Renaissance and Industrial Revolution popularized by Descartes and other philosophers, which held that the universe operated in a mechanistic fashion and all behavior was a result of cause-and-effect interactions [4]. Given this viewpoint, it is not surprising that Newtonian physics and engineering would be the first fields to provide a general, if rudimentary, mathematical study of systems.
Physics and engineering of the time possessed several advantages that contributed to the successful application of systems ideas. Due to the lack of availability of precise instruments, the only systems that could be effectively studied were those composed of obviously discrete components. In addition, most systems studied by these fields were composed of relatively few elements. In the case of those systems that were composed of a larger number of elements, the resolution of the available instruments was such that many interactions of small magnitude could be ignored. The end results of these three advantages were simple models of the systems in question, which enabled the systems to be studied as a whole with ease and accuracy relative to observation.
Engineering possessed additional advantages that some natural systems did not. While philosophers grappled with the nature of God and the purpose of the universe, engineers designed their systems for a known, predefined purpose of their own choosing. This allowed them to fully specify every element, in some cases arbitrarily define elements, artificially constrain the number of interactions or remove unwanted interactions to ensure a level of simplicity, and also provided a framework for understanding the evolution of the state of the system. These advantages account for the continued success of such things as industrial design techniques [5] and control theory [6] today.
2.2 Thermodynamics
Thermodynamics is particularly interesting to systems scientists for two major reasons. First, it is the first place where complexity was recognized as a serious problem for scientists. In addition, the concepts of a system were refined and solidified in the field. This refinement was necessitated by the problems faced by early thermodynamicists, which were unique at the time. Of particular interest was the dual approach to thermodynamics that utilized both top-down and bottom-up techniques to define thermodynamic systems. Although mathematical systems ideas had appeared in Newtonian mechanics, thermodynamics could be considered to be the true birthplace of mathematical study of systems.
Complexity first became a problem as scientists increased the resolution of their instruments and probed more deeply into the natural world. It was observed that certain systems did not obey the conservation of energy that was predicted by the simple Newtonian models. Early efforts to explain this discrepancy made use of what are now known as top-down techniques and attempted to develop a theoretical model from macro-scale observations. While attempts by Carnot, Joule, and others to place this discrepancy into a theoretical framework by adding additional interactions via materialistic methods like caloric theory [7] yielded little result, these efforts did result in a successful axiomatic theory of thermodynamics.
The framework provided by the axiomatic theory defined the problem sufficiently enough to allow Clausius to redefine the elemental component of the system. Clausius proposed that heat was the product of the movement of a large number of molecules [7]. This theory implied that nature was significantly more complex than previously believed, and that realization led directly to the heart of the problem. It was known that the number of molecules in even the smallest system was much too large to deal with given the computational aids of the time period. Maxwell and Boltzmann dealt with this problem by proposing that the molecules and interactions were highly similar and that statistical methods were applicable [7]. By grouping the molecules and interaction by property into a relatively small number of categories, and by making use of the framework provided by statistics and the creation of some synthetic concepts, such as entropy, to manipulate and measure these categories, thermodynamicists were able to create relatively simple models from which reliable predictions could be computed.
Both the axiomatic theories and the statistical theories proved valuable to thermodynamics, and both are still in use today. Gibbs was one of the primary researchers of the period, and was active in both the axiomatic work and the molecular theory. It was he who finally unified both in 1906 [8]. These two theories gave systems theory much of its vocabulary, including the concept of system boundaries, and sparked later research. The importance of thermodynamics in the evolution of systems theory cannot be minimized.
2.3 Ecology and Genetics
Early research in ecology was done during the same approximate time period as early research in thermodynamics. While thermodynamicists probed the intricacies of the atomic world, social scientists and ecologists were faced with similar intricacies when studying population dynamics. However, the problems faced by ecologists were not identical to those found in thermodynamics and the differences, although small, were significant. The habitats that ecologists studied contained a large number of diverse animals. In addition, the full range of interaction among the fauna and flora in the habitat were seldom known. These two obstacles translated into a system with a large number of components and a large number of interactions, many of which were unknown.
The primary difference between ecology and thermodynamics is that animals are complex entities that are assumed to be somewhat consciously goal-oriented and adaptable. Because of this, animals are nowhere near as constrained in the number of interactions with each other and their habitat as are simple molecules of a gas. This leads to a different and potentially more troubling type of system complexity. The numbers of animals in most habitats were not so vast as the number of molecules in any thermodynamic system. However, animals tend to move and hide where gas molecules do not, which makes counting animals just as problematic as counting molecules. Even though it is often assumed that animals of the same species have certain identical characteristics and goals, the study of nature has provided much evidence that animals will often achieve the same goals in diverse ways. As a result of these differences, ecological systems are complex systems not only because of the number of elements, but because of the number of possible interactions between elements. For early ecologists, these problems necessitated a slightly different approach than those of thermodynamics.
The statisticians Fischer [9] and Pearson [10] heavily promoted the use of statistical methods, already successfully applied to thermodynamics, in ecology. However, due to the different nature of the problems, the applications of statistics were different. Regression, chi-square testing, and concepts like standard deviations, were developed to mask interaction complexity [9], [10]. Sampling techniques came into widespread use in ecology in order to measure the number of elements in the system. Verhulst's work on the development of population equations [11] and the work of Volterra and Lotka [12] on predator-prey dynamics made heavy use of both of these types of techniques.
Although genetics and evolutionary theory did not immediately introduce any systems problems, they are worthy of mention both because they evolved directly from attempts to apply ecology the concepts of teleological causation to ecological systems and due to the tremendous effect they have on modern biology. Beginning with von Goethe’s morphology [12.5], genetics and evolution provided a new framework within which to place ecology and pioneered the use of stochastic modeling in biology with Mendel's work on inheritance [13]. The importance of genetics and evolution would be very important to biologists of the future.
3 The Quantum Revolution
The theory of quantum mechanics evolved directly out of efforts by Planck and others to solve the blackbody radiation problem from thermodynamics around the turn of the 20th century [14]. Quantum theory heavily influenced the evolution of systems theory, although the primary contribution of quantum revolution to systems thinking was philosophical. As thermodynamicists had previously questioned the possibility of a simple universe, the new physicists questioned the possibility of a certain universe. The success of quantum mechanics brought a new knowledge-oriented scientific paradigm into the mainstream. It was from this paradigm that many of the ideas in formal systems theory were born.
The ideas of nondeterminism, abstraction, and relativity, introduced into the scientific mainstream by physicists, exerted a more significant influence on the philosophy of science than on physics itself. The non-intuitive models of a quantum system and relativity, as well as the resulting debates over the meaning of the results given by these models, brought into question the nature of human knowledge and introduced a new knowledge-oriented, relativistic philosophy of science. This new philosophical approach towards science heavily influenced the evolution of computational theory and the development of the computer and was a major driving force behind the later development of General Systems Theory and cybernetics.
Quantum theory is a difficult case to analyze in this context due to the on-going nature of the debate over the theories’ physical significance. Although the Copenhagen interpretation of the theory is favored by many, others would argue for a different interpretation. Since the debate is not fully resolved, it is difficult to determine the ultimate impact of quantum theory on science and on the philosophy of science. While it is still too early to properly set this revolution into historical perspective, but it seems that a significant shift of some sort has occurred in the way humanity views science as a result of these theories.
[pic]
Figure 3: A Timeline of Key Events During the Quantum Era
3.1 Modern Physics
Modern physics is a difficult case in that the debates surrounding the interpretations of the theories are not yet completely resolved. Thus, it cannot be said with certainty what quantum theory and relativity's ultimate contributions to systems theory will be, but the debates surrounding the theories have had immeasurable impact on both the philosophy of science and the development of systems theory. Not only did these theories extend existing system concepts via nondeterministic methods, but the subsequent debates helped introduce these methods into the mainstream. The debates over the interpretation of the quantum model and relativity, as well as the debates over how far these new theories could be extended called into question the nature of scientific knowledge and provided the modern philosophical context in which systems theory and systems biology currently resides.
By 1900, physicists had discovered subatomic particles and had overturned the idea that the atom was the fundamental component of the universe. Early attempts by Bohr, Planck, and others to reconcile the orbit of the electrons with Newtonian mechanics met with limited success and brought up troubling inconsistancies [14]. To resolve these inconsistencies, two competing but equivalent models were proposed in the 1920s. The wave equation model introduced by Schrödinger and the algebraic model introduced independently by Dirac and Heisenberg , while useful. were very different from anything previously imagined [15]. In these quantum models, observables existed in a state of superposition and were assigned probability functions instead of values. The exact state of the observable was unknown until the wave function, which encoded all possible states of the observables, was collapsed. The concept of replacing a classical observable value with a probability function was similar to the type of statistical manipulations found in thermodynamics, but the way in which the nondeterminism was applied to a single particle was unprecedented in the world of physics. The potential implications of the quantum model started a series of debates that is still unresolved today.
Meanwhile, in addition to the new quantum theories under development, Einstein proposed his theory of general relativity in 1916 [16]. In the relativistic model, time is formalized as a fourth dimension and the philosophical concept of frames of reference was applied to physical phenomena. The concepts of observers and multiple frames of reference were common in philosophy, but had not been applied formally in a scientific theory at that time. The concept of relativity provided a twist to the debates raging over quantum theory, being used by both Bohr and Einstein to support their interpretations of the quantum model. Later debates also arose over the extensibility of the relativistic model; although a large amount of data supported the theory, concepts such as Gödel loops [16.5] were as difficult to accept as superposition.
The debate over the interpretation of these models, in particular quantum theory, has been ongoing for the better part of a century. Bohr and his disciples championed the Copenhagen interpretation, which proposed that the universe was actually nondeterministic. Einstein and Schrödinger held that their interpretation, which proposed an unknown deterministic process lay beneath quantum theory, was the correct one. Feynmen, among others, claimed it was an unimportant detail. Regardless of which argument is correct, no one can deny the utility of quantum mechanics and relativity. Likewise, no one can deny the immense impact the debate had on the philosophy of science. Bohr’s argument brought up the unpleasant idea that humanity might never possess complete knowledge of the universe, destroying the mechanistic philosophy of the preceding decades. Meanwhile, Einstein’s arguments focused attention on the philosophy of science and brought the concept of relativism into the scientific mainstream. The nature of the Bohr-Einstein debate served to focus attention on the proper role of philosophical relativism and of nondeterminism in science. Although the Copenhagen interpretation is the currently favored theory among most physicists, there are some who would still disagree.
Likewise, the full impact of the quantum revolution on systems theory cannot be measured. Nevertheless, the impact to date has been immense. These theories extended system concepts already in use in other disciplines, and the nature of the debates surrounding them served to focus attention on the underlying assumptions behind such concepts, leading to further refinement. The philosophical ramifications of quantum theory and relativity, as well as the techniques developed within these models, heavily influenced the later development of systems theory and systems biology.
3.2 The Vienna Circle
The Vienna circle, a group of mathematicians and philosophers, was formed in the early 1920s under the guidance of Mortiz Schlick. In an attempt to bring to philosophy the rigor of scientific work, Schlick, Carnap, Neurath, and others developed a doctrine of logical positivism based on Russell and Whitehead's mathematical logic [17.9]. Logical positivism is a reductionist approach that strives to verify statements by determining how meaningful they are, or in other words, whether they can be verified logically or by sensory observation [18]. Although criticized harshly by Popper [18.5], logical positivism helped to formalize the conception of a scientific theory. Because they dealt with the nature of philosophical knowledge, logical positivism was naturally applied to the nature scientific knowledge.
The Vienna circle drew much of its inspiration from the new ideas in logic and philosophy of the time and from the quantum debates. Through the Vienna circle, these ideas and the philosophies behind them were transmitted to the founders of formal systems theory, many of whom were products of or were familiar with the Vienna circle. The resulting agreement and disagreement with the philosophies advocated by the Vienna Circle played an instrumental role in shaping formal systems theory.
3.3 Ludwig von Bertalanffy and Organismic Biology
Von Bertalanffy is considered by many to be the father of modern systems theory [18.9] and should also be considered to be the father of modern systems biology. He was the first to argue that biological organisms should be viewed as an integrated whole, a result of his recognition that biological organisms were both complex and dynamic. His two earliest books on the subject, 1928's Modern Theories of Development [19] and 1932's Theoretical Biology [20], proposed that biological systems, which he termed “organismic systems”, could be modeled as self-regulating, open systems [20]. Von Bertalanffy also proposed that biological organisms could be modeled as open thermodynamic systems and were thus self-organizing systems. These systems were characterized by their ability to interact with their environment and gain new emergent properties through evolution [21].
Unfortunately, von Bertalanffy’s proposals were far ahead of their time and there was only a rudimentary understanding of molecular basis of biology. While he attracted attention and generated a burst of effort in the area, his theories were also criticized as being pseudo-scientific and were compared to vitalism. Without an experimental basis to support and test his theories, there was both little defense against these charges and little progress made in developing the theories themselves. As a result, von Bertalanffy’s theories only generated interest in a few, most notably the physicist Schrödinger [21.5].
Despite the lack of progress in developing his biological theories, von Bertalanffy also founded the later General Systems Theory and his ideas profoundly influence modern systems biology. Von Bertalanffy was a product of philosophy in general and of the Vienna circle specifically. He was also an ecologist by training and was knowledgeable of thermodynamics. His work reflects all of these influences. Much of his terminology was borrowed from thermodynamics, his emphasis on interactions arises out of his background in ecology, and his writings reference philosophy, notably Aristotle and Hartmann. Even ideas from the Vienna circle appear in organismic biology and later, in General Systems Theory. Von Bertalanffy and the rest of the systems movement were ardently opposed to reductionism as proposed by logical positivism, yet von Bertalanffy took an interest in constructing a formal language to represent scientific theory and eventually created general systems theory as that proposed language. He was a key influence behind the philosophy that drove the evolution of cybernetics, and some of his biological theories based on thermodynamic systems are the focus of research in modern systems biology [21.6].
4 Cybernetics, General Systems Theory, and Computational Science
The new knowledge-oriented philosophy introduced by the quantum revolution led to the investigation of the nature of human knowledge, which increased the study of logic, computation, and the process by which scientific models were created. These investigations generated a collection of general ideas about the structure of scientific models and the representation of knowledge. They also led to metaphors comparing human processes with mechanical processes, which were rapidly expanded beyond the original computational metaphor introduced by logicians.
The study of logic and computation inspired the modern computer via the Turing machine and the associated analogies to the human brain, while the expanded set of man-machine analogies came to be named cybernetics and influenced a number of fields before fracturing and having a number of fragments absorbed by General Systems Theory (GST) [21]. GST, which promised to provide a unifying language for the construction of scientific models, did not gain much favor in the scientific community, although it did benefit from the later influx of cyberneticians. Unfortunately, cybernetics, GST, and early computational biology were beset by the same challenges that afflicted von Bertalanffy’s original attempt at systems biology. The difficulties of dealing with large amounts of data and computing large and complex problems were not realized until attempts were made to actually solve the problems. Although the molecular basis of biology was better understood during these attempts than during von Bertalanffy’s first attempt at an integrative theory of biology, the techniques required for data collection and the mechanisms needed for data integration and computation on an appropriate scale were still not available, nor were the difficulties of complexity completely understood.
Despite the challenges, cybernetics and GST were the first attempts at developing a systematic theory of biology and formalized the concepts of a system. Both fields have left a rich legacy. Artificial intelligence, the behavioral and cognitive schools psychology, Simon's work in decision making in economics, systems analysis, neurobiology, and physiology all bear the imprint of these fields, although the connection is often not recognized. Cybernetics and GST have heavily influenced modern science and modern systems biology, despite their historical difficulties.
[pic]
Figure 4: A Timeline of Key Events During the Systems Era
4.1 Cybernetics
Cybernetics, the study of self-regulating and adaptive systems, can be viewed as the second attempt at formulating a systematic theory of biology. Although cybernetics was an interdisciplinary field from the beginning, its birth can be traced to the earliest study of computation. In 1943, the neurobiologists McCullough and Pitts published an article that proposed a theory of how ideas arise from the activity of neurons in the human brain [22]. This article, along with an article by Rosenblueth, Wiener, and Bigelow [23] on the philosophical issues of teleology and purpose, marked the birth of cybernetics. From the limited analogy between the human brain and early computers grew a host of analogies between humans and machines drawn from a diverse interdisciplinary background. This collection of analogies evolved into a generalized study of self-regulating systems. Feedback and control theory were borrowed from engineering to describe self-regulating processes in biology, information theory was borrowed from electrical engineering and used to describe the communication process required for control, system ideas and terminology were imported from physics and thermodynamics, and many other ideas from diverse fields such as linguistics and biology also found a home in the discipline.
In Wiener’s Cybernetics: Communication and Control in the Animal and Machine, published in 1948 [24], Wiener also credits von Bertalanffy’s organismic biology, information theory, and neurology as the foundations of cybernetics. Wiener’s outlook on cybernetics, as a fundamentally nondeterministic field interested more in interactions between things than things themselves, was shaped by his wartime experience in directed anti-aircraft fire, which used stochastic modeling in an attempt to predict the path of an airplane weaving randomly in an attempt to avoid antiaircraft fire [36]. Cybernetics, particularly Wiener’s view of cybernetics, achieved significant popularity among biologists. Included in this list were the neurobiologists McCullough and Pitts, as well as Ashby, who wrote an excellent book covering most of the field from the perspective of a biologist [25].
Although Wiener, Ashby, von Foerster, and others argued for the need for stochastic mechanisms in cybernetics, many cyberneticians came from an engineering background and were more familiar with engineering problems and engineering problem-solving techniques. As a result, cybernetics was infused with many deterministic techniques, and made use of differential equations and mechanistic engineering concepts such as feedback loops and Shannon’s information theory, which dealt with the effects of signal and noise on communication channels and was a product of electrical engineering [26]. This view of cybernetics, while useful to problems of a manageable size and defined purpose, had the side effect of fostering a 'culture of determinism' within cybernetics. When more complex problems were encountered, this culture provided resistance to the adoption of nondeterministic techniques and contributed to the fracturing of cybernetics. Since the deterministic methods proved to be sufficient for the needs of many cyberneticists, they returned to their own disciplines and abandoned what were considered to be at the time irrelevant problems. Cybernetics experienced the problem from which many interdisciplinary projects suffer, that is the transitory nature of the participants’ interests, and fractured as researchers returned to their original disciplines.
4.2 General Systems Theory
General Systems Theory evolved out of von Bertalanffy’s background in the Vienna circle around the same time period as the popularity of cybernetics peaked. General systems theory was intended to form a formal language in which scientific models could be constructed, despite attacks by its critics that it was attempting to be a universal scientific theory itself. In fact, there were universal elements to the theory and it was abused in that form, but von Bertalanffy never billed it as any more than it was. GST was primarily concerned with properties of and organization of systems in general, as opposed specific collection of analogies. Von Bertalanffy considered cybernetics merely a mechanistic form of general systems, and this view came to be shared by some of those involved with cybernetics as they encountered more complex problems that could not be solved by mechanistic means.
The principle characteristics of GST included the study of isomorphisms between systems that would lead to general theoretical laws about systems, an emphasis on the organization and function of systems, an emphasis on hierarchy within systems, and the study of emergent properties, which are behaviors that arise from the interactions among individual parts [21]. Cybernetics quickly began to parallel general systems theory, beginning with von Foerster’s work in second-order cybernetics [27]. The urging of Wiener and Ashby for more use of nondeterministic methods and the search for motifs common to cybernetic systems also paralleled common themes in general systems theory. Upon the fracturing of the field of cybernetics, this parallel work was absorbed into GST. Despite the absorption of cybernetics, general systems theory was never affected by the culture of determinism present in early cybernetics because most of those who migrated to the field were those who required nondeterministic techniques to solve their problems. After the absorption of cybernetics, work continued in general systems by those like Simon [27.5], Weinberg [28] and Klir [29].
The primary charge leveled at GST by critics is understandable in hindsight. GST was more focused on a theoretical level and many disciplines had acquired their deterministic metaphors from cybernetics by the time of absorption. These fields found the emphasis on theory and the move away from a mechanistic view to be less than useful; certainly neither the behavioral school of psychology or engineering control theory had any use for a non-mechanistic view at the time. The problems those two areas dealt with were generally not recognized as complex enough to require much in the way of nondeterminism. As a result, this drift into nondeterministic methods along with GST’s more pronounced penchant for finding isomorphisms were decried by many as a sign of a lack of relevance to real world applications.
Russian cybernetics, or what would more appropriately be called Russian General Systems Theory, is a more interesting case than the Western applications of GST. In the Soviet Union, cybernetics was accepted as a unified language of science, in the role that general systems theory was proposed to fill, and most scientific research took place in this context [29.1]. This was particularly interesting in its effect on biological research. Despite government repression of genetic research, Liapunov had proposed ideas bearing a striking resemblance to those in modern systems biology as early as 1960, including the need for high-throughput data collection [29.2], an information theoretic study of the transmission of genetic information [29.3], and hierarchy of biological control systems, with each system functioning as an element of a higher level system [29.4]. The Russians were also more accepting of stochastic methods, perhaps because Markov and Kolmogorov had pioneered many stochastic modeling techniques.
Unfortunately, GST failed to progress much further than cybernetics or organismic biology because the field was still grappling with the same problems of complexity that Systems Biology is faced with today. To compound the difficulty, these problems were more pronounced during GST's infancy than they are today. Computing technology was much more primitive, large amounts of biological data were unavailable, and computational complexity was not fully understood at the time. Unlike the earlier attempts at formulating an integrated theory of biology, GST has been able to adapt and absorb new concepts like fuzzy sets [34] and stochastic modeling. Despite the challenges faced by GST, it has persisted and exerts a direct and continuing influence on modern systems biology [34.1].
4.3 Computational Science and Early Computational Biology
Computational complexity has always accompanied attempts to compute scientific problems, even if the phenomenon was not understood or recognized immediately due to other limitations. The earliest attempts at computational biology were not directly hindered by complexity, but instead by the available hardware of the time and by limited data on biochemical processes. The first work in computational biology was Tarski and J.H. Woodger’s work on axiomatic biology [30], which was an attempt to represent biological knowledge as a logical system in which biological truths could be found. This early work paralleled the research of Turing, Church, and Godel in the 1930s and was spurred by the general study of human knowledge and logic during that time period. Upon creation of the first electronic computers, physicists, chemists, and biologists alike quickly recognized the potential of the new machines. Physicists and chemists put these earliest computers to work on their problems with success, and several useful biochemical simulation packages and statistical analysis packages were also developed. Simulation software, such as Biobell [31], which combined nonlinear differential equations with discrete event simulation, was the norm throughout the next three decades. These programs neatly illustrate the immense problems with which early computational biology was faced. In consequence, the vast majority of these programs were designed for small-scale simulation of a sequence of reactions. The few attempts at larger scale simulations, like metabolic control theory and biochemical systems theory [32], were only a matter of degree.
Attempts at modeling biological systems on a large scale foundered for two major reasons. First, the hardware of the time was not capable of handling the complex nonlinear differential equations that were the hallmark of biological modeling. In addition, constructing these large systems of equations was hindered due to incompletely specified interactions between the biochemical species. This imposed severe limitations on what could be done by computational scientists and biologists. Second, the very nature of some problems made them difficult to compute. This was not a problem limited to only biologists; it appeared throughout computational science and led to the eventual investigation of computational complexity.
The formal study of computational complexity began in the early 1960s [32.1] and dealt with the resource cost, in space and time, of solving classes of problems. However, it did not receive much attention from the computational science community at large until Cook [32.2] and Karp’s [32.3] work on NP problems in the 1970s. Complexity theory splits problems in hierarchical differences based on fundamental properties of the problem, like interaction level, and shows that some problems are more difficult to solve than others. It also links the resource requirement to the problem’s input size. An understanding of complexity theory gives an understanding of why biologists have such difficulty in dealing with large amounts of biological data and constructing and computing biological models. Likewise, an understanding of the difficulty of computing some biological problems emphasizes the points made by complexity theory.
5 Nondeterminism and Chaos
In the late 1960s and early 1970s, work in computational complexity demonstrated that some problems were either not computable by current techniques or were extremely difficult to compute. The ramifications of complexity theory ended the last remaining shreds of hope that the world could be viewed deterministically or certainly. A host of techniques were adopted in an attempt to work around the difficult problems, including fuzzy logic, stochastic systems, and chaos theory. Systems biologists have adopted many of these techniques, although this process is not yet complete and no conclusions can be drawn from their uses. In many cases, under what circumstances and how these techniques might be applied to biology are not yet fully defined. Nevertheless, they have been applied with some success and are worthy of further investigation, so a short overview of their uses in modern systems biology is given here.
5.1 Fuzzy Computation
Fuzzy logic is merely a form of logical system where variables can take on more than the standard two values. Fuzzy computation is the use of fuzzy logic to solve problems. Fuzzy logic was developed in the 19th century, but it was not until Zadeh’s paper Fuzzy Sets, published in 1965 [33] that any serious interest was taken. Even then, it was largely ignored until researchers in artificial intelligence and general systems discovered it a few years later. Klir is one of the more important figures in fuzzy systems and has authored a book on the subject, Fuzzy Sets and Logic [34].
Long used in engineering and control theory, fuzzy computation has recently been used in the analysis of genetic data obtained from micro array experiments, where comparisons can be more qualitative than quantitative. Fuzzy computation has also been used to deliberately fuzz variable values in order to allow use of a simpler model. There are commercially available clustering algorithms that make use of fuzzy logic, such as the FANNY algorithm [34.1]. In general though, there have been few applications of fuzzy logic in biology, as there are problems involved in the use of fuzzy computation, not the least of which is the difficulty in defining the member functions. Nevertheless, due to the qualitative nature of some of the comparisons given by micro array technology, fuzzy logic does seem applicable in some cases and has been applied with some success.
5.2 Stochastic Systems
Stochastic techniques are ancient and can trace their foundations at least back to early thermodynamics. However, stochastic techniques were not widely adopted by scientists until much later, after some of the philosophical issues of science had been laid to rest. The Russian mathematician Markov constructed the probabilistic paths that bear his name as early the 1930s and some early cyberneticists readily adopted stochastic control circuits. In fact, many of the early ideas of cybernetics can be found in Wiener’s wartime work on controlling and predicting anti-aircraft fire [36] and his earlier work on Brownian motion [36], both of which used stochastic techniques to model what seemed to be random phenomena.
Stochastic techniques gained early acceptance in the West for use in biology and physics, and were also heavily used in the Soviet Union in biochemical modeling. Adoption of these methods lagged in the engineering disciplines, partly due to advantages inherent in engineering problems that reduced the need. However, with the increasing complexity of modern technology, there has been increased interest in stochastic modeling in engineering. Currently, there are a number of biochemical modeling software packages that make use of stochastic modeling. Stochastic techniques are also often used in the analysis of genetic data. The wide scope of application prevents a comprehensive survey in this venue; however, this widespread application and increased interest tends to speak for the potential utility of the methods.
5.3 Chaos Theory
The first research into chaos theory grew out of early efforts in weather prediction, although Poincare first noticed the phenomenon in the three-body problem around 1900 [36.6]. In 1960, Lorenz noticed that weather prediction models diverged wildly when very small changes in input parameters were made. After further exploration, chaos theory became an integral part of stability analysis and is currently in heavy use in any field dealing with nonlinear systems. In biology, where chaotic systems were first noticed in ecology [36.7], is among these fields.
Chaos theory is currently used heavily in biological modeling. It has found a role in modeling emergent properties in nonlinear systems of equations and modeling the stability of large systems of nonlinear differential equations. Of particular note, it has been proposed that changes in the chaotic nature of a biological system affect the fitness of the system [37]. This technique has been used to diagnose electrocardiogram data and predict heart failure [37]. Chaos has also been used to model data encoding in DNA molecules [37.2]. In some circles, chaos theory has attained the reputation of being over-exposed, but there is enough applicability to biology that it should not be ignored.
6 Balance Sheet and Historical Conclusions
Currently, biology is a field undergoing rapid change and development. Computer technology, high-throughput instruments like the micro array, and greater knowledge about small-scale biological processes, have given biologists large quantities of data as well as the techniques to manipulate this data. However, as more data becomes available, more evidence demonstrates the complexity of any biological phenomena. This complexity poses many problems for the modern biologist that cannot be overcome by traditional methods.
The historical evolution of systems ideas shows a constant struggle against complexity. Thus, the role of complexity in modern systems biology should not be ignored. Modern biology has developed several different categories of techniques for dealing with various facets of the complexity inherent in biological systems and the problems this complexity generates. All of these categories are loosely grouped under varying names, such as Bioinformatics or Systems Biology. In general, Bioinformatics deals with the collection, management, and analysis of biological data. Systems biology, mathematical biology, or computational biology, deals with in silico modeling and the mathematics of biological systems. The interconnected nature of biological phenomena and the need for each of these categories makes distinctions between the terms rather small.
[Coming Soon, its a Venn Diagram of sorts...Left it at home]
Figure 5: What is Systems Biology?
From Kitano, "Systems Biology: A Brief Overview" [37.5]
6.1 The Role of Bioinformatics
The importance of modern data collection techniques in genetics and molecular biology cannot be minimized. Although the role of incomplete information has been minimized through much of this paper, it is as serious a problem as modeling biological systems. Both of these problems stem from the complexity of biological systems. One difference between the current efforts and past efforts is that until the latter half of the 20th century, biochemistry and genetics were not well understood. Thus, there was a lag in developing the techniques needed to gather this data.
The micro array is arguably one of the most important innovations in biology in recent times. The gene chip technology allows side-by-side comparison of a vast number of data points. While the precision of the chips and thus their analytic capability is still limited, it does allow collection of enough data to make large-scale biological analysis possible. This is something that would take many years if done in the traditional way and was one of the primary problems with previous attempts at formulating a systematic theory of biology.
Yet high-throughput data collection is only half of the solution. Once biological data is gathered, it must be analyzed and put to use. A wide variety of statistical techniques and some fuzzy logic have been used to these ends. Also significant is the use of standardized XML languages and databasing techniques to manage the information and promote dissemination and portability among researchers. SBML [39] and CellML [40] are notable languages for the storage of biological models, while there are quite a few databases for protein and genetic information ranging from the small to the enormous.
6.2 Current Work in Systems Biology
To say that systems biology is a young field is an understatement at best. However, the fundamental concepts are not. As the historical evolution of systems theory displays, the concepts have been evolving for centuries. Nevertheless, without this fortuitous intersection of readily available biological data and the results of the centuries long evolution of mathematical techniques designed to deal with this type of complexity, a systems approach to genetics and biochemistry would be impossible.
A report published by the World Technology Evaluation Center (WTEC) details current work in Systems Biology [41]. The definition of Systems Biology given by this report shows a discipline driven by emphasis on the dynamic properties of biological networks. There are a wide variety of techniques in use, including nonlinear differential equations, Bayesian networks, Petri-nets, cybernetic motifs, and optimization-based thermodynamic models. Work in systems biology seems to be centered in the US, Japan, and Western Europe. Two centers for the study of systems biology have been established; one is located in Japan [42] and the other is located in the U.S [43]. Many more programs have been opened recently or are planned. Systems Biology, while young, is growing rapidly.
6.3 Systems Biology and Computational Complexity
Systems biology is the product of centuries of evolution in systems ideas. Although a direct link between systems biology and general systems theory is evident, systems biology can also benefit from lessons learned in thermodynamics and ecology. Problems in modern biology share elements from problems in both thermodynamics and ecology. There are a large number of elements in a biological system, and these elements appear to be complex and highly interactive. Thermodynamics and ecology have achieved different levels of success in dealing with these problems; it remains to be seen if systems biology can successfully deal with both. It should be noted that cybernetics foundered on this type of complexity and the same fate almost befell general systems theory. Modern mathematical and computational techniques have vastly improved since, but it is as of yet unclear on how to use them properly.
For better of for worse, it seems likely that the future of systems biology is tied to complexity. Biologists might do well to take the philosophical lessons learned by physicists and engineers to heart and focus more on problem characteristics and model constraints rather than searching for universal correctness. In this approach, computational science might be of some assistance. Computational science has already assisted biologists with this complexity in managing and collecting biological data. As computer science has a long history in dealing with complexity, although admittedly with limited success, a partnership on an even higher level than is currently common would be productive for both fields. Many biologists lack training in complexity and many computer scientists lack training in biology. However, computational science, with its emphasis on problem analysis, could offer assistance to biologists seeking to avoid difficulties associated with complexity. Likewise, three emerging trends in computer science are biologically inspired computing, fault tolerance, and self-organizing systems; an understanding of biological systems would greatly benefit both of these trends, as biological systems are among the most complex known yet seem to make use of these principles fairly well. One ironic example along this line of though is the employment of genetic algorithms, originally inspired by evolutionary concepts, to analyze data from genetic experiments.
More immediately, computational issues limit the amount of data biologists can integrate and study as a whole. Biological systems are composed of a large number of diverse elements which interact with each other in many, diverse ways. Although it is currently unknown how closely biological systems approach the upper boundary of system complexity of 2n, established by Weinberg [28], it seems certain that biological systems are closer than many other systems. This complexity both makes high-throughput data collection necessary and limits its utility at the same time.
In the long term, biological systems might be extremely difficult or impossible to model deterministically. Proteomics is still a young science and the definition of what constitutes an element in a biological system is still in doubt. It may be the case that biologists are faced with hard problems nested inside hard problems or that some problems are not computable in the Turing model of computation. There is also no guarantee that computers can handle the complex differential equations to the needed degree of accuracy, especially since biological systems seem to be chaotic in some sense. Computer technology advances rapidly, but there are theoretical limits or difficulties that cannot necessarily be overcome easily.
Despite these potential problems facing the field, systems biology is uniquely positioned to provide some answers to these problems. It seems likely that once properly matured, the field of systems biology will yield useful results in multiple fields, not merely in biology. Systems biology and computational science both evolved from the same background and both fields face the same issues of complexity and computability. The historical evolution of the shared system ideas paints a picture of a struggle against complexity. The difference is that with evolving mathematical techniques, increased understanding of biochemistry and genetics, and high-throughput data collection, progress in systems biology seems inevitable.
7.0 References
[1] Lloyd, G. E. R. Early Greek Science: Thales to Aristotle. Norton, New York, 1970.
[2] Osborne, Henry Taylor. Greek Biology and Medicine. Cooper Square Publishers,
New York, 1922.
[3] Newton, Issac. The Mathematical Papers of Isaac Newton. Derek T. Whiteside, ed. 1967.
[4] Descartes, Rene. The Philosophical Writings of Descartes, Vol. 1. Cottingham, J, Stoothoff, R., and Murdoch, D., eds. Cambridge University Press, 1985.
[5] Pahl, G. Engineering Design: A Systematic Approach, 2nd ed. Springer, 1999.
[6] Engleberg, S. A Mathematical Introduction to Control Theory. WSPC, 2005.
[7] Keenan, J.H. and Hatsopoulos, G. Principles of General Thermodynamics. Pg xv-xli, 1965
[8] Gibbs, J.W. The Collected Works, Vol. 1. 1948.
[9] Fisher, R. A. The Genetical Theory of Natural Selection. Clarendon, 1930.
[10] Pearson, E.S (ed.). Karl Pearson’s Early Statistical Papers. Cambridge University Press, 1948.
[11] P. Verhulst. Recherches mathématiques sur la loi d'accroissement de la population.
Nouveaux mémoires de l'académie royale des sciences et belles-lettres de Bruxelles, 18:1-38, 1838.
[12] Lotka, A.J. Elements of Physical Biology. Williams and Wilkins, 1925.
[13] Bateson, W. Mendel's Principles of Heredity, a Defense, First Edition, London: Cambridge University Press, 1902.
[14] Bohr, N. The Philosophical Writings of Niels Bohr, Vol. 1. Ox Bow Press, 1987.
[15] Jammer, Max. The Conceptual Development of Quantum Mechanics. Wiley, 1974.
[16] Einstein, A. Relativity: The Special and the General Theory. Henry Holt, 1920.
[19] Bertalanffy, L. von. Modern Theories of Development: An Introduction to Theoretical Biology. Oxford, 1933. (translated).
[20] Bertalanffy, L. von. Theoretische Biologie. 2 Bde., Berlin, 1932.
[21] Von Bertalanffy, L. General Systems Theory: Foundation, Development, Applications. George Braziller, 1976.
[22] McCullouch, Warren, and Pitts, Walter. “A Logical Calculus of the Ideas Immanent in Nervous Activity”, Bulletin of Mathematical Biophysics, Vol. 5 (115-133). 1943.
[23] Rosenblueth, A., Wiener, Norbert, and Bigelow, Julian. “Behavior, Purpose, and Teleology.” Philosophy of Science, Vol. 10 (18-24). 1943.
[24] Wiener, N. Cybernetics: Control and Communication in the Animal and the Machine. MIT Press, 1965.
[25] Ashby, R. Introduction to Cybernetics. Routledge, Kegan, and Paul, 1964.
[26] Shannon, Claude E., and Weaver, Warren. The Mathematical Theory of Communication. University of Illinois Press, 1963.
[27] Von Foerster, H. Understanding Understanding: An Epistemology of Second Order Concepts. Aprendizagem/Desenvolvimento. Pg 83-85, 1981.
[28] Weinberg, G. An Introduction to General Systems Thinking. Dorset House Publishing, 2001.
[29] Klir, G. An Approach to General Systems Theory. Van Nostrand Reinhold Co., 1969.
[30] Woodger, J.H. The Axiomatic Method in Biology. Cambridge Press, 1937.
[31] Stevenson, D.E., Warner, D. D., and Brown, T.R. Biobell: A Simulation System for Biochemistry and Biophysics. Comput. Biol. Med., Vol. 14, No. 1. Pg 35-46, 1984.
[32] Savageau, et al. Biochemical Systems Theory and Metabolic Control Theory: Fundamental Similarities and Differences. 1987.
[33] Zadeh, L.A. Fuzzy Sets. Information and Control, June 1965. Pg. 338-353.
[34] Klir, G., and Yuan, B. Fuzzy Sets and Fuzzy Logic: Theory and Application. Prentice Hall, 1995.
[36] Mansani, P. Norbert Wiener: Collected Works. MIT Press, 1990.
[37] Dention, T.A., et al. Fascinating Rhythm: A Primer on Chaos Theory and Its Application to Cardiology. Am Heart J, 120, 1419-1440 (1990).
[39] SBML website. . Accessed 08/30/2005.
[40] CellML website. . Accessed 08/30/2005
[41] WTEC Report on Systems Biology. World Technology Evaluation Center, Oct. 2005.
[42] The Systems Biology Institue. Tokyo, Japan. sbi.jp.
[43] Institute for Systems Biology. Seattle, WA, USA. .
[16.5] Godel, K. "An Example of A New Type of Cosmological Solution of Einstein's Field Equations of Gravity" Rev. Mod. Phys. 21, 447, 1949
[29.1] Gerovitch, S. From Newspeak to Cyberspeak: A History of Soviet Cybernetics. MIT Press, 2002.
[29.2] Sobolev, Sergei L. "Vystuplenie na soveshchanii" in Philosophical Problems, ed Fedoseev, p 266 (From Gerovitch). Working on a better reference.
[29.3] Liapunov and Iablonksi. “Theoretical Problems in Cybernetics.”
[29.4] Liapunov. "Ob upravliaiushchikh sistemakh" (From Gerovitch)
[32.1] Hartmanis and Steans. “On the computational complexity of algorithms” 1965.
[32.2] Cook, S. The Complexity of Theorum Proving Procedures. Proceedings of the third annual ACM symposium on theory of computing, 151-158.
[32.3] Karp, R. "Reducibility Among Complex Problems."
[34.1] L. Kaufmann and P. J. Rousseeuw. Finding Groups in Data: An Introduction to Cluster Analysis. Wiley and Sons, 1990.
[0.5] Veith, Ilza, ed. The Yellow Emperor’s Classic of Internal Medicine. University of California Press, 1966.
[21.5] Schrödinger, E. What Is Life? Cambridge University Press, 1944.
[21.6] Schneider, ED, and Kay, J.J. “Life as a Manifestation of the Second Law of Thermodynamics.” Mathematical and Computer Modeling. Vol. 19, No. 6-8, pg. 25-48.
[27.5] Simon, H. A. The Sciences of the Artificial, 2nd Ed. MIT Press, 1981.
[36.6] Barrow-Green, J. Poincare and the Three Body Problem. American Mathematical Society, Novermber 1996.
[36.7] May, R. “When two and two do not make four: nonlinear phenomena in ecology.” Proceedings of the Royal Society, 1986, Vol. B228, pg. 241.
[37.2] Coffey, D. “Self-organization, Complexity, and Chaos: The New Biology for Medicine.” Nature Medicine, 4, pg. 882-885, 1998.
[37.5] Kitano, H. "Systems Biology: A Brief Overview." Science, Vol 295. March 1, 2002.
[17.9] Russell, B. and Whitehead, A. Principia Mathematica. Cambridge University Press, 1903.
[18.9] Davidson, M. Uncommon Sense: The Life and Thought of Ludwig von Bertalanffy, Father of General Systems Theory. 1983.
[34.1] Wolkenhaur, O. "Systems Biology: The Reincarnation of Systems Theory Applied in Biology?" Briefings in Bioinfomatics. 2001.
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related searches
- the evolution of surgery
- the evolution of humans
- the evolution of humans timeline
- evolution of the human skull
- the evolution of earth
- evolution of the earth timeline
- the evolution of life
- the evolution of slavery
- evolution of the education system
- darwinian evolution vs modern evolution
- the evolution of the airplane
- evolution of the camera