Latest Articles Include:
- Editorial board
- J Theor Biol 258(1):IFC (2009)
- How could the Gompertz–Makeham law evolve
- J Theor Biol 258(1):1-17 (2009)
In line with the origin of life from the chemical world, biological mortality kinetics is suggested to originate from chemical decomposition kinetics described by the Arrhenius equation k=A*exp(−E/RT). Another chemical legacy of living bodies is that, by using the appropriate properties of their constituent molecules, they incorporate all their potencies, including adverse ones. In early evolution, acquiring an ability to use new molecules to increase disintegration barrier E might be associated with new adverse interactions, yielding products that might accumulate in organisms and compromise their viability. Thus, the main variable of the Arrhenius equation changed from T in chemistry to E in biology; mortality turned to rise exponentially as E declined with increasing age; and survivorship patterns turned to feature slow initial and fast late descent making the bulk of each finite cohort to expire within a short final period of its lifespan. Numerical modelling sho! ws that such acquisition of new functions associated with faster functional decline may increase the efficiency of investing resources into progeny, in line with the antagonistic pleiotropy theory of ageing. Any evolved time trajectories of functional changes were translated into changes in mortality through exponent according to the generalised Gompertz–Makeham law μ=C(t)+Λ*exp[−E(t)], which is reduced to the conventional form when E(t)=E0−γt and C is constant. The proposed model explains the origin of the linear mid-age functional decline followed by its deceleration at later ages and the positive correlation between the initial vitality and the rate of ageing. - Long range clustering of oligonucleotides containing the CG signal
- J Theor Biol 258(1):18-26 (2009)
The distance distributions between successive occurrences of the same oligonucleotides in chromosomal DNA are studied, in different classes of higher eucaryotic organisms. A two-parameter modeling is undertaken and applied on the distance distribution of quintuplets (sequences of size five bps) and hexaplets (sequences of size six bps); the first parameter k refers to the short range exponential decay of the distributions, whereas the second parameter m refers to the power law behavior. A two-dimensional scatter plot representing the model equation demonstrates that the points corresponding to the distance distribution of oligonucleotides containing the CG consensus sequence (promoter of the RNA polymerase II) cluster together (group α), apart from all other oligonucleotides (group β). This is shown for the available chordata Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, Gallus gallus and Danio rerio. This clustering is less evident in lower Animali! a and plants, such as Drosophila melanogaster, Caenorhabditis elegans and Arabidopsis thaliana. Moreover, in all organisms the oligonucleotides which contain any consensus sequence are found to be described by long range distributions, whereas all others have a stronger influence of short range decay. Various measures are introduced and evaluated, to numerically characterize the clustering of the two groups. The one which most clearly discriminates the two classes is shown to be the proximity factor. - Limiting similarity and niche theory for structured populations
- J Theor Biol 258(1):27-37 (2009)
We develop the theory of limiting similarity and niche for structured populations with finite number of individual states (i-state). In line with a previously published theory for unstructured populations, the niche of a species is specified by the impact and sensitivity niche vectors. They describe the population's impact on and sensitivity towards the variables involved in the population regulation. Robust coexistence requires sufficient segregation of the impact, as well as of the sensitivity niche vectors. Connection between the population-level impact and sensitivity and the impact/sensitivity of the specific i-states is developed. Each i-state contributes to the impact of the population proportional to its frequency in the population. Sensitivity of the population is composed of the sensitivity of the rates of demographic transitions, weighted by the frequency and by the reproductive value of the initial and final i-states of the transition, respectively. Coexist! ence in a multi-patch environment is studied. This analysis is interpreted as spatial niche segregation. - A simple model for adaptive variation in the sex ratios of mammalian offspring
- J Theor Biol 258(1):38-42 (2009)
We present a simple mathematical model that describes how primary and secondary sex ratios of offspring may vary adaptively in order to maintain equal numbers of the sexes at the age of reproductive maturity. The model postulates that the sex of an offspring depends probabilistically on a weighted linear combination of maternal testosterone and male vulnerability. The model operates at population level, and is based on three physiological phenomena: first that maternal testosterone in follicular fluid is normally distributed, with levels above the mean more likely to be associated with the conception of males; secondly, that males are more vulnerable than females from conception onwards; and thirdly that under conditions of chronic stress, increased secretion of female testosterone coincides with increased male vulnerability. Thus during times of chronic stress, more males are conceived, but their number of live births is moderated by increased male loss. Variations in! secondary sex ratios should therefore be related not only to the stressfulness of environmental conditions, but also to the timing of changes in stressfulness. - Sensitivity analysis to identify key parameters influencing Salmonella infection dynamics in a pig batch
- J Theor Biol 258(1):43-52 (2009)
In the context of managed herds, epidemiological models usually take into account relatively complex interactions involving a high number of parameters. Some parameters may be uncertain and/or highly variable, especially epidemiological parameters. Their impact on the model outputs must then be assessed by a sensitivity analysis, allowing to identify key parameters. The prevalence over time is an output of particular interest in epidemiological models, so sensitivity analysis methods adapted to such dynamic output are needed. In this paper, such a sensitivity analysis method, based on a principal component analysis and on analysis of variance, is presented. It allows to compute a generalised sensitivity index for each parameter of a model representing Salmonella spread within a pig batch. The model is a stochastic discrete-time model describing the batch dynamics and movements between rearing rooms, from birth to slaughterhouse delivery. Four health states were introduced: Salmonella-free, seronegative shedder, seropositive shedder and seropositive carrier. The indirect transmission was modelled via an infection probability function depending on the quantity of Salmonella in the rearing room. Simulations were run according to a fractional factorial design enabling the estimation of main effects and two-factor interactions. For each of the 18 epidemiological parameters, four values were chosen, leading to 4096 scenarios. For each scenario, 15 replications were performed, leading to 61 440 simulations. The sensitivity analysis was then conducted on the seroprevalence output. The parameters governing the infection probability function and residual room contaminations were identified as key parameters. To control the Salmonella seroprevalence, efficient measures should therefore aim at these parameters. Moreover, the shedding rate and maternal protective factor also had a major impact. Therefore, further investigation on the protective effect of maternal or post-infection antibodies would be needed. - Basic networks: Definition and applications
- J Theor Biol 258(1):53-59 (2009)
We define basic networks as the undirected subgraphs with minimal number of units in which the distances (geodesics, minimal path lengths) among a set of selected nodes, which we call seeds, in the original graph are conserved. The additional nodes required to draw the basic network are called connectors. We describe a heuristic strategy to find the basic networks of complex graphs. We also show how the characterization of these networks may help to obtain relevant biological information from highly complex protein–protein interaction data. - Regulated transport as a mechanism for pattern generation: Capabilities for phyllotaxis and beyond
- J Theor Biol 258(1):60-70 (2009)
Large-scale pattern formation is a frequently occurring phenomenon in biological organisms, and several local interaction rules for generating such patterns have been suggested. A mechanism driven by feedback between the plant hormone auxin and its polarly localized transport mediator PINFORMED1 has been proposed as a model for phyllotactic patterns in plants. It has been shown to agree with current biological experiments at a molecular level as well as with respect to the resulting patterns. We present a thorough investigation of variants of models based on auxin-regulated polarized transport and use analytical and numerical tools to derive requirements for these models to drive spontaneous pattern formation. We find that auxin concentrations in neighboring cells can feed back either on exocytosis or endocytosis and still produce patterns. In agreement with mutant experiments, the active cellular efflux is shown to be more important for pattern capabilities as compare! d to active influx. We also find that the feedback must originate from neighboring cells rather than from neighboring walls and that intracellular competition for the transport mediator is required for patterning. The importance of model parameters is investigated, especially regarding robustness to perturbations of experimentally estimated parameter values. Finally, the regulated transport mechanism is shown to be able to generate Turing patterns of various types. - Dynamics of the interlocked positive feedback loops explaining the robust epigenetic switching in Candida albicans
- J Theor Biol 258(1):71-88 (2009)
The two element mutual activation and inhibitory positive feedback loops are a common motifs that occur in many biological systems in both isolated and interlocked form, as for example, in the cell division cycle and thymus differentiation in eukaryotes. The properties of three element interlocked positive feedback loops that embeds both mutual activation and inhibition are studied in depth for their bistable properties by performing bifurcation and stochastic simulations. Codimension one and two bifurcations reveal important properties like robustness to parameter variations and adaptability under various conditions by its ability to fine tune the threshold to a wide range of values and to maintain a wide bistable regime. Furthermore, we show that in the interlocked circuit, mutual inhibition controls the decision to switch from OFF to ON state, while mutual activation enforces the decision. This view is supported through a concrete biological example Candida albicans! , a human fungal pathogen that can exist in two distinctive cell types; one in the default white state and the other in an opaque form. Stochastic switching between these two forms takes place due to the epigenetic alternation induced by the transcriptional regulators in the circuit, albeit without any rearrangement of the nuclear chromosomes. The transcriptional regulators constitute interlocked mutual activation and inhibition feedback circuits that provide adaptable threshold and wide bistable regime. These positive feedback loops are shown to be responsible for robust noise induced transitions without chattering, persistence of particular phenotypes for many generations and selective exhibition of one particular form of phenotype when mutated. Finally, we propose for synthetic biology constructs to use interlocked positive feedback loops instead of two element positive feedback loops because they are better controlled than isolated mutual activation and mutual inhibitio! n feedback circuits. - When the exception becomes the rule: The disappearance of limiting similarity in the Lotka–Volterra model
- J Theor Biol 258(1):89-94 (2009)
We investigate the transition between limiting similarity and coexistence of a continuum in the competitive Lotka–Volterra model. It is known that there exist exceptional cases in which, contrary to the limiting similarity expectation, all phenotypes coexist along a trait axis. Earlier studies established that the distance between surviving phenotypes is in the magnitude of the niche width 2σ provided that the carrying capacity curve differs from the exceptional one significantly enough. In this paper we studied the outcome of competition for small perturbations of the exceptional (Gaussian) carrying capacity. We found that the average distance between the surviving phenotypes goes to zero when the perturbation vanishes. The number of coexisting species in equilibrium is proportional to the negative logarithm of the perturbation. Nevertheless, the niche width provides a good order of magnitude for the distance between survivors if the perturbations are larger than 1! 0%. Therefore, we conclude that limiting similarity is a good framework of biological thinking despite the lack of an absolute lower bound of similarity. - Phylogenetic information complexity: Is testing a tree easier than finding it?
- J Theor Biol 258(1):95-102 (2009)
Phylogenetic trees describe the evolutionary history of a group of present-day species from a common ancestor. These trees are typically reconstructed from aligned DNA sequence data. In this paper we analytically address the following question: Is the amount of sequence data required to accurately reconstruct a tree significantly more than the amount required to test whether or not a candidate tree was the 'true' tree? By 'significantly', we mean that the two quantities do not behave the same way as a function of the number of species being considered. We prove that, for a certain type of model, the amount of information required is not significantly different; while for another type of model, the information required to test a tree is independent of the number of leaves, while that required to reconstruct it grows with this number. Our results combine probabilistic and combinatorial arguments. - Genetic model for longitudinal studies of aging, health, and longevity and its potential application to incomplete data
- J Theor Biol 258(1):103-111 (2009)
Many longitudinal studies of aging collect genetic information only for a sub-sample of participants of the study. These data also do not include recent findings, new ideas and methodological concepts developed by distinct groups of researchers. The formal statistical analyses of genetic data ignore this additional information and therefore cannot utilize the entire research potential of the data. In this paper, we present a stochastic model for studying such longitudinal data in joint analyses of genetic and non-genetic sub-samples. The model incorporates several major concepts of aging known to date and usually studied independently. These include age-specific physiological norms, allostasis and allostatic load, stochasticity, and decline in stress resistance and adaptive capacity with age. The approach allows for studying all these concepts in their mutual connection, even if respective mechanisms are not directly measured in data (which is typical for longitudinal ! data available to date). The model takes into account dependence of longitudinal indices and hazard rates on genetic markers and permits evaluation of all these characteristics for carriers of different alleles (genotypes) to address questions concerning genetic influence on aging-related characteristics. The method is based on extracting genetic information from the entire sample of longitudinal data consisting of genetic and non-genetic sub-samples. Thus it results in a substantial increase in the accuracy of statistical estimates of genetic parameters compared to methods that use only information from a genetic sub-sample. Such an increase is achieved without collecting additional genetic data. Simulation studies illustrate the increase in the accuracy in different scenarios for datasets structurally similar to the Framingham Heart Study. Possible applications of the model and its further generalizations are discussed. - Inheritance of epigenetic chromatin silencing
- J Theor Biol 258(1):112-120 (2009)
Maintenance of alternative chromatin states through cell divisions pose some fundamental constraints on the dynamics of histone modifications. In this paper, we study the systems biology of epigenetic inheritance by defining and analyzing general classes of mathematical models. We discuss how the number of modification states involved plays an essential role in the stability of epigenetic states. In addition, DNA duplication and the consequent dilution of marked histones act as a large perturbation for a stable state of histone modifications. The requirement that this large perturbation falls into the basin of attraction of the original state sometimes leads to additional constraints on effective models. Two such models, inspired by two different biological systems, are compared in their fulfilling the requirements of multistability and of recovery after DNA duplication. We conclude that in the presence of multiple histone modifications that characterize alternative ep! igenetic stable states, these requirements are more easily fulfilled. - The effective size of bryophyte populations
- J Theor Biol 258(1):121-126 (2009)
Bryophytes with their dominant haploid stage conform poorly to the life cycles generally treated in population genetical models. Here we make a detailed analysis of what effective sizes bryophyte model populations have as a function of their breeding system. It is found that the effective size is rarely much smaller than the scored number of haploid gametophytic individuals, even when the limited number of diploids (sporophytes) formed is taken into account. The most severe decrease in effective size occurs when unisexual gametophytic females produce only a small number of fertile diploid sporophytes in male biased populations; this effect is due to the restricted sampling of male gametophytic individuals that then occurs. It is shown that the harmonic mean of diploid sporophytes formed per haploid gametophytic individuals is the relevant measure in these calculations and not the standard (and generally larger) arithmetic mean. - A game theoretical model of deforestation in human–environment relationships
- J Theor Biol 258(1):127-134 (2009)
We studied a two-person game regarding deforestation in human–environment relationships. Each landowner manages a single land parcel where the state of land-use is forested, agricultural, or abandoned. The landowner has two strategies available: forest conservation and deforestation. The choice of deforestation provides a high return to the landowner, but it degrades the forest ecosystem services produced on a neighboring land parcel managed by a different landowner. Given spatial interactions between the two landowners, each landowner decides which strategy to choose by comparing the expected discounted utility of each strategy. Expected discounted utility is determined by taking into account the current and future utilities to be received, according to the state transition on the two land parcels. The state transition is described by a Markov chain that incorporates a landowner's choice about whether to deforest and the dynamics of agricultural abandonment and fore! st regeneration. By considering a stationary distribution of the Markov chain for land-use transitions, we derive explicit conditions for Nash equilibrium. We found that a slow regeneration of forests favors mutual cooperation (forest conservation). As the forest regenerates faster, mutual cooperation transforms to double Nash equilibria (mutual cooperation and mutual defection), and finally mutual defection (deforestation) leads to a unique Nash equilibrium. Two different types of social dilemma emerge in our deforestation game. The stag-hunt dilemma is most likely to occur under an unsustainable resource supply, where forest regenerates extremely slowly but agricultural abandonment happens quite rapidly. In contrast, the prisoner's dilemma is likely under a persistent or circulating supply of resources, where forest regenerates rapidly and agricultural abandonment occurs slowly or rapidly. These results show how humans and the environment mutually shape the dilemma struct! ure in forest management, implying that solutions to dilemmas ! depend on environmental properties. - To grow or not to grow? Intermediate and paratenic hosts as helminth life cycle strategies
- J Theor Biol 258(1):135-147 (2009)
Larval helminths in intermediate hosts often stop growing long before their growth is limited by host resources, and do not grow at all in paratenic hosts. We develop our model [Ball, M.A., Parker, G.A., Chubb, J.C., 2008. The evolution of complex life cycles when parasite mortality is size- or time-dependent. J. Theor. Biol. 253, 202–214] for optimal growth arrest at larval maturity (GALM) in trophically transmitted helminths. This model assumes that on entering an intermediate host, larval death rate initially has both time- (or size-) dependent and time-constant components, the former increasing as the larva grows. At GALM, mortality changes to a new and constant rate in which the size-dependent component is proportional to that immediately before GALM. Mortality then remains constant until death or transmission to the definitive host. We analyse linear increasing and accelerating forms for time-dependent mortality to deduce why there is sometimes growth (intermed! iate hosts) and sometimes no growth (paratenic hosts). Calling i the intermediate or paratenic host, and j the definitive host, conditions favouring paratenicity are: (i) high values in host i for size at establishment, size-related mortality, expected intensity, (ii) low values in host i for size-independent mortality rate, potential growth rate, transmission rate to j, and ratio of death rate in j/growth rate in j. Opposite conditions favour growth in the (intermediate) host, either to GALM or until death without GALM. We offer circumstantial evidence from the literature supporting some of these predictions. In certain conditions, two of the three possible growth strategies (no growth; growth to an optimal size then growth arrest (GALM); unlimited growth until larval death) can exist as local optima. The effect of the discontinuity in death rate after GALM is complex and depends on mortality and growth parameters in the two hosts, and on the mortality functions before and! after GALM. - The invisible niche: Weakly density-dependent mortality and the coexistence of species
- J Theor Biol 258(1):148-155 (2009)
Weakly density-dependent effects, characterized by fractional scaling exponents close to one, are rarely studied in the ecological literature. Here, we consider the effect of an additional weakly density-dependent term on a simple competition model. Our investigation reveals that weak density-dependence opens up an "invisible niche". This niche does not constitute a new mechanism for coexistence, but is a previously unexplored consequence of known mechanisms. In the invisible niche a weaker competitor can survive at very low density. Coexistence thus requires large habitat size. Such niches, if found in nature, would have a direct impact on species-area laws and species-abundance curves and should therefore receive more attention. - Samuel Butler and human long term memory: Is the cupboard bare?
- J Theor Biol 258(1):156-164 (2009)
Memory studies in biological systems distinguish three informational processes that are generally sequential—production/acquisition, storage, and retrieval/use. Identification of DNA as a storage form for hereditary information accelerated progress in that field. Assuming the path of successful elucidation in one memory field (heredity) to be heuristic for elucidation in another (brain), then progress in neuroscience should accelerate when a storage form is identified. In the 19th century Ewald Hering and Samuel Butler held that heredity and brain memory both involved the storage of information and that the two forms of storage were the same. Hering specified storage as 'molecular vibrations' but, while making a fuller case, Butler was less committal. In the 20th century, the ablation studies of Karl Lashley failed to identify unique sites for storage of brain information, and Donald Hebb's 'synaptic plasticity' hypothesis of distributed storage over a neuron! al network won favor. In the 21st century this has come under attack, and the idea that brain and hereditary information are stored as DNA is advocated. Thus, albeit without attribution, Butler's idea is reinstated. Yet, while the case is still open, the synaptic plasticity and DNA hypotheses have problems. Two broad alternatives remain on the table. Long term memory is located: (1) in the brain, either in some other macromolecular form (e.g. protein, lipid) or in some sub-molecular form (e.g. quantum computing and 'brain as holograph' hypotheses) or (2) outside the brain. The suggestion of the medieval physician Avicenna that the brain 'cupboard' is bare—i.e. the brain is a perceptual, not storage, organ—is consistent with a mysterious 'universe as holograph' model. Understanding how Butler came to contribute could be heuristic for future progress in a field fraught with 'fractionation and disunity'.
No comments:
Post a Comment