Staff Publications

Staff Publications

  • external user (warningwarning)
  • Log in as
  • language uk
  • About

    'Staff publications' is the digital repository of Wageningen University & Research

    'Staff publications' contains references to publications authored by Wageningen University staff from 1976 onward.

    Publications authored by the staff of the Research Institutes are available from 1995 onwards.

    Full text documents are added when available. The database is updated daily and currently holds about 240,000 items, of which 72,000 in open access.

    We have a manual that explains all the features 

    Current refinement(s):

    Records 1 - 20 / 81

    • help
    • print

      Print search results

    • export

      Export search results

    Check title to add to marked list
    Sensitivity analysis methodologies for analysing emergence using agent-based models
    Broeke, Guus ten - \ 2017
    Wageningen University. Promotor(en): J. Molenaar, co-promotor(en): G.A.K. van Voorn; A. Ligtenberg. - Wageningen : Wageningen University - ISBN 9789463436991 - 211
    mathematics - computational mathematics - mathematical models - dynamic modeling - sensitivity analysis - adaptation - methodology - simulation - wiskunde - computerwiskunde - wiskundige modellen - dynamisch modelleren - gevoeligheidsanalyse - adaptatie - methodologie - simulatie

    Many human and natural systems are highly complex, because they consist of many interacting parts. Such systems are known as complex adaptive systems (CAS). Understanding CAS is possible only by studying the interactions between constituent parts, rather than focussing only on the properties of the parts in isolation. Often, the possibilities for systematically studying these interactions in real-life systems are limited. Simulation models can then be an important tool for testing what properties may emerge, given various assumptions on the interactions in the system. Agent-based models (ABMs) are particularly useful for studying CAS, because ABMs explicitly model interactions between autonomous agents and their environment.

    Currently, the utility of ABMs is limited by a lack of available methodologies for analysing their results. The main tool for analysing CAS models is sensitivity analysis. Yet, standard methods of sensitivity analysis are not well-suited to deal with the complexity of ABMs. Thus, there is a need for sensitivity analysis methodologies that are specifically developed for analysing ABMs. The objective of this thesis is to contribute such methodologies. Specifically, we propose methodologies for (1) detecting tipping points, (2) analysing the effects of agent adaptation, and (3) analysing resilience of ABMs.

    Chapter 2 introduces traditional methods of sensitivity analysis. These methods are demonstrated by applying them to rank the most influential parameters of an ODE model of predator-prey interaction. Furthermore, the role of sensitivity analysis in model validation is discussed.

    In Chapter 3 we investigate the use of sensitivity analysis for detecting tipping points. Whereas bifurcation analysis methods are available for detecting tipping points in ODE models, these methods are not applicable to ABMs. Therefore, we use an ODE model to verify the results from sensitivity analysis against those of bifurcation analysis. We conclude that one-factor-at-a-time sensitivity analysis (OFAT) is a helpful method for detecting tipping points. However, OFAT is a local method that considers only changes in individual parameters. It is therefore recommended to supplement OFAT with a global method to investigate interaction effects. For this purpose, we recommend all-but-one-at-a-time sensitivity analysis (ABOS) as a graphical sensitivity analysis method that takes into account parameter interactions and can help with the detection of tipping points.

    In Chapter 4 we introduce a basic ABM model of agents competing in a spatial environment for a renewable resource. This basic model will be extended in the subsequent chapters, and will serve as a testing case for various sensitivity analysis methods. In Chapter 4, it is used to assess the utility of existing sensitivity analysis methods for ABMs. The results show that traditional methods of sensitivity are not sufficient to analyse the ABM, due to the presence of tipping points and other strong non-linearities in the model output. In contrast, OFAT is found to be helpful for detecting tipping points, as was suggested in Chapter 3. Based on these outcomes, OFAT is recommended as a starting point for sensitivity analysis of ABMs, preferably supplemented by a global method to investigate interaction effects.

    In Chapter 5 we extend the ABM of Chapter 4 by adding agent adaptation in the form of a mechanism of natural selection. On short time-scales, the model behaviour appears to be similar to the non-adaptive model version. On longer time-scales, the agent adaptation causes the state of the model to gradually change as agents continue to adapt to their surroundings. We propose a sensitivity analysis method to measure the effects of this adaptation. This method is based on a quantification of the difference between probability density functions of model version with and without adaptation. Using this method, we show that this adaptation increases the resilience of the system by giving it the flexibility needed to respond to pressures.

    In Chapter 6 we further extend the test-case by giving agents the option to harvest either cooperatively or individually. Cooperation increases the potential yields, but introduces the risk of defection of the interaction partner. It is shown that ecological factors, which are usually not considered in models on cooperation, strongly affect the level of cooperation in the system. For example, low levels of cooperation lead to a decreased population size, and causes the formation of small groups of agents with a higher level of cooperation. As a result, cooperation persists even without any mechanisms to promote it. Nevertheless, the inclusion of such mechanisms in the form of indirect reciprocity does further increase the level of cooperation. Furthermore, we show that the resulting high levels of cooperation, depending on the circumstances, can increase the resilience of the agent population against shocks.

    To conclude, in this thesis several methodologies have been proposed to help with ABM analysis. Specifically, OFAT and ABOS are recommended for detecting tipping points in ABMs, and in Chapter 5 a protocol is introduced for quantifying the effects of adaptation. By suggesting these methodologies, this thesis aims to contribute to the utility of ABMs, especially for studying CAS.

    Using probabilistic graphical models to reconstruct biological networks and linkage maps
    Wang, Huange - \ 2017
    Wageningen University. Promotor(en): F.A. Eeuwijk, co-promotor(en): J. Jansen. - Wageningen : Wageningen University - ISBN 9789463431538 - 150
    probabilistic models - models - networks - linkage - mathematics - statistics - quantitative trait loci - phenotypes - simulation - waarschijnlijkheidsmodellen - modellen - netwerken - koppeling - wiskunde - statistiek - loci voor kwantitatief kenmerk - fenotypen - simulatie

    Probabilistic graphical models (PGMs) offer a conceptual architecture where biological and mathematical objects can be expressed with a common, intuitive formalism. This facilitates the joint development of statistical and computational tools for quantitative analysis of biological data. Over the last few decades, procedures based on well-understood principles for constructing PGMs from observational and experimental data have been studied extensively, and they thus form a model-based methodology for analysis and discovery. In this thesis, we further explore the potential of this methodology in systems biology and quantitative genetics, and illustrate the capabilities of our proposed approaches by several applications to both real and simulated omics data.

    In quantitative genetics, we partition phenotypic variation into heritable, genetic, and non-heritable, environmental, parts. In molecular genetics, we identify chromosomal regions that drive genetic variation: quantitative trait loci (QTLs). In systems genetics, we would like to answer the question of whether relations between multiple phenotypic traits can be organized within wholly or partially directed network structures. Directed edges in those networks can be interpreted as causal relationships, causality meaning that the consequences of interventions are predictable: phenotypic interventions in upstream traits, i.e. traits occurring early in causal chains, will produce changes in downstream traits. The effect of a QTL allele can be considered to represent a genetic intervention on the phenotypic network. Various methods have been proposed for statistical reconstruction of causal phenotypic networks exploiting previously identified QTLs. In chapter 2, we present a novel heuristic search algorithm, namely the QTL+phenotype supervised orientation (QPSO) algorithm, to infer causal relationships between phenotypic traits. Our algorithm shows good performance in the common, but so far uncovered case, where some traits come without QTLs. Therefore, our algorithm is especially attractive for applications involving expensive phenotypes, like metabolites, where relatively few genotypes can be measured and population size is limited.

    Standard QTL mapping typically models phenotypic variations observable in nature in relation to genetic variation in gene expression, regardless of multiple intermediate-level biological variations. In chapter 3, we present an approach integrating Gaussian graphical modeling (GGM) and causal inference for simultaneous modeling of multilevel biological responses to DNA variations. More specifically, for ripe tomato fruits, the dependencies of 24 sensory traits on 29 metabolites and the dependencies of all the sensory and metabolic traits further on 21 QTLs were investigated by three GGM approaches including: (i) lasso-based neighborhood selection in combination with a stability approach to regularization selection, (ii) the PC-skeleton algorithm and (iii) the Lasso in combination with stability selection, and then followed by the QPSO algorithm. The inferred dependency network which, though not essentially representing biological pathways, suggests how the effects of allele substitutions propagate through multilevel phenotypes. Such simultaneous study of the underlying genetic architecture and multifactorial interactions is expected to enhance the prediction and manipulation of complex traits. And it is applicable to a range of population structures, including offspring populations from crosses between inbred parents and outbred parents, association panels and natural populations.

    In chapter 4, we report a novel method for linkage map construction using probabilistic graphical models. It has been shown that linkage map construction can be hampered by the presence of genotyping errors and chromosomal rearrangements such as inversions and translocations. Our proposed method is proven, both theoretically and practically, to be effective in filtering out markers that contain genotyping errors. In particular, it carries out marker filtering and ordering simultaneously, and is therefore superior to the standard post-hoc filtering using nearest-neighbour stress. Furthermore, we demonstrate empirically that the proposed method offers a promising solution to genetic map construction in the case of a reciprocal translocation.

    In the domain of PGMs, Bayesian networks (BNs) have proven, both theoretically and practically, to be a promising tool for the reconstruction of causal networks. In particular, the PC algorithm and the Metropolis-Hastings algorithm, which are representatives of mainstream methods to BN structure learning, are reported to have been successfully applied to the field of biology. In view of the fact that most biological systems exist in the form of random network or scale-free network, in chapter 5 we compare the performance of the two algorithms in constructing both random and scale-free BNs. Our simulation study shows that for either type of BN, the PC algorithm is superior to the M-H algorithm in terms of timeliness; the M-H algorithm is preferable to the PC algorithm when the completeness of reconstruction is emphasized; but when the fidelity of reconstruction is taken into account, the better one of the two algorithms varies from case to case. Moreover, whichever algorithm is adopted, larger sample sizes generally permit more accurate reconstructions, especially in regard to the completeness of the resulting networks.

    Finally, chapter 6 presents a further elaboration and discussion of the key concepts and results involved in this thesis.

    Proceedings of the XII Global Optimization Workshop, Mathematical and Applied Global Optimization, MAGO 2014
    Casado, L.G. ; García, I. ; Hendrix, E.M.T. - \ 2014
    Almería : Universidad de Almería - ISBN 9788416027576 - 164
    optimalisatie - wiskunde - optimization - mathematics
    Kantelpunten en alternatieve evenwichten
    Hemerik, L. ; Nes, E. van; Pol, T.J. van de - \ 2014
    Amsterdam : Epsilon Uitgaven (Zebra-reeks 40) - ISBN 9789050411424 - 60
    hysterese - evenwicht - omslagpunten - wiskunde - biologie - ecologie - modellen - hysteresis - equilibrium - tipping points - mathematics - biology - ecology - models
    Waarom is het zo moeilijk om weer vrede te sluiten na het uitbreken van een oorlog? Het punt van omslag van vrede naar oorlog ligt op een heel ander punt dan de omslag van oorlog naar vrede. Dit soort situaties met zogenaamde 'hysterese' treden in veel andere systemen op zoals ecosystemen, financiële markten en klimaatsystemen. Hoe deze hysterese in eenvoudige modellen optreedt is het onderwerp van studie in dit boekje. In dit boekje wordt op een duidelijke manier uitgelegd hoe je kantelpunten kan ontdekken in een systeem dat geleidelijk verandert en hoe alternatieve evenwichten als gevolg van een langzaam veranderende omgevingsvariabele kunnen optreden.
    Emergent Results of Artificial Economics
    Osinga, S.A. ; Hofstede, G.J. ; Verwaart, T. - \ 2011
    Heidelberg : Springer Verlag (Lecture Notes in Economics and Mathematical Systems 652) - ISBN 9783642211072
    economie - bedrijfswetenschap - speltheorie - wiskunde - computertechnieken - sociale wetenschappen - economics - management science - game theory - mathematics - computer techniques - social sciences
    Wiskunde in Werking: van A naar B
    Gee, M. de - \ 2011
    Utrecht : Epsilon Uitgaven (Epsilon uitgaven 70) - ISBN 9789050411271 - 480
    wiskunde - vergelijkingen (wiskundig) - intergraalvergelijkingen - differentiaalvergelijkingen - studieboeken - mathematics - equations - integral equations - differential equations - textbooks
    Introduction to Nonlinear and Global Optimization
    Hendrix, E.M.T. ; Tóth, B. - \ 2010
    Germany : Springer (Springer Optimization and Its Applications 37) - ISBN 9780387886695 - 208
    wiskunde - operationeel onderzoek - programmeren - differentiaalmeetkunde - mathematics - operations research - programming - differential geometry
    This self-contained text provides a solid introduction to global and nonlinear optimization, providing students of mathematics and interdisciplinary sciences with a strong foundation in applied optimization techniques. The book offers a unique hands-on and critical approach to applied optimization which includes the presentation of numerous algorithms, examples, and illustrations, designed to improve the reader{u2019}s intuition and develop the analytical skills needed to identify optimization problems, classify the structure of a model, and determine whether a solution fulfills optimality conditions. Key features of "Introduction to Nonlinear and Global Optimization": - Offers insights into relevant concepts such as "regions of attraction", "branch-and-bound", and "cross-cutting" methods as well as many other useful methodologies. -Exhibits numerical examples and exercises developing the reader{u2019}s familiarity with the terminology and algorithms that are frequently encountered in scientific literature. - Presents various heuristic and stochastic optimization techniques demonstrating how each be applied to a variety of models from biology, engineering, finance, chemistry, and economics. This book is intended to serve as a primary text in an advanced undergraduate or graduate course focusing on nonlinear and global optimization and requires an understanding of basic calculus and linear algebra.
    Methods for robustness programming
    Olieman, N.J. - \ 2008
    Wageningen University. Promotor(en): Paul van Beek, co-promotor(en): Eligius Hendrix. - S.l. : S.n. - ISBN 9789085048763 - 176
    wiskunde - operationeel onderzoek - schatting - programmeren - monte carlo-methode - computerwiskunde - mathematics - operations research - estimation - programming - monte carlo method - computational mathematics
    Robustness of an object is defined as the probability that an object will have properties as required. Robustness Programming (RP) is a mathematical approach for Robustness estimation and Robustness optimisation. An example in the context of designing a food product, is finding the best composition of ingredients such that the product is optimally safe and is satisfying all specifications. Another example is the investment in a portfolio of stock market shares. The number of shares to invest in is typically a controllable factor. The future shares prices and resulting portfolio return are typically uncontrollable factors. It is interesting to find the composition of shares for which the probability of reaching a predefined target return is as high as possible.
    In this research alternative methods for Robustness Programming are developed with favourable optimisation properties for finding a design with a Robustness as high as possible. Some of these methods are generally applicable, while other methods use specific problem characteristics. A framework for Robustness Programming is developed for modelling design problems from a wide application area and to select the applicable RP methods for such design problems.
    Geschiedenis van de Wiskunde in de Twintigste Eeuw, van verzamelingen tot complexiteit
    Berg, J.C. van den - \ 2006
    Utrecht : Epsilon (Epsilon uitgaven 57) - ISBN 9789050410885 - 146
    wiskunde - geschiedenis - mathematics - history
    The semi-continous quadratic mixture design problem
    Hendrix, E.M.T. ; Casado, L.G. ; García, I. - \ 2006
    Wageningen : Mansholt Graduate School of Social Sciences (Working paper / Mansholt Graduate School : Discussion paper ) - 17
    operationeel onderzoek - algoritmen - besluitvorming - wiskunde - operations research - algorithms - decision making - mathematics
    The semi-continuous quadratic mixture design problem (SCQMDP) is described as a problem with linear, quadratic and semi-continuity con- straints. Moreover, a linear cost objective and an integer valued objective are introduced. The research question is to deal with the SCQMD prob- lem from a Branch-and-Bound perspective generating robust solutions. Therefore, an algorithm is outlined which is rigorous in the sense it iden- ti¯es instances where decision makers tighten requirements such that no ²-robust solution exists. The algorithm is tested on several cases derived from industry.
    Variation in rank abundance replicate samples and impact of clustering
    Neuteboom, J.H. ; Struik, P.C. - \ 2005
    NJAS Wageningen Journal of Life Sciences 53 (2005)2. - ISSN 1573-5214 - p. 199 - 221.
    gegevensverwerking - wiskunde - waarschijnlijkheid - statistiek - populatiedynamica - populatie-ecologie - grafieken - data processing - mathematics - probability - statistics - population dynamics - population ecology - graphs - species-area - occupancy - patterns
    Calculating a single-sample rank abundance curve by using the negative-binomial distribution provides a way to investigate the variability within rank abundance replicate samples and yields a measure of the degree of heterogeneity of the sampled community. The calculation of the single-sample rank abundance curve is used in combination with the negative-binomial rank abundance curve-fit model to analyse the principal effect of clustering on the species-individual (S-N) curve and the species-area curve. With the usual plotting of S against log N or log area, assuming that N is proportional to area, S-N curves and species-area curves are the same curves with only a shifted horizontal axis. Clustering results in a lower recorded number of species in a sample and stretches the S-N curve and species-area curve over the horizontal axis to the right. In contrast to what is suggested in the literature, we surmise that the effect of clustering on both curves will gradually fade away with increasing sample size. Since the slopes of the curves are not constant, they cannot be used as species diversity indices or site discriminant. S-N curves and species-area curves cannot be extrapolated.
    Current Themes in Theoretical Biology : A Dutch Perspective
    Reydon, T.A.C. ; Hemerik, L. - \ 2005
    [S.l.] : Springer - ISBN 9781402029011 - 310
    biologie - filosofie - evolutie - statistiek - ecologie - wiskunde - biology - philosophy - evolution - statistics - ecology - mathematics
    Wilcoxon twee steekproeven toets: het toetsen van verschillen
    Hemerik, L. - \ 2003
    Wageningen : VWO-campus (Lesbrieven VWO-campus 6) - 21
    wiskunde - statistiek - waarschijnlijkheidsanalyse - mathematics - statistics - probability analysis
    Automatic differentiation algorithms in model analysis
    Huiskes, M.J. - \ 2002
    Wageningen University. Promotor(en): J. Grasman; A. Stein; M. de Gee. - S.l. : S.n. - ISBN 9789058086013 - 152
    wiskunde - wiskundige modellen - differentiatie - algoritmen - computeranalyse - statistische inferentie - statistische analyse - gevoeligheid - mathematics - mathematical models - differentiation - algorithms - computer analysis - statistical inference - statistical analysis - sensitivity
    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods are combined to develop efficient tools for model analysis. Automatic differentiation algorithms comprise a class of algorithms aimed at the derivative computation of functions that are represented as computer code. Derivative-based methods that may be implemented using these algorithms are presented for sensitivity analysis and statistical inference, particularly in the context of nonlinear parameter estimation.

    Local methods of sensitivity analysis are discussed for both explicit and implicit relations between variables. Particular attention is paid to propagation of uncertainty, and to the subsequent uncertainty decomposition of output uncertainty in the various sources of input uncertainty.

    Statistical methods are presented for the computation of accurate inferential information for nonlinear parameter estimation problems by means of higher-order derivatives of the model functions. Methods are also discussed for the assessment of the appropriateness of model structure complexity in relation to quality of data.

    To realize and demonstrate the potential of routines for model analysis based on automatic differentiation a software library is developed: a C++ library for the analysis of nonlinear models that can be represented by differentiable functions in which the methods for parameter estimation, statistical inference, model selection and sensitivity analysis are implemented. Several experiments are performed to assess the performance of the library. The application of the derivative-based methods and the routines of the library is further demonstrated by means of a number of case studies in ecological assessment. In two studies, large parameter estimation procedures for fish stock assessment are analyzed: for the Pacific halibut and North Sea herring species. The derivative-based methods of sensitivity analysis are applied in a study on the contribution of Russian forests to the global carbon cycle.

    Striking the metapopulation balance : mathematical models & methods meet metapopulation management
    Etienne, R.S. - \ 2002
    Wageningen University. Promotor(en): J.A.P. Heesterbeek; J. Grasman. - S.l. : S.n. - ISBN 9789058085986 - 205
    wiskunde - wiskundige modellen - populaties - natuurbescherming - ecologie - fragmentatie - populatiedynamica - netwerken - mathematics - mathematical models - populations - nature conservation - ecology - fragmentation - population dynamics - networks

    There are two buzz words in nature management: fragmentation and connectivity. Not only (rail) roads, but also agricultural, residential and industrial areas fragment previously connected (or even continuous) habitat. Common sense tells us that the answer to habitat fragmentation is defragmentation and hence much effort is put into building corridors, of which fauna crossings are just one example. Corridors are conduits connecting two pieces of habitat through an environment of hostile non-habitat. As such, the use of corridors need not be restricted to the animal kingdom; plants can also use them as stepping-stones for their seeds, enabling them to colonize distant habitat. Although corridors may not only act as conduits but also as habitat, filters or even as barriers, in most cases they are constructed primarily for their conduit function. Connectivity is nowadays taken to its extreme in the "Ecologische Hoofdstructuur" (Ecological Main Structure) in The Netherlands. This is a plan in operation to create an extensive ecological structure by connecting a substantial part of the remaining "natural" habitat, which includes conduits of decommissioned farmland bought by the government. Similar plans exist in other parts of the world.

    Needless to say, there are good reasons for building corridors and plans involving them. Yet, there are some valid arguments against connecting everything. The risk of spreading of infectious diseases through these corridors is one of the most prominent arguments. The spread of the effects of (natural) catastrophes such as fire is another. But even when dismissing such negative effects of connectivity, there may be other mitigating measures which are much more efficient (and less expensive) than building corridors. The question whether this is the case and how alternatives should be compared stimulated the work for this thesis.

    A theory that is well suited for predicting the effects of fragmentation is metapopulation theory. As almost every text on metapopulations will tell you, this theory was conceived by Richard Levins in 1969-1970, although its roots may be found in earlier work. The core of the theory is the following observation. Populations are assumed to live in distinct habitat fragments, called patches. These local populations can go extinct relatively quickly, but immigration from other patches can lead to recolonization of empty patches. Thus, the whole population of populations, the metapopulation, can potentially persist if these recolonizations outweigh the extinctions of local populations. In a sense, the population spreads the risk of extinction by spatial separation. The basic model of the theory captures these processes in a simple ordinary differential equation.

    This thesis consists of eight chapters and is divided into four parts. Part I, containing only one chapter, can be regarded as a review of fundamental metapopulation processes, set in the context of a persistent problem in conservation science, the SLOSS problem. This problem, of which the acronym stands for Single Large Or Several Small, raises the question whether the optimal design of a habitat network consists of a single large nature reserve or several small reserves. Although this question was initially concerned with biodiversity (which design can contain the largest number of species?), it can be equally well applied to a single species living in a metapopulation for which the question becomes: which design optimizes the persistence of the species?

    Defined thus, it represents a fine example of opposing processes requiring mathematical modelling. On the one hand, patches must be as large as possible to minimize the risk of local extinction; on the other hand, there must be as many patches as possible to maximize the probability of recolonization and to minimize the risk of simultaneous extinction. Precise mathematical formulation of these thoughts can in principle lead to a solution of the problem. Sometimes the mathematical formulation requires that the question be expressed differently or more clearly. In chapter 1 SLOSS is replaced by the more neutral FLOMS, short for Few Large Or Many Small, because in the chosen framework a single large patch is not really possible (it exists only in a limit).

    Which design is optimal turns out, not completely surprisingly, to depend upon the measure one employs for metapopulation persistence. Two measures are introduced: the metapopulation extinction time and the colonization potential, which is a type of basic reproduction number (the number of patches colonized by a local population during its lifetime in an environment where all other patches are empty). These measures return in subsequent chapters.

    Which design is optimal also depends on how designs with different size and number of patches are compared. In chapter 1 this is done such that the amount of habitat per unit area is constant. This implies that few large patches have larger interpatch distances than many small patches.

    The two measures are functions of the extinction and colonization rates of the metapopulation. Several mechanisms for the extinction and colonization processes are formulated from which the dependence of these rates on patch size is calculated. It turns out that the metapopulation extinction time generally increases with patch size for all mechanisms, which supports the preference of few large patches. However, the colonization potential supports this preference only in the case of some special, rather unrealistic, mechanisms. In many other, more realistic, cases an intermediate patch size exists for which metapopulation persistence measured by the colonization potential is optimal.

    Part II concentrates on the Levins model. Models are often considered inadequate because the underlying assumptions are thought to be unrealistic. Yet, these assumptions can be formulated in a way that is stronger than necessary for the development of the mathematical model. Therefore, they need to be subjected to careful scrutiny, such that all superfluous elements are eliminated. If the model is still discarded, at least it is so for the right reasons.

    In chapter 2 the assumptions of the Levins model are examined. One of the assumptions as it often appears in the literature proves to be too strong: After colonization, the newly born population need not grow to the carrying capacity. It is sufficient if local dynamics are fast enough for a steady population size distribution to be established. It follows that patches need not all have the same extinction and colonization rates, but merely that these form a steady distribution depending on the population size distribution. The extinction and colonization rates in the Levins model are weighted averages over these distributions. Although this does not make the model much more realistic, it does remove restrictions on more realistic extensions of the Levins model. Three such extensions are studied: two extensions in chapter 2, involving the rescue effect and the patch preference effect, and one in chapter 3, dealing with the Allee effect. The first and third are attempts at a more careful and more mechanistic formulation of already existing models. Although the conclusions remain basically the same in these new formulations, they provide more insight in the responsible processes and are scientifically and aesthetically more satisfactory.

    The second extension of the Levins model in chapter 2 incorporates preference for occupied or empty patches in the Levins model. Preference for occupied patches may arise because of conspecific attraction; preference for empty patches seems plausible for territorial species. Preference for empty patches is shown to increase patch occupancy; preference for occupied patches lowers patch occupancy. Chapter 2 briefly studies patch preference and the rescue effect simultaneously as well, because it is not a priori evident how the rescue effect interferes with patch preference. On the one hand, empty patches should be preferred because colonization of empty patches is the only way in which the metapopulation can reproduce. On the other hand, additional colonization of occupied patches prolongs survival of the local population due to the rescue effect. It turns out that the effects are almost additive as far as patch occupancy is concerned. This could, however, be quite different if the metapopulation extinction time is taken as a measure of metapopulation persistence.

    Most metapopulation models, particularly Levins-type models, are only used to study equilibria. The last chapter of part II, chapter 4, deals with non-equilibria and their consequences for metapopulation management, using both the Levins model and its stochastic counterpart. These non-equilibria are created by imposing sudden changes in patch number and the colonization and extinction parameters on systems in equilibrium. One of the most striking results is that if we want to counteract the effects of habitat loss or increased dispersal resistance, the optimal conservation strategy is not to restore the original situation (that is, to create habitat or decrease resistance against dispersal), but rather to improve the quality of the remaining habitat in order to decrease local extinction rate. Optimality here pertains to metapopulation extinction time computed using the stochastic model. Chapter 4 also tells us that using the relaxation time of the deterministic Levins model as a surrogate for the metapopulation extinction time is not always warranted, which is not totally surprising, yet still somewhat disappointing, because the metapopulation extinction time is often hard to compute.

    Chapter 4 forms a bridge between part II and III: it introduces the stochastic approach used in the chapters following it and it already provides us with a rule of thumb for metapopulation conservation as we stated above. In part III rules of thumb that can guide management of metapopulations play a central role. First, in chapter 5, rules of thumb are derived on the abstract level of colonization and extinction probabilities. Then, in chapter 6, some of these rules are tested on the less abstract level of two landscape characteristics which often mainly determine the probabilities of colonization and extinction, viz. patch size and interpatch distance.

    The rules of thumb generated in chapter 5 can be summarized as: to optimize metapopulation extinction time, decreasing the risk of local extinction is preferable over increasing colonization probability and this should generally be done in the least extinction-prone patches; if changing local extinction risk is impossible, then increasing the colonization probability between the two least extinction-prone patches is most preferable. When extinction and colonization are related to patch size and interpatch distance in chapter 6 by mechanistic submodels of the corresponding processes, the last two of these rules transform into: the preferred strategies to optimize the metapopulation extinction time and the basic reproduction number are, firstly, increasing the size of the largest patch (which is least extinction-prone) and, secondly, decreasing the effective interpatch distance between the two largest patches. These rules are less strongly supported than those of chapter 5, and the first is even reversed if absolute (instead of relative) increases in patch size are considered. The reason for this is that in the mechanistic submodel for local extinction a large patch requires a large increase in size to substantially alter its local extinction probability. Since it is not a priori clear whether increases in patch sizes must be compared on an absolute or a relative basis, final conclusions cannot be drawn. Thus, chapters 5 and 6 are two parts of a trilogy which would be completed by a socio-politico-economic chapter taking into account e.g. the costs of habitat creation in relation to the size of the patch to which habitat is added. That is, it would then be almost completed, because there should also be an additional section on the important biological question how ecoducts and the like change the effective interpatch distance; this is usually merely hidden in the parameters. Although the trilogy is not complete, at least more light has been shed on the range of possible final conclusions and, more importantly, the conditions under which they are valid.

    Whereas the first three parts of this thesis deal with general models of hypothetical metapopulations, and are somewhat academic, part IV concentrates on (statistical) methodology assisting in making model predictions, illustrated by two real case studies. Chapter 7 shows how the (relative) impact of human interventions can be predicted despite data of poor quality, for two amphibian species threatened by the reinstatement of an old railway track, using uncertainty analysis. Again, the measure employed, in this case metapopulation extinction time and the occupancy of each local population, plays a crucial role in deciding which scenario of human interventions is most preferable. It is also noted that the optimal scenario may differ for different species which aggravates the decision making process, because species must then be assigned a certain quantity representing their importance. Furthermore, the most important source of uncertainty is not the uncertainty in the effects of the railway track on extinction and colonization, as one might expect, but the uncertainty due to the inherent stochastic nature of the model combined with the uncertainty about the default parameter settings.

    Chapter 8 demonstrates how Bayesian inference using Monte Carlo Markov Chain simulations can help in obtaining (estimates of posterior) probability distributions of meta- population model parameters based on a dataset, typical in metapopulation studies: a few years of data of occupancy (presence or absence) of the tree frog in 202 patches with many missing data. Parameter estimation methods were available before for such datasets (and surely formed a source of inspiration for this new method), but none of them could use all information in the dataset as well as provide a joint probability distribution of the parameters rather than a point estimate. Such a joint probability distribution is necessary for model predictions that take into account the uncertainties about the model parameters. It does take some time to compute, however, so much that it would not have been possible within a reasonable time until recently. Therefore, the appendix of chapter 8 also supplies an efficient algorithm.

    What does this thesis contribute to metapopulation theory and to metapopulation management? Being aware of the fact that I may not be the right person to answer this question, I will endeavor to provide an answer, at the risk of being pretentious.

    As far as metapopulation theory is concerned, I hope to have drawn attention to some underexposed aspects (the necessity of a careful definition of the SLOSS problem and the constant realization that different measures may yield different conclusions). Furthermore, I hope to have shown how existing models may be adjusted to a more satisfactory form that can be more easily extended (by incorporating the rescue and Allee effects into the Levins model). I also hope to have built more solid foundations and intermodel connections (by formulating more precise assumptions of the Levins model and examining the extensions which result when one of these assumptions is violated, by comparing the stochastic and deterministic versions of the Levins model, and by studying different modifications of the discrete-time stochastic model) and to have made some fairly original additions to the theory (patch preference, non-equilibria).

    As far as metapopulation management is concerned, I would be content if due to my work those responsible for metapopulation management thought twice before they decided upon, for example, building an ecoduct. At the same time, I would be disappointed if they followed the rules of thumb mindlessly. Along with many skeptical scientists, particularly biologists, I do not believe that there are rules of thumb upon which can be relied unconditionally. Yet, far from disposing of them altogether, I think they are very important; their value lies in summarizing a large part of our knowledge, the importance of which evidently increases with the robustness of the rules, and in provoking discussions. These discussions already commence in chapters 5 and 6, and are hopefully taken up by others. The discussions should deal with the many assumptions underlying the rules of thumb, when these assumptions are (approximately) valid and when they are clearly violated, and the extent to which such violations entail a change in the rules of thumb.

    Furthermore, I would be pleased if uncertainty analysis of metapopulation model predictions became standard, especially in situations where expert judgment is the most significant source to parameterize a model. I hope that chapter 7 makes clear that there are sophisticated yet easily understandable and implementable techniques. Likewise, I would be satisfied if our Bayesian parameterization method were in vogue, in cases where data are available. With the example of a non-standard incidence function model, I hope to have demonstrated its generality.

    Wiskunde in Werking deel 3 : functies van verscheidene variabelen
    Gee, M. de - \ 2001
    Utrecht : Epsilon Uitgaven - ISBN 9789050410694 - 357 p.
    mathematics - quantitative analysis - textbooks - functional analysis
    Wiskunde in Werking Dl. 1: vectoren en matrices toegepast
    Gee, M. de - \ 2001
    Utrecht : Epsilon (Epsilon uitgaven 48) - ISBN 9789050410632 - 288
    wiskunde - vectoren - matrices - studieboeken - vectoranalyse - mathematics - vectors - textbooks - vector analysis
    De eerste vier hoofdstukken van dit boek komen in licht gewijzigde vorm overeen met hoofdstukken uit Wiskunde in Werking (deel 28 in deze reeks). Het vijfde hoofdstuk is wat meer verdiepend. De klassieke basisstof uit de differentiaal- en integraalrekening en de lineaire algebra is verdeeld in betrekkelijk kleine eenheden, die de student zich door zelfstudie kan eigen maken. De docent speelt hierbij eerder een begeleidende dan een docerende rol. Bij deze zelfstudie wordt tevens veel aandacht besteed aan het bestuderen van concrete voorbeelden en oefeningen. De wiskundige technieken die worden aangedragen, worden direct ingebed in toepassingen, waardoor de verbinding tussen toepassing en wiskunde zeker zo belangrijk is als de wiskunde zelf.
    An analysis of the calculation of leaching and denitrification losses as practised in the NUTMON approach
    Willigen, P. de - \ 2000
    Wageningen : Plant Research International - 20
    uitspoelen - verliezen - stikstof - bodem - stikstofbalans - wiskunde - simulatie - modellen - dadels - stikstofmeststoffen - denitrificatie - leaching - losses - nitrogen - soil - nitrogen balance - mathematics - simulation - models - dates - nitrogen fertilizers - denitrification
    Bioinformatica : surfen op DNA
    Stiekema, W.J. - \ 1999
    Wageningen : Wageningen Universiteit - 30
    biologie - computers - wiskunde - gegevensverwerking - dna - genen - genkartering - bio-informatica - biology - computers - mathematics - data processing - dna - genes - gene mapping - bioinformatics
    Inaugurele rede Wageningen Universiteit, 25 november 1999
    Fundamentals in the Design and Analysis of Experiments and Surveys - Grundlagen der Planung und Auswertung von Versuchen und Erhebungen
    Rasch, D. ; Verdooren, L.R. ; Gowers, J.I. - \ 1999
    München [etc.] : Oldenbourg - ISBN 9783486249668 - 253
    statistische analyse - wiskunde - proefopzet - statistical analysis - mathematics - experimental design
    Check title to add to marked list
    << previous | next >>

    Show 20 50 100 records per page

    Please log in to use this service. Login as Wageningen University & Research user or guest user in upper right hand corner of this page.