- Browse by Title

# Masters Degrees (Mathematical Sciences)

## Permanent URI for this collection

## Browse

### Browsing Masters Degrees (Mathematical Sciences) by Title

Now showing 1 - 20 of 308

###### Results Per Page

###### Sort Options

- Item3D numerical techniques for determining the foot of a continental slope(Stellenbosch : Stellenbosch University, 2004-12) Pantland, Nicolette Ariana; Van Vuuren, J. H.; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: The United Nations Convention on the Law of the Sea (UNCLOS) provides an opportunity for qualifying coastal signatory states to claim extended maritime estate. The opportunity to claim rests on the precept that in certain cases a continental shelf extends beyond the traditionally demarcated two hundred nautical mile (200M) Exclusive Economic Zone (EEZ) mark. In these cases a successful claim results in states having sovereign rights to the living and non-living resources of the seabed and subsoil, as well as the sedentary species, of the area claimed. Where the continental shelf extends beyond the 200M mark, the Foot of the Continental Slope (FoS) has to be determined as one of the qualifying criteria. Article 76 of UNCLOS de nes the FoS as ". . . the point of maximum change in the gradient at its base." Currently Caris Lots is the most widely used software which incorporates public domain data to determine the FoS as a step towards defining the offshore extent of an extended continental shelf. In this software, existing methods to compute the FoS are often subjective, typically involving an operator choosing the best perceived foot point during consideration of a two dimensional profile of the continental slope. These foot points are then joined by straight lines to form the foot line to be used in the desk top study (feasibility study). The purpose of this thesis is to establish a semi-automated and mathematically based three dimensional method for determination of the FoS using South African data as a case study. Firstly, a general background of UNCLOS is given (with emphasis on Article 76), including a brief discussion of the geological factors that influence the characteristics of a continental shelf and thus factors that could influence the determination of the FoS. Secondly, a mathematical method for determination of the surfaces of extremal curvature (on three dimensional data), originally proposed by Vanicek and Ou in 1994, is detailed and applied to two smooth, hypothetical sample surfaces. A discussion of the bathymetric data to be used for application introduces the factors to be taken into account when using extensive survey data as well as methods to process the raw data for use. The method is then applied to two sets of gridded bathymetric data of differing resolution for four separate regions around the South African coast. The ridges formed on the resulting surfaces of maximum curvature are then traced in order to obtain a foot line definition for each region and each resolution. The results obtained from application of the method are compared with example foot points provided by the subjective two dimensional method of computation within the Caris Lots software suite. A comparison of the results for the different resolutions of data is included to provide insight as to the effectiveness of the method with differing spatial coarseness of data. Finally, an indication of further work is provided in the conclusion to this thesis, in the form of a number of recommendations for possible adaptations of the mathematical and tracing methods, and improvements thereof.Show more - Item3D position estimation of sports players through multi-view tracking(Stellenbosch : University of Stellenbosch, 2010-12) Vos, Robert (Robbie); Brink, Willie; Herbst, Ben; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: Extracting data from video streams and using the data to better understand the observed world allows many systems to automatically perform tasks that ordinarily needed to be completed by humans. One such problem with a wide range of applications is that of detecting and tracking people in a video sequence. This thesis looks speci cally at the problem of estimating the positions of players on a sports eld, as observed by a multi-view camera setup. Previous attempts at solving the problem are discussed, after which the problem is broken down into three stages: detection, 2D tracking and 3D position estimation. Possible solutions to each of the problems are discussed and compared to one another. Motion detection is found to be a fast and e ective solution to the problem of detecting players in a single view. Tracking players in 2D image coordinates is performed by implementing a hierarchical approach to the particle lter. The hierarchical approach is chosen as it improves the computational complexity without compromising on accuracy. Finally 3D position estimation is done by multiview, forward projection triangulation. The components are combined to form a full system that is able to nd and locate players on a sports eld. The overall system that is developed is able to detect, track and triangulate player positions. The components are tested individually and found to perform well. By combining the components and introducing feedback between them the results of the individual components as well as those of the overall system are improved.Show more - ItemA 3D Virtual Environment Development Platform for ASD Therapy Tools(Stellenbosch : University of Stellenbosch, 2009-12) Chamberlain, Morne Edward; Van Zijl, L.; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences. Computer Science.
Show more ENGLISH ABSTRACT: The aim of this thesis is to develop a generic 3D virtual environment development platform for autism spectrum disorder (ASD) therapy tools. The potential of using computerised therapy tools for ASD therapy is well known. However, the development of such tools is expensive and time-consuming, and is language and culture speci c. This work intends to alleviate these problems. The design of the platform is based on known game engine designs, but adapted for the requirements of ASD therapy tools. It supports standard features such as 3D rendering, animation and audio output. Speci c features, aimed at ASD therapy tools and educational games, included in our engine are: replays, data capturing, remote monitoring over a network and language localisation. We also implemented an input hardware abstraction layer to allow support for non-standard input peripherals in the future, without modifying existing game implementations. Furthermore, to separate the development of games and tools from the engine, we include wrapper libraries in our engine for Lua and Java. We successfully developed our engine and implemented a number of prototype therapy tools and educational games. These implementations confirmed that the engine works as expected. Some of these programs are currently in use at a local primary school.Show more - ItemAn adaptive feature-based tracking system(Stellenbosch : University of Stellenbosch, 2008-03) Pretorius, Eugene; Herbst, B. M.; Hunter, K. M.; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences. Applied Mathematics.
Show more In this paper, tracking tools are developed based on object features to robustly track the object using particle filtering. Automatic on-line initialisation techniques use motion detection and dynamic background modelling to extract features of moving objects. Automatically adapting the feature models during tracking is implemented and tested.Show more - ItemAdaptive occupancy grid mapping with measurement and pose uncertainty(Stellenbosch : Stellenbosch University, 2012-12) Joubert, Daniek; Herbst, B. M.; Brink, Willie; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: In this thesis we consider the problem of building a dense and consistent map of a mobile robot’s environment that is updated as the robot moves. Such maps are vital for safe and collision-free navigation. Measurements obtained from a range sensor mounted on the robot provide information on the structure of the environment, but are typically corrupted by noise. These measurements are also relative to the robot’s unknown pose (location and orientation) and, in order to combine them into a world-centric map, pose estimation is necessary at every time step. A SLAM system can be used for this task. However, since landmark measurements and robot motion are inherently noisy, the pose estimates are typically characterized by uncertainty. When building a map it is essential to deal with the uncertainties in range measurements and pose estimates in a principled manner to avoid overconfidence in the map. A literature review of robotic mapping algorithms reveals that the occupancy grid mapping algorithm is well suited for our goal. This algorithm divides the area to be mapped into a regular lattice of cells (squares for 2D maps or cubes for 3D maps) and maintains an occupancy probability for each cell. Although an inverse sensor model is often employed to incorporate measurement uncertainty into such a map, many authors merely state or depict their sensor models. We derive our model analytically and discuss ways to tailor it for sensor-specific uncertainty. One of the shortcomings of the original occupancy grid algorithm is its inability to convey uncertainty in the robot’s pose to the map. We address this problem by altering the occupancy grid update equation to include weighted samples from the pose uncertainty distribution (provided by the SLAM system). The occupancy grid algorithm has been criticized for its high memory requirements. Techniques have been proposed to represent the map as a region tree, allowing cells to have different sizes depending on the information received for them. Such an approach necessitates a set of rules for determining when a cell should be split (for higher resolution in a local region) and when groups of cells should be merged (for lower resolution). We identify some inconsistencies that can arise from existing rules, and adapt those rules so that such errors are avoided. We test our proposed adaptive occupancy grid algorithm, that incorporates both measurement and pose uncertainty, on simulated and real-world data. The results indicate that these uncertainties are included effectively, to provide a more informative map, without a loss in accuracy. Furthermore, our adaptive maps need far fewer cells than their regular counterparts, and our new set of rules for deciding when to split or merge cells significantly improves the ability of the adaptive grid map to mimic its regular counterpart.Show more - ItemAn age structured model for substance abuse dynamics in the Western Cape Province of South Africa(Stellenbosch : Stellenbosch University, 2017-03) Chinake, Filister; Nyabadza, Farai; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH SUMMARY: The substance abuse problem has escalated in the Western Cape province of South Africa. This has resulted in high rates of gangsterism and other prob- lems associated with substance abuse. The problem has evolved as gangsters have aggressively extended their turf by recruiting school learners to sell drugs within school premises. The e ect is that more age groups are being recruited into abusing substance. In order to reverse the current trends of substance abuse it is imperative that the dynamics of the problem are fully understood. More insight can be gained if age structure was incorporated into the substance abuse models as the processes like initiation, escalation into problematic sub- stance abuse and quitting are in uenced by age. Thus we propose an age structured model of substance abuse. A form of the reproduction number R0 is calculated and the model is shown to be well posed. A suitable nite di er- ence scheme is discussed for the numerical solution of our partial di erential equations. Sensitivity analysis is undertaken using the Latin Hypercube Sam- pling and Partial Rank Correlation Coe cient. Parameters for the model are obtained by tting the model to the age structured data for individuals in the rehabilitation centres. The dynamics of the model are described by the results from the numerical simulations. The model is used to predict the dynamics of substance abuse until the year 2020. Substance abuse is predicted to increase with time and higher incidence of substance abuse expected for the older age groups.Show more - ItemAn algebraic framework for reasoning about security(Stellenbosch : Stellenbosch University, 2013-03) Rajaona, Solofomampionona Fortunat; Sanders, J. W.; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: Stepwise development of a program using refinement ensures that the program correctly implements its requirements. The specification of a system is “refined” incrementally to derive an implementable program. The programming space includes both specifications and implementable code, and is ordered with the refinement relation which obeys some mathematical laws. Morgan proposed a modification of this “classical” refinement for systems where the confidentiality of some information is critical. Programs distinguish between “hidden” and “visible” variables and refinement has to bear some security requirement. First, we review refinement for classical programs and present Morgan’s approach for ignorance pre- serving refinement. We introduce the Shadow Semantics, a programming model that captures essential properties of classical refinement while preserving the ignorance of hidden variables. The model invalidates some classical laws which do not preserve security while it satisfies new laws. Our approach will be algebraic, we propose algebraic laws to describe the properties of ignorance preserving refinement. Thus completing the laws proposed in. Moreover, we show that the laws are sound in the Shadow Semantics. Finally, following the approach of Hoare and He for classical programs, we give a completeness result for the program algebra of ignorance preserving refinement.Show more - ItemAlgebraic points in tame expansions of fields(Stellenbosch : Stellenbosch University, 2021-12) Harrison-Migochi, Andrew; Boxall, Gareth John; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: We investigate the behaviour of algebraic points in several expansions of the real, complex and p-adic fields. We build off the work of Eleftheriou, Günaydin and Hieronymi in [17] and [18] to prove a Pila-Wilkie result for a p-adic subanalytic structure with a predicate for either a dense elementary substructure or a dense dcl-independent set. In the process we prove a structure theorem for p-minimal structures with a predicate for a dense independent set. We then prove quantifier reduction results for the complex field with a predicate for the singular moduli and the real field with an exponentially transcendental power function and a predicate for the algebraic numbers using a Schanuel property proved by Bays, Kirby and Wilkie [5]. Finally we adapt a theorem by Ax [2] about exponential fields, key to the proof of the Schanuel property for power functions, to power functions.Show more - ItemAmerican Monte Carlo option pricing under pure jump levy models(Stellenbosch : Stellenbosch University, 2013-03) West, Lydia; Ouwehand, P. W.; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT: We study Monte Carlo methods for pricing American options where the stock price dynamics follow exponential pure jump L évy models. Only stock price dynamics for a single underlying are considered. The thesis begins with a general introduction to American Monte Carlo methods. We then consider two classes of these methods. The fi rst class involves regression - we briefly consider the regression method of Tsitsiklis and Van Roy [2001] and analyse in detail the least squares Monte Carlo method of Longsta and Schwartz [2001]. The variance reduction techniques of Rasmussen [2005] applicable to the least squares Monte Carlo method, are also considered. The stochastic mesh method of Broadie and Glasserman [2004] falls into the second class we study. Furthermore, we consider the dual method, independently studied by Andersen and Broadie [2004], Rogers [2002] and Haugh and Kogan [March 2004] which generates a high bias estimate from a stopping rule. The rules we consider are estimates of the boundary between the continuation and exercise regions of the option. We analyse in detail how to obtain such an estimate in the least squares Monte Carlo and stochastic mesh methods. These models are implemented using both a pseudo-random number generator, and the preferred choice of a quasi-random number generator with bridge sampling. As a base case, these methods are implemented where the stock price process follows geometric Brownian motion. However the focus of the thesis is to implement the Monte Carlo methods for two pure jump L évy models, namely the variance gamma and the normal inverse Gaussian models. We first provide a broad discussion on some of the properties of L évy processes, followed by a study of the variance gamma model of Madan et al. [1998] and the normal inverse Gaussian model of Barndor -Nielsen [1995]. We also provide an implementation of a variation of the calibration procedure of Cont and Tankov [2004b] for these models. We conclude with an analysis of results obtained from pricing American options using these models.Show more - ItemAnalysing ranking algorithms and publication trends on scholarly citation networks(Stellenbosch : Stellenbosch University, 2014-12) Dunaiski, Marcel Paul; Visser, Willem; Geldenhuys, Jaco; Stellenbosch University. Faculty of Science. Department of Mathematical Sciences.
Show more ENGLISH ABSTRACT: Citation analysis is an important tool in the academic community. It can aid universities, funding bodies, and individual researchers to evaluate scientific work and direct resources appropriately. With the rapid growth of the scientific enterprise and the increase of online libraries that include citation analysis tools, the need for a systematic evaluation of these tools becomes more important. The research presented in this study deals with scientific research output, i.e., articles and citations, and how they can be used in bibliometrics to measure academic success. More specifically, this research analyses algorithms that rank academic entities such as articles, authors and journals to address the question of how well these algorithms can identify important and high-impact entities. A consistent mathematical formulation is developed on the basis of a categorisation of bibliometric measures such as the h-index, the Impact Factor for journals, and ranking algorithms based on Google’s PageRank. Furthermore, the theoretical properties of each algorithm are laid out. The ranking algorithms and bibliometric methods are computed on the Microsoft Academic Search citation database which contains 40 million papers and over 260 million citations that span across multiple academic disciplines. We evaluate the ranking algorithms by using a large test data set of papers and authors that won renowned prizes at numerous Computer Science conferences. The results show that using citation counts is, in general, the best ranking metric. However, for certain tasks, such as ranking important papers or identifying high-impact authors, algorithms based on PageRank perform better. As a secondary outcome of this research, publication trends across academic disciplines are analysed to show changes in publication behaviour over time and differences in publication patterns between disciplines.Show more - ItemAnalysis of partner turnover rate and the lifetime number of sexual partners in Cape Town using generalized linear models(Stellenbosch : Stellenbosch University, 2017-12) Olojede, Christianah Oyindamola; Delva, Wim; Stellenbosch University. Faculty of Science. Department of Mathematical Sciences.
Show more ENGLISH ABSTRACT :A large number of analyses have been carried out to investigate how sexually active people contracted human immunodeficiency virus (HIV) by using common indicators like the number of new sexual partners in a given year and the lifetime number of partners. In this study, the objective is to show that these are not always good indicators because what people report for these two indicators is not accurate nor consistent using generalized linear models such as Poisson and the negative binomial regression models. Generalized linear models are the types of models that allows for the distribution of the response variable to be non-normal. A cross-sectional, sexual behavioural survey was conducted in communities with a high prevalence of HIV in Cape Town, South Africa, in 2011 – 2012. We examined the effects of age and gender on the rate at which sexual partnerships are formed, using count data regression models. The age range of respondents was 16-40 years. The highest number of new sexual relationships formed in a year preceding the survey was 11 and the highest lifetime number of sexual partners was 15. A generalized linear regression model was used to examine the consistency between the reported number of new sexual partners formed in a year preceding the survey and the reported lifetime number of partners. We also assessed the predictive power of these two indicators for the respondent’s HIV status. We found that these indicators are not consistent, and we conclude that they are not good indicators for predicting HIV status.Show more - ItemAnalysis of the effects of growth-fragmentation-coagulation in phytoplankton dynamics(Stellenbosch : Stellenbosch University, 2011-12) Omari, Mohamed; Banasiak, Jacek; Rewitzky, Ingrid; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences. Mathematics Division.
Show more ENGLISH ABSTRACT: An integro-differential equation describing the dynamical behaviour of phytoplankton cells is considered in which the effects of cell division and aggregation are incorporated by coupling the coagulation-fragmentation equation with growth, and the McKendrick-von Foerster renewal model of an age-structured population. Under appropriate conditions on the model parameters, the associated initial-boundary value problem is shown to be well posed in a physically relevant Banach space using the theory of strongly continuous semigroups of operators, the theory of perturbation of positive semigroups and the semilinear abstract Cauchy problems theory. In particular, we provide sufficient conditions for honesty of the model. Finally, the results on the effects of the growth-fragmentation-coagulation on the overall evolution of the phytoplankton population are summarised.Show more - ItemAnalytic methods in combinatorial number theory(Stellenbosch : Stellenbosch University, 2015-12) Baker, Liam Bradwin; Wagner, Stephan; Stellenbosch University. Faculty of Science. Department Mathematical Sciences (Mathematics)
Show more ENGLISH ABSTRACT : Two applications of analytic techniques to combinatorial problems with number-theoretic flavours are shown. The first is an application of the real saddle point method to derive second-order asymptotic expansions for the number of solutions to the signum equation of a general class of sequences. The second is an application of more elementary methods to yield asymptotic expansions for the number of partitions of a large integer into powers of an integer b where each part has bounded multiplicity.Show more - ItemAnalytic models of TCP performance(Stellenbosch : University of Stellenbosch, 2011-10) Kassa, Debassey Fesehaye; Krzesinski, A. E.; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences. Institute for Applied Computer Science.
Show more ENGLISH ABSTRACT: The majority of tra c on the Internet uses the Transmission Control Protocol (TCP) as a transport layer protocol for the end-to-end control of information transfer. Measurement, simulation and analytical models are the techniques and tools that can be used to understand and investigate the Internet and its performance. Measurements can only be used to explore existing network scenario or otherwise become costly and in exible with the growth and complexity of the Internet. Simulation models do not scale with the growth of network capacities and the number of users. Computationally e cient analytical models are therefore important tools for investigating, designing, dimensioning and planning IP (Internet Protocol) networks. Existing analytical models of TCP performance are either too simple to capture the internal dynamics of TCP or are too complex to be used to analyze realistic network topologies with several bottleneck links. The literature shows that the xed point algorithm (FPA) is a very useful way of solving analytical models of Internet performance. This thesis presents fast and accurate analytical models of TCP performance with the FPA used to solve them. Apart from what is observed in experimental literature, no comprehensive proof of the convergence and uniqueness of the FPA is given. In this thesis we show how the FPA of analytical models of reliable Internet protocols such as TCP converges to a unique xed point. The thesis speci es the conditions necessary in order to use the FPA for solving analytical models of reliable Internet protocols. We also develop a general implementation algorithm of the FPA of analytical models of TCP performance for realistic and arbitrary network topologies involving heterogenous TCP connections crossing many bottleneck links. The models presented in this thesis give Internet performance metrics, assuming that only basic network parameters such as the network topology, the number of TCP connections, link capacity, distance between network nodes and router bu er sizes are known. To obtain the performance metrics, TCP and network sub{models are used. A closed network of :=G=1 queues is used to develop each TCP sub-model where each queue represents a state of a TCP connection. An M=M=1=K queue is used for each network sub{model which represents the output interface of an IP router with a bu er capacity of K ��������1 packets. The two sub-models are iteratively solved. We also give closed form expressions for important TCP performance values and distributions. We show how the geometric, bounded geometric and truncated geometric distributions can be used to model reliable protocols such as TCP. We give models of the congestion window cwnd size distribution by conditioning on the slow start threshold ssthresh distribution and vice-versa. We also present models of the probabilities of TCP timeout and triple duplicate ACK receptions. Numerical results based on comparisons against ns2 simulations show that our models are more accurate, simpler and computationally more e cient than another well known TCP model. Our models can therefore be used to rapidly analyze network topologies with several bottlenecks and obtain detailed performance metrics.Show more - ItemAn application of photogrammetry in the petrochemical industry(Stellenbosch : Stellenbosch University, 2008-03) Singels, Wynand; Hunter, K. M.; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences. Applied Mathematics.
Show more When building or improving a petrochemical plant, drawings are used extensively in the design process. However, existing petrochemical plants seldom match their drawings, or the drawings are lost, forcing the need to generate a 3D model of the structure of the plant. In this thesis photogrammetry is investigated as a method of generating a digital 3D model of an existing plant. Camera modeling, target extraction and 3D reconstruction are discussed in detail, and a real-world system is investigated.Show more - ItemApplications of change of numéraire for option pricing(Stellenbosch : University of Stellenbosch, 2007-12) Le Roux, Gawie; Kopp, P. E.; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences.
Show more The word numéraire refers to the unit of measurement used to value a portfolio of assets. The change of numéraire technique involves converting from one measurement to another. The foreign exchange markets are natural settings for interpreting this technique (but are by no means the only examples). This dissertation includes elementary facts about the change of numeraire technique. It also discusses the mathematical soundness of the technique in the abstract setting of Delbaen and Schachermayer’s Mathematics of Arbitrage. The technique is then applied to financial pricing problems. The right choice of numéraire could be an elegant approach to solving a pricing problem or could simplify computation and modelling.Show more - ItemThe architecture of antagonistic networks(Stellenbosch : Stellenbosch University, 2013-03) Nuwagaba, Savannah; Hui, Cang; Stellenbosch University. Faculty of Science. Department of Mathematical Sciences.
Show more ENGLISH ABSTRACT: Designing a mechanistic model that can give rise to realistic architecture of ecological networks is central to the understanding of how species assemble and function in ecosystems. As species are constantly adjusting their diets in an antagonistic network, we here incorporate this adaptive behaviour of diet choice into a bipartite network model, with the effect of antagonistic interactions between species depicted by Holling’s type II functional response. Predictions of this model fit extremely well with the observed levels of nestedness, modularity and node-degree distributions for 61 real host-parasitoid and plant-herbivore networks. We further examined two specific scenarios of our model (species with identical [neutral] demographic parameters and interactions with identical [neutral] benefit in the network) and found that the demography-neutral scenario overestimated observed modularity, whilst the benefit-neutral scenario over-estimate observed nestedness. Relationships between nestedness, modularity and connectance were found strong. Moreover, in contrast to the common belief of the high modularity in antagonistic networks, most real networks (> 80%) are significantly nested, whilst nearly 40% of the real networks are surprisingly less compartmentalized than random networks generated from null models. Regardless of the controversy on whether antagonistic networks are nested or compartmentalized, the proposed model captured the essence of the dynamic nature of structural emergence in antagonistic networks. Due to its predictive power, this model was further used to investigate robustness in antagonistic networks. Predictions showed that the robustness of a network is determined by many factors, such as connectance, resource degree distribution, resource-consumer ratio, diversity, nestedness and compartmentalisation. Surprisingly, the manner of network response to species loss was independent of the sequence followed while removing species from a network. Variations were only noticed in the intensity of the effect resulting from the removals. In addition, we also showed that species extinction procedures which ignore the interaction switch underestimate the effect of any loss of species in these networks. We must therefore value our knowledge of possible adaptive processes in the ecosystem as they may be important for resolving the diversity-stability debate.Show more - ItemAssessment of a measles outbreak response vaccination campaign, and two measles parameter estimation methods(Stellenbosch : Stellenbosch University, 2018-03) Azam, James Mba; Pulliam, Juliet R. C.; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more ENGLISH ABSTRACT : Measles is highly transmissible, and is a leading cause of vaccine-preventable death among children. Consequently, it is regarded as a public health issue worldwide and has been targeted for elimination by 5 out of the 6 WHO regions by 2020, the exception being the WHO Africa region. The hope of achieving this target, however, seems bleak as regular outbreaks continue to occur. Data from these outbreaks are useful for pursu- ing important questions about measles dynamics and control. This thesis is structured to investigate two questions: the first is on how well the time series susceptible-infected- recovered (TSIR) model and removal method perform when they are used to estimate parameters from poor quality data on measles epidemics. We simulate stochastic epidemics for four spatial patches, resembling data that are collected in low-income coun- tries where resources are limited for properly collecting and reporting data on measles epidemics. We then obtain from the simulated data sets, the size of the initial susceptible population S0, and the basic reproductive ratio R0 - for the TSIR; and S0, and either the effective reproduction number Re f f , or the basic reproductive ratio R0 - for the re- moval method, depending on the simulation assumptions. To assess performance, we quantify the biases that result when we tweak some of the simulation assumptions and modify the data to ensure it is in a form usable for each of the two methods. We find that the performance of the methods depends on the assumptions underlying the data gen- eration process, the degree of spatial aggregation, the chosen method of modifying the data to put it in a form usable for the estimation method, and the parameter being fitted. The removal method S0 estimates at the patch level are almost unbiased when the pop- ulation is naive, but are biased when aggregated to the population level, whether the population is initially naive or not. Furthermore, the removal R0 and Re f f estimates are generally biased. The TSIR model, on the other hand, seems more robust in estimating both S0 and R0 for non-naive populations. These findings are useful because they give us an idea of the biases in the fits of these methods to actual data of the same nature as the simulated epidemics. For the second question, we assess the impact of an outbreak response vaccination campaign which was organised in reaction to a measles outbreak in an all-boys high school in Stellenbosch, South Africa. We achieve this by formulating a discrete stochastic susceptible-exposed-infected-recovered (SEIR) model with daily time-steps, ignoring births and deaths. Using the model, we analyse multiple scenarios that allow us to estimate the cases averted, and to predict the cases remaining until the epidemic ended, and the time frame within which those cases would occur. Summarizing across scenarios, we estimate that a median of 255 cases (range 60 − 493) were averted. Also, a median of 15 remaining cases (range 1 − 33), and a median of 4 remaining weeks (range 1 − 16) were expected until the epidemic ended. We conclude that the campaign was successful in averting many potential cases.Show more - ItemAudio-visual automatic speech recognition using Dynamic Bayesian Networks(Stellenbosch : University of Stellenbosch, 2011-03) Reikeras, Helge; Herbst, B. M.; Engelbrecht, H. A.; Du Preez, J. A.; University of Stellenbosch. Faculty of Science. Dept. of Mathematical Sciences.
Show more Please refer to full text to view abstract.Show more - ItemAutomated brick sculpture construction(Stellenbosch : Stellenbosch University, 2008-12) Smal, Eugene; Van Zijl, Lynette; Stellenbosch University. Faculty of Science. Dept. of Mathematical Sciences.
Show more In this thesis we consider the modelling of a particular layout optimisation problem, namely, the LEGO construction problem. The LEGO construction problem, in short, concerns the optimal layout of a set of LEGO bricks to represent a given object. Our goal is to develop a software package which LEGO enthusiasts can use to construct LEGO sculptures for any real-world object. We therefore not only consider the layout optimisation problem, but also the generation of the input data required by the LEGO construction problem. We show that by using 3D geometric models to represent the real-world object, our implemented voxelisation technique delivers accurate input data for the LEGO construction problem. The LEGO construction problem has previously been solved with optimisation techniques based on simulated annealing, evolutionary algorithms, and a beam search approach. These techniques all indicate that it is possible to generate LEGO building instructions for real-world objects, albeit not necessarily in reasonable time. We show that the LEGO construction problem can be modelled easily with cellular automata, provided that cells are considered as clusters which can merge or split during each time step of the evolution of the cellular automaton. We show that the use of cellular automata gives comparable layout results in general, and improves the results in many respects. The cellular automata method requires substantially less memory and generally uses fewer LEGO bricks to construct the LEGO sculpture when using comparable execution times.Show more