SUNScholar
SUNScholar is a leading digital archive for the preservation and promotion of the research output of Stellenbosch University.

Communities in DSpace
Select a community to browse its collections.
Recent Submissions
Item
The effect of human immunodeficiency virus on the clinical course and treatment of gestational trophoblastic disease
(Stellenbosch : Stellenbosch University, 2025-03) Burns, Lucinda Carmen; Butt, Jennifer; Stellenbosch University. Faculty of Medicine and Health Sciences. Dept. of Obstetrics and Gynaecology.
Objectives: The objective of the study was to describe the HIV-positive patients with gestational trophoblastic disease and to determine whether it is safe to adopt a shorter surveillance strategy in these patients.
Methods: This was a retrospective descriptive study carried out at the Gynaecologic Oncology unit at Tygerberg Hospital. All cases of Gestational trophoblastic disease registered and managed at this unit over a 4-year period from 1st January 2017 to 31st December 2020 were included in the study.
Results: Ninety one women were diagnosed with GTD. Fourteen patients were excluded from the analysis. The mean age of the patients was 31.6 years. All patients had a known HIV status, eleven (14.2%) were HIV-positive and sixty-six (85.7%) were HIV-negative. The presenting features did not differ between HIV-positive and negative patients. Histologically, fifteen patients (19.5%) had partial moles and 60 patients (77.8%) had complete moles. There was no difference in type of gestational trophoblastic disease between HIV-positive and HIV-negative patients (p=0.343). At 56 days after evacuation of the molar pregnancy, none of the HIV positive patients’ BhCG level was <5mIU/ml. There was a significant difference in the mean time to BhCG< 5mIU/ml between HIV positive and negative women (p=0.009). PTN occurred in 16.9 % (13) of the 77 included women and 9.1% had choriocarcinoma. All persistent disease resolved after chemotherapy.
Conclusion: A shorter BhCG follow up time of 6 months post evacuation of a complete molar pregnancy can be adopted in HIV-positive patients. In our study, however, none of the HIV-positive patients met the criteria of BhCG < 5 within 56 days for this shorter follow up to be beneficial. There were not enough HIV-positive patients with partial moles for meaningful analysis. It is essential that all patients be advised to adhere to BhCG follow up, especially in HIV-positive women where the follow up may be lengthy.
Item
Process Development for Ethanol Production from Cellulose-Rich Bagasse Residues after Furfural Production
(Stellenbosch : Stellenbosch University, 2025-03) Bunga, Godrick Emmanuel; van Rensburg, Eugene; Gorgens, Johann F; Stellenbosch University. Faculty of Engineering. Dept. of Chemical Engineering.
Cellulosic ethanol production is often costly, where the pretreatment process and enzymatic
hydrolysis contribute to high capital and operational costs. In this study, cellulose-rich furfural
residues (FRs) from a South African sugar mill were used as feedstock for cellulosic ethanol
production. The FRs used are the by-product of the furfural production process, which uses
sugarcane bagasse as raw material. Furfural is produced industrially by treating bagasse at 184 °C
and 10 bar for 80-90 minutes, using the by-products, acetic and formic acids to catalyse the
process. After removal of furfural by steam-stripping, the resulting cellulose-rich FRs have the
potential to be converted into ethanol, and provide process energy. The FRs can eliminate the need
for a pretreatment process, provided that the residues exhibit excellent enzymatic
digestibility. Conversion of FRs can add value to the sugar industry, which is under increasing
pressure from economic forces in the sugar markets. This study investigated critical aspects of FRs
to 2G ethanol process, aiming to achieve acceptable process performance, while minimising the
operational costs of enzymes and yeasts for the process. The latter was essential since the
hydrolytic enzymes that digest the cellulose into fermentable sugars are by far the greatest
operational cost, which, if left unchecked, would render this process economically infeasible. To
achieve this target, four commercial enzyme cocktails, namely, Cellic® CTec3, SacchariSEB C6L Plus,
Viscamyl™ Flow, and Cellic® CTec3 HS, were screened to identify the most efficient and least costly
options, together with seven Saccharomyces cerevisiae strains, of which two strains, namely
Cellusec® 2.0 and Cellusec® 3.3 secreted cellulase enzymes. The latter two strains were essential
to minimise commercial enzyme dosages, while achieving attractive ethanol yields, concentration and
productivity to minimise operational costs. To maximise performance and productivity, fermentations
were done in fed-batch simultaneous saccharification and fermentation (SSF) to maximise the solids
loading and maintain an ethanol titre of > 30g/L. Whereas ethanol concentrations of 36 g/L at 89%
yield could be achieved at a high enzyme dosage of 5 FPU/g DS and a solid loading of 15% (w/w),
which served as a benchmark, the enzyme-producing yeast S. cerevisiae strain Cellusec® 3.3 could
reduce the exogenous enzyme dosage to 2 FPU/g DS, albeit at a
decreased ethanol concentration of 24.9 g/L (60% yield) at a solid loading of 15% (w/w).
Further decrease of exogenous enzyme dosage to 1.5 FPU/g DS was achieved at a high solid loading of
25% (w/w), but this was associated with accumulation of fermentation inhibitors, thus limiting the
final ethanol concentration to 19 g/L at 21% yield, with residual glucose of 45 g/L remaining at
the end of fermentation. Whereas research is ongoing on further process development to circumvent
the issue of residual glucose accumulation, an ethanol titre of 41 g/L was theoretically possible
should all the residual glucose at the end of fermentation be converted into ethanol. The study
successfully lowered commercial enzyme dosage, which could lead to reduced operational costs
associated with exogenous enzymes. Consolidated bioprocessing (CBP) yeasts that secrete cellulase
enzymes should be engineered to withstand inhibitors, to maintain
viable yeast cells and increase ethanol production.
Item
A Hybrid Deep Learning based Algorithm for Gamma Spectroscopy Analysis
(Stellenbosch : Stellenbosch University, 2025-03) Buckton, Calib; Wyngaardt, Shaun; Ngxande, Mkhuseli; Stellenbosch University. Faculty of Science. Dept. of Physics.
There exists a select group of unstable nuclei which undergo radioactive decay
by emission of highly-energetic photons or γ-rays. An effective tool for
analysing γ-emitting radioactive sources, γ spectroscopy is a common experimental
technique used in high-energy physics and environmental monitoring
of radiation. Analysis of various naturally-occurring radionuclides and their
decay products through γ spectroscopy has provided a practice for health and
safety regulation. It is a process which has seen a large amount of improvement
in efficiency and optimization over the past decade. This has often
been partly due to developments in detection and analysis which build upon
existing experimental methods. Recently, there have been developments in
machine learning based approaches, for automated and efficient detection and
radioisotope "fingerprinting". That is, these methods make use of the characteristic
spectra measured from radioactive isotopes to identify them. These
methods have commonly consisted mostly of a computational component, a
deep-learning algorithm known generally as a deep neural network (DNN),
which is trained on a representative dataset of energy spectra from different
isotopes, often simulation-based. Becoming increasingly more efficient, a type
of network called convolutional neural networks (CNNs) are most prominent in
this application, due to their ability to effectively learn useful features from
high-quality data and being able to classify or recognise radioactive species
with very good accuracy. However, a challenge which can occur is when there
are multiple sources present in a sample of data, as opposed to just a single
source, where the accuracy of the network will often decrease as the number of
sources increases. Additionally, developing a CNN model which is computationally
efficient, but also accurate and robust to noise and other environmental
influences can be difficult.
Item
A meta-learning framework for link prediction algorithm selection based on network structure analysis
(Stellenbosch : Stellenbosch University, 2025-03) Brown, Lienke Marie; van Vuuren, J. H.; Nel, G. S.; Stellenbosch University. Faculty of Engineering. Dept. of Industrial Engineering.
Link prediction is an important problem in the multi-disciplinary field of network science, which is
focused on systematically inferring potential (or missing) links in graph models of networks. This
problem spans multiple domains, including social networks, biological systems, and recommender
systems, to name but a few. Link prediction is aimed at uncovering relationships that are not
immediately apparent in networks associated with these domains by leveraging techniques from
the research areas of network analysis and machine learning. The algorithms utilised for this
task may be classified into three main paradigms, namely similarity-based methods, machine
learning classifiers, and embedding-based methods. The increasing availability and structural
diversity of network data, coupled with the extensive variety of the algorithms available, demands
a systematic approach towards selecting a suitable algorithm for link prediction purposes in a
given graph model. Existing approaches towards selecting such an algorithm often rely on ad
hoc approaches. As such, there is a lack of structured frameworks facilitating comprehensive
algorithmic comparisons in view of the diverse structural graph properties inherent to different
network types.
The link prediction algorithm selection problem arises from a need to identify the most appropriate
algorithm for a given network, taking into account its structural characteristics and the relative
performances of available algorithms. It has been shown that no single link prediction algorithm
consistently outperforms others with respect to all networks. In this dissertation, a framework,
called the LinkPAL framework, is proposed as a comprehensive solution to this problem. This
framework adopts a meta-learning approach towards recommending suitable link prediction
algorithms based on an analysis of relevant network meta-features, such as community structure,
degree distribution, and assortativity, to name but a few. The framework is able to predict
algorithmic performance and provides tailored recommendations for new, unseen link prediction
problem instances by training a meta-learner on a curated suite of benchmark network data
sets. The novelty of the LinkPAL framework pertains to its ability to address the link prediction
algorithm selection problem according to a structured, data-driven methodology with a view to
enhance the algorithmic performance when presented with a network data set exhibiting specific
structural graph properties.
The working of the proposed framework consists of two distinct phases, namely an offline phase
and an online phase. During the execution of the offline phase, meta-features are extracted from
a benchmark data suite, and meta-learners are subsequently trained in respect of these data.
During the online phase, the trained meta-learners generate suitable algorithm recommendations
for new networks. This two-phased approach ensures that computationally intensive processes
are handled offline, allowing for efficient and accurate algorithm selection recommendation during
the online phase. In addition, the modular design of the framework also ensures its adaptability
to future advancements in the field. The methodological utility of the LinkPAL framework is demonstrated by means of a computerised
instantiation as a proof-of-concept decision support system by utilising open-source benchmark
problem instances. The instantiation verifies the functional correctness of the framework and
evaluates algorithmic performance both quantitatively by means of statistical analyses, and
qualitatively through visualisations of, and contextual reflections on, the output of the algorithms.
The quality of the algorithm recommendations is assessed in three evaluation scenarios, each
representing a different distribution of structural graph properties. These scenarios serve to
verify and validate the recommendations produced by the framework.
Item
Conceptualizing headship and embodiment in Ephesians: A cognitive-linguistic approach to contexts, construal, and constructed meaning
(Stellenbosch : Stellenbosch University, 2025-03) Brown, Joel Stephen; Nagel, Peter; Stellenbosch University. Faculty of Theology. Dept. of Old and New Testament.
The laborious search for the locus of meaning continues to confound philosophers and hermeneuts alike. Cognitive Linguistics’ nascent philosophy of embodied realism promises much by grounding meaning firmly in the body, but the academic polemics continue, and the use of cognitive-linguistic methods has not been widely embraced in theological research. The prominent Conceptual Metaphor and Conceptual Integration Theories have revitalized investigation into the unique and powerful ways humans construct meaning using metaphors. No longer viewed simply as rhetorical device, metaphors are foundational for human creativity and conceptualization and are inextricably pervasive in the hermeneutical effort referred to as reading the Scriptures. In preparation for a journey into the ocean of human meaning-making, this study first considers the implications of embracing epistemic humility towards the frenetic conversation partner found in contemporary cognitive science.
The exploration begins with a series of cognitive-linguistic analyses of the profound construal of divinity and humanity found in the metaphors of embodiment and headship in Ephesians 1:22-23—that the universal church is a body with the glorified Jesus Christ as its head. These theologically significant utterances, and the cognitive contexts which make them meaningful, are assessed by handling papyri 𝔓46, the oldest extant manuscript containing these linguistic promptings, as a cognitive artefact of live human communication. This study illuminates how the teleological and ontological natures of a mediative Christ are constructed through emergent conceptual metaphors that reach deeply into cultural experiences of beneficent rule, honor, and body.
Following from this, emergent blends of divine headship and embodiment are tested for their tectonic characteristics using Robert Masson’s formulation of how metaphoric conceptualizations transform conventionalized ideologies and enable the creation of new knowledge, inferences, and theological truths. A conceptual-integrative analysis of κεφᾰλή is conducted by investigating the meaningful cognitive contexts that substantiate its comprehension and illuminates the ways the human head is the most conceptually important, spiritually significant, physically elevated, vitally crucial, and socially identifiable container of the body. Then, the tectonic activity resulting from the conceptual blends emergent from ΠΡΟΣΕΦΕΣΙΟΥΣ in 𝔓46 is measured and verified within the ideological and theological landscapes of Early Christianity. The study then endeavors, laying aside the tools of lexical and conceptual analysis, an investigation of the realities actualized by the body itself. A synthesization of cognitive-scientific research regarding the body’s cognitive sense of self, the spiritual realities experienced in recalled experiences of death, and the epistemics of embodied cognition, establishes that the body is far more capable, implicated, and involved in constructing and interpreting reality than is traditionally believed.
To bring the journey to an end, this study posits a hermeneutic of realization which recognizes our bodies not as the prison of creativity, but as the very gift of it.