Browsing by Author "Jonck, Petronella"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- ItemA micro-level outcomes evaluation of a skills capacity intervention within the South African public service : towards an impact evaluation(AOSIS, 2018) Jonck, Petronella; De Coning, Riaan; Radikonyana, Paul S.Orientation: Interest in measuring the impact of skills development interventions has increased in recent years. Research purpose: This article reports on an outcomes evaluation under the ambit of an impact assessment with reference to a research methodology workshop. Motivation of the study: A paucity of studies could be found measuring the workshop outcomes, especially within the public service as it pertains to training interventions. Research approach/design and method: A pretest–post-test research design was implemented. A paired-sample t-test was used to measure the knowledge increase while controlling for the influence of previous training by means of an analysis of variance and multiple regression analysis. Main findings: Results indicated that the increase in research methodology knowledge was statistically significant. Previous training influenced the model only by 0.8%, which was not statistically significant. Practical/managerial implications: It is recommended that the suggested framework and methodology be utilised in future research as well as in monitoring and evaluation endeavours covering various training interventions. Contribution/value add: The study provides evidence of the impact generated by a training intervention, within the South African Public Service. Thus, addressing a research gap in the corpus of knowledge.
- ItemA quasi-experimental evaluation of a skills capacity workshop in the South African public service(AOSIS, 2020) Jonck, Petronella; De Coning, RiaanBackground: A paucity of evaluation studies could be identified that investigated the impact of training. The lacuna of research should be viewed in light of austerity measures as well as inability to measure return of investment on training expenditure, which is substantial year on year, especially in the context of public service. Objectives: This article reports on an impact evaluation of a research methodology skills capacity workshop. Method: A quasi-experimental evaluation design in which comparison groups were utilised to evaluate the impact of a research methodology skills development intervention. A paired-sample t-test was used to measure the knowledge increase whilst controlling for the influence of comparison groups by means of an analysis of variance. A hierarchical multiple regression analysis was performed to determine how much of the variance in research methodology knowledge could be contributed to the intervention whilst controlling for facilitator effect. Results: Results indicated that the intervention had a statistically significant impact on research methodology knowledge. Furthermore, the intervention group significantly differed statistically from the control and comparison groups with respect to research methodology knowledge. Facilitator effect was found to be a moderating variable. Hierarchical regression analysis performed to isolate the impact of intervention in the absence of facilitator effect revealed a statistically significant result. Conclusion: The study augments the corpus of knowledge by providing evidence of training impact within the South African public service, especially utilising a quasi-experimental pre-test–post-test research design and isolating the impact of facilitator effect from the intervention itself.