Markerless vision-based localisation for autonomous inspection drones.

Date
2024-03
Journal Title
Journal ISSN
Volume Title
Publisher
Stellenbosch : Stellenbosch University
Abstract
ENGLISH ABSTRACT: Drones are becoming an increasingly popular tool for inspections, due to their ability to navigate difficult terrain and vary altitudes for comprehensive views. In order to safely move around in an environment the vehicle is dependent on its ability to localise itself. In environments where conventional Global Positioning System (GPS) localisation falls short, such as indoor environments, the need for alternative strategies arises. This thesis presents a markerless vision-based localisation algorithm to enable an autonomous inspection vehicle to determine its pose relative to a known inspection target. The algorithm offers pose in six degrees of freedom (6DOF) which makes it ideal for unmanned aerial vehicles (UAVs). The algorithm consists of two distinct phases: an offline mapping phase, during which the environment or target object is meticulously mapped, and an online localisation phase, offering real-time position information during inspections. In the mapping phase video footage of the environment is captured that is processed and features that will be recognised are stored in a 3D catalogue. A convolutional neural network (CNN) is trained on feature hotspots that are used to quickly identify and segment these hotspots during the localisation phase. The localisation phase is used for fully autonomous scenarios and provides continuous localisation during inspection operations. The practical results show that the vision-based algorithm together with the implemented extended Kalman filter (EKF) has the potential to provide real-time solutions for localisation problems. The algorithm’s performance was validated across two distinct scenarios: precise localisation around inspection objects and reliable localisation within inspected environments. This research contributes to enhanced drone-led inspection capabilities across diverse environments.
AFRIKAANSE OPSOMMING: Hommeltuie word ’n toenemend gewilde instrument vir inspeksies, as gevolg van hul vermo¨e om moeilike terrein te navigeer en hoogtes te verander vir omvattende uitsigte. Om veilig in ’n omgewing rond te beweeg, is die voertuig afhanklik van sy vermo¨e om homself te lokaliseer. In omgewings waar die konvensionele Globale Posisioneringstelsel (GPS) lokalisering te kort skiet, soos binnenshuise omgewings, ontstaan die behoefte aan alternatiewe strategie¨e. Hierdie tesis bied ’n merkerlose visie-gebaseerde lokaliseringsalgoritme om ’n outonome inspeksievoertuig in staat te stel om sy houding relatief tot ’n bekende inspeksieteiken te bepaal. Die algoritme bied pose in ses grade van vryheid (6DOF) wat dit ideal maak vir onbemande lugvoertuie (UAV’s). Die algoritme bestaan uit twee afsonderlike fases: ’n vanlyn karteringfase, waartydens die omgewing of teikenvoorwerp noukeurig gekarteer word, en ’n aanlyn lokaliseringsfase, wat intydse posisie-inligting tydens inspeksies bied. In die karteringsfase word beeldmateriaal van die omgewing geneem en verwerk om kenmerke te ontgin. Die kenmerke se posisies word bepaal in en ’n 3D-katalogus gestoor. ’n Konvolusionele neurale netwerk (CNN) word opgelei op kenmerk-groepe wat gebruik word om hierdie groepe vinnig te identifiseer en te segmenteer tydens die lokaliseringsfase. Die lokaliseringsfase word gebruik vir ten volle outonome scenario’s en bied deurlopende lokalisering tydens inspeksiebedrywighede. Die praktiese resultate toon dat die visie-gebaseerde algoritme saam met die ge¨ımplementeerde uitgebreide Kalman-filter (EKF) die potensiaal het om intydse oplossings vir lokaliseringsprobleme te verskaf. Die algoritme se werkverrigting is bevestig oor twee verskillende scenario’s: presiese lokalisering relatief tot inspeksie-voorwerpe en betroubare lokalisering binne ge¨ınspekteerde omgewings. Hierdie navorsing dra by tot verbeterde hommeltuig-geleide inspeksievermo¨ens oor diverse omgewings.
Description
Thesis (MEng)--Stellenbosch University, 2024.
Keywords
Citation