Electronic Thesis and Dissertation Repository

Degree

Doctor of Philosophy

Program

Electrical and Computer Engineering

Supervisor

K. McIsaac

2nd Supervisor

G. R. Osinski

Joint Supervisor

Abstract

Advances in the capabilities of robotic planetary exploration missions have increased the wealth of scientific data they produce, presenting challenges for mission science and operations imposed by the limits of interplanetary radio communications. These data budget pressures can be relieved by increased robotic autonomy, both for onboard operations tasks and for decision- making in response to science data.

This thesis presents new techniques in automated image interpretation for natural scenes of relevance to planetary science and exploration, and elaborates autonomy scenarios under which they could be used to extend the reach and performance of exploration missions on planetary surfaces.

Two computer vision techniques are presented. The first is an algorithm for autonomous classification and segmentation of geological scenes, allowing a photograph of a rock outcrop to be automatically divided into regions by rock type. This important task, currently performed by specialists on Earth, is a prerequisite to decisions about instrument pointing, data triage, and event-driven operations. The approach uses a novel technique to seek distinct visual regions in outcrop photographs. It first generates a feature space by extracting multiple types of visual information from the image. Then, in a training step using labeled exemplar scenes, it applies Mahalanobis distance metric learning (in particular, Multiclass Linear Discriminant Analysis) to discover the linear transformation of the feature space which best separates the geological classes. With the learned representation applied, a vector clustering technique is then used to segment new scenes.

The second technique interrogates sequences of images of the sky to extract, from the motion of clouds, the wind vector at the condensation level — a measurement not normally available for Mars. To account for the deformation of clouds and the ephemerality of their fine-scale features, a template-matching technique (normalized cross-correlation) is used to mutually register images and compute the clouds’ motion.

Both techniques are tested successfully on imagery from a variety of relevant analogue environments on Earth, and on data returned from missions to the planet Mars. For both, scenarios are elaborated for their use in autonomous science data interpretation, and to thereby automate certain steps in the process of robotic exploration.

Share

COinS