A Fully-Connected Neural Network Derived from an Electron Microscopy Map of Olfactory Neurons in Drosophila Melanogaster for Odor Classification
Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
URL with Digital Object Identifier
The fruit fly (Drosophila Melanogaster) is well-studied; the organism has served scientists for decades in all manner of biological research - most notably, perhaps, in genetics. However, much of the neuronal "middleware" of the fruit fly is unknown: for instance, how its neural architecture gives rise to functionalities such as odor categorization. Moreover, there is potential for the fruit fly neural network (FFNN) architecture in modelling Artificial Neural Networks (ANNs) - the former having been "crafted" over time by generations of evolutionary adaptation. In this work we hope to gain some insight with regards to both problem domains: firstly, with regards to understanding the "middleware" of the fruit fly neural network; secondly, with regards to constructing FFNN-derived ANNs. In particular, we recognize that there is a new opportunity to explore these problem domains in light of recent work on the EM (Electron Microscopy) "hemibrain" - the most comprehensive (to date) EM-derived digital reconstruction of the fruit fly brain, comprising 25,000 neurons (with labels for all neurons and synapses) . Using the hemibrain, we look to the fruit fly olfactory system for the purposes of both exploring its neural architecture and creating an odor classifier. Our FFNN-derived ANN is - at present - fully-connected and uses the 800 most prevalent neurons in the fruit fly olfactory circuit (Antenna Lobe, Mushroom Body Calyx, and Lateral Horn ); it also has weight values assigned based on the number of synapses between neurons (an assumption made by the hemibrain authors ). Our initial dataset for odor classification includes 33 samples; each with 16 input components (individual resistance values from an array of 16 metal oxide sensors) and 4 output classes (air, ethanol, acetone, or mixed). We augment the dataset to size 33,033 with input noise based on a Gaussian normal distribution. Our current prototype yields greater-than-random test accuracy (37.5%) with 100 epochs of training.