Orientation processing by synaptic integration across first-order tactile neurons

Document Type

Article

Publication Date

12-2-2020

Journal

PLoS Computational Biology

Volume

16

Issue

12

URL with Digital Object Identifier

10.1371/journal.pcbi.1008303

Abstract

Our ability to manipulate objects relies on tactile inputs from first-order tactile neurons that innervate the glabrous skin of the hand. The distal axon of these neurons branches in the skin and innervates many mechanoreceptors, yielding spatially-complex receptive fields. Here we show that synaptic integration across the complex signals from the first-order neuronal population could underlie human ability to accurately (< 3°) and rapidly process the orientation of edges moving across the fingertip. We first derive spiking models of human firstorder tactile neurons that fit and predict responses to moving edges with high accuracy. We then use the model neurons in simulating the peripheral neuronal population that innervates a fingertip. We train classifiers performing synaptic integration across the neuronal population activity, and show that synaptic integration across first-order neurons can process edge orientations with high acuity and speed. In particular, our models suggest that integration of fast-decaying (AMPA-like) synaptic inputs within short timescales is critical for discriminating fine orientations, whereas integration of slow-decaying (NMDA-like) synaptic inputs supports discrimination of coarser orientations and maintains robustness over longer timescales. Taken together, our results provide new insight into the computations occurring in the earliest stages of the human tactile processing pathway and how they may be critical for supporting hand function.

Share

COinS