Advanced computations by neurons in the skin

A fundamental feature of the first neurons in the touch-processing pathway is that they branch in the skin and have many transduction sites. This study shows that this branching constitutes a peripheral neural mechanism for extracting information about the geometric features of touched objects, a capacity previously thought to be a hallmark of sophisticated processing in the cerebral cortex.

HFSP Long-Term Fellow Andrew Pruszynski and colleagues
authored on Mon, 06 October 2014

Visual and tactile perception both involve neural mechanisms that extract high-level geometric features, like the orientation of an edge, by integrating information across many low-level inputs. This feature extraction process is traditionally attributed to neurons in the cerebral cortex. However, recent evidence from the visual system suggests that this type of feature extraction occurs very early in the processing pathway, even at the level of bipolar neurons in the retina. Research carried out in the Johansson lab at the University of Umea in Sweden – recently published in Nature Neuroscience – demonstrates that feature extraction also begins very early in the tactile processing pathway, literally at the most distal portion of first-order tactile neurons as they branch in the skin.

It has been known for a long time that neurons in the tactile periphery branch extensively and innervate multiple specialized mechanoreceptive end organs, yielding complex receptive fields with many highly-sensitive zones. But the functional consequences of this arrangement have remained unknown. Johansson and Pruszynski show that, for two types of neurons in the tactile periphery, integration across their complex receptive field allows them to signal information about a canonical geometric feature – edge orientation – via both intensity and temporal codes. That is, the dendritic-like arborization of first-order tactile neurons, situated in the skin, permits them to perform computations previously thought to require integration across multiple neurons in the brain.

These findings help redefine the role of first-order tactile neurons, from passive wires that merely transmit touch information to the brain for further processing into active components of feature extraction processes critical to sensory perception and motor control. Indeed, although this study focused on the orientation of straight edges, the same mechanism likely permits first-order tactile neurons to signal information about other aspects of touched objects currently thought to require cortical processing, such as the curvature of an edge or its direction of motion and speed across the skin.

Reference

Edge-orientation processing in first-order tactile neurons. Nature Neuroscience. Pruszynski, J.A., Johansson, R.S. (2014, in press). doi:10.1038/nn.3804.

Pubmed link

Nature Neuroscience link

Andrew Pruszynski website