Bilateral tactile input patterns decoded at comparable levels but different time scales in neocortical neurons

A new article from our lab was published today in Journal of Neuroscience. It is called ‘Bilateral tactile input patterns decoded at comparable levels but different time scales in neocortical neurons’ and investigates to which extent individual neurons of the primary somatosensory can decode contralateral and ipsilateral input.

We demonstrate that the spiking activity of single neocortical neurons in the somatosensory cortex of the rat can be used to decode patterned tactile stimuli delivered to the distal ventral skin of the second forepaw digits on both sides of the body. Even though comparable levels of decoding of the tactile input was achieved faster for contralateral input, given sufficient integration time each neuron was found to decode ipsilateral input with a comparable level of accuracy. Given that the neocortical neurons could decode ipsilateral inputs with such small differences between the patterns suggests that S1 cortex has access to very precise information about ipsilateral events. The findings shed new light on possible network mechanisms underlying bimanual haptic processing.

Artificial spatiotemporal touch inputs reveal complementary decoding in neocortical neurons

A new article was published today in Scientific Reports. The article, ‘Artificial spatiotemporal touch inputs reveal complementary decoding in neocortical neurons‘ unveils central brain mechanisms for touch with widespread applications for neuroprostheses design and understanding neurological disease.

It is a result of the collaboration with a lab from Sant’Anna School of Advanced Studies in Pisa, Italy. We, The Pisa-Lund group, generated artificial touch experiences with a bionic fingertip currently used for robotic upper limb neuroprostheses. These artificial touch experiences were provided to the touch sensor nerves of the skin, as a kind of neuroscientific playback of information to the brain. Using a high-resolution analysis of how individual neurons and their connected brain networks processed this touch information, designed by neurocomputational scientist Alberto Mazzoni and physics scientist Anton Spanne, the groups got an unexpected insight into the brain representations of the external world experienced through touch. Single neurons in our brains are able to convey much more information than was previously thought and can interact to generate potentially super rich representations of sensory stimuli. This knowledge will be embodied into a novel generation of sensitive robotic hands able to convey fine tactile information to amputees. Moreover, robotic arms with human-like richness of touch could be used to perform complex tasks in services and industry.

Moreover, what started out as an approach to understand how the brain processes artificial touch experiences also indicated a potentially novel methodology to obtain a high-precision analysis of the integrity and health of the brain. Sensory processing in the brain depends on a proper structure and dynamics in the neuronal networks of the brain. In all neurological disease and in psychiatric disorders, there is a disruptive effect on these networks properties. As our method quantifies the integrity of the network function, it can be implemented in high-resolution, quantitative evaluation of those diseases.

The research was carried out by SSSA/EPFL, with lead scientists Calogero Oddo and Silvestro Micera, and by LU, with lead scientist Henrik Jörntell.

And for our swedish friends there is a press release from the Faculty of medicine at Lund university to read.

Attention from media: