Research Highlights

Touch and sight in action

Published online 20 March 2019

Understanding the roles of touch and sight in day-to-day activities may help develop rehabilitation aids.

Biplab Das

E+/ Getty Images
Chores, such as holding a jar with one hand, reaching for it with the other, and then twisting off its lid, can often be performed without sight. But are actions performed using either touch or sight as accurate as those guided by both together?

Neuroscientists Robert Volcic and Ivan Camponogara of New York University Abu Dhabi in the United Arab Emirates found that using sight and touch together substantially improves reach-to-grasp actions, suggesting that touch plays an important role in our movements.

The team compared grasping actions in a group of young, healthy participants. When they could both see and touch an object, their movements were faster and they formed smaller, more accurate grip sizes. The researchers say sight and touch can be flexibly used together to optimize the execution of grasping movements because they are able to provide information about the size and position of an object.

These findings could be used to develop new rehabilitation aids for people with Parkinson's disease or stroke survivors, who often struggle performing basic reaching and grasping movements. They could also help people with curable congenital blindness, who are treated only later in life, to learn how to use touch to guide their actions. "By providing them with a rehabilitation protocol in which touch educates vision, we could speed up their learning process," explains Camponogara. 

Next, the team plans to explore other unknown facets of how the brain guides movements by integrating information provided by external cues that are sensed through vision and touch. This could generate data that might be useful for building robots that can be programmed to learn and execute movements by interacting with the environment, says Volcic.


Camponogara, I. & Volcic, R. Grasping movements toward seen and handheld objects. Sci. Rep. 9, 3665 (2019).