1997
Abstract
We find it effortless to reach toward or avoid nearby objects. However, the spatial and visuo-motor computations must be quite complicated, especially since our eyes, head, limbs, body, and the objects themselves may be continually changing positions. How does the brain construct a representation of the visual space surrounding the body, and how does this representation guide movement?
Abstract
1996
Abstract
In the macaque, neurons in ventral premotor cortex and in the putamen have tactile receptive fields on the face or arms, and corresponding visual receptive fields which extend outward from the tactile fields into the space near the body. For cells with tactile receptive fields on the arm, when the arm is moved, the corresponding visual receptive fields move with it. However, when the eyes move, the visual receptive fields remain stationary, “attached” to the arm. We suggest that these “arm-centered" visual responses play a role in visuo-motor guidance. We predict that other portions of the somatotopic map in premotor cortex and the putamen contain similar receptive fields, centered on the corresponding body parts. This "body-part-centered" representation of space is only one of several ways in which space is represented in the brain.
1995
Abstract
How are we able to reach accurately toward objects near us and avoid ones that are approaching, even though the objects and our own eyes, head, limbs and body may be continually changing positions? How does the brain construct a representation of the visual space surrounding the body, and how does this representation guide movement?
Abstract
Lesions of posterior parietal cortex have long been known to produce visuo-spatial deficits in both humans and monkeys. Yet there is no known “map” of space in parietal cortex. Posterior parietal cortex projects to a number of other areas which are involved in specialized spatial functions. In these areas space is represented at the level of single neurons and in many of them there is a topographically organized map of space. These extra-parietal areas include premotor cortex and the putamen, involved in visuomotor space, the frontal eye fields and the superior colliculus, involved in oculomotor space, the hippocampus, involved in environmental space, and dorsolateral prefrontal cortex, involved in mnemonic space. In many of these areas, space is represented by means of a coordinate system that is fixed to a particular body part. Thus, the processing of space is not unitary, but is divided among several brain areas and several coordinate systems in addition to those in posterior parietal cortex.
1994
Abstract
Abstract
Cells in the dorsal division of the medial superior temporal area (MSTd) have large receptive fields and respond to expansion/contraction, rotation, and translation motions. These same motions are generated as we move through the environment, leading investigators to suggest that area MSTd analyzes the optical flow. One influential idea suggests that navigation is achieved by decomposing the optical flow into the separate and discrete channels mentioned above, i.e. expansion/contraction, rotation, and translation. We directly tested whether MSTd neurons perform such a decomposition by examining whether there are cells which are preferentially tuned to intermediate spiral motions, which combine both expansion/contraction and rotation components. The finding that many cells in MSTd are preferentially selective for spiral motions indicates that this simple 3 channel decomposition hypothesis for MSTd is incorrect. Instead, there appears to be a continuum of patterns to which MSTd cells are selective. In addition, we find that MSTd cells maintain their selectivity when stimuli are moved to different locations in their large receptive fields. This position invariance indicates that MSTd cells selective for expansion cannot give precise information about the retinal location of the focus of expansion. Thus individual MSTd neurons cannot code the direction of heading by using the location of the focus of expansion. The only way this navigational information could be derived from MSTd is through the use of a coarse, population encoding. Positional invariance and selectivity for a wide array of stimuli suggest that MSTd neurons encode patterns of motion per se, regardless of whether these motions are generated by moving objects or by motion induced by observer locomotion.
Abstract
We propose that extrapersonal space is represented in the brain by bimodal, visual-tactile neurons in: 1) inferior area 6 in the frontal lobe, 2) area 7b in the parietal lobe, and 3) the putamen. In each of these areas, there are cells which respond both to tactile and visual stimuli. In each area, the tactile receptive fields are arranged to form a somatotopic map. The visual receptive fields are usually adjacent to the tactile ones and extend outward from the skin about 20 cm. Thus each area contains a somatotopically organized map of the visual space that immediately surrounds the body. These three areas are monosynaptically interconnected, and may form a distributed system for representing extrapersonal visual space. For many neurons with tactile receptive fields on the arm or hand, when the arm was moved, the visual receptive field moved with it. Thus, these neurons appear to code the location of visual stimuli in arm centered coordinates. More generally, we suggest that the bimodal cells represent near, extrapersonal space in a body part centered fashion, rather than in an exclusively head centered or trunk centered fashion.
Abstract
Abstract
The left fielder squints at the baseball as it curves toward him. He adjusts his hand and body, and the ball lands in his mitt. Somehow, the changing pattern of light on his retina was transformed into a motor command which brought his hand to the correct location for catching the ball. How was this accomplished? Is there a map of visual space in the brain which encodes the location of the ball and the fielder's glove? In this article, we review some recent experiments using monkeys, on visuo-motor transformations in the brain. We ask how neurons represent the location of a stimulus for the purposes of looking at it, reaching toward it, or avoiding it.
1993
Abstract
1992
Abstract
Dr. Stein begins by claiming "very little evidence has been found for the existence of a topographic map of perceptual space" and ends by stating that "indeed there is no evidence for a region...where egocentric space is represented topographically." In lieu of a map, he offers us a "neural network," that is, "a distributed system of rules for information processing that can be used to transform signals from one coordinate system to another." Such a computational scheme might indeed work; however, it is quite unnecessary since a neuronal topographic map of visual space does exist, at least for the region adjacent to the body, i.e. immediate extrapersonal space. As there is good evidence for more than one such map in the primate brain the question would seem to be what are their different functions rather than how can we erect a computational network to do without them.