- Visual perception projects
Perception of size and distance. We know a great deal about the depth cues that help the visual system find the hidden third dimension embedded in the retinal image. But how do these operate over different distances, and is our perception of the space around us as high fidelity as we imagine?
- Understanding peripheral vision. Our visual apparatus is truly dreadful just a few degrees away from the centre of the visual field, yet it never seems that way. For example, keep your eyes fixed on this word, and see how many words you can make sense of around it—don’t move your eyes! (If not convinced then look down to the text below, fixating directly on the red word. How many surrounding words can you read? Now try reading the passage from the beginning. The research questions here are how do we compensate for our dreadful peripheral vision, and what, in fact, is peripheral vision good for anyway?
- Using peripheral vision. A great deal in known about the central two degrees of visual perception, since this is the area on the retina that is stimulated using typical visual displays in psychophysical experiments. However, the visual field is about 180 deg wide, and although we know that spatial resolution and many other factors decline considerably in normal human vision away from the centre of the retina, we do not yet fully understand the way in which this degraded retina is used. This series of projects will use the combination of immersive 3D environments in the ALIVE facility with specialized eye tracking goggles that will allow us to mask off selected parts of the visual field under computer control. Using this technique, we will be able to learn how different parts of the visual field are utilized by the visual system in performing several tasks including navigation, physical interaction with the world and depth and surface perception.
- The limiting form of visual representation. We know a great deal about the initial stages of visual analysis but the properties of these neuronal analyzers do not accord with our direct visual experiences.. (For example, we cannot read off the 3 ‘colour’ code of our cones in the retina, that is the job of the colour opponent cells in the lateral geniculate nucleus; but we can’t read off that code either…) What then is the representational stage that sentient human beings are able to access to drive their behaviour—just which parts of the brain do ‘we’ experience?
- Visual search in 3D. Our visual perceptions lead to actions: what are our strategies then, when we lose our keys in a room? What do we do well, and what do we do badly? Can robots be programmed to do this better than us?
- Social vision projects
- Awareness of gaze direction in others. How good are we at covert assessment of body language and is there a special status for detecting when people are looking at us?
- Perception of biological motion. We know that the brain is very good at detecting this, but what are the invariants to which we are tuned?
- Perception of our self in another. We know that we favour others who show similarities to ourselves and similarities between their perception of us and our own perception of our self. Can we find ways of quantifying this in virtual reality?
- Awareness of gaze direction in others. How good are we at covert assessment of body language and is there a special status for detecting when people are looking at us?
- Cognition projects
- Decision making under stress. This project is uses virtual reality to investigate how experts make decisions under stress. Police Officers (Authorised Firearms Officers) take part in virtual simulations for training purposes, while their actions and EEG are recorded. By analysing behavioural decision making and brain oscillations at critical decision points in the scenarios, we aim to learn how experts handle stressful situations. Virtual reality improves over existing (video) training programmes by providing fully immersive and realistic environments with equipment that is intuitive and comfortable to use, leading to natural behaviour within the simulated scenarios.
- Attention switching. This project investigates age-related changes in our ability to switch between different types of attention while driving. Simulations in the driving simulator involve switching of spatial and temporal attention while attending to the rapidly changing events on the road ahead and searching the environment for a place name on a road sign. Analysis of behaviour is supported by electrophysiological analysis (EEG) of the neural signatures that accompany the attention switching during the task.
- Decision making under stress. This project is uses virtual reality to investigate how experts make decisions under stress. Police Officers (Authorised Firearms Officers) take part in virtual simulations for training purposes, while their actions and EEG are recorded. By analysing behavioural decision making and brain oscillations at critical decision points in the scenarios, we aim to learn how experts handle stressful situations. Virtual reality improves over existing (video) training programmes by providing fully immersive and realistic environments with equipment that is intuitive and comfortable to use, leading to natural behaviour within the simulated scenarios.
- Clinical projects
The aetiology and possible control of myopia: the importance of whole-field visual signals. In many regions within Asia, myopia has reached epidemic proportions. The prevalence of myopia is increasing rapidly in non-Asian countries. This rapid rise in myopia suggests that changing environmental factors are influencing current patterns of refractive errors (less outdoor play, more video-game playing). Myopia is a leading cause of permanent visual impairment (myopic eyes have an increased risk of cataract, glaucoma, chorioretinal degenerations, and retinal detachments), and has become a substantial economic burden on society.
Recent data from primate shows that the quality of the peripheral retinal image can modulate axial eye growth, with the ametropic shift proportional to the sign and magnitude of peripheral image blur. In short, it is believed that hyperopic defocus in the peripheral retina may lead to axial elongation (i.e. Myopia).
There has been a long-time assumption that foveal signals dominate refractive development. This is logical given that: (i) acuity is highest at the fovea; (ii) the fovea is most sensitive to optical defocus; (iii) accommodation is largely controlled by visual signals from the fovea. However, recent studies show:
- Visual signals from the fovea are not essential for many aspects of vision-dependent growth. Indeed, foveal signals can be eliminated in young monkeys without significantly interfering with emmetropization.
- Optically imposed peripheral errors can alter the refractive state of primates. Note that pharmacological/surgical sectioning of the optic nerve doesn’t prevent form-deprivation myopia (usually achieved with lid suture or translucent goggles) or the compensating responses to optically imposed defocus.
- Changes in refractive development operate in a regionally selective manner – e.g. nasal-field form deprivation in primates results in elongation in the contralateral retina.
- When conflicting signals exist between the central and peripheral retina, peripheral visual signals can dominate refractive development.
Myopic eyes are less oblate than emmetropic eyes. In consequence, without correction subjects are left with myopic blur in the fovea and hyperopic blur in the periphery. Traditional correction of a myope with negative lenses corrects the central myopic error but typically increases the degree of peripheral hyperopia. This peripheral hyperopic defocus is thought to be a key signal in driving further elongation of the eye, increasing the severity of myopia.
Some questions
- Is eye growth a visually-guided process? Only in the sense that the processes occur locally within the retina.
- Does blur detection occur in the retina? The answer appears to be YES. How does the retina accomplish the task of differentiating the sign of blur (i.e. hyperopic versus myopic defocus)? (a) First-order (spherical) aberrations are not sign specific, but higher-order aberrations may be. (b) There is some evidence that amacrine cells respond differentially to the sign of defocus (Fisher et al., 1999, Nature Neuroscience).
- What is the source of noise in the peripheral retina that gives rise to poor detection performance?
From the above it is clear that we need a better understanding of the role of peripheral vision and its interactions with foveal vision to improve our understanding of myopia. Experimental work using the CAVE and an eye-tracker will help us do this.