Aston Laboratory for Immersive Virtual Environments (ALIVE) Research Facility
A combination of motion capture, virtual reality and EEG recording equipment.
ALIVE is an Aston University research facility. The combination of motion capture, virtual reality, and EEG recording equipment provides a perfect platform for combining research into visual perception, social cognition, and social vision. This is a cross-school facility, designed to encourage collaboration and synergy. Our project areas can be broadly split into five areas:
We open our eyes and we see. The task is effortless, but the computational burden imposed on our brains is enormous; this is why so much of it is involved in processing visual information. Details are difficult to estimate, but roughly 25% of the brain is dedicated to vision (about ten times as much as for hearing), and another 40% is involved in processes that involve vision amongst other senses (e.g. hearing and proprioception) and knowledge-based input. We are interested in how the nervous system (eyes and brain) go from isolated ‘pixels’ recorded by the receptors that ‘look’ at the flat retinal images on the back of each eye, to a seamless impression of the 3D world. It’s all in the head, but we experience the world out there — immediate, accurate and in high fidelity. How does the brain achieve this remarkable feat?
The outside world provides an immense amount of information – significantly more than our brain can process at any given time. So how do we judge and select which parts of this information overload are most relevant and need to be acted upon? How do we store relevant information and how do we recall it at the right time? Answering these questions requires rich, realistic environments, as provided by VR, and allows us to understand essential daily cognitive function in complex situations such as driving and how we seem to navigate our complex multisensory world with ease.
Interactions with other human beings are central to our lives, but how do we accrue, store, and use information about other people - how rapid and/or effective are we at doing this, and are we even aware that we are doing it? These basic research questions are addressed by designing experiments to reveal the cognitive mechanisms involved, and by using brain recording techniques such as EEG to better understand the operation and inter-connectedness of the underlying brain networks.
Social vision is the visual interface between visual perception and social cognition. Our social behaviour is directed by numerous internally generated goals, needs and desires, but these are all modulated by external events, environments and agents. Our interpretation of those external situations reach us through our senses, and so understanding how our senses are tuned to this helps us to better understand what is and is not important, and how these factors contribute to and compare with internally generated models of our self and of others and of our situation in general.
This research develops autonomous interactive virtual humans (VH). The VHs are used as an embodiment of psychological and neurobiological models of human cognition, affect, personality, and behaviour regulation. From the applied perspective, the goal is to develop virtual humans into systems that can provide 'social interaction as a service' in any domain where humans interact with each other, for example, education, games, caregiving, and training. The CAVE of ALIVE is used for running immersive interaction studies where participants interact with virtual humans.
The following organisations have supported the development of ALIVE: