Research
Our research program combines ethological approaches with behavioral, computational, and neurophysiological methods to study the neural bases of auditory perception and cognition. By generating models of neural activity, our laboratory takes a systems-level circuit approach to test how behaviorally-relevant information is coded, represented, and transformed along defined neural circuits. The long-term goal of this research is to understand how brain areas interact and how these interactions relate to sensation, perception, and action at the level of brain networks.
Higher-order mechanisms needed to process a vocalization's meaning.
Communication is one of the fundamental components of both human and non-human animal behavior because it plays a fundamental role in their socioecology. For example, the species-specific vocalizations of rhesus monkeys transmit information about the identity and the age of the caller and often provide information about sex and emotional or motivational state. Some species-specific vocalizations have also been described as functionally referential. That is, based on acoustic structure alone, monkeys are able to extract information about a vocalization's function or more loosely, its apparent meaning. Examples of such referents include different kinds of predators, social relationships, and food. While there is a long and rich history of using species-specific vocalizations as probes of neural function in non-human primates, these studies have focused almost exclusively on how particular acoustic features of a vocalization are coded by cortical neurons. No studies, though, have identified the higher-order mechanisms, such as categorization, needed to process a vocalization's function or the mechanisms. We have begun to fill these gaps by designing behavioral and neurophysiological experiments that test how monkeys discriminate and categorize between vocalizations based on their apparent meaning; the neurophysiological experiments focus on activity in the prefrontal cortex and superior temporal sulcus.
Auditory spatial and non-spatial information processing
An important concept in cognitive neuroscience is that the spatial and non-spatial processing of auditory and visual stimuli occurs in parallel pathways. A "dorsal" pathway processes the spatial attributes of a stimulus. A "ventral" pathway processes the non-spatial attributes. Since area LIP is part of the dorsal pathway, it should preferentially process the spatial attributes of an auditory stimulus, whereas since the vPFC is part of the ventral pathway, it should preferentially process the non-spatial attributes of an auditory stimulus. We test the hypothesis that these pathways are not segregated but instead the mixture of spatial and non-spatial information may benefit those computations that form unified perceptual events that can accurately guide goal-directed behavior.
Role of salience and attention in adaptive, goal-directed spatial behavior
Goal-directed behavior is the essence of adaptation because it allows humans and other animals to respond to different environmental scenarios. It can be thought of as the formation of dynamic links between stimuli and actions. Goal-directed behavior can be as simple as shifting your gaze toward an auditory stimulus or as complex as deciding to learn calculus. We test the role that the lateral intraparietal area (area LIP) plays in forming links between different modality stimuli and actions. In particular, we focus on the roles that context, behavioral salience, and attention have in forming these links. We also examine more fundamental computational issues (e.g., reference frames) that are required by a cortical area mediating different modality stimuli.
Affiliations
Departments of Otorhinolaryngology, Neuroscience, & Bioengineering
Neuroscience Graduate Group
Psychology Graduate Group
Bioengineering Graduate Group
Institute for Research in Cognitive Science
Computational Neuroscience Initiative
Higher-order mechanisms needed to process a vocalization's meaning.
Communication is one of the fundamental components of both human and non-human animal behavior because it plays a fundamental role in their socioecology. For example, the species-specific vocalizations of rhesus monkeys transmit information about the identity and the age of the caller and often provide information about sex and emotional or motivational state. Some species-specific vocalizations have also been described as functionally referential. That is, based on acoustic structure alone, monkeys are able to extract information about a vocalization's function or more loosely, its apparent meaning. Examples of such referents include different kinds of predators, social relationships, and food. While there is a long and rich history of using species-specific vocalizations as probes of neural function in non-human primates, these studies have focused almost exclusively on how particular acoustic features of a vocalization are coded by cortical neurons. No studies, though, have identified the higher-order mechanisms, such as categorization, needed to process a vocalization's function or the mechanisms. We have begun to fill these gaps by designing behavioral and neurophysiological experiments that test how monkeys discriminate and categorize between vocalizations based on their apparent meaning; the neurophysiological experiments focus on activity in the prefrontal cortex and superior temporal sulcus.
Auditory spatial and non-spatial information processing
An important concept in cognitive neuroscience is that the spatial and non-spatial processing of auditory and visual stimuli occurs in parallel pathways. A "dorsal" pathway processes the spatial attributes of a stimulus. A "ventral" pathway processes the non-spatial attributes. Since area LIP is part of the dorsal pathway, it should preferentially process the spatial attributes of an auditory stimulus, whereas since the vPFC is part of the ventral pathway, it should preferentially process the non-spatial attributes of an auditory stimulus. We test the hypothesis that these pathways are not segregated but instead the mixture of spatial and non-spatial information may benefit those computations that form unified perceptual events that can accurately guide goal-directed behavior.
Role of salience and attention in adaptive, goal-directed spatial behavior
Goal-directed behavior is the essence of adaptation because it allows humans and other animals to respond to different environmental scenarios. It can be thought of as the formation of dynamic links between stimuli and actions. Goal-directed behavior can be as simple as shifting your gaze toward an auditory stimulus or as complex as deciding to learn calculus. We test the role that the lateral intraparietal area (area LIP) plays in forming links between different modality stimuli and actions. In particular, we focus on the roles that context, behavioral salience, and attention have in forming these links. We also examine more fundamental computational issues (e.g., reference frames) that are required by a cortical area mediating different modality stimuli.
Affiliations
Departments of Otorhinolaryngology, Neuroscience, & Bioengineering
Neuroscience Graduate Group
Psychology Graduate Group
Bioengineering Graduate Group
Institute for Research in Cognitive Science
Computational Neuroscience Initiative
Heading art: Spring Inspired by Jackson Pollock