The creativity of CCNR based research has been reflected in a historically diverse research funding portfolio covering applied and basic research.
Selected current research projects in the CCNR.
- Odor Objects - HFSP
-
Odor Objects is a three year and four institute collaboration investigating mechanisms of odor-background segregation by insects. The project brings together leading researchers from neurobiology, insect behaviour, computational neuroscience and bio-inspired robotics.
- Department of Neurobiology, Universität Konstanz, Germany
- School of Engineering and Informatics, 麻豆传媒社区入口, UK [Prof Thomas Nowotny]
- School of Life Sciences, Arizona State University, USA
- Research Center for Advanced Science and Technology, University of Tokyo, Japan
- Distributed neural processing of self-generated visual input in a vertebrate brain - BBSRC
-
Chris Buckley PI
Most of what we do on a day-to-day basis involves the ongoing and fluid coordination between our senses and our actions and understanding this ability a central goal of modern neuroscience. While a partial understanding of the underlying processes in specific circuits has been achieved, a full understanding would ideally require recordings of the neural activity from across the brain in a behaving animal. This type of experiment has been impossible in the past for two reasons: 1. Neural recording techniques require a great degree of stability between recording devices and neural tissue; thus most experiments involve heavily restrained, or anesthetized, animals, which prevents meaningful brain/environment interactions. 2. Typical brain recordings have been limited to either small numbers neurons at cellular resolutions or indirect recordings from large areas of brain tissue at low spatial and/or temporal resolution. To address these challenges will combine advanced techniques in experimental and computational neuroscience. First, a virtual reality for a swimming larval zebrafish; this will allow us to record from a non-moving brain but allow fictive behaviour. Second, light-sheet microscopy, a technique that can simultaneously image 10000's of neurons from across the zebrafish brain. Third, distributed computing techniques, which will enable us to analyse the enormous data sets acquired from these experiments.
We will use these tools to address three fundamental questions about brain function in behaving animals. First, when animals actively engage the world the brain receives two types of sensory input: Sensory input caused by changes in the external world, e.g. the optic flow experienced by a fish as water sweeps past its retina, and sensory input that is a consequence of their own actions, e.g. the optic flow experienced by the fish that results from its own swimming. These two types of inputs convey different types of information but arrive together on the retina. Thus a central question we will ask is what are the brain-wide circuits that allow the fish to distinguish between them. Second, animals readily adapt their behaviour when the sensory inputs caused by their own actions do not meet their expectations. For example, fish modulate the strength of swimming when changes in water viscosity cause a mismatch between the actual and expected consequences of their swimming, i.e., when their swimming does not propel them as far as they expect. We will ask what the distributed neural circuits are that allow fish to detect these mismatch errors. We will combine our results to produce a biologically plausible model of closed-loop control in an actively swimming fish that reproduces experimental observations and could be used to inspire robotic control systems. - Brains on Board - EPSRC
-
What if we could design an autonomous flying robot with the navigational and learning abilities of a honeybee? Such a computationally and energy-efficient autonomous robot would represent a step-change in robotics technology, and is precisely what the 'Brains on Board' project aims to achieve. Autonomous control of mobile robots requires robustness to environmental and sensory uncertainty, and the flexibility to deal with novel environments and scenarios. Animals solve these problems through having flexible brains capable of unsupervised pattern detection and learning. Even 'small'-brained animals like bees exhibit sophisticated learning and navigation abilities using very efficient brains of only up to 1 million neurons, 100,000 times fewer than in a human brain. Crucially, these mini-brains nevertheless support high levels of multi-tasking and they are adaptable, within the lifetime of an individual, to completely novel scenarios; this is in marked contrast to typical control engineering solutions. This project will fuse computational and experimental neuroscience to develop a ground-breaking new class of highly efficient 'brain on board' robot controllers, able to exhibit adaptive behaviour while running on powerful yet lightweight General-Purpose Graphics Processing Unit hardware, now emerging for the mobile devices market. This will be demonstrated via autonomous and adaptive control of a flying robot, using an on-board computational simulation of the bee's neural circuits; an unprecedented achievement representing a step-change in robotics technology.
is a £4.8M program grant bringing together Sussex, Sheffield University and QMUL. Paul Graham, Andy Philippides and Thomas Nowotny are all investigators on this project.
- Neuromorphic Implementations of Multivariate Classification Inspired by the Olfactory System - Human Brain Project EU
-
The overall goal of the proposed work is to implement a scalable spiking neuronal network for multivariate classification on the large-scale neuromorphic system developed in the Heidelberg group within the HBP. We will base our work on a study by Schmuker & Schneider (2007) in which a firing-rate model for multivariate classification inspired by insect olfaction has been devised. We will use the virtual receptor approach from this study to efficiently encode real-valued multivariate data sets into firing rate representations that are suitable for processing on spiking neuromorphic systems. Classification will be accomplished in a three-step process: Multivariate data is first encoded using broadly tuned virtual receptors. Their responses are then filtered by lateral inhibition in a decorrelation layer inspired by the insect antennal lobe. This layer projects onto a winner-take-all circuit and the weights of this projection are learnt in a supervised fashion using a perceptron learning rule. Our neuronal network designs will be tested on massively parallel digital graphical processing unit (GPU) super-computing platforms, and our network will provide a working proof-of-concept that the analysis of "Big Data" (in the sense of high-dimensional multivariate data sets) is feasible on large-scale neuromorphic platforms. Our work will furthermore expose the specific benefits of a massively parallel neuromorphic approach for Big Data processing, and identify challenges to be addressed in future projects. The main thrust of the proposed project aligns perfectly with HBP WP 113 "Future Computing", since it provides a bio-inspired design for a computing system that solves computational tasks outside the realm of biology.
- Visual navigation in ants: from visual ecology to brain - BBSRC
-
Paul Graham and Jeremy Niven in collaboration with Barbara Webb (Informatics Edinburgh)
We will use complementary methodological approaches to understand the nature of the visual memory that supports navigation in ants. We will develop procedures for carrying out brain lesions in ants, targeting a range of locations in central complex and mushroom body, and investigate the consequences in visual navigation tasks. Subsequent histology will allow us to correlate lesion locations with behavioural deficits. In parallel, we will establish a new experimental system using a compensatory treadmill to allow precise control over the visual stimulation provided to freely walking ants. This method will enable extremely rapid and minimally invasive transfer of ants from a conventional arena training paradigm to this controlled testing paradigm, supporting high-throughput experiments. These experimental methods will be coupled with analytical approaches to the information content in natural scenes from the ant habitat, to refine the stimulus paradigms and provide realistic input to computational models. An agent model (a simulated ant moving through a virtual world) will allow us to test specific algorithms for visual navigation under precisely parallel conditions to the animal, and thus allow us to devise crucial paradigms for the experimental system under which alternative models make different predictions. In particular, we will examine what are the critical eye regions, the essential image information content, and the most efficient and effective encoding and retrieval schemes to account for navigational behaviour. In the same agent model, we will also test more detailed models of the relevant brain circuitry, to understand how it could support such processing, and close the loop with predictions for new trackball and lesion studies and potential extensions towards single-cell electrophysiology of neurons in relevant brain regions. - Updating of memories during memory consolidation - BBSRC
-
Thomas Nowotny is a co-investigator on this BBSRC project.
Animals, learn by exploring the environment and assessing the predictive value of a particular experience by trial and error. This basic form of learning, called associative conditioning, enables animals to adaptively respond to sensory signals that are associated with reward or punishment. Following the initial formation of an associative memory there is a period of consolidation, during which the memory becomes progressively more permanent and during consolidation there are brief periods of amnesia during which the progression to long-term memory storage can be blocked or altered by unanticipated changes in the sensory environment. Recently we provided further evidence for this flexibility of consolidation by showing that a more important or recent experience during these lapse periods can fundamentally change the fate of the original association. Indeed it can result in the replacement of the first association with a second one. Although lapses may look like non-adaptive deficits in memory storage, we claim that they are important in providing opportunities for alteration of the memory trace. The updating of the memory at lapse points has the advantage of preventing the costly process of consolidation of an obsolete memory while providing opportunity for new or more important experience to be incorporated into or even replace the original memory. The aim of this project is to advance our understanding of this fundamental type of memory updating that results in the swapping of one memory by another. We will exploit the advantages offered by the relatively simple brain and behaviour of the mollusc Lymnaea. While having a CNS of only ~20,000 neurons, this animal nevertheless shares all of the basic features of associative memory formation displayed by far more complex animals, including humans. This allows us to study a universal form of behavioural adaptation and flexibility at a level of cellular and molecular detail that would not be possible in more complex animals. Because we know that the process of memory formation and its underlying neural and molecular mechanisms are evolutionarily highly conserved, this research is likely to reveal principles and mechanisms of behavioural flexibility that apply equally to both simple and complex animals.