Neural basis of sensory integration and decision-making in zebrafish

Supervisor: Armin Bahl

The Bahl lab is looking for a Ph.D. candidate to explore the neural basis of sensory integration and decision-making in zebrafish.
When moving through the environment, animals continuously acquire information from their senses, process, evaluate, and remember such cues, and constantly make decisions about what to do next. The
computational principles and neural implementation underlying such behavior remain poorly understood. To study these processes, our lab employs and develops closed-loop behavioral paradigms in immersive virtual reality, combined with whole-brain functional imaging, molecular biology, optogenetics, electrophysiology, and mathematical modeling, using zebrafish as a model system. We generally follow a reductionist approach, seeking to initially describe the behavioral primitives in detail. We then use computer simulations to make experimentally testable predictions of how the identified algorithmic rules and neural network dynamics can give rise to more complex emergent behaviors.

We are particularly interested in the behavioral and neural mechanisms underlying temporal integration:
● How are working memories mechanistically implemented in the brain?
● How does sensory experience shape neural circuit dynamics and decision-making?
● How do animals integrate information in the context of social behaviors?
The successful candidate will work on one of those questions, employing a combination of modern behavioral and neural circuit dissection tools. Candidates are also highly welcome to propose their own
research direction.
The position can begin as early as April 2023 (or by agreement) and will be fully funded for three years (65%, salary scale 13 TV-L), with the possibility of extension by one more year. Students will be living in Konstanz, Germany, and will become part of the International Max Planck Research School (IMPRS).

We are an interdisciplinary team of behavioral neuroscientists at the University of Konstanz
(www.neurobiology-konstanz.com) and an integral part of the Excellence Cluster “Centre for the
Advanced Study of Collective Behaviour” (www.exc.uni-konstanz.de/collective-behaviour), with tight links
to the “Max Planck Institute of Animal Behavior” (www.ab.mpg.de). Our ongoing collaborations with
Harvard University (www.mcb.harvard.edu) provide students with early international scientific exposure
and the possibility for extended research exchange visits.
The University of Konstanz and the Max Planck Society are equal opportunity employers that are
committed to providing employment opportunities to all qualified applicants without regard to race, color,
religion, age, sex, sexual orientation, gender identity, or national origin. They seek to increase the
number of women in those areas where they are underrepresented and therefore explicitly encourage
women to apply (https://www.uni-konstanz.de/en/equalopportunities/). Individuals with
disabilities are explicitly encouraged to apply. They will be given preference if appropriately qualified
(contact +49 7531 88 4016).
Requirements:
● MSc in Neuroscience, Computer Science, Physics, or a related field.
● Excellent communication and writing skills in English.
● Strong programming and data analysis skills, using at least one programming language (preferably Python), or the willingness to develop these abilities.
● Applicants with hands-on experience in multi-photon microscopy, electrophysiology, molecular
techniques, and/or instrumentation control are particularly welcome to apply.
Application:
Interested candidates should apply via the IMPRS online platform (www.ab.mpg.de/3228/imprs).
Applications should include a CV, a research statement (less than 1 page with academic background,
research experience, interests, and goals), and the contact details of two potential referees.
Main supervisor:
Armin Bahl, University of Konstanz (armin.bahl@uni-konstanz.de).
Keywords:
Zebrafish, animal behavior, decision-making, two-photon calcium imaging, electrophysiology.

Go to Editor View