Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
Brain Space Time Podcast - #5 Bernstein conference 2023: Computational neuroscience posters

#5 Bernstein conference 2023: Computational neuroscience posters

10/10/23 • 87 min

Brain Space Time Podcast

Two weeks ago, I visited the Bernstein conference in Berlin. I had lots of fun, particularly at the poster sessions, where I met William, Movitz, and Shervin. I met with each of them later and recorded the following conversations (on bark benches again^^).

William Walker (Gatsby Computational Neuroscience Unit, London) had a poster on 'Representations of State in Hippocampus Derive from a Principle of Conditional Independence'. We discuss how current deep learning struggles with generalization, lacks priors, and could benefit by learning latent conditionally independent representations (similar to place cells).

Movitz Lenninger (KTH Royal Institute of Technology, Stockholm) had a poster on 'Minimal decoding times for various shapes of tuning curves'. He was puzzled why neurons with periodic tuning curves (such as grid cells) are so rare in the brain considering their superior accuracy. He posits there may be a trade-off between accuracy and encoding time.

Shervin Safavi (Max Planck Institute for Biological Cybernetics, Tübingen) had a poster on linking efficient coding and criticality. We introduce those concepts and talk about why noise is a feature, not a bug. Shervin is also starting a new lab at TU Dresden, where he wants to understand the computational machinery of cognitive processes and he is looking for interdisciplinary-minded applicants! For Apple Podcast users, find books/papers links at: https://akseliilmanen.wixsite.com/home/post/pod05

Not familiar with place, grid and head direction cells? Here is my 5min primer.

  • William's publications:
    • Walker et al., 2023 - Unsupervised representation learning with recognition-parametrised probabilistic models preprint
    • Walker et al., 2023 - Prediction under Latent Subgroup Shifts with High-Dimensional Observations preprint
  • Movitz's LinkedIn
  • Movitz's poster from another conference:
  • Movitz's publications:
    • Lenninger et al., 2022 - How short decoding times, stimulus dimensionality and spontaneous activity constrain the shape of tuning curves: A speed-accuracy trade-off preprint
    • Lenninger et al., 2023 - Are single-peaked tuning curves tuned for speed rather than accuracy? paper
  • Shervin's Website
  • Twitter: @neuroprinciples
  • For Shervin's new lab: interest mailing list
  • Shervin's publications:
    • Safavi et al., 2022 - Multistability, perceptual value, and internal foraging paper
    • Safavi et al., 2023 - Signatures of criticality in efficient coding networks preprint
  • Synchronization of metronomes video
  • My Twitter @akseli_ilmanen
  • Email: akseli.ilmanen[at]gmail.com
  • The Embodied AI Podcast, my blog, other stuff
  • Music: Space News, License: Z62T4V3QWL

(00:00:00) - Intro

(00:02:53) - William Walker

(00:32:53) - Movitz Lenninger

(00:55:04) - Shervin Safavi

plus icon
bookmark

Two weeks ago, I visited the Bernstein conference in Berlin. I had lots of fun, particularly at the poster sessions, where I met William, Movitz, and Shervin. I met with each of them later and recorded the following conversations (on bark benches again^^).

William Walker (Gatsby Computational Neuroscience Unit, London) had a poster on 'Representations of State in Hippocampus Derive from a Principle of Conditional Independence'. We discuss how current deep learning struggles with generalization, lacks priors, and could benefit by learning latent conditionally independent representations (similar to place cells).

Movitz Lenninger (KTH Royal Institute of Technology, Stockholm) had a poster on 'Minimal decoding times for various shapes of tuning curves'. He was puzzled why neurons with periodic tuning curves (such as grid cells) are so rare in the brain considering their superior accuracy. He posits there may be a trade-off between accuracy and encoding time.

Shervin Safavi (Max Planck Institute for Biological Cybernetics, Tübingen) had a poster on linking efficient coding and criticality. We introduce those concepts and talk about why noise is a feature, not a bug. Shervin is also starting a new lab at TU Dresden, where he wants to understand the computational machinery of cognitive processes and he is looking for interdisciplinary-minded applicants! For Apple Podcast users, find books/papers links at: https://akseliilmanen.wixsite.com/home/post/pod05

Not familiar with place, grid and head direction cells? Here is my 5min primer.

  • William's publications:
    • Walker et al., 2023 - Unsupervised representation learning with recognition-parametrised probabilistic models preprint
    • Walker et al., 2023 - Prediction under Latent Subgroup Shifts with High-Dimensional Observations preprint
  • Movitz's LinkedIn
  • Movitz's poster from another conference:
  • Movitz's publications:
    • Lenninger et al., 2022 - How short decoding times, stimulus dimensionality and spontaneous activity constrain the shape of tuning curves: A speed-accuracy trade-off preprint
    • Lenninger et al., 2023 - Are single-peaked tuning curves tuned for speed rather than accuracy? paper
  • Shervin's Website
  • Twitter: @neuroprinciples
  • For Shervin's new lab: interest mailing list
  • Shervin's publications:
    • Safavi et al., 2022 - Multistability, perceptual value, and internal foraging paper
    • Safavi et al., 2023 - Signatures of criticality in efficient coding networks preprint
  • Synchronization of metronomes video
  • My Twitter @akseli_ilmanen
  • Email: akseli.ilmanen[at]gmail.com
  • The Embodied AI Podcast, my blog, other stuff
  • Music: Space News, License: Z62T4V3QWL

(00:00:00) - Intro

(00:02:53) - William Walker

(00:32:53) - Movitz Lenninger

(00:55:04) - Shervin Safavi

Previous Episode

undefined - #4 Paul Middlebrooks: BrainInspired & Podcasting

#4 Paul Middlebrooks: BrainInspired & Podcasting

An episode with my favourite podcast host, Paul Middlebrooks. Paul and I met in Berlin, and talked about his journey away from (and back into) academia and why he started his podcast BrainInspired. Yes, there is a lot of podcast meta-talk in this episode. For example, how science podcasts give you a glimpse into another field (as an outsider) and some advice for fellow podcast hosts. We also get into productivity, self-learning and some big-picture questions on what's holding neuroscience back.

For Apple Podcast users, find books/papers links at: https://akseliilmanen.wixsite.com/akseli-ilmanen/post/pod04

Timestamps:

(00:00:00) - Intro

(00:01:51) - Interesting conversations Paul had at the conference

(00:07:57) - The why and how of podcasting

(00:11:16) - Changing one's mind in science

(00:19:34) - Paul's NeuroAI course

(00:20:46) - Podcasts for self-learning & productivity fallacies

(00:26:15) - Podcast advice

(00:30:58) - Paul is back in academia

(00:38:48) - Neuroscience needs theory (beyond manifolds)

(00:45:50) - Saying thank you to Paul

Next Episode

undefined - #6 Kate Jeffery: Grid cells in 3D, entropy & climate change

#6 Kate Jeffery: Grid cells in 3D, entropy & climate change

Kate Jeffery is the head of the school of psychology & neuroscience at the University of Glasgow (formerly at UCL). This episode is all about grid cells (background info), which Kate was already recording in the 1990s. We discuss how grid cells' rate maps differ when the rats climb in 3D spaces. Here we cover anything from cross-species comparisons (bats, birds), to self-organizing dynamics, and symmetry breaking. Kate also shares her (maybe unpopular) thoughts that the hexagonal grid regularity is not functional but a by-product. We also get physics-y by discussing entropy, evolution, complexity and how they link to memory and the arrow of time. At the end there is career advice and some thoughts on climate change.

For Apple Podcast users, find books/papers links at: https://akseliilmanen.wixsite.com/home/post/pod06

Not familiar with place, grid or head direction cells? Here is my 5min primer.

  • Kate's Website
  • Kate's publications:
    • Jeffery et al., 2015 - Neural encoding of large-scale three-dimensional space—properties and constraints paper
    • Casali et al., 2019 - Altered neural odometry in the vertical dimension paper
    • Jeffery et al., 2019 - On the Statistical Mechanics of Life: Schrödinger Revisited paper
    • Jeffery et al., 2020 - Transitions in Brain Evolution: Space, Time and Entropy paper
    • Grieves et al., 2021 - Irregular distribution of grid cell firing fields in rats exploring a 3D volumetric space paper
    • Jeffery, 2022 - Symmetries and asymmetries in the neural encoding of 3D space paper
    • Rae et al., 2022 - Climate crisis and ecological emergency: Why they concern (neuro)scientists, and what we can do paper
  • Other reading mentioned:
    • Cheng, 1986 - A purely geometric module in the rat's spatial representation paper
    • My article on Michel Foucault and climate change deniers
  • My Twitter @akseli_ilmanen
  • Email: akseli.ilmanen[at]gmail.com
  • The Embodied AI Podcast, my blog, other stuff
  • Music: Space News, License: Z62T4V3QWL

Timestamps:

(00:00:00) - Intro

(00:02:14) - Missing out on a Nobel Prize

(00:11:05) - Place cells & grid cells interactions

(00:15:19) - Grid cells and rats climbing in 3D

(00:27:24) - (Spatial) ecological niches of rats, bats and birds

(00:32:55) - Self-organizing dynamics

(00:35:36) - 'Speed' in navigating physical vs abstract spaces

(00:40:19) - 3D = 2D planes stitched together?

(00:46:22) - Symmetry breaking in

(00:50:20) - 'A purey geometric module' (Cheng, 1986)

(01:01:24) - Why are grid cells grid-like?

(01:05:22) - Kate's (grid cell) secrets

(01:08:18) - Entropy, evolution, and complexity

(01:17:45) - Memory as metastable states

(01:22:07) - Entropy, memory & the arrow of time

(01:25:03) - Career Advice

(01:28:35) - Climate change & sociology

(01:38:07) - New position in Glasgow

Episode Comments

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/brain-space-time-podcast-426784/5-bernstein-conference-2023-computational-neuroscience-posters-58748630"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to #5 bernstein conference 2023: computational neuroscience posters on goodpods" style="width: 225px" /> </a>

Copy