Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
Brain Space Time Podcast - #8 Uri Hasson: Language in the real world for brains and AI

#8 Uri Hasson: Language in the real world for brains and AI

Brain Space Time Podcast

12/13/23 • 56 min

plus icon
bookmark
Share icon

Uri Hasson runs a lab in Princeton, where he investigates the underlying neural basis of natural language acquisition and processing as it unfolds in the real world. As Uri visited Tübingen (where I am doing my master's), we were able to meet in person. Originally, I planned to talk about his idea of temporal receptive windows, and how different brain regions (e.g. default mode network) operate at different timescales. However, we ended up talking more about Wittgenstein, evolution, and ChatGPT. An underlying thread throughout the conversation was that (for both biological and artificial agents), language is not clever symbol and rule manipulation but a brute force fitting to statistics across (Wittgensteinian) 'contexts'. This view is best articulated in Uri's Direct Fit paper. We also connect this to transformers and discuss what's missing in AI. The answer here is multimodal integration, episodic memory, and interactive sociality). At the end, I ask Uri about his 1000 days project, talking to crows, and "understanding" in neuroscience/AI.

For Apple Podcast users, find books/papers links at: https://akseliilmanen.wixsite.com/home/post/pod08

  • Uri's Website
  • Twitter: @HassonLab
  • Uri's publications & talks:
    • Hasson et al., 2015 - Hierarchical process memory: memory as an integral component of information processing Temporal receptive windows paper
    • Hasson et al., 2020 - Direct Fit to Nature: An Evolutionary Perspective on Biological and Artificial Neural Networks paper
    • Yeshurun et al., 2021 - The default mode network: where the idiosyncratic self meets the shared social world paper
    • Goldstein et al., 2022 - The Temporal Structure of Language Processing in the Human Brain Corresponds to The Layered Hierarchy of Deep Language Models preprint
    • Nguyen et al., 2022 - Teacher student neural coupling during teaching and learning paper
    • Goldstein et al., 2022 - Shared computational principles for language processing in humans and deep language models paper
  • Also mentioned:
    • Podcast episode with Tony Zador on Genomic Bottlenecks link
  • My Twitter @akseli_ilmanen
  • Email: akseli.ilmanen[at]gmail.com
  • Brain Space Time Podcast, my blog, other stuff
  • Music: Space News, License: Z62T4V3QWL

Timestamps:

(00:00:00) - Intro

(00:04:52) - Studying language in the real world

(00:07:57) - Wittgenstein

(00:11:10) - Evolution and the default mode network

(00:20:54) - Overparameterized deep learning works

(00:25:02) - Direct Fit paper and generalization

(00:39:37) - Episodic memory and sociality in language models

(00:47:15) - 1000 days project and talking to crows

(00:52:14) - "Understanding" in neuroscience

12/13/23 • 56 min

plus icon
bookmark
Share icon

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/brain-space-time-podcast-426784/8-uri-hasson-language-in-the-real-world-for-brains-and-ai-58748622"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to #8 uri hasson: language in the real world for brains and ai on goodpods" style="width: 225px" /> </a>

Copy