Deep Papers is a podcast series featuring deep dives on today’s seminal AI papers and research. Hosted by AI Pub creator Brian Burns and Arize AI founders Jason Lopatecki and Aparna Dhinakaran, each episode profiles the people and techniques behind cutting-edge breakthroughs in machine learning.
In this episode, we interview Dan Fu and Tri Dao, inventors of "Hungry Hungry Hippos" (aka "H3"). This language modeling architecture performs comparably to transformers, while admitting much longer context length: n log(n) rather than n^2 context scaling, for those technically inclined. Listen to learn about the major ideas and history behind H3, state space models, what makes them special, what products can be built with long-context language models, and hints of Dan and Tri's future (unpublished) research.
Learn more about AI observability and evaluation in our course, join the Arize AI Slack community or get the latest on LinkedIn and X.
02/13/23 • 41 min
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/deep-papers-251735/hungry-hungry-hippos-h3-29109759"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to hungry hungry hippos - h3 on goodpods" style="width: 225px" /> </a>
Copy