Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
Coredump Sessions - #007: AI, Open Source, and the Future of Embedded Development: How Much Code Will We Actually Write?

#007: AI, Open Source, and the Future of Embedded Development: How Much Code Will We Actually Write?

04/29/25 • 55 min

Coredump Sessions

In today's Coredump Session, we dive into a wide-ranging conversation about the intersection of AI, open source, and embedded systems with the teams from Memfault and Goliath. From the evolution of AI at the edge to the emerging role of large language models (LLMs) in firmware development, the panel explores where innovation is happening today — and where expectations still outpace reality. Listen in as they untangle the practical, the possible, and the hype shaping the future of IoT devices.

Speakers:

  • François Baldassari: CEO & Founder, Memfault
  • Thomas Sarlandie: Field CTO, Memfault
  • Jonathan Beri: CEO & Founder, Golioth
  • Dan Mangum: CTO, Golioth

Key Takeaways:

  • AI has been quietly powering embedded devices for years, especially in edge applications like voice recognition and computer vision.
  • The biggest gains in IoT today often come from cloud-based AI analytics, not necessarily from AI models running directly on devices.
  • LLMs are reshaping firmware development workflows but are not yet widely adopted for production-grade embedded codebases.
  • Use cases like audio and video processing have seen the fastest real-world adoption of AI at the edge.
  • Caution is warranted when integrating AI into safety-critical systems, where determinism is crucial.
  • Cloud-to-device AI models are becoming the go-to for fleet operations, anomaly detection, and predictive maintenance.
  • Many promising LLM-based consumer products struggle because hardware constraints and cloud dependence create friction.
  • The future of embedded AI may lie in hybrid architectures that balance on-device intelligence with cloud support.

Chapters:

00:00 Episode Teasers & Welcome

01:10 Meet the Panel: Memfault x Golioth

02:56 Why AI at the Edge Isn’t Actually New

05:33 The Real Use Cases for AI in Embedded Devices

08:07 How Much Chaos Are You Willing to Introduce?

11:19 Edge AI vs. Cloud AI: Where It’s Working Today

13:50 LLMs in Embedded: Promise vs. Reality

17:16 Why Hardware Can’t Keep Up with AI’s Pace

20:15 Building Unique Models When Public Datasets Fail

36:14 Open Source’s Big Moment (and What Comes Next)

42:49 Will AI Kill Open Source Contributions?

49:30 How AI Could Change Software Supply Chains

52:24 How to Stay Relevant as an Engineer in the AI Era

⁠⁠Join the Interrupt Slack

Watch this episode on YouTube

Follow Memfault

Other ways to listen:

⁠⁠Apple Podcasts

iHeartRadio⁠⁠

⁠⁠Amazon Music

GoodPods

Castbox

⁠⁠

⁠⁠Visit our website

plus icon
bookmark

In today's Coredump Session, we dive into a wide-ranging conversation about the intersection of AI, open source, and embedded systems with the teams from Memfault and Goliath. From the evolution of AI at the edge to the emerging role of large language models (LLMs) in firmware development, the panel explores where innovation is happening today — and where expectations still outpace reality. Listen in as they untangle the practical, the possible, and the hype shaping the future of IoT devices.

Speakers:

  • François Baldassari: CEO & Founder, Memfault
  • Thomas Sarlandie: Field CTO, Memfault
  • Jonathan Beri: CEO & Founder, Golioth
  • Dan Mangum: CTO, Golioth

Key Takeaways:

  • AI has been quietly powering embedded devices for years, especially in edge applications like voice recognition and computer vision.
  • The biggest gains in IoT today often come from cloud-based AI analytics, not necessarily from AI models running directly on devices.
  • LLMs are reshaping firmware development workflows but are not yet widely adopted for production-grade embedded codebases.
  • Use cases like audio and video processing have seen the fastest real-world adoption of AI at the edge.
  • Caution is warranted when integrating AI into safety-critical systems, where determinism is crucial.
  • Cloud-to-device AI models are becoming the go-to for fleet operations, anomaly detection, and predictive maintenance.
  • Many promising LLM-based consumer products struggle because hardware constraints and cloud dependence create friction.
  • The future of embedded AI may lie in hybrid architectures that balance on-device intelligence with cloud support.

Chapters:

00:00 Episode Teasers & Welcome

01:10 Meet the Panel: Memfault x Golioth

02:56 Why AI at the Edge Isn’t Actually New

05:33 The Real Use Cases for AI in Embedded Devices

08:07 How Much Chaos Are You Willing to Introduce?

11:19 Edge AI vs. Cloud AI: Where It’s Working Today

13:50 LLMs in Embedded: Promise vs. Reality

17:16 Why Hardware Can’t Keep Up with AI’s Pace

20:15 Building Unique Models When Public Datasets Fail

36:14 Open Source’s Big Moment (and What Comes Next)

42:49 Will AI Kill Open Source Contributions?

49:30 How AI Could Change Software Supply Chains

52:24 How to Stay Relevant as an Engineer in the AI Era

⁠⁠Join the Interrupt Slack

Watch this episode on YouTube

Follow Memfault

Other ways to listen:

⁠⁠Apple Podcasts

iHeartRadio⁠⁠

⁠⁠Amazon Music

GoodPods

Castbox

⁠⁠

⁠⁠Visit our website

Previous Episode

undefined - #006: Pebble’s Code is Free: Three Former Pebble Engineers Discuss Why It's Important (PART 2/2)

#006: Pebble’s Code is Free: Three Former Pebble Engineers Discuss Why It's Important (PART 2/2)

In today’s Coredump Session, the team reunites to unpack the behind-the-scenes lessons from their time building firmware at Pebble. This episode dives into the risks, decisions, and sheer grit behind a near-disastrous OTA update—and the ingenious hack that saved a million smartwatches. It’s a candid look at the intersection of rapid development, firmware stability, and real-world consequences.

Key Takeaways:

  • Pebble’s open approach to developer access often came at the cost of security best practices, reflecting early startup trade-offs.
  • A critical OTA update bug almost bricked Pebble devices—but the team recovered using a clever BLE-based stack hack.
  • Lack of formal security measures at the time (e.g., unsigned firmware) unintentionally enabled recovery from a serious update failure.
  • Static analysis and test automation became top priorities following the OTA scare to prevent repeat incidents.
  • The story reveals how firmware constraints (like code size and inline functions) can lead to high-stakes bugs.
  • Investing in robust release processes—including version-to-version OTA testing—proved vital.
  • Real security risks included impersonation on e-commerce platforms and potential ransom via malicious OTA compromise.
  • The importance of "hiring your hackers" was humorously noted as a de facto security strategy.

Chapters:

00:00 Episode Teasers & Welcome

01:22 Why Pebble’s Firmware Was Open (and Unsigned)

05:01 The Security Tradeoffs That Enabled Speed

11:00 The OTA Bug That Could Have Bricked Everything

15:26 Hacking Our Way Out with BLE Stack Overflow

17:47 Lessons Learned: Test Automation & Static Analysis

26:30 How Pebble Built a Developer Ecosystem

29:56 CloudPebble, Watchface Generator & Developer Tools

42:55 Backporting Pebble 3.0 to Legacy Hardware

49:02 The Bootloader Rewrite & Other Wild Optimizations

53:31 Simulators, Robot Arms & Debugging in CI56:40 Firmware Signing, Anti-Rollback & Secure Update

1:06:10 Coding in Rust? What We’d Do Differently Today

1:08:28 Where to Start with Open Source Pebble Development

⁠⁠Join the Interrupt Slack

Watch this episode on YouTube⁠⁠

Follow Memfault

Other ways to listen:

⁠⁠Apple Podcasts

iHeartRadio⁠⁠

⁠⁠Amazon Music

GoodPods

Castbox

⁠⁠

⁠⁠Visit our website

Episode Comments

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/coredump-sessions-668730/007-ai-open-source-and-the-future-of-embedded-development-how-much-cod-90335548"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to #007: ai, open source, and the future of embedded development: how much code will we actually write? on goodpods" style="width: 225px" /> </a>

Copy