Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
PyTorch Developer Podcast - Inductor - Post-grad FX passes

Inductor - Post-grad FX passes

PyTorch Developer Podcast

04/12/24 • 24 min

plus icon
bookmark
Share icon
The post-grad FX passes in Inductor run after AOTAutograd has functionalized and normalized the input program into separate forward/backward graphs. As such, they generally can assume that the graph in question is functionalized, except for some mutations to inputs at the end of the graph. At the end of post-grad passes, there are special passes that reintroduce mutation into the graph before going into the rest of Inductor lowering which is generally aware of passes. The post-grad FX passes are varied but are typically domain specific passes making local changes to specific parts of the graph.

04/12/24 • 24 min

plus icon
bookmark
Share icon

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/pytorch-developer-podcast-373610/inductor-post-grad-fx-passes-53495811"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to inductor - post-grad fx passes on goodpods" style="width: 225px" /> </a>

Copy