
Responsible AI panel at DevFest Silicon Valley 2024
11/14/24 • 46 min
Join People of AI host Ashley Oldacre as she moderates a panel on Responsible AI at DevFest Silicon Valley. Ashley teams up with four industry experts to discuss their personal journeys in technology and explore the future of AI. Learn how these experts keep responsible AI at the forefront of emerging technologies.
Resources:
Learn more about the amazing panelists:
Jigyasa Grover, Lead, AI & Research at Bordo AI → https://goo.gle/3Z42QJb
Deepa Subramanian, Google Developer Expert → https://goo.gle/3AE5P1v
Daniel Goncharov, AI Research Engineer, Stanford University School of Medicine → https://goo.gle/40M2hF4
Vikram Tiwari, Lead ML Engineer, Assembled → https://goo.gle/4eJB3m1
Join People of AI host Ashley Oldacre as she moderates a panel on Responsible AI at DevFest Silicon Valley. Ashley teams up with four industry experts to discuss their personal journeys in technology and explore the future of AI. Learn how these experts keep responsible AI at the forefront of emerging technologies.
Resources:
Learn more about the amazing panelists:
Jigyasa Grover, Lead, AI & Research at Bordo AI → https://goo.gle/3Z42QJb
Deepa Subramanian, Google Developer Expert → https://goo.gle/3AE5P1v
Daniel Goncharov, AI Research Engineer, Stanford University School of Medicine → https://goo.gle/40M2hF4
Vikram Tiwari, Lead ML Engineer, Assembled → https://goo.gle/4eJB3m1
Previous Episode

3-step approach to mobile app compliance with Checks co-founders Fergus Hurley and Nia Castelly
Meet Fergus Hurley and Nia Castelly, co-funders of Checks, Google's AI-powered compliance platform. Fergus Hurley, Product Management Director at Google Cloud in addition to General Manager and Co-founder of Checks, and Nia Castelly, Co-Founder and Head of Legal at Checks, share the inner workings of Checks. Discover how they built Checks initially leveraging natural language processing (NLP) and, now Gemini, to review and analyze privacy policies at scale. Learn about their three-step approach that can detect what an app says it does, how they test apps to see what’s going on, AI safety measures, Android vitals, and much more in this episode of People of AI.
Resources: Checks → https://goo.gle/4ejR81h
Code compliance → https://goo.gle/3YGOH30 Blog: Checks, Google’s AI-powered privacy platform → https://goo.gle/4fszoSu
The new new thing → https://goo.gle/3ChItPK
GDPR → https://goo.gle/3NVO0hg
Android Vitals → https://goo.gle/3YTs9xu
The Singularity is Near → https://goo.gle/4fzcMQ5
Next Episode

NotebookLM with Steven Johnson and Raiza Martin
Explore the fascinating world of AI and its potential to transform how we work, learn, and create with NotebookLM. Join guests Steven Johnson, Editorial Director of Notebook LM, and Raiza Martin, Senior Product Manager at Google Labs, leading Notebook LM for a deep dive into the inspiration, development, practical use cases, and more in this People of AI episode.
Resouces: A.I. Is Mastering Language. Should We Trust What It Says? → https://goo.gle/3Cub1Wd NotebookLM website→ https://goo.gle/3UZOwPe Create your first notebook→ https://goo.gle/3OflIPc Adjacent possible newsletter → https://goo.gle/3AGpe21
#TensorFlow #PeopleofAI
If you like this episode you’ll love
Episode Comments
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/people-of-ai-310074/responsible-ai-panel-at-devfest-silicon-valley-2024-78273965"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to responsible ai panel at devfest silicon valley 2024 on goodpods" style="width: 225px" /> </a>
Copy