Co-Pilot and Misconfigured Permissions - A Looming Threat?
The Security Swarm Podcast02/14/24 • 32 min
The use of Large Language Models (LLMs), like ChatGPT has skyrocketed, infiltrating multiple facets of modern life. In today's podcast episode, Andy and Paul Schnackenburg explore Microsoft 365 Co-Pilot and some surprising risks it can surface. Microsoft 365 Co-Pilot is more than just a virtual assistant: it's a powerhouse of productivity! It is a versatile generative AI tool that is embedded within various Microsoft 365 applications, and as such, it can execute various tasks across different software platforms in seconds.
Amidst discussions about Co-Pilot’s unique features and functionalities, many wonder: How does M365 Co-Pilot differ from other LLMs, and what implications does this hold for data security and privacy? Tune in to learn more!
Timestamps:
(4:16) – How is Co-Pilot different from other Large Language Models?
(11:40) – How are misconfigured permissions a special danger with Co-Pilot?
(16:53) – How do M365 tenant permission get so “misconfigured”?
(21:53) – How can your organization use Co-Pilot safely?
(26:11) – How can you easily right-size your M365 permissions before enabling Co-Pilot?
Episode Resources:
Paul’s article on preparing for Co-Pilot
Webinar with demo showcasing the theft of M365 credentials
02/14/24 • 32 min
Generate a badge
Get a badge for your website that links back to this episode
<a href="https://goodpods.com/podcasts/the-security-swarm-podcast-260537/co-pilot-and-misconfigured-permissions-a-looming-threat-44862062"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to co-pilot and misconfigured permissions - a looming threat? on goodpods" style="width: 225px" /> </a>
Copy