Oops! Microsoft AI team accidentally spills 38 TB of data
Microsoft swears it’s not as bad as it sounds.
Picture this: You’re a massive tech company, and you decide to share some open-source code and AI models on GitHub. But in a plot twist that would make M. Night Shyamalan proud, you end up exposing a mammoth 38 terabytes of personal data.
That’s exactly what happened to a Microsoft AI research team.
Cybersecurity firm Wiz stumbled upon a digital Pandora’s box while perusing the shared files—a link that led straight to backups of Microsoft employees’ computers.
This wasn’t just any link, folks. It was the proverbial golden ticket, granting access to Microsoft service passwords, secret keys, and even over 30,000 internal Teams messages from hundreds of Microsoft employees.
The Silver Lining
Before you start panicking, Microsoft was quick to clarify that no customer data was compromised, and no other internal services were jeopardized.
No customer data was exposed, and no other internal services were put at risk because of this issue. No customer action is required in response to this issue,” Microsoft wrote in a blog post.
So, it seems like this was less of a Titanic-sized iceberg and more of an inconvenient speed bump.
The inclusion of the link wasn’t an oversight but a deliberate move. The researchers were using an Azure feature called “SAS tokens” to create shareable links.
This feature allows users to control the access level, ranging from a single file to their entire storage. In this case, the researchers went full Oprah and gave away access to the whole storage account.
The security lapse was spotted by Wiz on June 22 and reported to Microsoft, who promptly revoked the SAS token by the next day.
Microsoft also assured that it regularly scans all its public repositories. However, this particular link had been erroneously marked as a “false positive” by its system.
Learning From Mistakes
Microsoft has since rectified the issue and enhanced its system to detect overly permissive SAS tokens in the future. But let’s not forget the potential for data leaks and significant privacy issues still looms if SAS tokens are mishandled.
Microsoft stressed the importance of proper creation and handling of SAS tokens and shared a list of best practices for using these tokens—a guide it is presumably (and hopefully) following itself.
So, while this incident was more of an “oopsie” than a full-blown disaster, it serves as a reminder that even tech giants can trip over their own shoelaces. Let’s hope Microsoft has learned its lesson and tied those laces a little tighter.
- Microsoft’s latest Windows 11 update is an absolute horror show
- Microsoft Surface Laptop Studio 2 and Laptop Go 3 renders leaked
- Microsoft introduces new Windows Backup app
- Microsoft officially kills the Cortana app on Windows 11