pixel
Connect with us

Microsoft

Oops! Microsoft AI team accidentally spills 38 TB of data

Microsoft swears it’s not as bad as it sounds.

microsoft logo on blurred background

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

Picture this: You’re a massive tech company, and you decide to share some open-source code and AI models on GitHub. But in a plot twist that would make M. Night Shyamalan proud, you end up exposing a mammoth 38 terabytes of personal data.

That’s exactly what happened to a Microsoft AI research team.

Cybersecurity firm Wiz stumbled upon a digital Pandora’s box while perusing the shared files—a link that led straight to backups of Microsoft employees’ computers.

This wasn’t just any link, folks. It was the proverbial golden ticket, granting access to Microsoft service passwords, secret keys, and even over 30,000 internal Teams messages from hundreds of Microsoft employees.

The Silver Lining

Before you start panicking, Microsoft was quick to clarify that no customer data was compromised, and no other internal services were jeopardized.

No customer data was exposed, and no other internal services were put at risk because of this issue. No customer action is required in response to this issue,” Microsoft wrote in a blog post.

So, it seems like this was less of a Titanic-sized iceberg and more of an inconvenient speed bump.

The inclusion of the link wasn’t an oversight but a deliberate move. The researchers were using an Azure feature called “SAS tokens” to create shareable links.

This feature allows users to control the access level, ranging from a single file to their entire storage. In this case, the researchers went full Oprah and gave away access to the whole storage account.

Damage Control

The security lapse was spotted by Wiz on June 22 and reported to Microsoft, who promptly revoked the SAS token by the next day.

Microsoft also assured that it regularly scans all its public repositories. However, this particular link had been erroneously marked as a “false positive” by its system.

Learning From Mistakes

Microsoft has since rectified the issue and enhanced its system to detect overly permissive SAS tokens in the future. But let’s not forget the potential for data leaks and significant privacy issues still looms if SAS tokens are mishandled.

Microsoft stressed the importance of proper creation and handling of SAS tokens and shared a list of best practices for using these tokens—a guide it is presumably (and hopefully) following itself.

So, while this incident was more of an “oopsie” than a full-blown disaster, it serves as a reminder that even tech giants can trip over their own shoelaces. Let’s hope Microsoft has learned its lesson and tied those laces a little tighter.

Have any thoughts on this? Drop us a line below in the comments, or carry the discussion to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Kevin is KnowTechie's founder and executive editor. With over 15 years of blogging experience in the tech industry, Kevin has transformed what was once a passion project into a full-blown tech news publication. Shoot him an email at kevin@knowtechie.com or find him on Mastodon or Post.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Microsoft