Microsoft Leakd Secrets

"An employee error at Microsoft led to the exposure of a number of secrets and 38 terabytes of sensitive data."🚨.

Wiz, a cloud security startup, announced that it had discovered an exposure in Microsoft's AI GitHub repository, including more than 30,000 internal Microsoft Teams messages. The problem was a Secret (SAS token) published on Github.

The repository belongs to Microsoft's AI research team and its purpose was to provide open source code and AI models for image recognition. Within the repository, users were directed to download the AI model via a URL in Azure Storage, and the SAS token was mistakenly included in the Blob store URL, which was then shared to a public GitHub repository.

Furthermore, the token was incorrectly set to only allow access to specific files, which resulted in sharing the entire storage account.

✅ Because of the misconfigured SAS token, anyone who accessed the repository could see additional secrets, 38TB of data, passwords, and internal chat messages from the Microsoft team.

Worse, the token was not read-only, but was also granted "full control" permissions, which allowed the attacker to delete or overwrite files.

This setup left the door open for the attacker to inject malicious code into the AI model, which could have propagated the risk to other users of the model. In response, Microsoft conducted a full Secret scan of all public repositories on Microsoft Github, partner organisations, and extended the scope of the detection to include all SAS tokens.

💡Azure Storage recommends the following best practices when working with SAS URLs: (Source: MSRC blog)

1️⃣ Apply the principle of least privilege: Scope the SAS URL to the smallest set of resources the client needs (e.g., a single blob) and limit permissions to only what the application needs (e.g., read-only, write-only).

2️⃣ Use short-term SAS: Always use an imminent expiration time when creating SAS and have clients request a new SAS URL when needed. Azure Storage recommends 1 hour or less for all SAS URLs.

For more information, see 3️⃣ Handle SAS tokens carefully: SAS URLs grant access to data and should be treated as application secrets. Only expose SAS URLs to clients who need access to the storage account.

4️⃣ Have a revocation plan: To granularly revoke SAS within a container, associate the SAS token with a storage access policy. Be prepared to remove storage access policies or rotate storage account keys if SAS or shared keys are compromised.

5️⃣ Application monitoring and auditing: Enable Azure Monitor and Azure Storage logs to track how requests to your storage account are authenticated. Enable SAS expiration policies to detect clients using long-lived SAS URLs.

🧐 As Microsoft has shown, if your secrets are exposed in places that are publicly available to many, such as Github source code repositories, they can be easy targets for internal information leaks, especially in the case of source code repositories.

And it's not just source code repositories - secrets are exposed everywhere, including internal systems and collaborative SaaS solutions, which can lead to an excessive privilege escalation.

🔍 Secret Detection: The best way to prevent this is to have a robust engine that detects secrets in real-time and enforce Secret-Driven Security. Sharing tokens for source code repositories, setting tokens to share the entire storage account, creating high privilege SAS tokens, and setting a time that never expires - because it all comes down to human error, and you can only do so much to reduce it.