The token utilized by Microsoft not solely allowed entry to further storage unintentionally via large entry scope, nevertheless it additionally carried misconfigurations that allowed “full management” permissions as a substitute of read-only, enabling a potential attacker to not simply view the personal information however to delete or overwrite present information as effectively.
In Azure, a SAS token is a signed URL granting customizable entry to Azure Storage knowledge, with permissions starting from read-only to full management. It may cowl a single file, container, or whole storage account, and the consumer can set an elective expiration time, even setting it to by no means expire.
The complete-access configuration “is especially attention-grabbing contemplating the repository’s unique function: offering AI fashions to be used in coaching code,” Wiz mentioned. The format of the mannequin knowledge file meant for downloading is ckpt, a format produced by the TensorFlow library. “It is formatted utilizing Python’s Pickle formatter, which is vulnerable to arbitrary code execution by design. That means, an attacker may have (additionally) injected malicious code into all of the AI fashions on this storage account,” Wiz added.
SAS tokens are tough to handle
The granularity of SAS tokens opens up dangers of granting an excessive amount of entry. Within the Microsoft GitHub case, the token allowed full management of permissions, on the whole account, without end.
Microsoft’s repository used an Account SAS token — certainly one of three sorts of SAS tokens that additionally embody Service SAS, and Consumer Delegation SAS — to permit service (software) and consumer entry, respectively.
Account SAS tokens are extraordinarily dangerous as they’re susceptible when it comes to permissions, hygiene, administration, and monitoring, Wiz famous. Permissions on SAS tokens can grant excessive stage entry to storage accounts both via extreme permissions, or via large entry scopes.