A recent Forbes investigation revealed that Microsoft has allegedly been handing over Bitlocker encryption recovery keys to law enforcement when served with warrants. Microsoft says it receives about 20 such requests annually. Taken narrowly, this may appear to be a routine case of lawful compliance. On closer inspection, it raises a consequential question about how modern digital systems are designed and who ultimately controls the data they hold.
The essence of the debate centers on data sovereignty, or whether individuals and organizations truly control their own data, or whether that control can be involuntarily transferred because of architectural choices made by technology providers.
BitLocker itself is strong encryption. Federal investigators have acknowledged they cannot defeat it cryptographically. Access depends on possession of the recovery key. The issue, then, is not the strength of the encryption, but where the keys reside.
Microsoft commonly recommends that users back up BitLocker recovery keys to a Microsoft account for convenience. That choice means Microsoft may retain the technical ability to unlock a customer’s device. When a third party holds both encrypted data and the keys required to decrypt it, control is no longer exclusive. Data sovereignty has already been diluted—long before any warrant is issued.
Before founding Virtru, I served as the lead technology policy adviser at the White House National Economic Council, where I participated in the early debates around the Patriot Act. Those discussions were framed as exceptional—extraordinary access in extraordinary circumstances. What experience and history have shown since is that access tends to expand, requests become more routine, and oversight struggles to keep pace with technologies that were never designed to limit access in the first place.
We have seen the consequences of this design pattern for more than two decades. From the Equifax breach, which exposed the financial identities of nearly half the U.S. population, to repeated leaks of sensitive communications and health data during the COVID era, the pattern is consistent: centralized systems that retain control over customer data become systemic points of failure. These incidents are not anomalies. They reflect a persistent architectural flaw.
That is why the BitLocker issue matters beyond a single investigation. When systems are built so that providers can be compelled to unlock customer data, lawful access becomes a standing feature of the architecture rather than an exceptional outcome governed by narrow circumstances.
Other large technology companies have demonstrated that a different approach is possible. Apple has designed systems that limit its own ability to access customer data, even when doing so would ease compliance with government demands. Google offers client-side encryption models that allow customers to retain exclusive control of encryption keys. These companies still comply with the law, but when they do not hold the keys, they cannot unlock the data. That is not obstruction. It is a design choice.
Microsoft could make similar choices. Retaining decryption capability is not a technical inevitability; it is a product and business decision. Defaults matter, and when convenience is the default, most users—individuals and enterprises alike—will unknowingly trade control for ease of use.
There is no such thing as a risk-free backdoor or a universally safe key escrow. Encryption does not distinguish between authorized and unauthorized access. Any system designed to be unlocked on demand will eventually be unlocked by unintended parties.
These risks are magnified in an era of persistent nation-state cyber activity. Every additional entity capable of decrypting data increases the attack surface.
The Salt Typhoon debacle underscores the point. Even when attackers target networks or infrastructure instead of endpoints, the goal remains the same: to gain access to systems where large volumes of sensitive data can be accessed or decrypted at scale. Architectures that concentrate decryption authority magnify the consequences of inevitable breaches; architectures that enforce data-level key ownership sharply limit the blast radius.
For global companies, the issue is not only U.S. legal process, but the possibility of conflicting demands across jurisdictions—some with far weaker protections for civil liberties and commercial confidentiality.
The lesson is straightforward: organizations cannot outsource responsibility for their most sensitive data and assume that third parties will always act in their best interest. Encryption only fulfills its purpose when the data owner is the sole party capable of unlocking it.
Microsoft has an opportunity to address this by making customer-controlled keys the default and by designing recovery mechanisms that do not place decryption authority in Microsoft’s hands. True data sovereignty—personal and organizational—requires systems that make compelled access technically impossible, not merely contractually discouraged.
In the meantime, this episode should serve as a warning. Executives and boards should ask a simple question of their technology stack: Where do our encryption keys live?
The answer increasingly determines who truly owns the data—and who does not. Because if you do not control the keys, you do not control the data.
John Ackerly is the CEO and Co-Founder of Virtru.
The post If you don’t control your keys, you don’t control your data appeared first on CyberScoop.
–
Read More – CyberScoop



