Top 5 This Week

Related Posts

Lawsuit Claims Microsoft Teams Collected Voiceprints Illegally

Microsoft Teams has had a rough few years. First, it was the era of “Why is this installed on my computer again?” Then came the era of “Why does this meeting link not open?” And now Teams has entered a new chapter. A class action lawsuit claims the platform has been collecting users’ voice data without permission.

According to a new complaint filed in Illinois, Microsoft is accused of quietly capturing and storing users’ voiceprints through Teams. That would violate the state’s famously strict Biometric Information Privacy Act, better known as BIPA. If you know anything about BIPA, you know it does not play around. Companies have been hit with massive settlements for far less than “we may have recorded your voice and built a biometric profile.”

Here is what is going on, why Illinois is involved, and how this fits into Microsoft’s increasingly messy privacy track record.

What the Lawsuit Claims

The complaint argues that Microsoft collected users’ voice data during Teams calls and used that audio to create voiceprints, all without securing the written consent that Illinois law requires. It also claims the company never disclosed how long the data would be stored or what it would be used for. Under BIPA, companies must clearly explain why they are gathering biometric identifiers such as voiceprints and must obtain explicit written permission before doing so.

The lawsuit argues that Microsoft did none of this.

According to the lawsuit, Microsoft skipped every one of those steps. One of the most striking allegations is that Teams captured and analyzed users’ unique vocal characteristics to help train Microsoft’s AI and speech recognition systems, which would mean that everyday meetings were quietly feeding the company’s models. And in Illinois, where plaintiffs have successfully sued tech giants for far less, that is the kind of accusation that tends to land with force.

One of the more striking claims is that Teams allegedly captured and analyzed users’ unique vocal characteristics to improve Microsoft’s AI and speech recognition systems. If true, that would mean everyday Teams meetings were being used as training data. The same meetings where someone inevitably forgets to mute.

Illinois plaintiffs are not shy about suing over biometric violations. Facebook, Google, Snapchat, and many others have learned this the hard way.

Why Illinois Is Involved

Illinois’ Biometric Information Privacy Act is one of the strongest privacy laws in the United States. It allows private individuals to sue, not just the state. It also allows for penalties of one thousand dollars per negligent violation and five thousand dollars per intentional or reckless violation.

If a court decides that every single Teams call counts as a violation, Microsoft may need to start budgeting for a new line item called “Oops.”

Microsoft’s Recent Privacy Stumbles

Microsoft’s recent privacy stumbles make it clear that this lawsuit is not happening in a vacuum. The company has been tripping over its own data practices for years, and the pattern is becoming harder to ignore. The Windows Recall controversy was the most visible example, where Microsoft introduced an AI feature that quietly captured screenshots of nearly everything happening on a user’s PC. Security researchers quickly pointed out that this looked less like innovation and more like a keylogger with better branding, prompting Microsoft to pause the rollout after widespread backlash.

At the same time, European regulators have continued to question how Microsoft handles user data, especially in enterprise and education settings, with several agencies warning that the company’s data practices remain opaque. Those concerns only intensified as Microsoft pushed Copilot into every corner of its product lineup, a pace critics say has outstripped the company’s ability to conduct proper privacy and security reviews.

Even outside its core products, Microsoft has faced scrutiny, such as when LinkedIn was forced to disable certain ad‑targeting categories after regulators found it was using sensitive data in ways users never agreed to. And lingering in the background is the Xbox voice data incident, where Microsoft admitted that contractors listened to user recordings to improve Cortana.

The company apologized, but the episode still serves as a reminder that voice data has long been a tempting resource for Microsoft’s AI ambitions. Taken together, these missteps paint a picture of a company racing ahead on AI while struggling to keep its privacy obligations from falling behind.

Microsoft has spent years positioning itself as the responsible tech giant. The one that takes privacy seriously. The one enterprises can trust. The one that is not Facebook.

But trust erodes quickly when users feel like they are being turned into training data without their knowledge.

This lawsuit will not be the last. Not as long as AI models need more data, and not as long as companies keep collecting it faster than they can explain why.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Popular Articles