Meta Ray-Bans Are Mass Surveillance Outsourced to Kenya

Investigation found thousands of workers in Nairobi review footage of people naked, using bathrooms, and having sex captured by smart glasses

Meta Ray-Ban smart glasses with camera
Meta Ray-Bans record everything the wearer sees and send footage to thousands of contractors in Kenya who review naked people and credit card details

Meta Ray-Ban smart glasses capture footage of naked people, sensitive information, and violent acts that thousands of employees in Kenya review for AI training. Investigation found workers see people using bathrooms, changing clothes, having sex, and entering credit card details.

Meta Ray-Ban smart glasses capture footage of naked people, sensitive information, and violent acts that Meta's AI and an army of employees review. An investigation by Svenska Dagbladet found footage is collected and seen by human AI trainers working for Sama, a Kenyan subcontractor of Meta. Anonymous sources from the Nairobi company revealed they have seen footage of people at their most private and vulnerable moments when users of Meta's smart glasses would rather not have cameras rolling. Thousands of workers draw boxes on screens identifying items in images and video to train Meta's AI models.

One man said he could see people going to the bathroom or removing clothes. He didn't know if the users were aware of being captured and said they probably wouldn't want the footage seen elsewhere. Other employees detailed what they saw. One explained they saw a man put the glasses on a bedside cabinet and leave the room, only for a woman to come in and change her clothes. The videos shown to data annotators also include credit card details and sexual acts. Some footage could have been filmed on purpose by the user. Many instances were caught without the user or the video's subjects being fully aware. All in sensitive situations that would be compromising if leaked.

The use of humans to train AI using data is standard in the industry. Companies rely on teams of workers to sort through data and correct accuracy issues, often in countries where it's cheap to hire small armies of employees. Meta assures the public that their systems are private. Employees working on supplied footage sign non-disclosure agreements. The sensitive data seen by employees is not intended to train AI models. Anonymization systems are in place to cover faces but are not entirely reliable. One former Meta employee explained that faces are sometimes visible, with difficult lighting conditions sometimes to blame.

Meta was asked how long voice recordings and video clips were stored, among other privacy questions. It took two months for a response that only explained how data is transferred from the glasses to the mobile app and referred reporters to Meta's AI privacy policy. Apple dealt with a similar scandal in 2019 when audio recordings from Siri triggers were sent to third-party contractors working on improving Siri's accuracy. Those recordings included private conversations between doctors and patients and drug deals. Apple paid millions of dollars in settlements as recently as 2025.

Meta Ray-Bans record everything the wearer sees. The footage goes to Meta's servers where it is processed by AI and reviewed by thousands of low-paid workers in Kenya. These workers see people naked, changing clothes, using the bathroom, having sex, and entering credit card details. Users may not know the glasses are recording. The people being recorded definitely don't know. Meta's privacy policy does not explain how long this footage is stored or who has access to it. The company took two months to respond to privacy questions and provided no meaningful answers. Anonymization systems are in place to blur faces but don't work reliably. Workers see faces. Workers see identifiable people in compromising situations.

The glasses have a small LED that lights up when recording. This LED can be covered, disabled, or simply not noticed by people being recorded. Someone wearing Ray-Bans in a public bathroom, locker room, or bedroom is recording everyone around them whether those people consent or not. That footage goes to Meta. That footage goes to thousands of contractors in Nairobi. Meta claims the sensitive footage is not used to train AI models. This statement is meaningless. The footage still exists on Meta's servers. The footage is still reviewed by humans. The footage still captures people in situations where they have a reasonable expectation of privacy. Whether or not Meta uses bathroom footage to train AI models does not change the fact that bathroom footage exists on Meta's servers and is viewed by Meta's contractors.

Sama employees sign NDAs prohibiting them from discussing what they see. NDAs do not prevent data breaches. NDAs do not prevent employees from saving footage. NDAs do not prevent footage from being leaked or sold. Meta is relying on NDAs signed by low-paid workers in Kenya to protect footage of naked people, credit card numbers, and sexual acts captured by cameras worn on people's faces in private spaces. Smart glasses with cameras are marketed as convenient ways to capture life hands-free. The reality is cameras on your face recording everything you see with all footage uploaded to corporate servers and reviewed by thousands of contractors. People buy these glasses to record their own experiences. They end up recording everyone around them without consent in bathrooms, bedrooms, and changing rooms.

Meta assures users their data is handled sensitively while taking two months to avoid answering basic privacy questions. The company operates a system where thousands of workers in Nairobi review footage of people naked, having sex, and entering passwords. Apple faced the same problem with Siri in 2019. Contractors heard private doctor-patient conversations and drug deals. Apple paid millions in settlements. Apple is reportedly working on AirPods with infrared cameras and its own smart glasses. If Apple deploys these products with the same privacy failures as Siri, the same scandals will repeat. Cameras on faces recording everything and uploading to corporate servers for review by thousands of contractors is not a product feature.

Blackout VPN exists because privacy is a right. Your first name is too much information for us.

Keep learning

FAQ

Who reviews Meta Ray-Ban footage?

Thousands of workers employed by Sama, a Kenyan subcontractor in Nairobi, review footage from Meta Ray-Bans to train AI models. These workers see people naked, using bathrooms, changing clothes, having sex, and entering credit card details.

Does Meta use anonymization to protect privacy?

Meta has anonymization systems in place to blur faces but they are not entirely reliable. One former Meta employee confirmed that faces are sometimes visible, with difficult lighting conditions sometimes to blame.

How long does Meta store Ray-Ban footage?

Meta was asked how long voice recordings and video clips were stored. It took two months for a response that only explained how data is transferred from glasses to the mobile app and referred reporters to Meta's AI privacy policy without answering the question.

Are people aware they're being recorded?

The glasses have a small LED that lights up when recording but it can be covered, disabled, or simply not noticed. Users may not know the glasses are recording and people being recorded definitely don't know their footage is being sent to thousands of contractors.

What protects Ray-Ban footage from leaking?

Sama employees sign NDAs prohibiting them from discussing what they see. NDAs do not prevent data breaches, do not prevent employees from saving footage, and do not prevent footage from being leaked or sold.