Wannabe pickup artists and other creeps are reportedly using Meta’s “smart glasses” to video record their interactions with women without their consent, then posting the encounters online — prompting advocacy groups to urge the company not to add facial-recognition tech to the technology.
A growing number of aspiring influencers are using the smart glasses to turn real-life encounters into content — prowling nightlife strips, shopping centers and city streets to film their unsolicited approaches to women, Wired reported.
The videos follow a familiar script — a compliment, a pickup line, a push for a name or number — with the footage later blasted across TikTok and Instagram for views, often without the subject ever realizing they were on camera.
In many cases, the interactions veer from awkward to aggressive, with women visibly rejecting advances while still being recorded. The clips are designed to provoke reactions and fuel engagement, critics say — and have reportedly earned the tech the nickname “pervert glasses.”
The trend has drawn backlash from observers who describe the behavior as “predatory,” as creators exploit their discreet, first-person recording to capture and monetize encounters with unsuspecting targets.
Kassy Zanjani, a resident of Vancouver, British Columbia, Canada, didn’t realize anything was off when a stranger struck up a casual conversation during a night out earlier this year — until a friend later sent her a viral video of the encounter.
The man had been wearing smart glasses and secretly recorded the entire exchange, posting it online where it racked up tens of thousands of views.
“When I saw it, I was in shock and it definitely brought up a lot of anxiety,” Zanjani told CTV, adding that she felt “humiliated” by a clip she never consented to — one she believes was meant to “degrade women” for cheap viral clicks.
The disturbing reality behind Meta’s smart glasses goes far beyond viral pickup videos.
An investigation by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten found that footage captured on the devices can include people using the bathroom, undressing and even having sex — often without realizing they were being recorded.
The personal footage isn’t just stored — it’s being reviewed by human contractors tasked with training Meta’s AI systems, according to Wired.
Workers in Kenya told the newspapers they regularly see “everything — from living rooms to naked bodies,” describing a steady stream of intimate clips from users who appear unaware their private moments are being captured and analyzed.
The contractors said the videos sometimes expose highly sensitive information, including bank cards, private conversations and explicit content.
Some clips even show people accidentally recording sexual encounters or partners undressing in the background.
The investigation also found that safeguards meant to protect privacy don’t always work.
While faces are supposed to be blurred, workers said the system frequently fails — leaving people identifiable in footage that is circulated internally for AI training purposes.
“People are responsible for following the law, whether or not they’re wearing Ray-Ban Metas,” a Meta spokesperson told The Post.
“Unlike smartphones, our glasses have an LED light that activates whenever someone captures content, so it’s clear the device is recording.”
More than 70 civil liberties and advocacy groups are now sounding the alarm, warning that Meta’s smart glasses could take the trend from creepy to outright dangerous if new features are rolled out.
In a letter to CEO Mark Zuckerberg, the coalition urged the company to scrap plans for facial-recognition technology that would allow users to identify strangers in real time.
“Our competitors offer this type of facial recognition product, we do not,” a Meta spokesperson told The Post.
“If we were to release such a feature, we would take a very thoughtful approach before rolling anything out.”
Advocacy groups including the ACLU and the Electronic Privacy Information Center warned the feature could let “stalkers, scammers, [and] abusers” silently uncover a person’s identity and personal details — from their workplace to their home address — without their knowledge or consent.
They cautioned that pairing discreet, always-on cameras with instant identification would “exacerbate abuse, harassment, and stalking,” particularly for women and other vulnerable groups and effectively stripping people of the ability to move through public spaces anonymously.
“People should be able to move through their daily lives without fear” of being secretly identified and tracked, the coalition wrote, calling the technology a “red line society must not cross.”
