Voyager Labs allegedly used the data alongside its own software, which the company claims can predict future criminal behavior.
Meta has filed a legal complaint against a company for allegedly creating tens of thousands of fake Facebook accounts to scrape user data and provide surveillance services for clients.
The firm, Voyager Labs, bills itself as “a world leader in advanced AI-based investigation solutions.” What this means in practice is analyzing social media posts en masse in order to make claims about individuals. In 2021, for example, The Guardian reported how Voyager Labs sold its services to the Los Angeles Police Department, with the company claiming to predict which individuals were likely to commit crimes in the future.
Meta announced the legal action in a blog post on January 12th, claiming that Voyager Labs violated its terms of service. According to a legal filing issued on November 11th, Meta alleges that Voyager Labs created over 38,000 fake Facebook user accounts and used its surveillance software to gather data from Facebook and Instagram without authorization. Voyager Labs also collected data from sites including Twitter, YouTube, and Telegram.
Meta says Voyager Labs used fake accounts to scrape information from over 600,000 Facebook users between July 2022 and September 2022. Meta says it disabled more than 60,000 Voyager Labs-related Facebook and Instagram accounts and pages “on or about” January 12th.
Meta is demanding that the company stop violating its terms of service and requests that the courts ban Voyager Labs from using Facebook, Instagram, and services related to those platforms. The company also requests that the firm compensate Meta for its “ill-gotten profits in an amount to be proven at trial,” claiming that Voyager Labs unjustly enriched itself at Meta’s expense.
Voyager Labs is one of many companies — including the likes of Palantir — that claim to be able to predict future criminal activity based on an individual’s past behavior and online activity. Experts say these technologies are flawed and that the algorithms are too simple to effectively predict crime. In 2019, the LAPD conducted an internal audit of one of its data-driven programs, revealing that the tech was inconsistent and racially biased.
“Companies like Voyager are part of an industry that provides scraping services to anyone regardless of the users they target and for what purpose, including as a way to profile people for criminal behavior,” said Jessica Romero, Meta’s director of platform enforcement and litigation. “This industry covertly collects information that people share with their community, family and friends, without oversight or accountability, and in a way that may implicate people’s civil rights.”