Link Copied!

Meta Mined Workers' Keystrokes Before Firing 8,000

Meta rolled out the Model Capability Initiative in late April, installing software that captures keystrokes, clicks, and screenshots across hundreds of apps on every US work laptop. The same week, the company announced 8,000 layoffs effective May 20. On May 12, workers distributed flyers calling the offices an "Employee Data Extraction Factory" and started signing an NLRA petition. European employees were exempt, which is the loudest part of the whole story.

A worker's hands at a corporate keyboard inside a low-lit open-plan office, faint translucent gridlines and click-trail vectors overlaid on the desk surface, a single security camera in soft focus behind the figure, an empty cardboard moving box on the adjacent chair, late-afternoon office light through floor-to-ceiling windows, photojournalistic documentary style with available natural light, shot on Canon EOS R5 35mm f1.4.

Key Takeaways

  • The Program: Meta’s Model Capability Initiative (MCI) captures mouse movements, clicks, keystrokes and periodic screenshots from US work laptops across “several hundred” sites, including Google, LinkedIn, Wikipedia, GitHub and Slack. There is no opt-out on a company-issued machine.
  • The Calendar: MCI was disclosed to staff in late April. Forty-eight hours later, Meta announced 8,000 layoffs (about 10% of the workforce) effective May 20, 2026, plus 6,000 unfilled roles closed.
  • The Revolt: On Tuesday, May 12, workers placed flyers on vending machines and toilet paper dispensers at multiple US offices labeling Meta an “Employee Data Extraction Factory” and pointing to a petition citing the National Labor Relations Act.
  • The Geography: European employees are exempt from MCI; GDPR and national worker-protection laws make this kind of tracking illegal without explicit consent. The exemption is the clearest signal of what the program actually is.

The Calendar

On or around April 21, 2026, Meta’s Chief Technology Officer Andrew Bosworth informed staff in the company’s Superintelligence Labs that a new internal program, the Model Capability Initiative (MCI), would begin capturing what employees do on their work-provided laptops. The software records mouse movements, clicks, keystrokes and occasional snapshots of screen contents across what reporting describes as several hundred websites and applications, including Google, LinkedIn, Wikipedia, Microsoft’s GitHub, Salesforce’s Slack, and Atlassian.

Two days later, on April 23, Meta’s Chief People Officer Janelle Gale sent a separate memo. Roughly 8,000 jobs, about 10% of the global workforce, would be cut. Another 6,000 open roles would be closed. The departure date for the laid-off staff is May 20, 2026.

On Tuesday, May 12, at multiple US Meta offices, the workers who had spent roughly three weeks being recorded started taping flyers to vending machines, meeting room walls and toilet paper dispensers. The flyers asked, in plain words, “Don’t want to work at the Employee Data Extraction Factory?” and directed staff to an online petition. The petition’s animating sentence is short: “Workers are legally protected when they choose to organize for the improvement of working conditions.” That is a direct invocation of the right Section 7 of the National Labor Relations Act protects: concerted activity by employees for mutual aid or protection.

Eight days after the flyers, the layoff date hits.

The structural reading is hard to miss. Meta installed surveillance software, announced layoffs forty-eight hours later, and is set to remove 8,000 people from its payroll less than a month after that, having spent the intervening weeks logging exactly how the people on the way out type, click and read. The site has already covered the financial half of this trade in Meta Cut 8,000 People to Spend $135 Billion on AI. MCI is the other half.

What the Software Actually Does

MCI is not a hypothetical. It is a deployed agent on US-issued Meta laptops. The captured data points are specific: every keystroke, every mouse movement, every click, and “occasional snapshots of the contents of their screens.”

The site list is the most revealing technical detail. Meta is not just logging activity inside its own internal tools, which would be a defensible IT-security posture. Reporting confirms the captured domains include external services where work happens: Google (search and Workspace), LinkedIn, Wikipedia, GitHub, Slack and Atlassian. Threads and the Manus AI agent (whose acquisition by Meta was announced in late 2025) are also on the list, which is described as still in flux.

A Meta spokesman framed the purpose this way: “If we’re building agents to help people complete everyday tasks using computers, our models need real examples of how people actually use them, things like mouse movements, clicking buttons, and navigating dropdown menus.” Read literally, that statement is true. A computer-use agent that has to drive a real browser, fill a real form, and click a real “Submit” button needs training data that looks like a human driving a real browser. Synthetic data and Mechanical Turk-style contractors do not capture the cognitive flow of a real worker mid-task.

What the spokesman’s framing does not address is who is paying for the data. The answer is: the workers themselves, in the currency of their own keystrokes, while a parallel memo cuts 8,000 of them off the payroll and closes another 6,000 open roles that will not be filled.

The Opt-Out That Isn’t

The clearest direct quotes belong to Bosworth, Meta’s CTO. Asked whether employees could opt out, he answered: “No there is no opt out on your work provided laptop.” Asked about Gmail being inside the capture surface, he advised: “Gmail is an approved context so if you have concerns it may be best not to check personal email on your work computer.”

The Gmail line is the tell. A program advertised as work-only does not normally require employees to be reminded that personal email will be ingested. The advisory is an unintentional admission of scope: MCI captures whatever is on the screen, full stop, and the employee’s defense is to change their own behavior around the capture surface.

Bosworth’s third notable comment, in response to a question about privacy review, was: “This project completed a privacy review. Not sure ‘what kind’ you mean but, the usual kind?” That is a candid admission that “privacy review” inside Meta does not mean what an employee would assume it means.

And the system’s stated payoff is not subtle. Bosworth has described future agents as the entities that will “primarily do the work,” with the human employees who survive demoted to “direct, review and help them improve.” The data being collected in April and May 2026 is the training corpus for the system that does the same work afterward. The workers who go to the door on May 20 are inside that corpus.

The European Exception

European Meta employees are not enrolled in MCI. Reporting on Bosworth’s announcement is explicit: “European privacy laws and worker protections prevent invasive tracking of the sort represented by MCI.” The General Data Protection Regulation requires a lawful basis for processing personal data; where the basis is consent, that consent must be freely given, specific, informed and unambiguous, and EU member states layer national worker-protection rules on top, making blanket workplace surveillance very hard to defend.

A defense of MCI as “standard enterprise IT logging” runs straight into that exemption. If MCI were genuinely standard, it would already be deployed in Europe, where Meta has thousands of engineers running the same workflows on the same applications. The fact that Meta chose to spend legal effort carving European employees out of the capture surface is the cleanest possible evidence that Meta’s own counsel reads MCI as something more than ordinary monitoring. Where the law makes it hard, the program stops. Where the law makes it easy, the program runs.

That asymmetry (protection by geography) is the inverse of the worker-mobility argument. An EU engineer doing the same job for the same company on the same code is excluded from the training corpus. The US engineer is the input.

The Steelman: Agents Really Do Need Real Data

The strongest defense of MCI is technical, and it deserves to be heard cleanly. Computer-use AI agents, the systems OpenAI, Anthropic, Google and Meta itself are racing to build, need training data that captures the messy reality of how humans drive a computer. The synthetic alternative (a model trained on screenshots and structured DOM trees) consistently fails to handle the basic chaos of real interfaces: half-loaded modals, weird focus behavior, the user who tabs twice instead of clicking. Meta’s spokesman is correct that “real examples” are what’s missing.

Other labs have publicly addressed similar problems through contracted, consent-based data labeling, typically via vendors like Scale AI or Surge. Those approaches are slower and more expensive than ingesting your own workforce’s daily activity, and the resulting datasets are smaller. Meta’s design choice was to use captive in-house users instead. That choice is what’s at issue, not the underlying technical need.

The other piece of the defense is the safeguards claim. A Meta representative told Fortune that “safeguards are in place to protect sensitive content and that the data will not be used for any other purpose.” Internal employee questions reported in Platformer cut against that reassurance: staff asked how the company would avoid capturing “users’ personally identifying information, or their own health- or finance-related data, particularly given that the tool is allowed to observe them on Gmail.” “Safeguards” is doing a lot of work in a sentence about a program with no opt-out and a CTO who advises employees to keep their personal email off their work machine.

The Taylor Inversion

The historical rhyme here is Frederick Winslow Taylor’s 1911 The Principles of Scientific Management, the founding document of corporate workplace surveillance. Taylor put a stopwatch on a factory floor worker to time every motion, then used the data to redesign the job for higher throughput from the same worker. The point of the stopwatch was to optimize a person.

MCI inverts the mechanism. The keystrokes are timed and recorded, but the optimization target is not the person being recorded. It is the agent that will replace the person. Taylor’s stopwatch made the worker more productive. MCI makes the worker more dispensable.

The closer industrial parallel sits a generation earlier, in the late-18th and early-19th-century British textile industry. Edmund Cartwright’s power loom, patented in 1785, was designed to mechanize work that hand-loom weavers had performed for generations; by the 1810s, the displacement of those weavers had produced the Luddite uprisings. The pattern (study the human worker, build the machine, retire the worker) is older than software. MCI just runs it at the speed of a corporate IT rollout.

The site’s White-Collar Assembly Line piece described how agentic AI fragments knowledge work into Taylor-like discrete steps. MCI is the data-collection layer underneath. You cannot Taylorize a knowledge worker until you have measured what the knowledge worker actually does. Meta is measuring.

What the May 12 Action Is Really About

The flyer campaign is not, in its own framing, primarily a privacy protest. The flyers and the petition both lead with the National Labor Relations Act, not with GDPR or California’s privacy statutes. That is a deliberate legal choice.

Section 7 of the NLRA protects concerted activities by employees for collective bargaining or mutual aid, and Section 8(a)(1) makes it an unfair labor practice for an employer to interfere with, restrain, or coerce employees in the exercise of those rights, a framework under which the NLRB has long treated workplace surveillance as potential interference. A program that captures every keystroke on every work machine, with no opt-out, is the textbook definition of a surveillance regime that could chill protected organizing. If a Meta engineer drafts a union-card sign-up form in a Google Doc, MCI is in a position to know about it.

The petition is the opening move in a likely NLRB filing. The flyers’ “Employee Data Extraction Factory” line is rhetorical, but the legal theory underneath it is concrete: when capture is universal and opt-out is impossible, the labor-law analysis treats the capture itself as interference, regardless of whether Meta ever reads the data with that intent.

In the UK, the same workforce has begun a separate organizing push with the United Tech and Allied Workers (UTAW) branch of the Communication Workers Union. UTAW’s interest is structural: a UK contract is going to look different from a US contract specifically because the work-product extraction question is contestable under UK and EU law.

What Comes After May 20

Three things are about to happen, in this order.

First, on May 20, the 8,000 cuts take effect. The departing workers’ three-to-four weeks of MCI capture stays with Meta. There is no published policy for purging an ex-employee’s training data, and the spokesman’s “data will not be used for any other purpose” clause does not commit to deletion.

Second, the petition and the UTAW campaign are on track to feed into one or more legal actions. An NLRB unfair-labor-practice charge runs through the agency’s regional offices under the procedures set out in Sections 7 and 8 of the National Labor Relations Act, the same statute the petition’s framing invokes. A UK certification application via UTAW would run on a similar order of timeline.

Third, every other large tech employer is watching. Meta has gone early on a question the entire industry needs to answer (how to build computer-use training data at scale without paying contractors) and has chosen the path of least resistance. If MCI survives the NLRB and the European-exemption pattern holds, the playbook for the rest of the sector is now visible: deploy in the US, exempt the EU, fold the workforce’s keystrokes into the model, cut the workforce. On May 14, the same day Meta workers were still tallying flyer signatures, Cisco announced roughly 4,000 cuts alongside record quarterly revenue and a strategic pivot toward AI infrastructure, restructuring on adjacent logic to Meta’s.

The piece of this story that should not get lost is the geography. There is a version of MCI that is technically necessary, deployed everywhere, and accepted by employees in exchange for something concrete: equity, severance, a clear deletion policy, an opt-out clause that means anything. That version of MCI exists in zero jurisdictions. The version that exists is the one that runs precisely where the law allows it and stops at the border with the law that doesn’t. The European employee and the American employee are doing the same work on the same machines for the same company. Only one of them is in the training set.

The Bottom Line

What Meta is doing is, by all available indications, legal under current US labor and privacy law. That is the indictment of US labor and privacy law, not a defense of Meta. The Model Capability Initiative captures a workforce’s daily cognition, the same workforce gets a layoff memo forty-eight hours after the program is disclosed, and Meta has openly described the captured data as the training input for agents that will, in its CTO’s own framing, “primarily do the work.” The May 12 flyer action is notable as a coordinated US tech-workforce objection to being mined for AI training data while still on the payroll of the company training those agents.

Six days from now, the people inside that training set walk out for the last time. The data stays.

Sources

🦋 Discussion on Bluesky

Discuss on Bluesky

Searching for posts...