Flock Uses Overseas Gig Workers to Build Its Surveillance AI
Author
Punchy take: This is one of those stories where the mechanics of surveillance are shown in plain sight — cheap gig labour, leaked dashboards and US citizens’ movements being parsed, annotated and used to train systems that police already rely on. Read up; it matters.
Summary
An accidental exposure of internal material shows Flock, the company behind automatic licence-plate-reading and AI surveillance cameras installed in thousands of US communities, has been using gig workers overseas to annotate and train its machine-learning models. The leaked panel and worker guides — reported by 404 Media and WIRED — indicate tasks included transcribing licence plates, categorising vehicle make, colour and type, labelling people and clothing, and even audio labelling for events such as screams, gunshots or car wrecks.
The exposed data listed annotators, some of whom appear to be based in the Philippines and hired via Upwork. Screenshots used in worker guides show footage clearly shot in the US. After being contacted, Flock removed the exposed panel and declined to comment. The revelations raise questions about who can access sensitive footage and where people reviewing it are located, at a time when law-enforcement searches of Flock data often occur without warrants and civil-rights groups are challenging widespread deployments.
Key Points
- An accidental leak revealed Flock used overseas gig workers to annotate surveillance footage used to train its AI models.
- Workers appearing in the exposed material are reportedly based in the Philippines and were hired through Upwork.
- Annotation tasks included transcribing licence plates, tagging vehicle make, colour and model, and labelling people and clothing.
- Audio annotation instructions included selecting labels such as ‘car wreck’, ‘gunshot’ or ‘screaming’ and noting confidence levels.
- Flock’s cameras allow nationwide searches for vehicles and are used by police; searches often occur without warrants.
- Flock patent materials and slides mention detection of sensitive attributes such as ‘race’, amplifying privacy concerns.
- The exposed dashboard showed large volumes of annotations completed over short periods, highlighting reliance on human labour for training.
- After being contacted by reporters, Flock took the panel offline and declined to comment, leaving many questions about oversight and access.
Why should I read this?
Because this is how the sausage is made: your town might have Flock cameras and your car journeys could be used to train AI by people overseas. It’s worrying, and you should know who is looking at footage and how it’s being labelled. We’ve skimmed the noise and pulled out the bits that explain why this actually affects privacy, policing and public oversight — so you don’t have to dig through the leak yourself.
Source
Source: https://www.wired.com/story/flock-uses-overseas-gig-workers-to-build-its-surveillance-ai/