Skip to content
    September 25, 2025

    A “Massive” Difference: Why Facial Recognition for Surveillance Isn’t the Same as Authentication

    ​​By: Amy Osteen - General Counsel of Alcatraz

    Music and surveillance don’t usually end up in the same headline. Yet that’s exactly what happened after Massive Attack’s September 14 show in London’s Victoria Park, where the band turned facial recognition into stage art.

    I have fond memories of Massive Attack from college. Overalls, braids, a “2 + 2 = 5” bumper sticker, and their Teardrop CD on repeat in my car. 

    Fast-forward to now, and Massive Attack is still blending music with message. Only this time, it wasn’t a song coming through a car stereo. Thousands of faces were scanned, tagged, and projected onto giant screens in real time.

    People started sending me the clips. Isn’t this what you do? 

    No. There is a Massive difference in what we do.

    The Concert

    On September 14 cameras swept across Victoria Park. The feeds were piped into facial recognition software that extracted faces, matched them against a dataset, and displayed names and sometimes professions of individuals above the stage. The audience itself became part of the visuals.

    It was provocative and unsettling. Some attendees cheered when they saw themselves on screen, while others were unnerved when they were identified without consent. Headlines called it both amazing and terrifying.

    Audience reactions to the display ranged from fascination and praise to discomfort and concern. Some attendees described the experience as “amazing and terrifying,” especially when their own faces and names appeared on massive screens in front of the crowd.

    Social media posts revealed a mix of emotions. Some people found the spectacle thought-provoking and admired the band’s critique of surveillance culture. Others felt uneasy or violated by having their biometric data captured and projected without consent. One attendee online remarked, one more reason to not leave the house.

    The lack of transparency around the database, the uncertainty of its sources, and the very public display of personal information led several privacy advocates and fans to question whether the performance was a bold piece of art or a demonstration of how easily mass surveillance can be conducted.

    _ChatGPT Image Sep 23, 2025, 09_15_24 PM

    AI-generated image — not an actual event photo.

    What They Actually Did

    From reports and audience clips we know the mechanics looked like this:

    • Live cameras scanned the crowd in real time
    • Faces were detected and pulled from the video
    • The recognition engine compared those faces against a database
    • Matches were displayed instantly as part of the stage visuals

    The band hasn’t disclosed what database was used, whether outside sources were tapped, or whether any data was stored after the show. That uncertainty is exactly what makes lawyers and privacy advocates sit forward. If the system reached into external sources, even for labels, then what looked like art was also a live demonstration of surveillance.

    Why It All Looks the Same (But Isn’t)

    Here’s where people understandably get confused.

    From the outside, all facial recognition use cases can look alike. A camera points at you, your face gets analyzed, and a result comes back. If you don’t live in privacy law or biometrics daily, why wouldn’t you assume it’s all the same?

    But it isn’t the same, and this distinction matters to us, a lot.

    • One-to-many (1:N) comparisons like Massive Attack used scan crowds against large datasets, often without consent. The goal is to figure out who someone is, even if they never signed up.
    • One-to-one (1:1) comparisons verify a single person against their own enrolled profile. Think about unlocking your phone.
    • One-to-few (1:few) comparisons check one face against a small group of authorized people. Think employee check-in or VIP access.

    So far, so good. But here’s the tricky part. Sometimes, authentication can look like one-to-many math. A workplace with 200 enrolled employees might compare a face at the door against 200 stored templates. From the outside, that feels like the same thing as surveillance.

    The difference is not the math. The difference is where the data comes from and who controls it.

    • One-to-many surveillance pulls from outside sources like government registries, scraped social media, or other datasets people never knowingly provided. No consent. No control.
    • One-to-many closed-system authentication only checks against templates inside the organization’s system. Everyone in that dataset enrolled voluntarily. No strangers. No scraping.

    The math might look similar. The stakes are completely different.

    Two Very Different Models

    The facial recognition technology used by Massive Attack at their concert operated as a surveillance-style 1:N system. That is fundamentally different from how Alcatraz works.

    Massive Attack: 1:N Surveillance

    • The software scanned hundreds or thousands of faces in the crowd and tried to match each one against a database, potentially identifying anyone present without consent
    • It happened in real time with no opt-in — individuals did not enroll, weren’t actively interacting, and may not even have realized they were being processed
    • Faces and names were displayed publicly on stage as part of the spectacle, turning mass identification into art
    • In the real world, this kind of system could be used for tracking, profiling, or cataloging people without their knowledge

    Alcatraz: 1:1 or 1:few Authentication

    • Built for secure, privacy-first access control, not surveillance
    • Every user has to explicitly enroll with the system in advance and provide consent
    • When someone approaches a door, their face is compared only against their own stored profile (1:1) or against a very small group in special workplace settings (1:few)
    • No crowds, no scanning strangers — only authorized individuals can gain entry
    • The system never stores face images; instead, it converts each face into an encrypted mathematical template using a 1-way non-reversible algorithm and 
    • Processing is done locally at the device
    • Biometric data is not shared with the company and employee PII is not shared with Alcatraz
    • Users keep control of their data, with the ability to opt out or revoke at any time
    • Data is never shared for tracking or marketing

    A Quick Side-by-Side

    Feature

    Massive Attack (Concert)

    Alcatraz AI Access Control

    Type of matching

    1:N (one-to-many)

    1:1 or 1:few

    Purpose

    Mass surveillance and spectacle

    Secure authentication

    Consent

    None, public scanning

    Explicit opt-in enrollment

    Data use

    Public display and potential tracking

    Private encrypted templates

    Privacy controls

    Almost none

    Strong and user controlled

    Massive Attack used facial recognition to showcase the sweeping, invasive potential of surveillance. Alcatraz shows how the same underlying technology can be designed for the opposite — protecting rights and privacy while providing practical security and convenient access.

    Questions to Ask if You’re Evaluating Biometric Systems

    If you’re a lawyer, compliance officer, or even just a business leader thinking about biometrics, here are the questions you need to press.

    • How is consent obtained? Is it written, informed, and revocable
    • What is the exact purpose? Is the system locked to access control or open to secondary uses
    • Where does the dataset come from? Did everyone knowingly enroll
    • How long are templates kept? What is the deletion policy
    • Who owns the data? The organization or the vendor
    • Is it processing local or in the cloud?
    • How is transparency provided? What notice is given and when?

    If the answer to dataset origin is “outside sources” or “we don’t know,” you’re looking at surveillance, not authentication.

    Why This Distinction Matters

    When people hear the phrase “facial recognition,” they often imagine the worst version. The one-to-many surveillance model. That’s why public reaction to the Massive Attack show was so visceral.

    Some audience members left calling it amazing and terrifying. Others joked that it was one more reason to not leave the house. Another quipped online, I love Massive Attack, but I’m not looking to get doxed at a concert.

    Those reactions matter. They show how quickly people conflate all uses of facial recognition with the surveillance-style systems they fear most. If you felt unsettled in that crowd, would you really care about the nuances of one-to-many versus one-to-one? Probably not.

    That is exactly why education is needed. Not to make anyone feel dumb but to show that the difference is real. Surveillance systems identify strangers without consent. Authentication systems only work for people who chose to enroll.

    Back to Victoria Park

    Which brings us back to the Massive Attack concert. The band wanted to provoke. They succeeded. People left talking about what it feels like to be watched and labeled in public. The stunt made surveillance visible.

    But the fact that so many people immediately asked me, “Isn’t this what you do?” proves why education matters. Without careful explanation, all forms of facial recognition look the same from the outside.

    Surveillance asks: Who is this stranger among thousands?
    Authentication asks: Is this the person who enrolled for this door?

    That is the Massive difference.

    In college, Massive Attack was the soundtrack of my late-night drives. Their music was moody, unsettling, and ahead of its time. Last week, their show was moody and unsettling in a different way. It forced thousands of people to feel the mechanics of surveillance in their own skin.

    As art, it was effective. As a model for technology, it was a warning.

    The lesson is clear for anyone evaluating biometrics. Look closely at where the data comes from, who controls it, and whether people opted in. If the system pulls from the outside world without consent, it is surveillance. If it only works within a closed system of enrolled people, it is authentication.

    Massive Attack made a crowd feel watched. At Alcatraz, our job is to make sure people understand the difference and that companies choose the path that builds trust instead of fear. 

    Let’s Rock.

    Tag(s): Blog

    Other posts you might be interested in

    View All Posts