Blog

The New PPP: Public-Private + Privacy

Written by Alcatraz | Oct 21, 2025 9:56:27 PM

​​By: Amy Osteen - General Counsel of Alcatraz

Years ago, before Alcatraz, I was deep in negotiations with a foreign government hosting one of the world’s biggest sporting events. It was the kind of project that eats your calendar alive. Midnight calls, dozens of faces on Zoom calls, lawyers who always needed “just one more turn.”

The recurring argument was about data.

The government wanted everything. Every credential, every vendor record, every biometric template, every scan of anyone who went near the stadium. They promised to keep it “safe” and “local” to protect citizens from outsiders. Back then, it sounded sensible. Governments controlled roads, water, airspace. Why not data?

That was another time.

Today, the idea of a single entity owning everything about you feels different. People expect to know who has their information and why. They want proof it isn’t being misused. They want to feel protected, not managed.

That shift, from control to consent, has changed almost everything. It’s rewriting how we think about power, responsibility, even security itself.

The Trust Crisis

Traditionally, PPP meant Public-Private Partnership — governments and private companies joining forces to build physical infrastructure. But in 2025, the partnerships that matter most connect data, people, and machines.

And that adds complexity. 

Recently, the World Economic Forum published an article titled “The AI Trust Crisis.” (1)

According to the World Economic Forum report, approximately six in ten executives believe AI is being used responsibly, and approximately half of employees agree. The report indicates that ninety-five percent of AI projects do not progress beyond the pilot stage, with trust issues cited as a primary factor rather than technical failures.

Trust isn’t code. It’s a feeling. The quiet permission we give to the systems that run our lives. Once it cracks, the whole machine stalls.

Cities delay security upgrades because they’re afraid of surveillance headlines. Companies hesitate to deploy new tools because someone says, “What if this looks creepy?” Employees quietly ignore software they don’t believe was built for them.

The problem isn’t AI itself. It’s how we build it, govern it, and explain it. Not because the algorithms don’t work. Because people don’t trust them.

The New PPP: An Answer to the Trust Crisis.

The old PPP built infrastructure. The new one must build confidence too. 

We call it Public-Private plus Privacy.

The kind that forces both sides to stay honest. Governments can’t regulate fast enough, and companies can’t keep saying, “Just trust us.” It has to be a partnership with clear boundaries. Public agencies bring oversight and credibility. Private companies bring speed and creativity. The public brings a much-needed gut check on both.

When those forces align, progress earns its right to happen. But privacy has to come first. The moment people feel watched, the trust evaporates. Privacy isn’t paperwork. It’s a design choice people can sense, even if they can’t see it.

Privacy isn’t paperwork. It’s a design choice people can sense, even if they can’t see it.

But design only works if everyone knows their role.

If we want the new PPP to hold up, we need more than good intentions—we need clarity about who does what.

Think of it as a meta RACI for trust.

Governments set the rules. Private companies build the tools. The public decides whether to use them.

When any of those roles blur, trust collapses.

Role

Government

Private Sector

The Public

Responsible

Setting guardrails through laws, procurement standards, and oversight of data use

Building and maintaining technology that meets privacy-by-design standards

Using systems responsibly and reporting misuse

Accountable

For the ethical, lawful deployment of AI and surveillance technologies

For truthful claims about performance, security, and privacy

For holding both sides accountable through civic and consumer feedback

Consulted

Industry, civil society, and privacy experts on policy and deployment

Regulators and standards bodies on emerging requirements

Through public comment, transparency initiatives, and user councils

Informed

About system risks and incident outcomes

About new regulations and consent expectations

About how their data is used, stored, and protected

This isn’t bureaucracy. It’s balance.

A way to make trust operational—to move from hoping it exists to proving it does.

The report also outlines three levers for building trust: governance, assurance, and inclusion. The words sound academic, but the ideas hold up.

  • Governance is about clarity. When governments require privacy-by-design, innovation doesn’t slow. It becomes safer. Everyone knows who owns the data and who is accountable. That’s how we built The Rock at Alcatraz. It authenticates people without creating surveillance. It gives organizations control without collecting what they don’t need.
  • Assurance is proof. Real proof. Not marketing copy, but independent audits and performance that stands up under sunlight.
  • And inclusion. The quiet one that decides who benefits. I’ve watched technology reach wealthy markets first while public institutions wait years to catch up. Privacy shouldn’t be a privilege. If it only protects people who can afford it, we’ve failed.

A lot has changed since my first biometrics deal.

The same debate about data and control still happens, but the answers sound very different.

At Alcatraz, we recently worked with a quasi-government agency that manages critical infrastructure. Thousands of employees. Sensitive facilities. Complex oversight.

Their position was the opposite of that old sporting event deal. They didn’t want to store personal data, they didn’t want their people feeling like they were being monitored, and they didn’t want the liability or the risk.

It was a relief. A sign that the mindset is changing.

The Rock was built for that kind of thinking. It separates who you are from what you can access. Instead of sending facial images to the cloud or saving them in a central database, it creates an encrypted facial signature that never leaves the device. The matching happens locally, right at the door, and the data stays under the organization’s control.

That means no bulk storage, searchable image libraries, or central repository that could ever be misused or breached. Each interaction is simply a question—is this the right person, right now? And once answered, it’s gone.

It’s a quiet, almost invisible kind of security. People walk through as themselves, not as a dataset. Systems that know who belongs without cataloging everyone who doesn’t. Less collection, more consent. Progress through design restraint.

What used to be a fight over who owns the data is turning into a shared understanding that protecting privacy protects everyone.

And if we don’t protect privacy, and are left with PPP minus privacy?

The WEF estimates that if AI doesn’t earn public trust, the global economy could lose more than FOUR TRILLION DOLLARS by 2033. Hard to picture that kind of number, but you can feel the loss in smaller ways.

Every stalled project. Every delayed upgrade. Every “maybe next quarter” delay that could have prevented something bad. 

Mistrust isn’t just a drag on innovation. It’s a risk multiplier. Every delay leaves a door unlocked a little too long.

Public-Private + Privacy

So the new PPP spreads that responsibility. The best partnerships now aren’t built on shared infrastructure, but shared integrity.

That’s why everything we build at Alcatraz starts from a simple idea: Security should protect people, not monitor them.

The Rock makes that idea real. Physical security that behaves like digital privacy. A checkpoint that feels more like a handshake than a spotlight.

That’s what partnership looks like when it works. Technology that helps governments keep people safe while keeping their trust intact.

The first PPPs built roads and power lines. The next ones will build trust.

Learn More: https://www.alcatraz.ai/resources/privacy-trust-center

______________

(1) See The AI Trust Crisis: Why Public-Private Partnerships Are Key to Responsible AI, World Econ. F. (Sept. 16, 2025), https://www.weforum.org/stories/2025/09/ai-trust-crisis-public-private-partnerships/