Resistance is (currently) futile. Maybe it doesn’t have to be.

By Charlie Warzel

Mr. Warzel is an Opinion writer at large.

This article is part of a limited-run newsletter. You can sign up here.

You can’t solve a problem that you can’t define. That’s why I love this dual definition of privacy by Maciej Ceglowski, one of my favorite writers and thinkers on technology. It’s from his written testimonysubmitted to the Senate Banking Committee last week.

The first definition is the classic one, that data privacy is “the idea of protecting designated sensitive material from unauthorized access.” Easy enough. His second is much more profound and, as he puts it, “until recently was so common and unremarkable that it would have made no sense to try to describe it.” Here it is:

That is the idea that there exists a sphere of life that should remain outside public scrutiny, in which we can be sure that our words, actions, thoughts and feelings are not being indelibly recorded. This includes not only intimate spaces like the home, but also the many semiprivate places where people gather and engage with one another in the common activities of daily life — the workplace, church, club or union hall.

Close X

What Ceglowski is really talking about is the ability to “opt out.” It’s a phrase that big tech companies love to use. Just toggle this button and you’re free! David, the user, has control, not Goliath. This is, of course, quite disingenuous. As Ceglowski argues:

A characteristic of this new world of ambient surveillance is that we cannot opt out of it, any more than we might opt out of automobile culture by refusing to drive. However sincere our commitment to walking, the world around us would still be a world built for cars. We would still have to contend with roads, traffic jams, air pollution, and run the risk of being hit by a bus. Similarly, while it is possible in principle to throw one’s laptop into the sea and renounce all technology, it is no longer be possible to opt out of a surveillance society.

We’ve built a society and economy that runs on surveillance, a world where the price for participation is tracking, targeting and disclosure of data. “Opting out” might as well mean heading to Walden Pond (and even then it’s likely that, in preparation for your journey to Thoreau’s cabin, you’d be targeted by ads for self-reliance books on Amazon, freeze-dried prepper meals and 12 different iPhone meditation apps).

I called up Ceglowski after his trip to Washington to inquire about the experience and what he thinks we can do to make opting out less of a pipe dream. Like anyone with a decent understanding of how the web works, he has a healthy skepticism that we’ll rein in privacy violations, but his one potential area of optimism really stuck with me. It’s the concept of positive regulation.

The gist is that Google and Facebook and the entrenched platforms are truly vulnerable only in one area: privacy. He argues for a legally binding framework with harsh penalties (criminal liability) for playing fast and loose with data. The logic is that big tech companies are so reliant on invasive privacy practices and deal with so much information that there’s no way they can play by such rules. But new entrants — companies that are smaller and that actually put a premium on privacy — might be able to differentiate themselves and disrupt the space.

“If we use privacy constructively and create a legal framework, we can incentivize those who want to go up against the entrenched players by marketing themselves as explicitly privacy-focused,” he told me.

Perhaps most important, Ceglowski’s approach would finally test the idea of just how much internet users value data privacy. “It is possible that the tech giants are right, and people want services for free, no matter the privacy cost. It is also possible that people value privacy, and will pay extra for it, just like many people now pay a premium for organic fruit,” he wrote in his statement.This is your last free article.

Close X

Over the phone, he explained that, while it might seem small, if real people on the internet vote with their wallets to use privacy-focused services over big data-sucking platforms like Facebook and Google, the effect could be profound. He cited the telemarketing wars of the early 2000s as an example.

“When telemarketers were fighting the ‘do not call’ list they argued that people loved having the opportunity to hear about great deals and products via phone during dinner time,” he said. “But once the regulation passed, everyone signed up for that list and it became obvious that the industry’s argument was laughable.”

So far, nobody’s been able to poke a hole in big tech companies’ argument that we enjoy their services enough that we’re O.K. with constant privacy violations. Perhaps that’s only because, as Ceglowski suggests, it’s all we know. “In the best case, you could have companies who can make the argument that real people care about privacy, as long as they’re given a realistic option,” he said.

https://www.nytimes.com/2019/05/14/opinion/data-privacy