Ian’s note: When I started writing this, I had no idea I’d find what I did. I have friends who work at Facebook who do good things there, who work hard in support of Facebook’s mission to give people the power to share and to make the world more open and connected. I did not communicate with them for this piece. Also, I am still a Facebook subscriber. It’s an indispensable utility that allows me to keep up with friends I otherwise would have lost touch with years ago and to see the joys of my family unfold in pictures and posts.

——————————

Our lawmakers can’t keep up. And so big tech opportunists are not criminals. Our personal thoughts that we share as private messages on Facebook, through what has been revealed recently as a meaningless sieve of privacy settings, belong to Facebook and its corporate partners, some of the world’s largest technology companies.

Facebook’s two chief executives, Mark Zuckerberg and Sheryl Sandberg, fail to protect the data on millions of its users, and despite a series of warnings, they failed to act to stem the proliferation of disinformation and “fake news,” which undermines democracy and has led to brutal beatings, murder, and genocide.

The Cost of Slow

To ramp up ad sales, increase profit, and build shareholder value, Zuckerberg and Sandberg repeatedly looked the other way upon hearing warnings of privacy breaches and civil unrest caused by the malicious use of their platform. On October  29 and 30, the PBS series Frontline spelled it out clearly in their two-part documentary, “The Facebook Dilemma.” Facebook’s primary talking point when asked about its social irresponsibility continues to be the admission that they’ve been slow to understand how the platform can be used illicitly. Check these out:

Russia uses Facebook as a weapon against U.S. democracy.

“We didn’t see it fast enough.” — Nathaniel Gleicher, Facebook Head of Cybersecurity Policy (from Frontline’s “The Facebook Dilemma”)

Philippines President Rodrigo Duterte uses paid Facebook followers and fake Facebook accounts to target critics of his drug war. Through the use of Facebook, the war grows into one the U.N. has labelled a crime under international law and one that Human Rights Watch calls government-sanctioned butchery.

“I think we were too slow.” — Monika Bickert, Facebook Vice President of Global Policy Management (from Frontline’s “The Facebook Dilemma”)

In 2014, Facebook learns that Buddhists in Myanmar use Facebook to target the Muslim minority with hate speech and fake news stories of Muslim men raping Buddhists women, leading to the brutal beatings and murder of men who are innocent victims. By 2017, a massive wave of violence displaces over 150,000 people in what the U.N. labels genocide and in which they find the Facebook platform has played a central role.

“We have been slow to really understand the ways in which Facebook might be used for bad things.” — Naomi Gleit, Facebook Vice President of Social Good (from Frontline’s “The Facebook Dilemma”)

In March 2018, the New York Times and other news outlets report that personal Facebook data of approximately 87 million users (the original number was 50 million) has been acquired by British political consulting firm Cambridge Analytica in a breach of Facebook terms of service. The data is later accessed from Russia and is used to target political ads on behalf of the Trump campaign in the 2016 U.S. presidential election.

“One of my greatest regrets in running the company is that we were slow in identifying the Russian information operations in 2016.” — Mark Zuckerberg on April 10, 2018, at a joint hearing between the Senate Judiciary Committee and Senate Committee on Commerce, Science and Transportation, concerning the Cambridge Analytica leak.

Thoughtless Opportunism

Two weeks after “The Facebook Dilemma” aired, the New York Times published its scorching 6,000-word front-page feature “Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis.” The article opens with the story of Facebook’s stymied security chief, Alex Stamos. In September 2017, more than a year after Facebook had discovered Russia activity on its platform, Stamos informs Facebook’s board of directors that the company has yet to contain the crisis. Sandberg’s reaction to this later in a private meeting with Stamos and Zuckerberg says everything about the thoughtlessness of leadership at Facebook: “You threw us under the bus!” she screams at Stamos. Stamos had been trying for months to get leadership at Facebook to take the matter more seriously.

Turns out, Zuckerberg and Sandberg are not slow to understand. They’re quick to cover up. They hid Stamos’s warnings from the public and focused on growing Facebook’s number of users and finding ways to extract more data from them. Zuckerberg and Sandberg are the most thoughtless opportunists in tech.

Sandberg, who is Facebook’s chief operating officer, was vice president of global online sales and operations at Google. There, she was responsible for online sales of Google’s advertising and publishing products and Google’s consumer products. She started with four people; when she left Google, she had a team of 4,000.

In 2012, when sales at Facebook began to flatten before the company’s IPO, Sandberg did something that left many privacy watch groups concerned. She created Facebook partnerships with large data broker firms that for decades have collected information about individuals from public records and private sources — identifying data, sensitive identifying data, demographic data, data from court and public records, home and neighborhood data, general interest data, vehicle data, travel data, purchasing behavior data.

Facebook then began matching this data with the data it collects from its users in order to serve those users highly targeted ads. Sure enough, advertisers loved it and ad sales went up. (But the company’s IPO was still a shaky one.) For many, this is an acceptable business model, and many users prefer it over seeing ads that are meaningless. If Facebook’s data practices and policies ended there, things would be okay, sort of. But they don’t.

Making the World More Open and Connected

Most of the information we share with Facebook can be accessed by any app developer. The Cambridge Analytica scandal came about because an app developer had access to the personal data of 270,000 users. That data also included information on the friends of those 270,000 users. This was all legit and within the terms and services offered by Facebook (a fact that has been a revelation to many). But then the app developer provided that data to another party, breaching Facebook’s terms. This is how Cambridge Analytica came into possession of it. The total take for CA was personal data on 87 million users.

Once a developer takes data from Facebook, Facebook has no control over it. And once that data is out there, it can go anywhere. There’s no getting it back. The only thing Facebook can do is hold up a terms-of-service agreement and say, “See, we told them not to share it!”

Today, Facebook’s mission — to give people the power to share and to make the world more open and connected — seems quaint and anachronistic. When you build what amounts to a public utility, and that utility is exploited to erode democracy and spill blood, you have a new responsibility. It’s not just to your shareholders, and your book publisher, and a snarky ego that should have been left in a college dorm room. It’s to a global population affected by the 2.2 billion people, real and fake, who use your platform.

Big Tech Buddies

Not only had Sandberg and Zuckerberg concealed warning signs of the Russia scandal from the public, Facebook remained focused on growing its user base and driving profits through the strategic use of profile data with corporate partners. And it did so after insisting to lawmakers that it had put privacy protections in place.

In hundreds of pages of records from Facebook’s internal system for tracking partnerships, the Times, in a December 18, 2018, article, underscored how personal data has become the new currency of the digital age. Facebook has become ruthless at extracting more data from its users, making what they can offer their corporate partners more valuable. Here are some of the ways Facebook lets partners use our data: Microsoft’s Bing search engine can see the names of Facebook users’ friends without consent; Netflix and Spotify can read Facebook users’ private messages; Amazon can collect users’ names and contact information through their friends; and as recently as this summer, Yahoo could view streams of users’ friends’ posts.

“We are focused on privacy. We care the most about privacy. Our business model is, by far, the most privacy-friendly to consumers.” — Sheryl Sandberg (from Frontline’s “The Facebook Dilemma”)

Zuckerberg has repeatedly said that we have complete control over anything we share on Facebook. This simply isn’t true.

Regulation and the Blow from the Future

We still feel entitled to those parts of our privacy that have been furtively vacuumed into the commercial domain. That domain itself is private, and so the irony is plaguing. But we got beat in a trade, fair and square according to the law — or lack of law. We traded for convenience, and because of its unforeseen utility, that convenience has become indispensable. Can we unplug? Can we go without internet service and a cell phone and dispense with privacy settings and user agreements? We’d get most of our privacy back, but we’d be living in 1984.

I bring this up because the sad truth is that today’s privacy protection under the law maps to that time period. We’ve been hit with a blow from the future. Our lawmakers can’t keep up. It’s the wild west; as a good friend used to say, it’s the Wild Wild Web. This, of course, will change. Our laws will catch up. But when they do, we’ll have a new collective understanding of what privacy means in a more open and connected world. And what it means to place it in the hands of thoughtless opportunists.