In today’s open and connected world, we’re negotiating a new understanding of privacy. Advancements in communication technology are thrusting us into new realms of awareness, access, and commerce, and they’re doing so at such an accelerated pace we can barely hang on. The information we share about ourselves fuels these advancements, igniting first as content and then firing repeatedly as marketing data. In the process, our privacy has become less consequential.

Even though we underwrite many of these advancements by trading personal information for benefits and conveniences, many of us aren’t ready to accept new norms of privacy. We remain concerned about unforeseen costs, particularly given the poor stewardship over our personal data in this Wild West era of Big Tech. But history shows us two things about advancements in human communication: They are evolutionary and they redefine privacy.

The Rise of the Eco-community

Recently, Amazon purchased a router company called Eero, which you likely haven’t heard of. Eero makes a router system that covers hard-to-reach Wi-Fi dead zones in your home. This router system joins the growing legion of data-collecting devices Amazon already has in the home — Echo, the Ring security camera, plugs, thermostats, lights, microwaves, vacuums — many of which include Alexa, who can hear everything you say.

The Amazon eco-community (“eco-“ for economics) is a good example of how the single-click acceptance of every user agreement we’ve never read has led to the subordination of privacy to convenience.

Eero also collects and reports on quite a bit of data from the home, including what devices you use to connect to the internet, the times you’re most active online, and the streaming patterns of the devices you use. So now, Amazon can build a comprehensive profile of you in the physical and digital world, while at the same time building intelligence on its competitors whose devices you use in your home. Amazon does this so it can provide you with better, more personalized products and services. Lucky you!

This freaks a lot of people out. But for decades, marketers have collected and used behavioral data in creative ways to increase the chances that we will buy their products, and they’ve used this data to come up with, well, better products. Today, that data marketplace is growing globally at an annual rate of over 30% per year as new sources of higher quality consumer data are made available. We are those new sources, wittingly or not. We’re the ones agreeing to provide most of this higher quality data, and it brings up legal and ethical questions around ownership of data, rights to data, and the privacy of data.

We’re Not Entitled to Privacy

There is no clear definition of privacy. Aristotle made a distinction as it relates to government and politics (the public sphere) versus individuals and families (the private or domestic sphere). And in the U.S., there was no constitutional right to privacy until 1965, when the Supreme Court, in Griswold v. Connecticut, derived a right to privacy only after cobbling together other more clearly stated constitutional protections.

Two essential forms of privacy frame the debate. One is a shield behind which we’re free from the interference of others. And one is a cloak beneath which we hide inappropriate or illegal activity. When we talk about privacy and privacy protection, we usually speak in terms of a hostile action and a victim.

The debate over entitlement to privacy is not an easy one. Griswold established a right to privacy, but not an entitlement. Online, our entitlement is at the discretion of Big Tech. Anytime we open an app, post on social media, create a Google doc, say Alexa’s wake phrase, or send a text message, the level of privacy we’re entitled to is a single “new terms and services” revision from being anybody’s guess. Those who continue to think we will once again enjoy the privacy we had just a year ago will find themselves increasingly disappointed. The genie is out of the bottle. And a look at history shows it’s not the first time.

The Face with Tears of Joy

Advances in communication sign-post human evolution. The advent of human languages 50,000 years ago, for example, facilitated the production of tools from generation to generation. Look where that brought us. And then there’s the Gutenberg printing press. It incentivized us to learn how to read and enabled us to expand our knowledge beyond what we saw and heard in our local villages. Gutenberg’s press was the single most important catalyst for the Renaissance, the Industrial Revolution, the Technological Revolution, and western Democracy as we know it today.

These improvements in our communication opened windows not only into the knowledge of things, but knowledge of ourselves and each other. The advent of languages and Gutenberg’s printing press represent massive invasions of privacy, and it’s silly to ask if they were worth it.

Set these two advancements against my favorite in recent times: the return of symbolism to language on platforms that encourage innovation and deviation. In 2015, the Oxford Dictionary’s Word of the Year was an emoji. The dictionary chose the Face with Tears of Joy emoji as “the ‘word’ that best reflected the ethos, mood, and preoccupations of 2015.” Caspar Grathwohl, the president of Oxford Dictionaries, said at the time that emoji are becoming “an increasingly rich form of communication, one that transcends linguistic borders.”

Rich indeed. Our communication technologies today have speeded up a key determinant of human advancement: the process of building empathy. Today, I can instantly build common understanding with people on any landmass by posting a symbol.

Look at the #MeToo movement. We’ve distilled a complex injustice to a single hashtag that is helping us establish traction to push for change. A great amount of pain and action (good and bad) underpins that hashtag, but thankfully we can now build empathy around this issue with a speed like never before. The ability to broadcast through blogs, social media, and messaging tools allows a single voice to resonate with the like-minded and the like-challenged, and to convince all else that the waters are not calm.

Accepting our differences is something humans have been slow to do, especially along racial and gender lines. But now, the speed and repetition by which we can become aware of, understand, process, and respond to our differences allows us to get over them more quickly than ever before and onto the more important challenges we face as a single people.

The Big Data Bubble

We have never had the potential to build empathy this quickly, but we face a privacy paradox. We’ve lost trust in the brokers who provide us the information. Big Tech has exploited our data with alarming stealth and carelessness, and now there’s an emerging resistance to sharing them. We’re in the middle of a big data bubble, where Big Tech has shaped markets to their advantage by commercializing information they collect on us and by inflating that information’s value with claims it contains a component that it really doesn’t: our permission.

But the bubble might pop soon. Last week, Mark Zuckerberg took an about face and announced that Facebook would shift toward private messaging and away from the public-broadcasting model of social media it helped pioneer. He said he wants to turn Facebook into a “living room” instead of a “town square,” admitting that Facebook doesn’t “currently have a strong reputation for building privacy protective services.” The platform, he said, will evolve in a direction people care about, namely private messaging. What we haven’t heard from Zuckerberg is how Facebook will monetize this shift and how the platform will overcome its reliance on the trade of information from that town square.

Also last week, Sir Tim Berners-Lee, at an event hosted by The Washington Post marking the 30th anniversary of the World Wide Web, reminded us of the potential of the web but pointed out that the web has been “hijacked by crooks.” Last November, Berners-Lee announced his open-source project, Solid, that lets us retain control over our own data by storing it in a “pod” that allows Big Tech access on our terms, not theirs.

When the big data bubble pops, there will be the usual economic fallout as Big Tech grapples with new business models involving our personal data. But we will arrive at a new understanding of privacy.

It will no longer excuse our ignorance of the types of data that exist for us, how it’s collected, where it’s stored, or how it’s used. Transparency mechanisms like the European Union’s GDPR will prompt and remind us each time we are about to share data; and tools like Solid will give us ways to seal off our data from those we don’t want to share it with. But these mechanisms are mere safety measures in a world in which people with good intentions and bad have increasing degrees of connection and knowledge, a world in which we each must resume the maintenance of our own privacy.