The default business model of the internet is advertising spyware. What does that mean for grassroots organizing and independent media?

coat button with camera spyware

During the Cold War, the East German secret police known as the Stasi could turn a mundane object like a jacket button into a spying apparatus. A typical page on today's internet is a modern version of this device, populated with virtual buttons tied to advertising platforms that are watching, whether you choose to click the buttons or not. But who are the advertising platforms tied to? And what does that mean for grassroots organizing and independent media?

The following is not a commentary about right or left politics. It is about the unbridled potential of automated bureaucratic systems of control and what it means when they fall into centralized systems of power that tend toward kakistocracy: rule by the worst, least qualified and/or most unscrupulous citizens.

What is "Advertising Spyware"?
How much does "free" cost?
Journalism's Big Data contradiction
It's personal
"Just a tool"
Don't like the game? Change the rules.
Gotcha in the fine print
More Reading

What is "Advertising Spyware"?

The default business model of today's internet is the harvesting of as much data from connected devices as possible in the name of delivering more effective "Return on Investment" (ROI) for advertisers. In other words: Spying on consumers.

How much does "free" cost?

Behavioral advertising utilizes the data collected by this spyware to create a feedback loop. An endless carousel of algorithms whose rhythms dictate what your eyeballs see.

Behavioral advertising does this so effectively that it is now the industry that currently fuels most of your "free" experiences in the connected world. But what are we really paying to have these free experiences? What is the real cost of free?

Journalism's Big Data contradiction

According to conventional wisdom, journalism is a public-interest vocation. In practice, we recall that Reuters began as a pricey news service available to traders who subscribed to stock prices delivered by carrier pigeon to get ahead of commodity markets in the mid-1800s.

Now that news is ubiquitous, technology platforms have benefitted public-interest news and other political organizing tools by utilizing behavior-driven personalization algorithms to deliver audiences outside the bounds of traditional broadcast media. The Faustian bargain, however, is that this transaction does not end when a new audience member "likes" or discovers a great news site or political organizing tool. In fact, at this point the Faustian bargain has just begun. Because if that website or tool is using one of the many behavioral advertising platforms available to monetize their content and pay staff, it means that their new visitor and all of the new visitor's subsequent movements within the system are now the property of the system, available for rent.

a telephone with control board at the stasi museum in berlin

It's personal

Seemingly mundane information about individuals is harnessed by these complex systems at mega-scales to produce the outcomes desired by system operators. It's easy to shrug off the significance of each signal you enter into the system as an individual. We lull ourselves into the belief that each signal is a tiny, worthless bead we've surrendered for free digital conveniences (if we're even aware of this exchange at all). But the hidden power of technocratic systems is in the massive power they bestow upon their operators at scale, camouflaged by the mundane nature of the signals they gather.

A common, profitable result of behavioral ad tailoring is that you might end up seeing different (higher or lower) prices for big-ticket items like airfare purchases than someone else would, based on the income bracket the algorithms think you fall into.

"Free" services are also harvesting data that you leave behind in other places, and no data point is too small or insignificant. A personal example: lately I've been enjoying a specific brand of locally-produced kombucha tea. Sometimes I'll buy a couple of them at a time and my kombucha commute is reduced to the round-trip between the refrigerator and back. One day recently something eerie happened: I got up, went to the fridge to get my daily tea, came back to a conversation I was having with a friend on Facebook and immediately absentmindedly scrolled right into an ad for the same very specific, locally-produced tea I was holding in my hand at that very moment (I had never seen an ad for this tea anywhere before and assumed they didn't market much at all).

We are creatures of habit. After a brief existential meditation on this, I became consciously aware that this ritual occurred every day around the same time (right before the store closes), and that I usually pay for the tea with a credit card. Facebook hadn't read my mind; it had probably just tracked this habit by acquiring additional data about my purchase activities from one of the many cloud services who monetize this data by selling it to ad platforms like Facebook.

The feedback loop was flowing now: I had unknowingly fed my habit into the system by purchasing tea sufficiently often at the same time of day that the habit produced a signal in the noise. And when the system detected a missing signal one day, it delivered a targeted nudge at the exact moment when it would be most likely to produce the desired outcome: my daily purchase, and a higher ROI for the business owner's advertising dollar. Connecting advertisers with niche audiences through data mining is what this system is all about. I like kombucha tea, so what.

In the days leading up to the 2016 US presidential election, Donald Trump's team knew that there were several key demographics on the fence about turning out to vote for Hillary Clinton. His team was able to harness the power of nudging via Facebook's ad platform -- which allowed his team to segment potential ad viewers by race and gender -- to produce a desired outcome: that potential Hillary voters in key localities stay home instead of voting on election day (link).

We will never know how effective this novel form of the classic voter suppression strategy was in influencing the outcome of the election because this data is the private property of Facebook, of course. But it seems to be a significant moment in which we've crossed a rubicon where politics and technology meet. Political influence is now a mechanized process available for rent from mercenary tech platforms.

"Just a tool"

The most common rebuttal I encounter when voicing concerns about this from the tech folk who produce these mechanisms is: "It's just a tool." But the problem with tools is that they only work for those who have access to them.

Documentary filmmaker Adam Curtis slyly observes in his film "Hypernormalisation":

Nothing ever seems to change, because the system is designed to funnel the energy of its users back into the service of its operators.

When we are angry about political outcomes, we turn to our social media apps to vent frustration. Put succinctly: Angry people click more. But the clicks only balloon the power of the platforms, owned and operated by the hyper-wealthy and made available for rent to the highest bidder.

It follows that the people who are most poised to strategically utilize these tools are the ones who have already been using them in the world of finance for decades. It's no coincidence that perhaps the most influential figure in the 2016 US presidential election was a hedge fund manager, Robert Mercer of Renaissance Technologies. He is also a founding investor and board member of Cambridge Analytica, a political consulting company that has gathered Big Data in the service of both the Brexit and Trump campaigns. Mercer had purchased Cambridge Analytica's parent company, Britain's SCL Elections. This is a company which, prior to the Trump and Brexit campaigns, was "just a psychological warfare" firm that was simply doing its thing "in the kind of developing countries that don't have many rules" one former employee told The Guardian. Note that according to his definition, a massively unregulated election environment is a "developing country". It is in this kind of environment that digital astroturf thrives (astroturf is fake grassroots activism, fake news, and even fake web traffic that is disseminated thru social media -- for a deep dive check out a previous blog post about it here).

Robert Mercer is the kind of person who honed his signal detection ability by studying the minute effects of seemingly arbitrary signals such as the weather on financial markets. People like this are the virtuosos of Big Data and they are able to play the system like a violin.

Grassroots organizers naively create websites and political action tools that trade their shiny beads of data into this system in the hopes that at some point in this Faustian bargain, the payoff is worth the cost of surrendering their data to them. What these organizers fail to realize is that from a Big Data perspective, this is like watching a loosely assembled pick-up basketball team from a local park try to take on the LA Lakers. In other words, grassroots organizers don't have a chance playing the system's game by the system's rules. Unlike Robert Mercer, they will simply never have a building full of PhD's and math quants who specialize in the level of research it takes to fully exploit Big Data like those at the top of the system do.

In other words, they are destined to failure. And they will take the rest of us down with them because it is our data that they are feeding into the system's furnace.


The futility and delusion of feeding grassroots user data into a top-down system that is available for rent to the highest bidder.

Don't like the game? Change the rules.

From a strategic point of view, continuing to play this game is futile if our goal is real democratic political change. But this is the catch-22 that most public-interest media publishers and political organizers find themselves caught in because they are dependent on the audiences and ad-revenue delivered when they play it.

There is a growing backlash from an increasingly savvy internet audience. Part of this personalization backlash takes the form of using ad-blockers to identify and zap the trackers that behavioral ad platforms use to harvest data. Using an ad-blocker is a great way to deprive the system of the fuel it needs to continue steam-rolling grassroots movements.

Genuine privacy advocates argue that internet privacy is a human right. The complement to that argument put forth here is simple:

Democracy is impossible without internet privacy.

The real cost of "free" internet services is democracy itself.

The downside of taking our privacy back when we use ad blockers (and carefully auditing where our data flows by getting picky about which services we sign up for), is that we are also depriving our favorite publishers of the revenue they need to continue their work.

This trend is exacerbated by the fact that large publishing platforms that use behavioral advertising (i.e. YouTube) are increasingly exercising a form of censorship by de-monetizing all political content in an effort to combat the rise of "Fake News". Instead of promoting media literacy they have opted to simply decide what is news and what is not, thus consolidating centralized power over political discourse.


One of the most exciting remedies that has emerged in response to these trends is community building via crowd-funding. Independent media outlets and campaigns based on community support are more resilient and can even act as a form of mutual-aid. Audiences go from passive consumers to engaged, active community members.

If you're an audience member, that means that simply using an ad-blocker is just 50% of the equation. You have to give back. There is no "free" anymore.

Publishers can build community on top of privacy-oriented grassroots organizing tools like Whatever tools publishers decide to use, the key is making sure that publishers fully own their community's data and don't rely upon third-party platforms whose fine print allows them to siphon data away or control narratives by cutting off audiences and/or funding without warning. Publishers must offer multiple ways to for the community support their work financially.

It is also critical to realize that many "free", ubiquitous tools (like Google Analytics and social media "share" buttons) are harvesting audience data. Publishers can consider getting a analytics account instead (Matomo is free and open source software that offers paid plans and the ability to anonymize traffic numbers) and build up email lists of supporters. In these any many other ways, publishers must take control of their data.

Tracker-free content-based advertising is another route to income that has been insufficiently explored and is ripe for more innovation. This means advertisements that are simply matching ads against content (so if you're reading an article about riding bicycles, you see bicycle ads in the sidebar). This advertising model excludes personalized information and does not track the user. It will never deliver the same ROI as ad personalization, but maybe that's ok.

Gotchas in the fine print

A segment of technorati pushing "friendlier" behavioral advertising will push back against the notion that targeted ads undermine democracy. Their mental gymnastics will conflate personal privacy with the network effects of targeted advertising. Their preferred solution is regulation of "PII" -- Personally Identifiable Information. PII is a magic wand waved to dismiss concerns about personalization and ad-retargeting by de-coupling your online behavior from your actual real-life identity. With PII regulation, you become a number instead of a name to behavioral advertisers. But simply regulating PII doesn't address the fundamental fact that the system operators still have total access to your network activity and can still rent out the ability to nudge users en masse all the same. Even worse, behavioral data is still being gathered and users can be "unmasked" by those in positions of authority. When the magic wand of PII gets waved, you can bet the Big Data is still intact (and hidden up a sleeve).

PII regulation is a great first measure and it is an important step to taking back control. But when it comes to addressing the threat to democracy represented by network effects that siphon data from the political grassroots to the system operators, decoupling PII from real-life identity alone is a meaningless gesture.

a meeting room at the stasi headquarters in berlin

When neoliberal acolytes at The Economist start talking about the need for a "radical rethink" on regulation and call for a reboot of monopoly-busting when it comes to Big Data, it's time to have a hard look at what we're staring down the barrel of.

Luckily, we still have the power to deprive the system of its fuel by installing ad-blockers on our devices, opting out of data-gathering services as much as we can, and paying real media watchdogs what they're worth. Spread the word.

More Reading

The world’s most valuable resource is no longer oil, but data -- from, no byline

The great British Brexit robbery: how our democracy was hijacked -- by Carole Cadwalladr for

Alternet: Trackers to the left of me, surveillance to the right... -- by privacy advocate Aral Balkan, creator of tracker blocker

The "dark ads" election: How are political parties targeting you on Facebook? -- by Maeve McClenaghan for

Rule by Nobody: Algorithms update bureaucracy’s long-standing strategy for evasion -- by Adam Clair for

I'm an ex-Facebook exec: don't believe what they tell you about ads -- by Antonio Garcia-Martinez for

Facebook employs ex-political aides to help campaigns target voters -- by Robert Booth for

These are just links, not embedded share buttons that load 3rd party trackers on the page!

Lauren Garcia is a software developer in the San Francisco Bay Area.