Let me count the ways that the filter bubble distorts your screen— the screen in your mind. It’s the fixed-motion scene— the freeze-frame of focus. It always looks impressive, reinforcing your perspective and sounds oh so convincing. But, don’t let the filter bubble fool you. It’s a smooth stalking slayer.
Filter bubble fallout is contaminating the world with extreme prejudice and debilitating discord. It is not the only device driven dis-ease. But it is one of the reasons why ghastly scenes of unimaginable happenings are increasingly cropping up on your media feeds.1
The Filter Bubble is nothing less than a digitally enhanced form of cognitive bias. As such, it obscures judgment and reduces our ability to think and communicate clearly. No one is immune. It’s poisoning our minds and cutting the world in two— into ‘us’ and ‘them’. And again and again into sharp slivers of false certainty and fanatical-factions.
Contrary to popular opinion— ‘what you don’t know can hurt you’! It might be invisible, but there is no doubt that the stubborn bluster of bias can divide families, provoke wars and wreak havoc on your every move— on every decision that you make, and on every action that you take.
Granted,the Filter Bubble is not normal fare for a subliminal website. It doesn’t fit into the new-agey, psycho-pop click that’s usually associated with subliminality.2 It’s not suggestive, image-trickery, or conspiratory, but it is supra-subliminal. It’s a new breed of AI-enhanced subliminal influence that’s altering our mind and perceptions on an unprecedented scale.
It’s laced with layers that are all teeming with subliminality. For starters the Filter Bubble exhibits:
- a high level of algorithmic complexity that’s all but impossible to penetrate. This is especially true regarding filter-bubble-fed SEME’s (Search Engine Manipulation Effects). Even a dedicated team of interdisciplinary scientists and thousands of crowd-sourced volunteers would still have a hard time piercing the veil of no paper trail complexity.3
- It tentacles of cryptic-coercion resonate in deep-mind space and are about as transparent as vantablack— the darkest shade in the world.
- It’s driven by the ‘neuromarketing’ branch of Surveillance Capitalism— and promoted as the latest and greatest breakthrough in personalization and convenience.
You wouldn’t notice the Filter Bubble unless it’s pointed out and even then you’re still captivated— you’re still under the spell of its subliminal sway. This psychodynamic phenomenon would have likely remained a nameless phantom if not for Eli Pariser’s seminal expose— The Filter Bubble: What the Internet is Hiding from You.
You see, the algorithms that were developed to control targeted advertising are now exerting control over our lives. “They are constantly creating and refining a theory of who you are” what you’ll do, and what you’ll want next. “Together, these (prediction) engines create a unique universe of information for each of us.” 4 As a result, this insidious-sphere of warped data— Pariser’s Filter Bubble— is subliminally shaping how we learn, what we know, and even how our democracy works.
As Stanford law professor Ryan Calo is so succinctly quoted as saying—
“Every technology has an interface… a place where you end and the technology begins. And when the technology’s job is to show you the world, it ends up sitting between you and reality, like a camera lens. That’s a powerful position, Calo says. “There are lots of ways for it to skew your perception of the world.” And that’s precisely what the filter bubble does.” 5
The Filter Bubble stands at the nexus between computer science and the sciences of the mind, between humanity and the machine. Pariser’s book put the Filter Bubble on the map. It describes how the ’selective targeting filters’ at the heart persuasion tech have locked us into an all-pervasive hall of mirrors— an echo chamber of extreme prejudice.
Much of what Pariser discusses in the Filter Bubble is as relevant today as it was back when it was first published in 2011. Never the less, things have progressed, and it is slightly dated. But, a new edition, entitled— Filter Bubbles and Targeted Advertising, is slated to be released in August 2019. Even so, the Filter Bubble is just the tip of the proverbial iceberg of the ‘instrumentarian power play’. It’s a vast issue with an astronomical degree of insuperable complexity. No matter how comprehensive it may be, no one book could ever hope to cover it in the full depth and detail that it demands.
Filter Bubbled— Choices, Decisions & Consequences
Isn’t it true that your decisions are what pave the path of your destiny? Aren’t they the building blocks of your future? Can you see how your decisions— what you study, what you do, what you eat, where you live, what you support and who you marry— determine the kind of life you live? And in turn, can you tell how they impact your relationships with loved ones and influence the future of your children?
“The filter bubble can affect your ability to choose how you want to live… When you enter a filter bubble, you’re letting the companies that construct it choose which options you’re aware of.
“You may think you’re the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you’ve clicked on in the past determines what you see next—a Web history you’re doomed to repeat. You can get stuck in a static, ever narrowing version of yourself—an endless you-loop.” 6
The thing is that this manufactured myopia affects everyone, including those of us in positions of power— positions that influence nations, international agreements and the constitution of our social order. Even if the world were not under the spell of media-fueled performance politics, the decision of our judges, legislators, prime ministers and presidents, would still be compromised by subliminal bubbles of bias.
One study conducted by a Harvard-trained psychologist and former editor in chief of Psychology Magazine, Robert Epstein found that Democratic Elections can be manipulated by a margin of 15% just by a biased filtering search results!
“For his experiment… Epstein attempted to shape the perceptions of a random sampling of potential voters in California. The test involved an election most of the subjects knew little about: a close-fought campaign for prime minister of Australia in 2010. The researchers secretly altered the rankings of search results to help favored candidates.
“After 15 minutes of searching and reading linked articles, it was clear that the manipulation had worked, with about 65 percent of subjects favoring the candidate getting elevated rankings, compared with 50 percent among a control group that saw impartial search results… Three out of four subjects, meanwhile, reported no awareness that the search rankings had been altered.” 7
The Filter Bubble Bias Band and the Cyber-Sirens
Google and Facebook are the stars of the Filter Bubble show. While Facebook has recently shown a modicum of responsiveness to widespread privacy concerns and transparency, Google has battened down the hatches and renewed its assault.
Even so, the rush towards an ever greater personalization of the web is being driven by a whole host of digital corporations and government agencies. Jaron Lanier identifies this invisible league of virtual corpor’nations as Siren Servers.
“Siren Servers gather data from the network, often without having to pay for it. The data is analyzed using the most powerful available computers, run by the very best available technical people. The results of the analysis are kept secret, but are used to manipulate the rest of the world to advantage.” 8
I prefer to call them Cyber-Sirens, but the spirit of the term remains the same as Lanier intended. You may recall that the Sirens are the spell-binding nymphs mentioned in Homer’s Odyssey. The Siren’s song lured sailors to their deaths on craggy shores of the Cyrenian shallows. The Sirens fascinated beyond measure, compelling the weather-worn sailors of legends-morn to forgo their senses and meet their doom.
Let’s face it, technology is fascinating. It’s 21st century magic. Dazzled by technology with evangelists chanting their creed, who wouldn’t be compelled by the song of the Cyber-Sirens? Instead of rocky shores, our blind faith in the technological vision of Cyber-Sirens is leading us to extreme bias, unbalanced division and perhaps even technological servitude. At the very least “our excitement with new technologies lulls us into accepting risks that we do not see or understand.” 9
In his book Who Owns the Future, Lanier warns us that: “Sirens might be even more dangerous in inorganic form, because it is then that we are really most looking at ourselves in disguise. It is not the siren who harms the sailor, but the sailor’s inability to think straight. So it is with us and our machines.” 10
We really are ‘looking at ourselves in disguise’, because the Filter Bubble generates a feedback loop of perception and belief! Here’s how it works, Cyber-Sirens sweep up everything that you do, that you show interest in. It uses this data to compile a continuously updated digital persona of you, of your preferences and interests. It then feeds you media and content that you’ll presumably agree with and like.
The Privacy Trade Charade
“The totality of societal observation over the individual is the defining antithesis of freedom, even when that observation is gained through hidden and subtle persuasion…
“We have been overwhelmed with the illusion that surveillance and freedom are compatible. That is because the culture of the Internet, driven by its core economic model, has succeeded in equating privacy with anonymity…
“Even now in the early phase of mapping our minds, this access to our thoughts already exceeds the powers of the most invasive Big Brother government that Orwell imagined.
“At the command of Internet-driven signals, people everywhere in the world have been willing to abandon the concerns and safeguards of privacy, developed painstakingly throughout human history, for the convenience of plucking that perfect item off a virtual shelf and paying for it without looking up from their devices.” 11
Have you noticed the coy way that Silicon Valley appropriates words that sound good and then twists their meaning beyond recognition? Take personalization for instance. Who doesn’t like personal service? It sounds great, but what Google means by personalization isn’t like having personalized service. Google is notorious for their rude ‘take it or leave attitude’ and the glaring lack of customer service support. Personalization is simply used as a pleasant-sounding euphemism, a palatable mask for a Social Profiling feedback loop! Privacy takes on a completely different shade of meaning in the digital landscape. Privacy isn’t just one of those times that you don’t have to worry about someone barging in on you. It is really about personal space and the freedom to have a moment to yourself, a moment to contemplate without interruption or intrusion?
How would you feel if you lived in a glass box on Madison Avenue with a team of shrinks assigned to study and second guess your every move? What people don’t realize is that when you’re in cyber-space you are virtually living in a glass cage. A league of highly trained specialists; computer scientists, statisticians, social scientists, neuromarketers, and hackers are plugged into the global network of sensors. They use sophisticated software programs and apps to scrutinize and monetize you, your life.
“Companies, emboldened by new capabilities, are eager to use the enormous data sets they’ve amassed to squeeze more money out of their present and future customers.” 12
Besides selling your digital dossier, using your behavior to train their machines, Cyber-Sirens recycle your data, feeding it back to you as enticing bits of clickable bait. The whole cycle drones on to the beat of refine and repeat.
“Virtually everything Google does revolves around data collection and data personalization” 13
When Eric Schmidt said that using Google is the best way to evade the NSA, I nearly fell out of my chair.14 This statement seems particularly twisted when you consider the fact that Google is bed-partners with American Foreign Affairs and a de facto arm of the military-intelligence complex.
“Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation…
Long before company founders Larry Page and Sergey Brin hired Schmidt in 2001, their initial research upon which Google was based had been partly funded by the Defense Advanced Research Projects Agency (DARPA). And even as Schmidt’s Google developed an image as the overly friendly giant of global tech, it was building a close relationship with the intelligence community…
In 2004, after taking over Keyhole, a mapping tech start-up co-funded by the National Geo-spatial-Intelligence Agency (NGA) and the CIA, Google developed the technology into Google Maps, an enterprise version of which it has since shopped to the Pentagon and associated federal and state agencies on multimillion-dollar contracts.
In 2008, Google helped launch an NGA spy satellite, the GeoEye-1, into space. Google shares the photographs from the satellite with the US military and intelligence communities…
Emails obtained in 2014 under Freedom of Information requests show Schmidt and his fellow Googler Sergey Brin corresponding on first-name terms with NSA chief General Keith Alexander about ESF (Enduring Security Framework)…
But most reports over-looked a crucial detail. “Your insights as a key member of the Defense Industrial Base,” Alexander wrote to Brin, “are valuable to ensure ESF’s efforts have measurable impact.””15
Google’s do-gooder facade is beginning to crack. It’s facing fines for breaking Dutch Privacy laws and is under investigation for privacy violations by the European Union. It is also in the midst of a class action suit in the UK for circumventing Safari’s default privacy settings. Ya, we all need a sly old fox to guard the chicken coop!
I don’t know where he comes up with them, but Schmidt appears to have an endless supply of absurd one-liners. Speaking about privacy concerns, Schmidt famously declared that:
“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”16
His paternal reasoning sounds sensible, doesn’t it? Well, it might until you think about it. At which point you’ll probably realize that it disguises the issue and amounts to little more than a clever use of misdirection. You see, Eric’s statement is irrelevant. It doesn’t relate to anyone, except for the folks who are up to bad stuff, illegal or grossly perverted activities. In which case they would presumably be using the darknet.
What the Cyber-Sirens are interested in is you— your life and times! They want to learn your behavior, your idiosyncrasies, your activities and your preferences.
“Your life pattern is you. It’s what you do, with whom, and where. It’s the content that fills the vessel of your existence. A few decades ago this content was private, but also forgettable, a stream of experience that flowed into oblivion. It’s now less private and the stream flows to someone’s server.” 17
It is certainly easier to pattern you, then it would be to invent the first intelligent machine. Computers aren’t intelligent in the way that you and I are. Our intelligence hasn’t even been adequately defined by any of the mind sciences. The computer isn’t smart, it’s just a lightning fast computational machine. It learns by simply recording billions of behavior signals, signals that it learns from you and me.
Contextual Personalization and Privacy
The bubble blowing is being hyped up and trumpeted as a sort of ‘computer knows best’, an uppity concierge, a personal assistant that spies on everything that you do. Ok, that’s not exactly how it’s being billed. The idea is that contextually personalized filters will help you navigate the ‘internet of things’, the next big wave in the digital age! In some areas, like health care, transportation, and entertainment, contextual filters could be a huge advantage. Yet, without transparency, optional choices and safeguards they are dangerous and prone to abuse
The evangelist of what’s being called the Age of Context presume that we are willing to make a trade-off between privacy and convenience, to trade our privacy for personalization, for personalized services. The built-in assumption is that the majority of us users want this kind of service and are willing to make the trade. An unbiased insider’s take on privacy suggests that the general consensus is that: “Privacy for ordinary people can be forfeited in the near term because it will become moot anyway.” 18
Normally when you make a trade, you’ve got some idea of the value of what you’re trading, as well as, some control over or input regarding the terms. Privacy Policies negate any sense of fair trade. They appear to be written to frustrate any attempt to be read or understood. Back in 2008, when the internet was significantly smaller, two researchers at Carnegie Mellon calculated just how much time it would take to read the privacy policies that you should. They found that it would take each of us 75 working days.19
Imagine what it would be now? One reason that privacy policies haven’t been codified into a number of different nationally accepted standards is because of they provide an easy way for digital corporations to take advantage of their users. How hard would it be to model something like the Creative Commons Licensing Standards for Privacy Policies? If they really meant to be a two-way agreement, instead of an underhanded imposition, then this kind of system would surely be much more realistically workable.
So, we are trading our Privacy, that is defined in a practically inaccessible manner, for a supposedly Free Services! Besides the fact that you don’t know what your Privacy refers to, you also don’t have a clue as to what it’s worth in financial terms. No one does! But if that wasn’t lopsided enough, the companies like Google and Facebook, can change the policy anytime that they like.
The idea of a trade-off under these kinds of terms just doesn’t pan out. Rather than a trade, it looks a lot more like a legal hood-wink. If you tried to make this kind of trade deal with any kind of real commodity, wouldn’t you get booted out of the office? We’ve been looking at Privacy in stark and simplistic terms. Yet, there are other elements that also come into play, like questions of ethics and hacking, the latter of which hangs over even the most secure digital enterprise like the sword of Damocles.
The Filter Bubble Wrap Up
The Filter Bubble isn’t a popular topic. We have enough unpleasantries to think about. Even so, ignoring the Filter Bubble wont make it go away. The more that you know about how the Cyber Servers operate, the more difficult it will be for them to take advantage of you.
It might seem like we have covered the Filter Bubble in great depth and detail, but we’ve barely scratched the surface. There is so much more to it than meets the eye.
- This is no exaggerated hyperbole, fear-mongering or anything of the like— And, it’s only going to get worse. Just take at the scathing, divisive vitriolic splashing across the headlines in the popular press. At the time of writing this update March 28th, 2019, Brenton Tarrant, live-streamed his armed assault ending with the death of 50 people killed at two mosques in New Zealand. As Murtaza Hussain writes in his article: New Zealand Suspect’s Actions Are Logical Conclusion of Calling Immigrants “Invaders”, published on theintercept— “There are no individuals in his worldview, just faceless masses of “us” and “them.” The latter group is to be kept at a distance at all costs. He approvingly cites the deterrent effect of killing their children.”
- Subliminal is a kaleidoscopic term with a remarkable range of meaning. This is even truer now that the digital realm has opened us up to a whole new world of subliminal media. Unfortunately, I’m not talking about the good stuff here— subliminal software or personal development technology. I’m talking about subliminal manipulation at the speed of light.
- Robert Epstein, The Unprecedented Power of Digital Platforms to Control Opinions and Votes, Digital Platforms and Concentration conference, Chapter 6, April 12, 2018
- Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books; Reprint edition (April 24, 2012), Pg 32
- Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books; Reprint edition (April 24, 2012), Pg 38
- Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books; Reprint edition (April 24, 2012), Pg 57
- Craig Timberg, Could Google tilt a close election?, The Washington Post (March 29, 2013)
- Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), Pg 131
- Rebecca MacKinnon, Consent of the Networked: The Worldwide Struggle For Internet Freedom, Basic Books; First Trade Paper Edition edition (April 23, 2013), Pg 57
- Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), Pg 131
- Robert Scheer, They Know Everything About You: How Data-Collecting Corporations and Snooping Government Agencies Are Destroying Democracy, Nation Books; Feb. 24, 2015, Pg 10-13
- Patrick Tucker, The Naked Future: What Happens in a World That Anticipates Your Every Move?, Current; March 6, 2014, 14
- Evgeny Morozov, Who pays for us to browse the web? Be wary of Google’s latest answer, The Guardian (Nov. 29, 2014)
- Steven Nelson, Schmidt: Google Now Best Way to Evade NSA, U.S.News (Dec. 12, 2014)
- Julian Assange, When Google Met WikiLeaks, OR Books; Sept. 18, 2014, 37-41
- Astra Taylor, Schmidt: Google Now Best Way to Evade NSA, The Guardian (June 16, 2014)
- Patrick Tucker, The Naked Future: What Happens in a World That Anticipates Your Every Move?, Current; March 6, 2014, 70
- Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), Pg 50
- Alexix C. Madrigal, Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, The Atlantic (Mar. 1, 2012)