The Filter Bubble Rebooted is the first episode of Subliminal Insights; a Subliminal Video Series dedicated to making your world a little less subliminal and a little more conscious.
Most people don’t realize that Subliminal is a kaleidoscopic term with a remarkable range. This is even truer now that that the digital age has introduced new forms of subliminal media.
Unfortunately, I’m not talking about the good stuff here, subliminal software or personal development technology. I’m talking about the new and improved forms of subliminal manipulation, the subliminal influences that are invisibly altering our mind and perceptions on an unprecedented scale.
The Filter Bubble may not appear to be subliminal at first glance, but it is. It’s subliminal because of three main factors:
- its high level of complexity
- its covert implementation
- the neuromarketing1 tactics used to hype it up
You wouldn’t notice the filter bubble unless it’s pointed out and even then it’s hard to pin down.
If not for the outstanding work of Eli Pariser, the chair of MoveOn.org and co-founder of Upworthy.com, the Filter Bubble would have remained a nameless phantom. He coined the term back in 2011 in his investigative expose; The Filter Bubble: What the Internet is Hiding from You.
[st_modal modal_link=”Continue Reading” primary_text=”Continue Reading” primary_link=”” modal_title=”The Filter Bubble Rebooted”]
Pariser recognized the fact that:
“The algorithms that orchestrate our ads are starting to orchestrate our lives
The basic code at the heart of the new Internet is pretty simple. The new generation of Internet filters looks at the things you seem to like—the actual things you’ve done, or the things people like you like—and tries to extrapolate.
They are prediction engines, constantly creating and refining a theory of who you are and what you’ll do and want next. Together, these engines create a unique universe of information for each of us—what I’ve come to call a filter bubble—which fundamentally alters the way we encounter ideas and information.”2
Pariser’s book offers a comprehensive and compelling picture of how online personalization turned into a pervasive Filter Bubble of influence. He discusses the technology that’s driving it, provides insightful antidotes, highlights some benefits, brings up a number of troubling concerns and some potentially disastrous consequences.
Filter Bubbled Decisions & Influence
“Ultimately, the filter bubble can affect your ability to choose how you want to live… When you enter a filter bubble, you’re letting the companies that construct it choose which options you’re aware of.
You may think you’re the captain of your own destiny, but personalization can lead you down a road to a kind of informational determinism in which what you’ve clicked on in the past determines what you see next—a Web history you’re doomed to repeat. You can get stuck in a static, ever narrowing version of yourself—an endless you-loop.”3
Your decisions are the building blocks of your future. They determine the kind of life you’ll live and influence the lives of your children and loved ones. Every decision you make paves the path of your destiny.
By limiting your exposure to a wide range of information, the filter bubble narrows your options and inhibits your ability to make the best decisions, well-informed decisions.
The scary thing is that this manufactured myopia affects everyone, including those of us in positions of power, positions that have influence over the lives of others and the structure of social order. The decision of judges, of legislators and even prime ministers and presidents, are now being made inside of a Filter Bubble.
[st_modal modal_link=”Continue Reading” primary_text=”Continue Reading” primary_link=”” modal_title=”Filter Bubbled Decisions”]
In a recent 2014 NYTimes article about Voter Law, Richard L. Hasen, an election law expert at the University of California, said that:
“I don’t believe that any of these judges are deciding cases because they want to help… their party win the election,”…
“Rather I think people who are more conservative based especially on what they hear in conservative media are more disposed to believe voter fraud is a major problem…”4
The extreme polarization of public opinion in the United States, and political stalemate crippling the country, if not caused by the filter bubble, is certainly exacerbated by it.
One study conducted by a Harvard-trained psychologist and former editor in chief of Psychology Magazine, Robert Epstein found that Democratic Elections can be manipulated by a margin of 15% just by a biased filtering search results!
“For his experiment… Epstein attempted to shape the perceptions of a random sampling of potential voters in California. The test involved an election most of the subjects knew little about: a close-fought campaign for prime minister of Australia in 2010. The researchers secretly altered the rankings of search results to help favored candidates.
After 15 minutes of searching and reading linked articles, it was clear that the manipulation had worked, with about 65 percent of subjects favoring the candidate getting elevated rankings, compared with 50 percent among a control group that saw impartial search results… Three out of four subjects, meanwhile, reported no awareness that the search rankings had been altered.”5
Pariser opened the door and put the Filter Bubble into a well-defined context, but it’s a vast issue, and many questions remain unanswered. No one book could be expected to cover it in the full depth and detail that it demands.
It involves a staggering array of interlinked sciences. The Filter Bubble stands at the nexus between computer science and the sciences of the mind, between humanity and the machine.
You see, its tentacles stretch through an elaborate maze of convoluted, complex and critically important areas. A virtual labyrinth of perplexing subjects.
My aim is not to merely rehash the material covered by Pariser in his book, The Filter Bubble, but to build on it and reinvigorate research, discussion and the call for greater transparency. Right now, the Filter Bubble is more relevant than ever because we’re being hit by another major wave of Filer Bubble’isation.
No matter what you call it, ‘Cloud Computing,’ the ‘Age of Context’ or the ‘Internet of Things,’ the next wave of hyper-personalization is upon us.
The Filter Bubble Band Subliminal Songs of the Cyber-Sirens
Google and Facebook are the stars of the Filter Bubble show. While Facebook has recently shown a modicum of responsiveness to widespread privacy concerns and transparency, Google has battened down the hatches and renewed its assault.
Even so, the rush towards an ever greater personalization of the web is being driven by a whole host of digital corporations and government agencies. Jaron Lanier identifies this invisible league of virtual corpor’nations as Siren Servers.
“Siren Servers gather data from the network, often without having to pay for it. The data is analyzed using the most powerful available computers, run by the very best available technical people. The results of the analysis are kept secret, but are used to manipulate the rest of the world to advantage”.6
I prefer to call them Cyber-Sirens, but the spirit of the term remains the same as Lanier intended. You may recall that the Sirens are the spell-binding women mentioned in Homer’s Odyssey. It was the Songs of the Sirens that lured sailors to their death, to crash their ships upon craggy shores of the Cyrenian shallows.
The Sirens were apparently much more than simply alluring women. They fascinated beyond measure, compelling the weather-worn sailors of legends yarn to forgo their senses and meet their doom.
Let’s face it, technology is awesome. It’s the 21st century form of magic. Dazzled by technology with evangelists chanting their creed, who wouldn’t be compelled by the song of the Cyber-Sirens?
[st_modal modal_link=”Continue Reading” primary_text=”” primary_link=”” modal_title=”Cyber Sirens Continued…”]
Instead of rocky shores, our blind faith in the technological vision of Cyber-Sirens is leading us to extreme bias, unbalanced division and perhaps even technological servitude. At the very least “our excitement with new technologies lulls us into accepting risks that we do not see or understand.”7
In his book Who Owns the Future, Lanier warns us that: “Sirens might be even more dangerous in inorganic form, because it is then that we are really most looking at ourselves in disguise. It is not the siren who harms the sailor, but the sailor’s inability to think straight. So it is with us and our machines.”8
We really are ‘looking at ourselves in disguise’, because the Filter Bubble generates a feedback loop of perception and belief! Here’s how it works, Cyber-Sirens sweep up everything that you do, that you show interest in. It uses this data to compile a continuously updated digital persona of you, of your preferences and interests. It then feeds you media and content that you’ll presumably agree with and like.
The Privacy Trade Charade
“The totality of societal observation over the individual is the defining antithesis of freedom, even when that observation is gained through hidden and subtle persuasion…
We have been overwhelmed with the illusion that surveillance and freedom are compatible. That is because the culture of the Internet, driven by its core economic model, has succeeded in equating privacy with anonymity…
Even now in the early phase of mapping our minds, this access to our thoughts already exceeds the powers of the most invasive Big Brother government that Orwell imagined.
At the command of Internet-driven signals, people everywhere in the world have been willing to abandon the concerns and safeguards of privacy, developed painstakingly throughout human history, for the convenience of plucking that perfect item off a virtual shelf and paying for it without looking up from their devices.” 9
Have you noticed the coy way that Silicon Valley appropriates words that sound good and then twists the meaning? Take personalization for instance. Who doesn’t like personal service? It sounds great, but what Google means by personalization isn’t like having personalized service.
Google is notorious for their rude ‘take it or leave attitude’ and the glaring lack of customer service or support. So, personalization is simply used as a pleasant-sounding euphemism, a palatable mask for a Social Profiling feedback loop!
Privacy takes on a completely different shade of meaning in the digital landscape. Privacy isn’t just one of those times that you don’t have to worry about someone barging in on you. It is really about personal space and the freedom to have a moment to yourself, a moment to contemplate without interruption or intrusion?
Ultimately, the surveillance of activities, your thoughts, and desires is what feeds the Filter Bubble Money Machine. Whenever you access the internet, use your phone, credit cards, take a walk or meet up with friends, you’re being watched, recorded and studied.
How would you feel if you lived in a glass box on Madison Avenue with a team of shrinks assigned to study and second guess your every move? What people don’t realize is that when you’re in cyber-space you are virtually living in a glass cage.
A league of highly trained specialists; computer scientists, statisticians, social scientists, neuromarketers, and hackers are plugged into the global network of sensors. They use sophisticated software programs and apps to scrutinize and monetize you, your life.
[st_modal modal_link=”Continue Reading” primary_text=”Continue Reading” primary_link=”” modal_title=”The Privacy Trade”]
“Companies, emboldened by new capabilities, are eager to use the enormous data sets they’ve amassed to squeeze more money out of their present and future customers.” 10
Besides selling your digital dossier, using your behavior to train their machines, Cyber-Sirens recycle your data, feeding it back to you as enticing bits of clickable bait. The whole cycle drones on to the beat of refine and repeat.
“Virtually everything Google does revolves around data collection and data personalization”11
When Eric Schmidt said that using Google is the best way to evade the NSA, I nearly fell out of my chair.12 This statement seems particularly twisted when you consider that Google is in bed with American Foreign Affairs and a de facto arm of the military-intelligence complex.
“Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation…
Long before company founders Larry Page and Sergey Brin hired Schmidt in 2001, their initial research upon which Google was based had been partly funded by the Defense Advanced Research Projects Agency (DARPA). And even as Schmidt’s Google developed an image as the overly friendly giant of global tech, it was building a close relationship with the intelligence community…
In 2004, after taking over Keyhole, a mapping tech start-up co-funded by the National Geo-spatial-Intelligence Agency (NGA) and the CIA, Google developed the technology into Google Maps, an enterprise version of which it has since shopped to the Pentagon and associated federal and state agencies on multimillion-dollar contracts.
In 2008, Google helped launch an NGA spy satellite, the GeoEye-1, into space. Google shares the photographs from the satellite with the US military and intelligence communities…
Emails obtained in 2014 under Freedom of Information requests show Schmidt and his fellow Googler Sergey Brin corresponding on first-name terms with NSA chief General Keith Alexander about ESF (Enduring Security Framework)…
But most reports over-looked a crucial detail. “Your insights as a key member of the Defense Industrial Base,” Alexander wrote to Brin, “are valuable to ensure ESF’s efforts have measurable impact.”” 13
Google’s do-gooder facade is beginning to crack. It’s facing fines for breaking Dutch Privacy laws and is under investigation for privacy violations by the European Union. It is also in the midst of a class action suit in the UK for circumventing Safari’s default privacy settings. Ya, we all need a sly old fox to guard the chicken coop!
I don’t know where he comes up with them, but Schmidt appears to have an endless supply of absurd one-liners. Speaking about privacy concerns, Schmidt famously declared that:
“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” 14
His paternal reasoning sounds sensible, doesn’t it? Well, it might until you think about it. At which point you’ll probably realize that it disguises the issue and amounts to little more than a clever use of misdirection.
You see, Eric’s statement is irrelevant. It doesn’t relate to anyone, except for the folks who are up to bad stuff, illegal or grossly perverted activities. In which case they would presumably be using the darknet.
What the Cyber-Sirens are interested in is you, your life! They want to learn from and about your behavior, your idiosyncrasies, your activities and your preferences.
“Your life pattern is you. It’s what you do, with whom, and where. It’s the content that fills the vessel of your existence. A few decades ago this content was private, but also forgettable, a stream of experience that flowed into oblivion. It’s now less private and the stream flows to someone’s server.” 15
It is certainly easier to pattern you, then it would be to invent the first intelligent machine.
Computers aren’t intelligent in the way that you and I are. Our intelligence hasn’t even been adequately defined by any of the mind sciences. The computer isn’t smart, it’s just a lightning fast computational machine. It learns by simply recording billions of behavior signals, signals that it learns from you and me.
Contextual Personalization and Privacy
The bubble blowing is being hyped up and trumpeted as a sort of ‘computer knows best’, an uppity concierge, a personal assistant that spies on everything that you do.
Ok, that’s not exactly how it’s being billed. The idea is that contextually personalized filters will help you navigate the ‘internet of things’, the next big wave in the digital age! In some areas, like health care, transportation, and entertainment, contextual filters could be a huge advantage. Yet, without transparency, optional choices and safeguards they are dangerous and prone to abuse
The evangelist of what’s being called the Age of Context presume that we are willing to make a trade-off between privacy and convenience, to trade our privacy for personalization, for personalized services. The built-in assumption is that the majority of us users want this kind of service and are willing to make the trade.
An unbiased insider take on privacy suggests that the general consensus is that: “Privacy for ordinary people can be forfeited in the near term because it will become moot anyway.” 16
Normally when you make a trade, you’ve got some idea of the value of what you’re trading, as well as, some control over or input regarding the terms. Privacy Policies negate any sense of fair trade. They appear to be written to frustrate any attempt to be read or understood.
[st_modal modal_link=”Continue Reading” primary_text=”Continue Reading” primary_link=”” modal_title=”Context and Privacy”]
Back in 2008, when the internet was significantly smaller, two researchers at Carnegie Mellon calculated just how much time it would take to read the privacy policies that you should. They found that it would take each of us 75 working days. 17
Imagine what it would be now? One reason that privacy policies haven’t been codified into a number of different nationally accepted standards is because of they provide an easy way for digital corporations to take advantage of their users.
How hard would it be to model something like the Creative Commons Licensing Standards for Privacy Policies? If they really meant to be a two-way agreement, instead of an underhanded imposition, then this kind of system would surely be much more realistically workable.
So, we are trading our Privacy, that is defined in a practically inaccessible manner, for a supposedly Free Services! Besides the fact that you don’t know what your Privacy refers to, you also don’t have a clue as to what it’s worth in financial terms. No one does! But if that wasn’t lopsided enough, the companies like Google and Facebook, can change the policy anytime that they like.
If you tried to make this kind of trade deal with any kind of real commodity, wouldn’t you get booted out of the office?
The idea of a trade-off under these kinds of terms doesn’t pan out. Rather than a trade, it looks a lot more like a legal hood-wink.
We’ve been looking at Privacy in stark and simplistic terms. Yet, there are other elements that also come into play, like questions of ethics and hacking, the latter of which hangs over even the most secure digital enterprise like the sword of Damocles.
The Filter Bubble Wrap Up
The Filter Bubble isn’t a popular topic. We have enough unpleasantries to think about. Even so, ignoring the Filter Bubble wont make it go away. The more that you know about how the Cyber Servers operate, the more difficult it will be for them to take advantage of you.
It might seem like we have covered the Filter Bubble in great depth and detail, but we’ve barely scratched the surface. There is so much more to it than meets the eye.
References and Resources
1) Neuromarketing uses in-depth cognitive psychology, social psychology and the science of persuasion to manipulate our desires and influences our decisions. In essence and function neuromarketing is a re-branded form of subliminal marketing.
2) Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books; Reprint edition (April 24, 2012), 32
3) Eli Pariser, The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think, Penguin Books; Reprint edition (April 24, 2012), 57
4) Trip Gabriel, Court Decisions on Voting Rules Sow Confusion in State Races, The New York Times (Oct. 7, 2014)
5) Craig Timberg, Could Google tilt a close election?, The Washington Post (March 29, 2013)
6) Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), 131
7) Rebecca MacKinnon, Consent of the Networked: The Worldwide Struggle For Internet Freedom, Basic Books; First Trade Paper Edition edition (April 23, 2013), 57
8) Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), 131
9) Robert Scheer, They Know Everything About You: How Data-Collecting Corporations and Snooping Government Agencies Are Destroying Democracy, Nation Books; Feb. 24, 2015, 10-13
10) Patrick Tucker, The Naked Future: What Happens in a World That Anticipates Your Every Move?, Current; March 6, 2014, 14
11) Evgeny Morozov, Who pays for us to browse the web? Be wary of Google’s latest answer, The Guardian (Nov. 29, 2014)
12) Steven Nelson, Schmidt: Google Now Best Way to Evade NSA, U.S.News (Dec. 12, 2014)
13) Julian Assange, When Google Met WikiLeaks, OR Books; Sept. 18, 2014, 37-41
14) Astra Taylor, Schmidt: Google Now Best Way to Evade NSA, The Guardian (June 16, 2014)
15) Patrick Tucker, The Naked Future: What Happens in a World That Anticipates Your Every Move?, Current; March 6, 2014, 70
16) Jaron Lanier, Who Owns the Future, Simon & Schuster; Reprint edition (March 4, 2014), 50
17) Alexix C. Madrigal, Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, The Atlantic (Mar. 1, 2012)