I think if you take all these filters, if you take all of these algorithms, you get what I call a filter bubble.
And your filter bubble is kind of your own personal unique universe of information that you live in online.
And what’s in your filter bubble depends on who you are and what you do. But the thing is that you don’t decide what gets in, and more importantly you don’t actually see what gets edited out.
Eli Pariser discussing the filter bubble created by online algorithms.
Pariser later described the filter bubble as containing two parts:
Well in the talk and even more in the book there’s two pieces One is the partisan echo chamber challenge and the other is do people get exposed to content about topics that are in the public sphere at all or is it Miley Cyrus and cats all the way down.
I’m concerned with both of these, but I’m more concerned that I’m being exposed to different ideas, to escaping the echo chamber. It's worth noting that I find Twitter to be more of an echo chamber than Facebook. I use Facebook to communicate with my friends and family. Twitter I use largely to follow and communicate with colleagues and people in my industry. On Twitter, I'm exposed to less content that I disagree with (and to less content that is new and interesting). The fact that this is the case on Twitter is a bit scary since this is arguably a situation of my own making, and one that I'm trying to correct.
Fascinatingly, there is some research that there is a physical analogue to the filter bubble, that we're living in mono-neighborhoods with people who share our outlook. This, along with Pariser's filter bubble, seems to have an impact on our ability to listen to those with opinions different from ours. Ultimately, this means that lack practicing the habit of compromise means or politics is increasingly polarized.