Literacy Now

Teaching With Tech
ILA Membership
ILA Next
ILA Journals
ILA Membership
ILA Next
ILA Journals
  • Blog Posts
  • Digital Literacies
  • Administrator
  • Job Functions
  • Classroom Teacher
  • Digital Literacy
  • Literacies
  • 21st Century Skills
  • Foundational Skills
  • Topics
  • Tutor
  • Teacher Educator
  • Special Education Teacher
  • Reading Specialist
  • Literacy Education Student
  • Teaching With Tech
  • Content Types

Don't Click Here: Facebook, Algorithms, and Articles You Won’t Be Shown

By W. Ian O'Byrne
 | Nov 18, 2016

ThinkstockPhotos-155787182_x300How much do you trust what you read online and in social networks? It is likely that of digital texts you obtain as you read, search, and sift through the internet has been handpicked especially for you. This is because without you knowing it, you are in a filter bubble, which could limit your—and your students’—worldview and the connections you make online. Although we live in a connected world where the Web affords unprecedented learning opportunities to make information plentiful and put experts at our fingertips, there can be pitfalls along the way.

Living in a filter bubble

Before you even start searching online, websites and search engines have already compiled information about you based on things like previous search history. Every search shows you the output of a computer-based process called an algorithm. The websites and social networks you use for research, news gathering, and watching cat videos increasingly use algorithms to filter the results to make them more personalized . Search engines, news sites, or social networks provide you with what they think you want to see, not with the broad selection of what is out there.

When you search online, search algorithms anticipate what you are looking for to provide the results more quickly. However, as these algorithms record which texts you read, you are given an  incomplete idea of what is happening in the world around you. When people gravitate to ideas that are familiar and to those that align to an existing perspectives, they develop a confirmation bias—a “tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities.” This is increasingly problematic given the recent rise in fake news and disinformation sites available online.

An example from Facebook

If you ever wondered why you have thousands of Facebook friends, but you see the same few  friends in your feed, it’s because the Facebook algorithms have determined from your interaction history that these are the friends whose posts you want to see. If you don’t see anything at all from other people, the algorithms have once again determined you don’t want to see posts from those people. Facebook is just one example of the multiple ways in which algorithms try to determine what you want to see  and thus tailor the information to you.

While on the subject of Facebook, let’s dig in a bit deeper. You’ll have to login to Facebook to see your ad preferences. Ad preferences are compiled from your posts, likes, and collection of friends on the network. They also track you as you move and search elsewhere online. Those little buttons you see on other websites asking you to “like” their page on Facebook are tracking you as you move across the Web. You don’t even have to be logged in for the social network to keep track of your history, and Facebook is not the only company doing this.

While you’re on your Facebook ad preferences page, click through the different tabs to see what information the social network has about you. They sell all of this information to advertisers. Facebook has its own determination of your political views based on your activity. Under the Interests tab, click on the Lifestyle and Culture tab. In this section, you might find a box titled US Politics or the appropriate political association for wherever you live; this information is used by advertisers and political campaigns to send you news, ads, and posts containing their message.

What can you do?

For the most part, there isn’t much that you can do to break free from this filter bubble other than removing yourself from social media entirely. Many sites are increasingly using personalized search tools to manipulate your feed of information. Within Facebook, Twitter, or other social networks, you can address some of this by showing (or teaching) the algorithm you want a diverse set of opinions and information by following individuals or groups that have different perspectives than your own. You can also use a unbiased search engine like DuckDuckGo or routinely use Incognito mode on Google Chrome or on other browsers. Finally, there are Chrome extensions that are great for protecting your privacy and stopping others from tracking you online.

I think the best defense against filter bubbles is simply awareness. Recognize that filter bubbles exist and that they create a very real echo chamber that influences your potential for literacy and learning. You should also discuss this with your students and investigate methods for them to actively interact with individuals or groups with perspectives different from their own. You can start this dialogue by watching Eli Pariser describe the dangers of filter bubbles. From there it needs to be an active fight on the part of every individual to not simply trust that what they are shown online is the full story. Everyone needs to understand the danger of confirmation bias and the filter bubble as they become thoughtful, critically aware, literate individuals.

W. Ian O’Byrne, an ILA member since 2007, is an assistant professor of literacy education at the College of Charleston in South Carolina. His research examines the literacy practices of individuals as they read, write, and communicate in online spaces. He blogs at wiobyrne.com. You can subscribe to his newsletter or podcast to stay up to date on literacy, education, and technology.

This article is part of a series from the International Literacy Association’s Technology in Literacy Education Special Interest Group (TILE-SIG).

Back to Top

Categories

Recent Posts

Archives