The Filter Bubble and how it fits into ELA B10

Outcomes:

CR B10.1 – Comprehend and respond to a variety of visual, oral, print, and multimedia texts that address:
• identity (e.g., Diversity of Being);
• social responsibility (e.g., Degrees of Responsibility); and
• social action (agency) (e.g., Justice and Fairness).

CC B10.2- Create and present a visual or multimedia presentation supporting a prepared talk on a researched issue, using either digital or other presentation tools.

Course Theme

The World Around and Within Us

Digital Citizenship Continuum Skills:

Weigh the value of online “filter bubbles” and their impact on search results and their implications for society.

Resources:

Eli Pariser’s Ted Talk about Filter Bubbles- https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?referrer=playlist-how_to_pop_our_filter_bubbles&autoplay=true

Wall Street Journal “Blue Feed, Red Feed”- https://graphics.wsj.com/blue-feed-red-feed/

Ahmadi, Shaherzab. “Lesson Plan: Filter Bubbles.” https://www.dwrl.utexas.edu/2018/01/02/lesson-plan-filter-bubbles/

The Filter Bubble:

“Filter Bubbles” have been a hot topic in digital citizenship for over a decade.

In 2011, internet activist Eli Pariser introduced the concept of the “filter bubble” in his 2011 book The Filter Bubble: What the Internet is Hiding From You. In addition to his book, Pariser delivered a Ted Talk the same year his book was realised, and it has gone on to have been viewed millions of times.

Pariser was one of the first people to speak about how the algorithms used to recommend content on the websites like YouTube and Facebook can have detrimental impacts on the relationships between people with different political perspectives. He believed that because internet were becoming more personalised to individual users, more information was being kept away from users instead of seeing all of the information they needed to see to be fully informed on any host of topics. To put it simply, Pariser believed the internet was showing what users “want to see instead of what we need to see” (Pariser, 2011).

In the decade plus since Pariser first spoke about the filter bubble the power of algorithmically recommended content has transformed and become even more powerful. For this lesson, students are asked to investigate the impact filter bubbles have on them, and how they impact the way they view the world.

This topic fits well with the course theme of “The World Around Us and Within Us.” The apps that students use, and the algorithms and artificial intelligence these apps use to predict what they will want to see, can affect the way students understand the world, and it is important that students gain an understanding of how their online activity can impact their worldviews. Students should also be empowered to engage with their own filter bubbles to think about how they can shape them and how they can take control of the content they see.

Lesson Overview:

Ahmadi recommends one period for this lesson, that also includes a final project, to introduce the concept of a filter bubble, and to allow students to start demonstrating their skills in the area.

For this lesson, there are a few changes I would recommend to make to the lesson to ensure student understanding.

  1. Begin the lesson with Pariser’s Ted Talk video about “Filter Bubbles.” Before playing the video, I would recommend making it clear to students that the video comes from 2011, so there have been some changes that have happened since then, but the concept Pariser is talking still holds power today.
  2. After watching the video, discuss the concept with the class, and introduce how we can see filter bubbles around us today. For example, students can conduct Google searches on different topics and compare the results they get. As they are comparing results, they can be asked about what may be influencing the differences in results they are seeing around the room. As well, they should start to think about the broader topic of how seeing different results on search engines like Google can alter your world view and change your activities in the world.
  3. Students will then look through the “Blue Feed, Red Feed” service from the Wall Street Journal. This website creates an example of what a Facebook feed can look like for a conservative and a liberal user. One drawback of the service is that it has stopped being updated, and has been archived from 2019. As they are investigating the posts, they should be looking for examples of different tones that are used in the verbs and adjectives that are used in the headlines. As they are reading, they should be thinking about what the intention is in using these types of words, and what the potential impacts could be on their audiences.
  4. After investigating the feeds, have students discuss common factors they could find in the use of language in the conservative and liberal feeds. Ask them about the similarities they can see in the use of language and how this could impact the worldviews of audiences. Also, have them think about what could influence the formulation of these feeds.
  5. Have students reflect on their own social media bubbles. What kinds of websites and viewpoints dominate? What kinds of people (ethnic background, class, gender, etc.)? How can these representations of the world impact their worldviews?
  6. As an final assessment, students should create a type of advertisement to try and reach across the divide and appeal to someone in a different filter bubble. This can be for any type of product or service, but they should try to appeal to someone that is in a different background than them. This should help students to expand their view of others and try to think from a different perspective. Ahmadi recommends using Canva to create the advertisements. This service allows students to make a wide array of visual projects, and provides templates that can help students get started with their projects. As well, there is a new AI service from Canva called Magic Design that allows students to refine their work even more.

Other Resources that can be used for this topic:

How Algorithms Spread Human Bias” by Corey Patrick White

  • a TED Talk from 2021 about how algorithms and their recommendations can have wide spread impacts on society, and can help spread dangerous ideas such as racism. This topic can be quite heavy for students, but it can still be important for them to understand the potential impacts of these ideas.
  • How TikTok Reads Your Mind” by Ben Smith
  • A New York Times article from 2021 that tries to explain to readers the way that TikTok’s algorithm has made it the most successful app of the last few years.

References

Ahmadi, Shaherzab.(2018, Jan. 2). Lesson Plan: Filter Bubbles. Digital Writing and Research Lab. https://www.dwrl.utexas.edu/2018/01/02/lesson-plan-filter-bubbles/.

Couros, A. & Hildebandt, K. (2015). Digital Citizenship Education in Saskatchewan Schools. Saskatchewan Ministry of Education. file:///Users/jordanhalkyard/Downloads/83322-DC_Guide_-_ENGLISH_2%20(5).pdf.

Keegan, Jon. (2019, Aug. 19). Blue Feed, Red Feed. The Wall Street Journal. https://graphics.wsj.com/blue-feed-red-feed/.

Parsier, Eli. (2011, March). Beware Online “Filter Bubbles.” TED. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?referrer=playlist-how_to_pop_our_filter_bubbles&autoplay=true.

Saskatchewan Ministry of Education. (2011). Saskatchewan Curriculum: English Language Arts 10. https://curriculum.gov.sk.ca/CurriculumHome?id=37.

Smith, B. (2021, Dec. 5). How TikTok Reads Your Mind. The New York Times. https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html.

White, C. (2021, May). How Algorithms Spread Bias. TED. https://www.ted.com/talks/corey_patrick_white_how_algorithms_spread_human_bias.

Leave a comment