More Than Clicks & Cookies-Information on Algorithms

Algorithms. A term that many of us were not familiar with even just a handful of years ago, and now something that can dominate our day-to-day lives, or at minimum our engagement in media platforms. I have had some awareness and familiarity about the basic workings and influences of algorithms, but before this week I did not fully understand just how far the reach of algorithms extended. Honestly, the first thing that usually popped into my head when thinking about algorithms is this scene from Ron Swanson in the show Parks & Rec:

Video courtesy of Peacock, retrieved from YouTube

Numbers Dive

I wanted to do a little digging, just to get a feel for how often an algorithm might impact someone on social media. This article reports internet users being on social media around 2 hours and 23 minutes per day on average, the same figure reported by this report from Statista. This is roughly the same level that has been seen since 2018, and with estimates above 2 hours per day since 2016 (as a note, the Statista report does put U.S. users at 2:16, so slightly lower than average, while users in Brazil clock the highest amount, at 3:49!). Extrapolating that out, an average internet user spends 870 hours on social media each year, which is about 36 days, or just over 5 weeks (basically, 10% of their lives). 


Image Source
Time spent on social media in the last two years (reported January 2024)

This is part of the estimated 6.4 hours per day Statista estimates people spend online each day. And, when it comes to ads, this article from Forbes estimates American see between 4,000 and 10,000 ads per day, while this report from USC estimates that number around 5,000. So, some fast math would tell us that in those 6.4 hours spent online per day, the average person will see about 13 ads per minute (using the 5,000 figure). I would guess that number goes up on social media, as a more ad-driven component of the internet, but I cannot say for sure.
Image Source
Daily Worldwide Internet Usage, 2015-2023

Ok, but what about for myself? As one whose social media use is admittedly more limited, how many ads will I see? As an experiment, I spent one minute each scrolling through Facebook and Instagram at what I would call my typical pace, and tracked the number of explicit ads I saw (for both platforms, I used the ‘Sponsored’ indicator, so it’s possible there are other things I saw that would also count as ads). In my minute of scrolling Facebook, I saw 11 ads, while I found 9 on Instagram. So, even spending an hour total per day on these platforms would expose me to 600 ads!

How Algorithms Can Impact Users

I realize that was a lot of math, but in addition to wanting to get a better feel for numbers, I also wanted to include it because that’s how I saw multiple sources refer to algorithms, a lot of fancy math and data that impacts one’s experience online. Just how much calculating is done? Well this Washington Post article cites 98 (yes…98…) data points that can be used to target ads on Facebook (which also owns Instagram). These vary from as basic to location, age, and gender to “Users who are ‘heavy’ buyers of beer, wine or spirits” and “Users who are ‘receptive’ to offers from companies offering online auto insurance, higher education or mortgages, and prepaid debit cards/satellite TV”. These last two definitely give me a sense of taking advantage of users…especially when my own scroll experiment brought me ads for two separate mortgage companies, the University of Phoenix online, Geico, and State Farm (no idea what has cited me as a ‘receptive’ user, but here we are anyway). The same Post article gives an overall sense that Facebook is unique in the depth and approach to tailoring ads, although the article was published in 2016, so it is possible that other companies have caught up since then.


Image Source
Different aspects of what is accounted for within an algorithm


The potential danger of algorithms is highlighted in this video/article from KQED, which discusses how curated content can lead users into a one-track mind. For example, if you only click on, like, share, etc. news stories related to one type of political viewpoint, websites may continue to only feed you news or pages with such a slant. In other words, the time you spend online becomes more about reinforcing potential bias than providing a holistic picture. This can open users up to potential disinformation, or at a minimum, removes them from any sort of multi-sided awareness. To use a personal, less-extreme option, I can say for certain that last week I clicked on a story on Twitter about how being a dividend investor can be a smart money strategy. After that one story, my feed suddenly started cropping up with a variety of other profiles or articles proclaiming this investing strategy as superior. Anybody who is familiar with investing knows this is only one in a myriad of strategies for how to handle money, but it felt like a single click suddenly opened up a rabbit hole that would have been very easy to follow. 


Takeaways for the Classroom

Beyond personal use, as a teacher, I feel as though this has to be something accounted for within education. Knowing that teens are likely spending even more time online, it feels like there is a responsibility to make sure that they understand how algorithms work, and that every click, like, and share impacts what they see next during their digital journey. This video/aligned worksheet from PBS could potentially be a great starting point to helping students understand how algorithms works, and help them avoid ‘falling down the rabbit hole’ when they are being active online. I think this would be the start of a great conversation with students, and can lead into a variety of classroom activities, from having students conduct a personal ‘rabbit hole’-style experiment to show them just how easily they can fall into an algorithm’s pattern, to incorporating mini-lessons or reflections within a research unit to help students ensure they are able to find objective information on both sides of a topic (already standard practice for any teachers, but perhaps incorporating social media or other online sites would help make this more relevant to students). I also think just including a greater focus on media literacy and teaching students to be critical consumers of information will help to avoid potential pitfalls of algorithms, since the amount of time we are spending online, and the amount of time and money being spent on algorithms by tech companies, doesn’t seem likely to be going down any time soon.

Comments

  1. Empowering students by promoting and teaching media literacy is the only way we are going to change the narrative on social media. We need more critical consumers. Thanks for the detailed math behind algorithms. It was so interesting to read!

    ReplyDelete
  2. I really appreciated your connection to the classroom. The term "rabbit hole" is one that even young learners are using to talk about how their free time is being spent. It is our job as educators to teach them about why this is happening, and what they need to be aware of when it does. I also think it helps students to be more aware of just how much time they are spending online. While not all of that time is necessarily bad, it is important for kids and adults to be aware of their time spent. Do you think those lessons would be appropriate at a young age? At what age do you think kids should learn about their personal data and algorithms?

    ReplyDelete
    Replies
    1. I do feel like younger ages might still need some introductory exposure to this. I am not 100% sure as to when brain development would dictate that students can realistically understand some of the more nuanced capabilities of algorithms, but I think that idea of making them critical and responsible consumers is an important skill to instruct across age levels. With how young children are being given personal devices these days (or how young they are when information about them is shared online by parents or other family members), and how accessible some might find the internet even at these young ages, I think teaching kids very early on would be a good thing. It almost feels like the digital equivalent of 'don't talk to strangers', and I know those are conversations parents have at very early ages.

      Delete
  3. It's fascinating to see how algorithms, essentially fueled by complex math and data, shape our online experiences and perceptions. The sheer number of data points that can be used to target ads on platforms like Facebook is mind-boggling, ranging from basic demographics to more nuanced interests like beverage preferences or financial habits. It's concerning how this level of targeting can sometimes feel invasive, especially when ads seem to follow us around based on a single click or interaction. Your example of how a single click on a specific type of news story led to a flood of similar content highlights the potential for algorithms to create echo chambers and reinforce biases, rather than offering a diverse range of perspectives. It's a reminder of the importance of being critical consumers of online content and understanding the algorithms.

    ReplyDelete
    Replies
    1. I 100% agree-some of the things just made me feel uneasy. For example, if you have someone who does purchase alcohol regularly, or say visits gambling websites regularly, and this is a data point used to feed an algorithm that then prompts a user with ads...it feels like it could be feeding into an addiction in certain cases. You mentioned reinforcing biases, and political leanings/those who engage in politics are additional data points used within algorithms as well, yet these are the types of things that the sources I read talk about leading people into extreme corners of the internet, or for a less-extreme example, only fueling feeds with one-sided viewpoints. It just begs the question, when apps proclaim they want to keep users informed, or they want to be a place where people can connect, using algorithms in this way seems to point to an opposite type of purpose.

      Delete
  4. I don’t think I have noticed how many ads I get going across my social media feeds unless they tick me off, such as diet ads, Temu, political misinformation, and ones with flashing lights due to a family history of seizures. I don’t like Temu due to its history of stealing individuals’ data and ripping off small artist designs.
    I have noticed that ad companies do not know who I am. I have gotten ads for alcohol, cars, and gambling to things covered in glitter and bows and anything babies. Like I’m a gender-non-confirming adult who likes stuffed animals and Legos while reading books about Sociology and Philosophy while taking the bus.
    I don’t think I have many issues with algorithm ad placement compared to others, as I either tell companies not to track me or I tell them what I do and do not like when it comes to ads. Sometimes, they ignore me about specific brands, but they appear less often than before I tell them I do not like said product/brand.

    ReplyDelete

Post a Comment

Popular posts from this blog

My Social Media Profile...Profile

Let's Get Active (Learning)!