More Than Clicks & Cookies-Information on Algorithms
Algorithms. A term that many of us were not familiar with even just a handful of years ago, and now something that can dominate our day-to-day lives, or at minimum our engagement in media platforms. I have had some awareness and familiarity about the basic workings and influences of algorithms, but before this week I did not fully understand just how far the reach of algorithms extended. Honestly, the first thing that usually popped into my head when thinking about algorithms is this scene from Ron Swanson in the show Parks & Rec:
Numbers Dive
I wanted to do a little digging, just to get a feel for how often an algorithm might impact someone on social media. This article reports internet users being on social media around 2 hours and 23 minutes per day on average, the same figure reported by this report from Statista. This is roughly the same level that has been seen since 2018, and with estimates above 2 hours per day since 2016 (as a note, the Statista report does put U.S. users at 2:16, so slightly lower than average, while users in Brazil clock the highest amount, at 3:49!). Extrapolating that out, an average internet user spends 870 hours on social media each year, which is about 36 days, or just over 5 weeks (basically, 10% of their lives).
Image Source Time spent on social media in the last two years (reported January 2024) |
Image Source Daily Worldwide Internet Usage, 2015-2023 |
How Algorithms Can Impact Users
I realize that was a lot of math, but in addition to wanting to get a better feel for numbers, I also wanted to include it because that’s how I saw multiple sources refer to algorithms, a lot of fancy math and data that impacts one’s experience online. Just how much calculating is done? Well this Washington Post article cites 98 (yes…98…) data points that can be used to target ads on Facebook (which also owns Instagram). These vary from as basic to location, age, and gender to “Users who are ‘heavy’ buyers of beer, wine or spirits” and “Users who are ‘receptive’ to offers from companies offering online auto insurance, higher education or mortgages, and prepaid debit cards/satellite TV”. These last two definitely give me a sense of taking advantage of users…especially when my own scroll experiment brought me ads for two separate mortgage companies, the University of Phoenix online, Geico, and State Farm (no idea what has cited me as a ‘receptive’ user, but here we are anyway). The same Post article gives an overall sense that Facebook is unique in the depth and approach to tailoring ads, although the article was published in 2016, so it is possible that other companies have caught up since then.
Image Source Different aspects of what is accounted for within an algorithm |
The potential danger of algorithms is highlighted in this video/article from KQED, which discusses how curated content can lead users into a one-track mind. For example, if you only click on, like, share, etc. news stories related to one type of political viewpoint, websites may continue to only feed you news or pages with such a slant. In other words, the time you spend online becomes more about reinforcing potential bias than providing a holistic picture. This can open users up to potential disinformation, or at a minimum, removes them from any sort of multi-sided awareness. To use a personal, less-extreme option, I can say for certain that last week I clicked on a story on Twitter about how being a dividend investor can be a smart money strategy. After that one story, my feed suddenly started cropping up with a variety of other profiles or articles proclaiming this investing strategy as superior. Anybody who is familiar with investing knows this is only one in a myriad of strategies for how to handle money, but it felt like a single click suddenly opened up a rabbit hole that would have been very easy to follow.
Empowering students by promoting and teaching media literacy is the only way we are going to change the narrative on social media. We need more critical consumers. Thanks for the detailed math behind algorithms. It was so interesting to read!
ReplyDeleteI really appreciated your connection to the classroom. The term "rabbit hole" is one that even young learners are using to talk about how their free time is being spent. It is our job as educators to teach them about why this is happening, and what they need to be aware of when it does. I also think it helps students to be more aware of just how much time they are spending online. While not all of that time is necessarily bad, it is important for kids and adults to be aware of their time spent. Do you think those lessons would be appropriate at a young age? At what age do you think kids should learn about their personal data and algorithms?
ReplyDeleteI do feel like younger ages might still need some introductory exposure to this. I am not 100% sure as to when brain development would dictate that students can realistically understand some of the more nuanced capabilities of algorithms, but I think that idea of making them critical and responsible consumers is an important skill to instruct across age levels. With how young children are being given personal devices these days (or how young they are when information about them is shared online by parents or other family members), and how accessible some might find the internet even at these young ages, I think teaching kids very early on would be a good thing. It almost feels like the digital equivalent of 'don't talk to strangers', and I know those are conversations parents have at very early ages.
DeleteIt's fascinating to see how algorithms, essentially fueled by complex math and data, shape our online experiences and perceptions. The sheer number of data points that can be used to target ads on platforms like Facebook is mind-boggling, ranging from basic demographics to more nuanced interests like beverage preferences or financial habits. It's concerning how this level of targeting can sometimes feel invasive, especially when ads seem to follow us around based on a single click or interaction. Your example of how a single click on a specific type of news story led to a flood of similar content highlights the potential for algorithms to create echo chambers and reinforce biases, rather than offering a diverse range of perspectives. It's a reminder of the importance of being critical consumers of online content and understanding the algorithms.
ReplyDeleteI 100% agree-some of the things just made me feel uneasy. For example, if you have someone who does purchase alcohol regularly, or say visits gambling websites regularly, and this is a data point used to feed an algorithm that then prompts a user with ads...it feels like it could be feeding into an addiction in certain cases. You mentioned reinforcing biases, and political leanings/those who engage in politics are additional data points used within algorithms as well, yet these are the types of things that the sources I read talk about leading people into extreme corners of the internet, or for a less-extreme example, only fueling feeds with one-sided viewpoints. It just begs the question, when apps proclaim they want to keep users informed, or they want to be a place where people can connect, using algorithms in this way seems to point to an opposite type of purpose.
DeleteI don’t think I have noticed how many ads I get going across my social media feeds unless they tick me off, such as diet ads, Temu, political misinformation, and ones with flashing lights due to a family history of seizures. I don’t like Temu due to its history of stealing individuals’ data and ripping off small artist designs.
ReplyDeleteI have noticed that ad companies do not know who I am. I have gotten ads for alcohol, cars, and gambling to things covered in glitter and bows and anything babies. Like I’m a gender-non-confirming adult who likes stuffed animals and Legos while reading books about Sociology and Philosophy while taking the bus.
I don’t think I have many issues with algorithm ad placement compared to others, as I either tell companies not to track me or I tell them what I do and do not like when it comes to ads. Sometimes, they ignore me about specific brands, but they appear less often than before I tell them I do not like said product/brand.