Algorithm and Newsfeed – A reflection on networked communication

Algorithm and Newsfeed

This blog will give you food for thought in relation how social networking sites like Facebook and Instagram influence and control newsfeeds and will reflect on the implications of this for future digital platforms. These thoughts are not entirely my own and as a firm believer in separating truth from fiction on social media, I have listed a link of resources and research for those that would like to read further. I also think it’s important to understand that ideas such as these are never black and white, there are rarely ‘right’ and ‘wrong’ answers in any conversation but reflecting on impacts of digital communication are important for the future evolution of digital spaces.

This blog is based on a newspaper article, originally written in the Wall St Journal as part of a series called ‘The Facebook Files’ (and further discussed in our popular press), where undercover journalism led to the accusation that Facebook kept quiet findings that Instagram can be harmful to young girls. It will only touch the surface of the discussion, and it’s only a snapshot of some very specific interactions with social media and its effects in particular circumstances. These blogs though, are intended to be thought provoking and hopefully leads to constructive conversations about how social media is used and engaged with. I think that reading blogs gives everyone an opportunity to both share understandings and read alternative points of view. Digital communication (as I will go on to illustrate) has the capacity to be damaging to some, it also gives people like ‘us’ an opportunity to ‘create’ media. This opportunity feels normal now, but before digital marketing, this was a luxury only afforded large media conglomerates so it should be celebrated.

The Report

In 2021, the Wall St Journal published an article accusing Facebook (who owns Instagram), for keeping quiet findings that show Instagram to be harmful to young girls. The upshot of the article was that Instagram as a predominantly image sharing app is largely used by young teens at a time when they are forming ideas about their self-identity. It was argued that findings showed young teen girls used the platform as a constant comparison to other users. Users with eating disorders and users with self-identity issues had their ideas and fears reinforced by the platform, a damaging cycle where young girls were fed and re-fed on-line images which ultimately impacted mental health. Okay, so this is a very shallow and simple breakdown of the accusations and the research behind the article, it is enough though, to reflect on how information is shared and ideas reinforced on social media. I should also clarify that Zuckerberg claims that negative body issues impact a very small percentage of Instagram users.

The ‘Business’ of social media

So how do algorithms work? Well exactly how they are structured are inevitably behind the secrecy of the large networking conglomerates such as Facebook (now Meta) and Google, they are though a commercial advertising tool and this is important. The objective of social networking businesses is to extract data from the likes of you and me, to sell to advertisers. The more information we give, the more we engage with other users on the platform, the more opportunities advertisers have to connect with potential consumers and the more data can be extracted. This is how ‘the business’ of social media works. In exchange for a ‘free’ social connecting service, we give our data away and we are fed very highly targeted advertising.

Algorithms and Filter Bubbles

So hey, that’s okay right? I don’t mind selling my data for free social networking. I can chat to my friends, I target other users through my business and it’s great, I love to be able to share pictures of my stupid dog…so what’s the problem? Let’s go back to the young girl using Instagram with self-confidence issues. In order for the platform to engage us, algorithms structure our newsfeeds, they give us personalised information they think we want to see. This keeps us engaged and interested. There is no point showing me images of fishing, I want to see other stupid dogs. What appears in newsfeeds are not random or chosen by us as users.  This is a very simplified illustration of how social media acts as a filter bubble, algorithms are designed to give us what we want, and importantly exclude what we don’t want. At face value this is great, but it means with such tailored information, we eventually start only seeing information we agree with, connect with and identify with at the exclusion of alternatives. If you are a child searching for images on Instagram that show the perfect life, the perfect body shape, the best way to lose weight, to look better, you will see more and more of these images and less of others. This same filter bubble effect has impacts relating to other issues such as racism, hate speech and ideas around misinformation, where ideas are filtered to the exclusion of others because the algorithms are working hard to ‘give you want you want to see’. It’s also important to reflect that as humans, we have a bit of a psychological ‘dark-side’ in that we do tend to gravitate to the weird, strange and ‘interesting’ stories. Conspiracy is way more exciting than the idea that nothing unusual is happening at all.

Algorithm and Ethics

In the relation to the Wall St Journal investigation it was revealed that a reduction in engagement on Facebook and Instagram during the Pandemic due to users leaving the platform for alternatives, led to Facebook changing the algorithm to prioritise what they call ‘long-chain material’, posts that generate lots of comments and re-shares. This would push these posts to the top of news-feeds, encouraging users to engage and would increase participation (and increase profit). What this also does though, is inadvertently amplify angry voices. Posts that generate a lot of shares, tend to be those that are the most divisive and argumentative. Perhaps this is not significant, we are all of course aware of information that is hateful or controversial, and we can just ignore it. On the other hand, consider a different situation, about the genocide in Myanmar of the Rohingya muslims. In the period up to the genocide of the Rohingya, access to mainstream and global news were not available and social media/Facebook were the main source of local information. There is an on-going legal case where victims of the Rohingya are suing Facebook for £150 billion, where representatives of the Rhoingya are arguing that Facebook’s negligence facilitated genocide after the social media network’s algorithms amplified hate speech and the platform failed to take down inflammatory posts.

Accountability and Opportunity

So of course, these examples are a lot more complicated than I can explain in a blog and there are many benefits to digital networked culture. It is though worth questioning the responsibility of businesses such as Meta and Google, that have control over ‘connecting at scale’. Algorithms through these businesses are based on a model that incentivises profit, not wrong in its self, but with that come responsibilities and its equally important for society to understand risks and to ask questions of not just accountability but also how can this be done better. If businesses that utilise algorithms had a duty of care in relation to how communication is shared, not just based on profit, there is an opportunity for better, safer and more trusted communication networks between communities and individuals.

Finally and A Favour

I hope you found that interesting and food for thought. As I said at the start, these are not all my ideas and below are a list of resources and information if you want to read further.

Can I ask a favour? If you have enjoyed this blog, can you share it? I spend a lot of time researching and writing content that I hope is trusted, interesting and good quality. I have to date 43 subscribers to my blogs and would love to share thoughts and ideas wider. I also encourage you to comment and let me know your opinion. Just for fun. I run a business, but this isn’t business marketing, media and communication and understanding the relationships we have with our fellow friends on local and global networks through media are what get me out of bed in the morning.

Of course if you need engaging and relevant media production to utilise those algorithms to support your marketing strategy… get in touch:

Get in touch with Firetree Here. – Business Media Production

You can subscribe to my blogs in the footer area of the website: www.firetreevisual.com

Kyra.

Sources:

Really good books on Algorithm and The Filter Bubble:

Bridle, J., 2017. medium.com/@jamesbridle/something-is-wrong-on-the-internet. [Online]
Available at: https://medium.com/@jamesbridle/something-is-wrong-on-the-internet-c39c471271d2

Parisier, E., 2011. The Filter Bubble, “What the Internet Is Hididng From You”. s.l.:Penguine.

A scholarly article on Algorithms:

Cheney-Lippold, J., 2021. Algorithms and the Making of Our Digital Selves,. New York: New York University Press.

The Wall St Journal Report (it’s behind a paywall though) but there is a Podcast on it:

Wall St Journal, G., 2021. The Facebook Files “We Make Body Image Issues Worse”. New York: The Wall Street Journal.

Newspaper Article on Facebook and Algorithm:

Paul, K. & Milmo, D., 2021. Facebook putting profit before public good, says whislteblower Frances Haugen. The Guardian, 4 Oct.