top of page
Search

Reading Response for Week 9

Reading Response for Week 9


This week’s readings center around the notion of “algorithmic culture.” Gillespie has made three main arguments in the article “Algorithms, clickworkers, and the befuddled fury around Facebook Trends”: Firstly, algorithms are not neutral. Secondly, the problem of Facebook lies in their attempt to clickwork the news. Thirdly, trends are not the same as news. Gillespie elaborates some of these arguments in another article “Facebook Trending: It’s made of people!!” and calls for social media platforms to be responsible for their huge power because they constitute public life rather than represent it. Personally, I agree with the first notion that algorithms are not neutral. Algorithms are the combination of human activity and computational analysis. As long as the programming of algorithms needs human-value-related judgements, which always happens when designers want an algorithm to deal with complicated things, the so-called mathematical objectivity cannot be guaranteed. Also, we had a discussion in one of our previous seminars about the neutrality of algorithms, when we mentioned the problem of data. Data and algorithms are inseparable. The Nine Algorithms reading already told us that the programming of algorithms often involves the use of data. In other words, not only criteria imbued with human values play an important role in the programming process, but also human categorized data constitute an integral part of the algorithm. Faced with this unsolvable problem, Gillespie then points out the significance of human endeavor. Human endeavors can be organized to work against biases and that is exactly what journalism tries to do. Facebook, according to Gillespie, instead treats the work of human curators as “an information processing problem, not an editorial one.” Moreover, as Gillespie argues, trends are not news – “they are expressions of interest” – but Facebook wants them to be news and users often consider them news. That is where Gillespie introduces the idea that Facebook should not produce a biased representation of reality of their users through trends. It is about human endeavors, about responsibilities.


However, I do not very believe that these monopolistic companies will value responsibilities more than political and economic profits. I might be pessimistic, but Solon and Levin also point out how Google search is manipulated by rightwing people. Regardless of the problem of neutrality, algorithms can only classify and cannot make sense of data, which make them subject to human manipulation. Once understanding how a particular algorithm works, people can trick it and then produce misinformation. Also, as for Facebook, the trending topics are in fact personalized, which means if someone receives a huge amount of misinformation, he may even get more in the future. What makes things worse is that up to date not so many Internet users have questioned the reliability or neutrality of these big social media platforms and search engines. In “Algorithmic Culture,” Striphas first defines this term as the delegation to computational processes of the work of culture, including the sorting, classifying, and hierarchizing of people, places, objects and ideas, and worries about “the gradual abandonment of culture’s publicness and the emergence of a strange new breed of elite culture purporting to be its opposite.” Striphas believes that the authoritative power of culture has been now transferred to algorithms, the latter taking the task of “reassembling the social.” However, algorithms used by main social media platforms and search engines are never transparent, which means the previous work of culture, – making decisions that sort, classify and hierarchize people, places, objects, and ideas – which is now the work of algorithms, is done privately. Many people still believe in the authority of algorithmic culture, but they have not realized that it is becoming the culture of the elite, who are in charge of algorithms. In fact, I do think that even though we always say that we have much more access to information in this digital era, which thus make us better-informed and freer than people in the past, isn’t it true that the algorithms we rely on also contribute to the mystification of power(like what was always happening in the past)?

5 views0 comments

Recent Posts

See All

final blog | Thank you!

final blog | Thank you! I have learned a lot from this class. I did not know what to expect going in, and I was a little annoyed that we were only going to make a prototype instead of turning our rese

Week 9 Reading Response

Last week I noted that how algorithms might be biased in their internal logic, this week’s readings by Tartelon Gillespie elaborated on this point and showed how in every stage of an algorithm such as

bottom of page