top of page
Search
  • Writer's picturesoha

Week 5 Reading Response

The introduction of Safiya Umoja Noble strongly resonates with the final project that I did for my Political Media course last quarter on YouTube algorithm. As she points out, the instances of sexism, racism, etc usually are justified are glitches, as errors of the algorithm that in other 99.99% of times works fine. YouTube, which is owned by Google, has always dealt with the scandals in such a fashion; that the search result, the suggestion was an isolated mistake and otherwise their algorithm works fine. There is another explanation that they offer when similar inappropriate contents are offered multiple times, for instance a white-supremasist video spreading islamophobic message. Or in another famous case of Google search engine that Tarleton Gillespie notes in his book, Custodian of the Internet, the auto-complete feature of the search engine in several instances offered suggestions to the user that were islamophobic. In such cases, YouTube (Google) argues that their algorithm is neutral and merely mirrors what users post; that is as MacCormick explains in another assigned article, match and rank. If the search engine suggests such inappropriate search phrases, that is because many other people have searched those.


However, this justification is flawed for several reasons. It does not take into account that the popularity of those phrases are partly due to the search engine itself, a vicious circle. Also, it is possible to exploit algorithms that are based on human entry. More importantly though, we should not forget that the goal of these search engines is not to necessarily lead the user to the most desired result, but to make profit, that is advertisement. In the case of YouTube Zeynep Tufekci argues that since the goal of the YouTube is to show as much ad as possible, its aim is to keep the viewer behind the screen as long as possible. Tufekci contends that to do so its algorithm is tailored toward radical, controversial content, because that is what gains attention. The problem is, under section 230 of the Communication Decency Act, these platforms are not legally responsible for the content that their algorithm is offering. “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Yet, this law was written in 1996 with regard to ISPs. It is right that search engines do not publish the content they are offering; what they are publishing is their algorithm for which they should count accountable.


Looking at HarassMap, I cannot help thinking about an Iranian application that was produced in 2016. Gasht-e-Ershad (Guidance Patrol) is a section of Iran’s police that patrol streets in major cities in order to make sure that women are following the regime’s code for Hijab. They park their van in different spots and surveil the pedestrians. Those whose Hijab does not obey the state’s strict criteria are detained and put in the patrol van. This police has become a major issue for women in Iran, because they don’t know whether or not during the day they are going to face the patrol or not. Gershad is an app that, using its user’s reports, crowdsources the patrols’ locations and shows them on the application map so that women can make a detour and circumvent the patrols. Such a collective reply, mirroring back the state’s gaze, brings the idea of sousveillance mentioned in Dark Matters to mind. Since my own project is based on its users’ data and works with maps against the regime, I was thinking about how this application can be exploited by the regime’s “soft war officers” and yesterday I came across an article on a performance which abused the crowdsourcing of Google Maps for its live traffic feature. He cleverly fooled the algorithm into thinking there is a huge traffic on a street while it was the artist himself who were walking the street up and down with a small for hours with a small wagon full of smartphones, each of them running the Google Maps.


3 views0 comments

Recent Posts

See All

final blog | Thank you!

final blog | Thank you! I have learned a lot from this class. I did not know what to expect going in, and I was a little annoyed that we were only going to make a prototype instead of turning our rese

Week 9 Reading Response

Last week I noted that how algorithms might be biased in their internal logic, this week’s readings by Tartelon Gillespie elaborated on this point and showed how in every stage of an algorithm such as

bottom of page