Algorithm Politics (ft. Crawford)

Crawford’s paper argues the existence of agonistic pluralism – conflict and politics – in public algorithms, and to what extent it affects our daily lives. She also discusses whether or not a political agenda has any place in algorithms, which are inflexible and predictable in nature, noting how some algorithms are “designed to produce clear ‘‘winners’’ from information contests, often with little visibility or accountability for how those contests are designed”. In this, Crawford refers to trends, such as the bestselling books on Amazon, and how these genres may not necessarily reflect the personal tastes and nuances of the individual buyer. Thus, Crawford wants her audiences to understand that algorithms, while useful in practical situations like cataloguing or stocktaking, cannot necessarily function in a broader social context without human input. (Crawford 2015, p. 1)

Similarly, Kirkpatrick’s (2016, p.1) article on computerised algorithms also presents how personal bias can affect trends. Within the digital age, these algorithms allow advertisers to directly cater to consumers on the basis on their Internet history, being able to churn out content faster and more reliability than a person behind a computer can. However, in turn, these algorithms in themselves only produce results based on what we input into their programming, which often excludes minority groups (e.g. the disabled, people of colour, rural population…) without access to such algorithms, leaving gaps of interest.

Likewise, categorising, Crawford obverses, plays a fundamental part in algorithms, given in her example of a woman “buying some books written by authors of a conference she’s about to attend”, during which she wonders if the same buyers of a particular book would necessarily act or behave in the same way as her. In this case, the woman is using her perception to imagine an algorithm based on types of Amazon consumers, which is what Amazon on a large scale does by both ‘‘invoking and claiming to know a public with which we are invited to feel an affinity” (Crawford 2015, pp. 2-4). By playing to our biases, Amazon’s algorithm is agonistic towards items that do not fit into the woman current field of interest (Governing Algorithms), when on another day she might search for something completely different.

As such, Crawford concludes, that algorithms do have some agonistic pluralism in its programming, as “machine (in themselves) learning algorithms do not encompass all of the algorithms of interest” (Burrell 2016, p. 3). Thus, in order to fix these gaps, there is a need for some human experience “to review and address how that data is presented to users, to ensure the proper context and application of that data”.

References

Burrell, J (2016): “How the machine ‘thinks’: Understanding opacity in machine learning algorithms”, Big Data and Society, v3n1, pp. 1-12

Crawford, K (2015): “Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics”, Science, Technology and Human Values, v41n2, pp 77-92.

Kirkpatrick, K (2016): “Battling algorithmic bias: how do we ensure algorithms treat us fairly?” in Communications of the ACM, v59n10, pp. 16-17

Photos

https://www.google.com.au/search?biw=1352&bih=676&tbm=isch&sa=1&ei=pJWKW5DMJoaA-Qau8JeICQ&q=algorithm+politics&oq=algorithm+politics&gs_l=img.3..0i24k1.11187.12888.0.13006.9.7.0.1.1.0.476.786.3-1j1.2.0….0…1c.1.64.img..6.3.788…0j0i67k1.0.5H4k6YiK184#imgrc=mnIVOwCLp9QpFM:

Algorithm Politics: A Threat to Democracy

One thought on “Algorithm Politics (ft. Crawford)

Leave a comment