{"667604":{"#nid":"667604","#data":{"type":"news","title":"The Algorithm and the Damage Done","body":[{"value":"\u003Cp\u003EAlgorithms might appear harmless, but some of them are far from it. They gather information and make calculations, but whether they do so in a neutral manner is a debated issue.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe harmful effects of an algorithm can range from labeling and categorizing someone into a box that inaccurately depicts who they really are, to altering one\u2019s future because of the way they answered a question on an exam or job application.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn some cases,\u0026nbsp;\u003Ca href=\u0022https:\/\/www.youtube.com\/watch?v=tv92WWqUQyA\u0022\u003Ealgorithms can reinforce systems that are unjust or oppressive\u003C\/a\u003E, argues Georgia Tech researcher and School of Interactive Computing Ph.D. candidate\u0026nbsp;\u003Ca href=\u0022https:\/\/twitter.com\/UpolEhsan\/status\/1537112310505824256\u0022\u003EUpol Ehsan\u003C\/a\u003E\u0026nbsp;in his paper,\u0026nbsp;\u003Cem\u003EThe Algorithmic Imprint,\u003C\/em\u003E\u0026nbsp;which was presented at the\u0026nbsp;\u003Ca href=\u0022https:\/\/facctconference.org\/\u0022\u003E2022 Association for Computing Machinery\u2019s FAcct Conference\u003C\/a\u003E.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EIn 2020, Ehsan saw a news report about protests occurring in the United Kingdom. Students across the U.K. spoke out against the results of the General Certificate of Education (GCE) A-level examinations, which had been graded by an algorithm for the first time. The A-levels are the final exams taken before university in the U.K. and have a major impact on whether students can attend their desired institutions.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe Office of Qualifications and Examinations Regulation (Ofqual), the GCE exam governing body in the U.K., said the COVID-19 pandemic had made it necessary to pivot from manual grading to using an algorithm. Protests demonstrated that students found this change to be unacceptable, arguing the algorithm was biased against people from poorer economic backgrounds.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EOfqual removed the algorithm from its grading, but that didn\u2019t solve the problem. Ehsan and his colleagues argue the effects of Ofqual\u2019s algorithm lingered long after its removal. The situation is one example of how algorithms can leave hard imprints on the societies in which they are deployed.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cMost students we interviewed for the paper had their grades improved,\u201d Ehsan said. \u201cBut they were still angry. That\u2019s when I started thinking, \u2018Why are people still angry even if their results aren\u2019t bad?\u2019\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EAnd why was the U.K. the only country to receive any media coverage when the same exams are administered in more than 160 countries, including Ehsan\u2019s home country of Bangladesh?\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cThis is not a U.K. issue,\u201d he said. \u0022This is a global issue. If we don\u2019t share people\u2019s stories, their narratives get erased from the historical narrative. If we didn\u2019t bring this up, largely speaking, the Bangladesh narrative would\u2019ve never been captured as the catastrophe that it was.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cProtests were also going on elsewhere. It\u2019s just that they were never covered. These kids were protesting just as much as the U.K. kids.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe fact that many of the other countries who use the GCEs are members of the Commonwealth \u2014 meaning they were once occupied by the British Empire \u2014 wasn\u2019t lost on Ehsan.\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe voices in the Global South weren\u2019t being heard after suffering the effects of an algorithm designed by the Global North.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cRight now, the current way we evaluate an algorithm\u2019s impact is from the birth to the death of the algorithm, from its deployment to its destruction,\u201d Ehsan said. \u201cWhen an algorithm is deployed, we do an impact assessment. When it is no longer deployed, we stop it, and that\u2019s where we think this is the end.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cAnd that is the fundamental flaw in our thinking. Even when the algorithm was taken out, its harms persisted.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe imprints can be even more damaging when they mimic or reinforce modern and historical systems of discrimination and oppression. Ehsan argues that was the case with the Commonwealth nations who also use the GCEs, and the exams were already considered to be unfair and biased before the algorithm was deployed.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cIt feels like I\u2019m paying money to my ex-colonizer for a piece of certificate that tells the world I am no dumber than a local UK kid,\u201d said one student whom Ehsan interviewed during his research. \u201cSometimes it\u2019s hard to ignore that reality.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EEhsan compared the effects of colonialism to trying to erase pencil markings from a piece of paper. Even after the eraser has been used, the traces of the pencil markings are still visible.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cOne of the things you see in postcolonialism is that there are remnants of the British infrastructure that we still live with today,\u201d Ehsan says. \u201cJust because colonizers leave, does not mean colonialism has left. Just because an algorithm has left, doesn\u2019t mean its impact has left.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EEhsan said the goal of his paper is to bring awareness to the imprints that algorithms can leave so developers can consider the potential impacts of an algorithm before deploying it.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cOne of the things I wanted to do was drive policy changes,\u201d he said. \u201cI didn\u2019t want this to be a research project just to have a research project. I had a moral reason behind it. I was driven by it.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EEhsan also said he would like to see developers, researchers, and practitioners design algorithms in a way in which their impacts can be controlled and mitigated, and if an algorithm harms a group of people, reparations should be considered to atone for the damage.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cAlgorithms leave imprints that are often immutable,\u201d he said. \u201cJust because they are made of software, it doesn\u2019t mean there\u2019s an undo button there.\u003C\/p\u003E\r\n\r\n\u003Cp\u003E\u201cWe need people to understand that algorithms have consequences that outlive their own existence, and if that doesn\u2019t bring us into a more mindful, ethical way of thinking about deployments, I\u2019d be very sad.\u201d\u003C\/p\u003E\r\n\r\n\u003Cp\u003EThe Algorithmic Imprint was co-authored with Ranjit Singh and Jacob Metcalf from\u0026nbsp;\u003Ca href=\u0022https:\/\/datasociety.net\/\u0022\u003EData \u0026amp; Society Research Institute\u003C\/a\u003E\u0026nbsp;and Professor Mark Riedl from the School of Interactive Computing.\u003C\/p\u003E\r\n","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EA researcher from the School of Interactive Computing finds that the imprints of a biased or flawed algorithm can be even more damaging when they mimic or reinforce modern and historical systems of discrimination and oppression.\u003C\/p\u003E\r\n","format":"limited_html"}],"field_summary_sentence":[{"value":"A Georgia Tech researcher is working to shine a light on the potential harm algorithms can inflict, even after they are no longer in use."}],"uid":"32045","created_gmt":"2023-05-02 14:29:36","changed_gmt":"2023-05-02 14:31:56","author":"Ben Snedeker","boilerplate_text":"","field_publication":"","field_article_url":"","dateline":{"date":"2023-05-02T00:00:00-04:00","iso_date":"2023-05-02T00:00:00-04:00","tz":"America\/New_York"},"extras":[],"hg_media":{"670711":{"id":"670711","type":"image","title":"Upol Ehsan (1).jpeg","body":null,"created":"1683037795","gmt_created":"2023-05-02 14:29:55","changed":"1683037795","gmt_changed":"2023-05-02 14:29:55","alt":"Georgia Tech Ph.D. Upol Ehsan presenting his work, The Algorithmic Imprint","file":{"fid":"253623","name":"Upol Ehsan (1).jpeg","image_path":"\/sites\/default\/files\/2023\/05\/02\/Upol%20Ehsan%20%281%29.jpeg","image_full_path":"http:\/\/tlwarc.hg.gatech.edu\/\/sites\/default\/files\/2023\/05\/02\/Upol%20Ehsan%20%281%29.jpeg","mime":"image\/jpeg","size":48049,"path_740":"http:\/\/tlwarc.hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/2023\/05\/02\/Upol%20Ehsan%20%281%29.jpeg?itok=by6NNqSe"}}},"media_ids":["670711"],"groups":[{"id":"50876","name":"School of Interactive Computing"}],"categories":[{"id":"8862","name":"Student Research"},{"id":"153","name":"Computer Science\/Information Technology and Security"}],"keywords":[],"core_research_areas":[{"id":"39501","name":"People and Technology"}],"news_room_topics":[],"event_categories":[],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[{"value":"\u003Cp\u003ENathan Deen Communications Officer I School of Interactive Computing nathan.deen@cc.gatech.edu\u003C\/p\u003E\r\n","format":"limited_html"}],"email":["nathan.deen@cc.gatech.edu"],"slides":[],"orientation":[],"userdata":""}}}