The social construction of algorithms and the limits of algorithmic transparency

Heiner Heiland, Matthias Sommer
Proceedings of Fourth European Workshop on Algorithmic Fairness, PMLR 294:423-427, 2025.

Abstract

Algorithms increasingly govern social and economic processes, yet their mechanisms often remain opaque to the public. This paper explores the limits of algorithmic transparency by combining theoretical considerations with empirical findings from a study on platform-mediated food delivery work. While transparency is commonly proposed as a solution to mitigate algorithmic opacity and promote fairness, we argue that transparency alone is insufficient because it often fails to translate into practical understanding or meaningful agency for users. Technical transparency may be obstructed by trade secrecy, technical complexity, and the dynamic nature of machine learning systems. Moreover, even when transparency is achieved, users may continue to act on alternative theories about algorithmic functioning, as demonstrated in our study. This highlights that algorithmic governance is co-constructed through social processes, user interpretations, and technological affordances. We conclude that efforts to improve algorithmic accountability must move beyond technical transparency and address the social dynamics that shape algorithmic systems in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v294-heiland25a, title = {The social construction of algorithms and the limits of algorithmic transparency}, author = {Heiland, Heiner and Sommer, Matthias}, booktitle = {Proceedings of Fourth European Workshop on Algorithmic Fairness}, pages = {423--427}, year = {2025}, editor = {Weerts, Hilde and Pechenizkiy, Mykola and Allhutter, Doris and CorrĂȘa, Ana Maria and Grote, Thomas and Liem, Cynthia}, volume = {294}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--02 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v294/main/assets/heiland25a/heiland25a.pdf}, url = {https://proceedings.mlr.press/v294/heiland25a.html}, abstract = {Algorithms increasingly govern social and economic processes, yet their mechanisms often remain opaque to the public. This paper explores the limits of algorithmic transparency by combining theoretical considerations with empirical findings from a study on platform-mediated food delivery work. While transparency is commonly proposed as a solution to mitigate algorithmic opacity and promote fairness, we argue that transparency alone is insufficient because it often fails to translate into practical understanding or meaningful agency for users. Technical transparency may be obstructed by trade secrecy, technical complexity, and the dynamic nature of machine learning systems. Moreover, even when transparency is achieved, users may continue to act on alternative theories about algorithmic functioning, as demonstrated in our study. This highlights that algorithmic governance is co-constructed through social processes, user interpretations, and technological affordances. We conclude that efforts to improve algorithmic accountability must move beyond technical transparency and address the social dynamics that shape algorithmic systems in practice.} }
Endnote
%0 Conference Paper %T The social construction of algorithms and the limits of algorithmic transparency %A Heiner Heiland %A Matthias Sommer %B Proceedings of Fourth European Workshop on Algorithmic Fairness %C Proceedings of Machine Learning Research %D 2025 %E Hilde Weerts %E Mykola Pechenizkiy %E Doris Allhutter %E Ana Maria CorrĂȘa %E Thomas Grote %E Cynthia Liem %F pmlr-v294-heiland25a %I PMLR %P 423--427 %U https://proceedings.mlr.press/v294/heiland25a.html %V 294 %X Algorithms increasingly govern social and economic processes, yet their mechanisms often remain opaque to the public. This paper explores the limits of algorithmic transparency by combining theoretical considerations with empirical findings from a study on platform-mediated food delivery work. While transparency is commonly proposed as a solution to mitigate algorithmic opacity and promote fairness, we argue that transparency alone is insufficient because it often fails to translate into practical understanding or meaningful agency for users. Technical transparency may be obstructed by trade secrecy, technical complexity, and the dynamic nature of machine learning systems. Moreover, even when transparency is achieved, users may continue to act on alternative theories about algorithmic functioning, as demonstrated in our study. This highlights that algorithmic governance is co-constructed through social processes, user interpretations, and technological affordances. We conclude that efforts to improve algorithmic accountability must move beyond technical transparency and address the social dynamics that shape algorithmic systems in practice.
APA
Heiland, H. & Sommer, M.. (2025). The social construction of algorithms and the limits of algorithmic transparency. Proceedings of Fourth European Workshop on Algorithmic Fairness, in Proceedings of Machine Learning Research 294:423-427 Available from https://proceedings.mlr.press/v294/heiland25a.html.

Related Material