Let's Hack On
by David Haselberger
The "free" business model prospers although it betrays basic human rights to privacy. Social media - the capitalist brainchild of the World Wide Web (that was itself brought into existence due to technical needs at CERN at the time; the democratic notion arguably came later) - is a time sink, erodes shared public space, and hollows out democracy.
Large Language Models (LLM) devour acres and acres of natural habitats for its simulation of answers and, while tastefully repeated in the treadmills of techno-lobbyists via personalized ads, the use of technology alone did never eo ipso solve social problems. This is not painting the devil on the wall; it is simply the state of affairs. I don't say technology is bad - it is not. But tech design and its use have effects, and those can be unforeseen and devastating.
I once organized a workshop on computer use in school with ten-year-old pupils. First we talked about the joy of sending photos to friends (sure, they all had the devices and software) and the possibility to do this globally. We played sending a photo across the ocean with our tables as countries and the space between them representing water. We highlighted that the data packets making up the picture travel large distances between networked computers in milliseconds - wow, technology! When I introduced the teachers' computer as the server storing their pictures forever, even if they delete them, jaws dropped open.
In another workshop organized at a school, a group of 16-year-old girls worked out a dating algorithm inspired by Christian Rudder's TED Talk on the OkCupid algorithm. They invited classmates to try out how they matched on questions such as: "Do you like potatoes?" or "Do you like presents?" Standing in front of the two "users" who voluntarily participated in their algorithmic matchmaking, the experts calculated their score on a piece of paper. It was high. As the girls proclaimed the match percentage, the two prospective lovers looked at each other with an expression of "How can this be?", their faces turning slightly red. We do believe in numbers. A split-second later, one of them angrily shouted: "How did you calculate this? Hand over this sheet!"
The key takeaway here was: Immediacy created a marked shift in our social dynamics. Pupils indeed cared a great deal about their private relationships and personal choices, and acted upon the threats they could clearly perceive in aforementioned scenarios. Colleagues or teachers could be attributed to and held directly responsible for their products, and were in reach or at least in shouting distance.
Technology appears to be operating out of conscious awareness most of the time. Effects of its use are not directly perceptible, its makers unknown. When technology operates out of consciousness, forming and organizing meaning, and by that a sense of choice, are absent. It is not possible to take any action without motivation. Stated differently: When technological systems are natural environment, it is not possible to think outside the box, as clear concepts for this are missing.
What makes them natural environment?
First, scientific and engineering excellence does what it does and irons out nature's resistance by abstract modeling and basal design. Everyone can use a computer; it is easy. And that is great. Yet, smooth functioning renders thorough understanding unnecessary.
Second, tech use shapes language. Everyday speech is full of (nonsense) metaphors such as "cloud" describing others' computers and terms like "complexity" or "emergence" used as pseudo-scientific fill-words to describe complicated circumstances. This instills tech as familiar in all kinds of human matters and obfuscates its impact.
Third, the cybernetic idea that "the world is information" with its embedded belief in total wholeness acts deeply soothing, almost anesthetic, in the face of the existential dread of life's inherent unpredictable strangeness. That helps emotional tech acceptance.
In other words: Assimilation is less effort than accommodation. The cost of this fleeting sensation of control is the self-inflicted subjugation of subjective experience of self and Other under abstract generalizations in technical models (again shaping perception).
In the hope of being recognized, we conform to defined interfaces stripping away analog diversity and have become what (((Günther Anders))) calls "masssoloists."
Lastly, the narrative of the computer as problem solving super-brain inspires awe. And: "...the conjuration of spirits avails nothing unless accompanied by belief..." (Freud, 1919, p. 140)
Efficient tech becomes nontransparent and hidden. And cybernetic feedback loops are designed to fend off disturbance: that is their purpose. Conversely, disturbance makes technological systems immediately palpable. When I miss a train because of an app, I get upset and ashamed. At the same time, technological infrastructure becomes apparent to an extent I had formerly not grasped. I can imagine the impact it could potentially have. Imagination is key to understanding what can in reality be produced with a technological system.
Hacking is the central vehicle of imagination to lift technology from its ordinary invisibility. Hacking strives for immediacy. It makes technology and its effects visible (again). I very much enjoy when kids tell me they want to learn how to hack: Hacking is getting to know, and trying to understand how a technological system works and integrates in one's Lebenswelt.
It is an endeavor to uncover immediate creative uses of technology, not necessarily aligned with insinuated, often consumerist, purposes. Motivated by the desire to hack, it is for example possible to find out which systems are open and accessible, and which ones are mere bricks of rare, unfairly traded, metals. Don't underestimate kids. If they find cracks of possibility, they lean in: they want to explore and learn. So let's take our time to let them.
Finally, there are people deeper into tech than others, like us informatics and hackers, and in a democratic society, where we all have our share, it is their - our - responsibility to provide safe systems for less informed participants, open up spaces for discourse and education, and consider those who have no voice to speak for themselves. This is not my moral radar, but core democratic values.
"It is a widely held but a grievously mistaken belief that civil courage finds exercise only in the context of world-shaking events. To the contrary, its most arduous exercise is often in those small contexts in which the challenge is to overcome the fears induced by petty concerns over career, over our relationships to those who appear to have power over us, over whatever may disturb the tranquility of our mundane existence." (Weizenbaum 1976, p. 276)
In that sense: Let's hack on.
If you're interested in reading more:
This article is inspired by the critique on technoscience put forward by Gadamer (as collected in: Marino 2011).
- The technology critique by Anders (1956/2018a and 1980/2018b), especially his writings on Promethian shame - reprinted in English and succinctly interpreted by Müller (2016) - further by Weizenbaum's take on computer's power and human reason (1976).
- The (((Frankfurt school's))) critique on instrumental reason (Horkheimer and Adorno 1947/2002).
- Habermas' (1983) discourse ethics.
- Freud's comments on magical thinking (Chapter 3 of Totem and Taboo, 1919).
- Piaget's (1971) work on cognitive development.
- Gadamer's (1960/1989) and Benjamin's (1988) reflections on the Other.
- Dickels (2023) critique on systems theory.
- Pias' (2004) historic exploration of cybernetics.
- Turner's (2008) recherche on how the computer became personal.
The term "technological unconscious" appears to be coined by Thrift (2004), but also Star (1999) and Latour (1999) discuss similar ideas.
References
Anders, G. (2018a). Die Antiquiertheit des Menschen Bd. I: über die Seele im Zeitalter der zweiten industriellen Revolution (4th ed.). C.H. Beck.
Anders, G. (2018b). Die Antiquiertheit des Menschen Bd. II: über die Zerstörung des Lebens im Zeitalter der dritten industriellen Revolution (5th ed.). C.H. Beck.
Benjamin, J. (1988). The Bonds of Love: Psychoanalysis, Feminism, and the Problem of Domination Pantheon Books
Dickel, S. (2023). Der kybernetische Blick und seine Grenzen. Zur systemtheoretischen Selbstbeschreibung der digitalen Gesellschaft. Berlin J Soziol 33, 197-226.
Freud, S. (1919). Totem and Taboo. New York: Moffat, Yard & Company
Gadamer, H.-G. (1989). Truth and Method. New York: Continuum.
Habermas, J. (1983). Diskursethik: Notizen zu einem Begründungsprogramm. In Die Herausforderung des Rechts durch die Moral (pp. 78-88). Suhrkamp.
Horkheimer, M., & Adorno, T. W. (2002). Dialectic of Enlightenment: Philosophical Fragments (J. Cumming, Trans.). Stanford University Press. (Original work published 1947)
Latour, B. (1999). Pandora's Hope: Essays on the Reality of Science Studies. Harvard University Press
Marino, S. (2011). Gadamer and the Limits of the Modern Techno-Scientific Civilization. Peter Lang CH.
Muüller, C. J., & Anders, G. (2016). Prometheanism: Technology, Digital Culture and Human Obsolescence (C. J. Müller, Trans.). Rowman & Littlefield International.
Piaget, J. (1971). The Theory of Stages in Cognitive Development. In D. R. Green, M. P. Ford, & G. B. Flamer (Eds.), Measurement and Piaget (pp. 1-11). McGraw-Hill.
Pias, C. (2004). Zeit der Kybernetik - Eine Einstimmung In C. Pias (Ed.), Cybernetics/Kybernetik. Die Macy-Konferenzen 1946-1953 (Vol. 2, pp. 9-...). Diaphanes.
Star, S. L. (1999). The Ethnography of Infrastructure. American Behavioral Scientist, 43(3), 377-391.
Thrift, N. (2004). Remembering the Technological Unconscious by Foregrounding Knowledges of Position. Environment and Planning D: Society and Space, 22(1), 175-190.
Turner, F. (2018). From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press
Weizenbaum, J. (1976). Computer Power and Human Reason: From Judgment to Calculation. San Francisco: W.H. Freeman