The Absolute Otherness of Authentic Human Identity as a Ghost in a Machine by Feride Zeynep Güder

04.12.2024

With the development of invasive neurotechnology such as human enhancement technologies, the discourse of escape practices from this algorithmic surveillance emerges as a significant concern for humanity. The challenges posed by today's rapid digitization are deeply intertwined with algorithmic culture, leading to crises marked by new forms of data surveillance and mental intrusion. New developments in AI and neuroscience pose profound challenges to human cognition and identity. As technology opens pathways into the human brain, the true nature of these technologies is increasingly being questioned. The digitalization of the human mind and nervous system could pave the way for dystopian scenarios that diminish anthropocentric paradigms and allow a new elite class to wield power through algorithmic governance.

Using these paradigmatic shifts and perspectives, this study examines techno-philosophical arguments about the absolute otherness of authentic human identity and the nature of soul and consciousness in the context of transhumanism, highlighting the tension between cybernetic forms and human essence.

In this vein, philosophers of technology have developed different perspectives on the relationship between technology and human beings. The study reviews these rich arguments, beginning with Descartes' soul-body duality and Ryle's critique, and extends to concepts of algorithmic governance by Rouvroy, the speed of innovation by Gille, and technological commentary by Günther Anders.

Insights from neuropsychological studies can illuminate the evolution of human interaction as technology permeates cognitive processes. The key challenge of human enhancement technologies is their impact on the human body and cognitive systems, potentially marking a pivotal point in the digitalization of human neurology.

As the study explores 'authenticity' and the notion of 'otherness' in a digital collective consciousness, Levinas' idea of otherness has also served to articulate the discussions. About the digitalization of consciousness, Bernard Stiegler's notion of 'epiphylogenesis' is given in a correlative Heideggerian concept of 'Dasein'.

The perception of space that transcends human cognition and soul in a cybernetic body is explained by the Foucauldian concept of heterotopia. In his work "Of Other Spaces", Foucault suggests that subjects do not exist in homogeneous environments but inhabit different perceptions of space. In the millions of digitised artificial consciousnesses and the collective mind, individual consciousnesses lose their unity and become entangled in a collective 'hive mind'. Collective consciousness and cognition in multiple spaces leads to the idea of 'heterocognition', developed from Foucault's terminology of 'heterotopia'.  Here another crucial question arises: where can consciousness escape the pervasive algorithmic surveillance of these collective minds?

Within this philosophical framework, the existential challenges of heterocognition shape the sense of freedom, identity, and the interference of external desires. As a case study, this study takes Mira, the main character of the film Ghost in the Shell, who embodies the tension between her cybernetic form and her human essence. As a ghost in a machine, Mira raises profound questions about the nature of the soul and consciousness. These questions transcend a simplistic Cartesian dualism and highlight the cultural nuances inherent in the original context of the work.

The study draws on Nietzsche's words that "we have seen nothing yet" to synthesize the ongoing discourse of human otherness in the context of evolving technologies. This Nietzschean perspective invites further reflection on the implications of algorithmic governance and the potential futures of human existence in an increasingly automated world: the authenticity of the human mind, its unique cognition and intricate nervous system, may pave the way for different scenarios that prevent anti-anthropocentric paradigms and prevent a new elite class from wielding power through algorithmic governance.

Read Full Text article